Below is a compact infographic that places the leading civilisation-scale hazards on a single, intuitive axis: “chance of pushing humanity into global societal collapse before 2150.”
The numbers synthesize the most recent expert-elicitation studies, risk-perception surveys, and flagship technical reports (sources cited after the explainer).
How to read the graphic
- Horizontal bars ≈ best-guess probability (%) that the threat — acting alone or as the primary trigger of cascading failures — unravels the institutions, supply chains and knowledge bases on which modern life depends.
- Estimates are purposefully rounded (orders-of-magnitude rather than spurious precision); uncertainty grows toward the right of the chart.
- “Collapse” here means a loss of technological & institutional complexity so deep that global living standards fall and never fully recover, even if some populations survive.
Key take-aways
- Run-away AI sits on top (~10 %). Multiple surveys of thousands of AI researchers and policy analysts now cluster around a 5-to-10 % extinction-or-collapse risk this century if alignment and governance lag capability progress. forum.effectivealtruism.orgaiimpacts.org
- Engineered biothreats (~3 %) outpace natural pandemics because synthetic biology lowers the cost of tailoring pathogens for high transmissibility and lethality. RAND’s 2025 assessment notes a “step-change” in risk as DNA-writing prices fall.
- Full-scale nuclear war (~2 %) remains a live danger: expert-and-superforecaster medians hover between 1 % and 5 % before 2045, with the probability compounded by escalation pathways involving emerging tech and multipolar tensions.
- Climate-driven ecological tipping cascades (~1 – 2 %) could destabilise food, water and energy systems even without total extinction. IPCC-linked research on Atlantic circulation collapse and Amazon die-back underscores the systemic nature of the threat. www.theguardian.com
- Complex-systems fragility (~1 %) aggregates shocks such as global supply-chain synchronised failure, debt-crises feeding political extremism, and cyber-induced critical-infrastructure knock-ons. These rarely get stand-alone probability estimates, but modelling of coupled networks suggests order-of-1 % plausibility.
- Baseline natural hazards (zoonotic pandemics ≈ 0.5 %, supervolcano ≈ 0.1 %, kilometre-class asteroid ≈ 0.05 %) are small individually thanks to geological rarity and improved surveillance, but not negligible. Toby Ord’s The Precipice remains the canonical synthesis. en.wikipedia.org
Why the spread matters
- Preventability skews left-to-right. AI alignment frameworks, biorisk monitoring, and nuclear-confidence-building each have clear (if politically hard) interventions, whereas asteroid deflection and volcanic-aerosol mitigation are largely capped by physics.
- Many risks interact. A regional nuclear exchange could inject soot that worsens climate shocks; extreme warming could accelerate pathogen spill-over; an AI race dynamic increases nuclear-launch-on-warning pressure.
Sources & further reading
- Ord, T. (2020, 2024 updates) The Precipice best-guess risk table. www.tobyord.com
- World Economic Forum, Global Risks Report 2025 – top-10 global threats by likelihood & impact. reports.weforum.orgwww.weforum.org
- Forecasting Research Institute / Open Nuclear Network expert survey on nuclear-war probability.
- RAND, Global Catastrophic Risk Assessment – Pandemics (2025) and AI Catastrophe Commentary (2025).
- Guardian / YouGov 2025 public poll on perceived nuclear-war likelihood.
- IPCC Expert Meeting (2024) on climate tipping-points; Guardian survey of climate scientists (2024). apps.ipcc.chwww.theguardian.com
- Guardian (2024) & PNAS (2022) analyses of biodiversity-driven collapse. (Remember: all probabilities are speculative; their real value lies in helping governments and citizens compare priorities, not in delivering exact forecasts.)