Skip to content

Transition Turbulence

Risk Factor

Transition Turbulence

Model RoleIntermediate Factor
Key ParametersEconomic Stability, Human Agency, Societal Resilience
CharacterProcess quality (not destination)

Transition Turbulence measures how much disruption occurs as society navigates from current AI to whatever future emerges. Unlike Ultimate Scenarios (which describe what happens), Transition Turbulence describes how rough the journey is—and that roughness affects both whether we survive (Existential Catastrophe) and what world we end up in (Long-term Trajectory).

Even if we ultimately reach a good destination, a turbulent transition causes real suffering along the way. Economic displacement, political instability, and social fragmentation during the transition matter independently of the final outcome.

Why a Root Factor? Transition Turbulence is a background condition that affects Ultimate Scenarios and Ultimate Outcomes, not an endpoint in itself. High turbulence can trigger acute catastrophes (political collapse → loss of control) and constrain long-term trajectory (path dependence, destroyed institutions).


Inherently negative. High turbulence is always worse than low turbulence, all else equal. There’s no “good” version of extreme disruption—the question is how much turbulence we experience, not whether turbulence is desirable.

LevelDescription
Low turbulenceSmooth adaptation, minimal disruption, institutions keep pace
Moderate turbulenceSignificant disruption but recoverable, adaptation strained
High turbulenceSevere instability, cascading failures, suffering widespread
Catastrophic turbulenceSystem breakdown triggers existential catastrophes or permanent damage

Loading diagram...

1. Rapid Capability Growth AI capabilities advance faster than institutions, labor markets, and social norms can adapt. The faster the change, the more turbulence.

2. Economic Displacement AI automation displaces workers faster than new roles emerge. Mass unemployment creates economic and political instability.

3. Racing Dynamics Competition between labs/nations creates pressure to deploy before adequate safety testing, increasing both capability speed and coordination failures.

4. Coordination Failures Governments, labs, and international bodies fail to coordinate on standards, safety requirements, and transition support.

DomainLow TurbulenceHigh Turbulence
EconomicGradual workforce transition, safety nets absorb displacementMass unemployment, inequality spikes, market instability
PoliticalDemocracies adapt, regulation keeps paceAuthoritarian backlash, polarization, institutional collapse
SocialTrust maintained, communities adaptFragmentation, loss of shared reality, civil unrest
InstitutionalRegulators understand AI, governance effectiveGovernance captured or overwhelmed, rule of law erodes


High turbulence can trigger acute catastrophes:

  • Political collapse → Loss of control over AI development
  • Racing acceleration → Deployment before adequate safety
  • Institutional breakdown → No capacity to respond to emerging threats
  • Social unrest → Desperate measures, authoritarian responses

A rough enough transition can cause the catastrophe, even if AI itself isn’t misaligned.

Turbulence shapes what futures are reachable through path dependence:

  • Destroyed institutions are hard to rebuild
  • Lost trust takes generations to restore
  • Authoritarian responses to chaos tend to entrench
  • Economic disruption locks in inequality
  • Options foreclosed during crisis rarely reopen

Even if acute catastrophe is avoided, high turbulence constrains the achievable long-term trajectory.


Ultimate ScenarioRelationship
AI TakeoverTurbulence increases risk of loss of control
Human-Caused CatastropheTurbulence can trigger state failures and desperate actions
Long-term Lock-inTurbulent periods often lock in emergency measures

Indicators of increasing turbulence:

  1. Labor market stress: AI-related unemployment rising faster than retraining
  2. Political polarization: AI becoming partisan issue, backlash movements
  3. Regulatory lag: Governance clearly behind capability development
  4. International tension: AI competition framed as zero-sum
  5. Trust decline: Public trust in institutions/tech companies falling
  6. Social instability: Protests, strikes, civil unrest related to AI

Economic:

  • Universal basic income or robust safety nets
  • Retraining and education programs
  • Gradual deployment policies
  • Worker transition support

Political:

  • Democratic deliberation processes for AI policy
  • International coordination mechanisms
  • Regulatory capacity building
  • Preventing authoritarian capture

Social:

  • Maintaining epistemic commons (shared facts)
  • Community resilience programs
  • Trust-building between tech and public
  • Preserving human-human social fabric

Technical:

  • Paced deployment (slowing capability rollout)
  • Interoperability requirements
  • Human-in-the-loop requirements
  • Transition period safety measures

Turbulence LevelAssessment
Some turbulenceAlmost certain—significant disruption is baseline
High turbulenceLikely without deliberate intervention
Catastrophic turbulencePossible, depends on speed of capability growth and coordination
Low turbulenceRequires active coordination and paced deployment

Key uncertainty: How fast will transformative capabilities arrive? Faster arrival = more turbulence.