Skip to content

Trust Decline

📋Page Status
Quality:91 (Comprehensive)⚠️
Importance:25 (Peripheral)
Last edited:2025-12-29 (9 days ago)
Words:583
Backlinks:7
Structure:
📊 3📈 0🔗 23📚 020%Score: 9/15
LLM Summary:Trust erosion is the active degradation of societal trust levels, accelerated by AI capabilities including deepfakes and the 'liar's dividend' effect. This is a short reference page—see Societal Trust parameter for comprehensive data and analysis.
Risk

Trust Decline

Importance25
CategoryEpistemic Risk
SeverityMedium-high
Likelihoodhigh
Timeframe2025
MaturityGrowing
TypeEpistemic
StatusOngoing

Trust erosion describes the active process of declining public confidence in institutions, experts, media, and verification systems. While the current state of societal trust is analyzed in the Societal Trust parameter page, this page focuses on trust erosion as a risk—examining the threat model, acceleration mechanisms, and responses.

For comprehensive data and analysis, see Societal Trust, which covers:

  • Current trust levels (US government trust: 77% in 1964 → 22% in 2024)
  • International comparisons and benchmarks
  • AI-driven acceleration mechanisms (liar’s dividend, deepfakes, scale asymmetry)
  • Factors that increase trust (interventions, C2PA standards, media literacy)
  • Trajectory scenarios through 2030

DimensionAssessmentNotes
SeverityHighUndermines democratic governance, collective action on existential risks
LikelihoodVery HighAlready occurring; AI accelerating pre-existing trends
TimelineOngoingEffects visible now, intensifying over 2-5 years
TrendAcceleratingAI content generation scaling faster than verification capacity
ReversibilityDifficultRebuilding trust requires sustained effort over decades

Trust erosion threatens AI safety and existential risk response through several mechanisms:

DomainImpactEvidence
AI GovernanceRegulatory resistance, lab-government distrustOnly ~40% trust government to regulate AI appropriately (OECD 2024)
ElectionsContested results, violence4 in 10 with high grievance approve hostile activism (Edelman 2025)
Public HealthPandemic response failureHealthcare trust dropped 30.4 pts during COVID-19
Climate ActionPolicy paralysisOnly ~40% believe government will reduce emissions effectively
International CooperationTreaty verification failuresLiar’s dividend undermines evidence-based agreements

The core dynamic: low trust prevents the coordination needed to address catastrophic risks, while AI capabilities make trust harder to maintain.


ResponseMechanismEffectiveness
Content AuthenticationCryptographic verification of content origins (C2PA standard)Medium-High
Epistemic InfrastructureStrengthening fact-checking and verification systemsMedium
Epistemic SecurityProtecting information ecosystems from manipulationMedium
Deepfake DetectionTechnical countermeasures to synthetic mediaMedium (cat-and-mouse)
Media Literacy ProgramsTeaching source evaluation and critical thinkingMedium (d=0.60 effect size)

See Societal Trust for detailed intervention analysis.


Key Acceleration Mechanism: The Liar’s Dividend

Section titled “Key Acceleration Mechanism: The Liar’s Dividend”

The most concerning AI-driven dynamic is the liar’s dividend (Chesney & Citron): the mere possibility of fabricated evidence undermines trust in all evidence. Research shows politicians who falsely claim scandals are “fake news” receive 8-15% higher support than those who apologize (American Political Science Review, 2024).

This creates a double bind where neither belief nor disbelief in evidence can be rationally justified—and the effect will intensify as deepfake capabilities improve.


  • Societal Trust — Comprehensive parameter page with current levels, data, mechanisms, interventions, and scenarios