Skip to content

LAWS Proliferation Model

📋Page Status
Quality:78 (Good)
Importance:62.5 (Useful)
Last edited:2025-12-26 (12 days ago)
Words:5.4k
Structure:
📊 12📈 1🔗 2📚 014%Score: 10/15
LLM Summary:Five-stage model tracking LAWS proliferation from great powers (2015-2023) to potential mass availability (2030+), projecting 60 nations with capabilities by 2030 and non-state actor access by 2032. Quantitative comparison shows LAWS proliferating 4-6x faster than nuclear weapons due to dual-use commercial technology, with 40-50 nations currently possessing some capability across three tiers.
Model

LAWS Proliferation Model

Importance62
Model TypeTimeline Projection
Target RiskAutonomous Weapons
Related
Model Quality
Novelty
3
Rigor
4
Actionability
4
Completeness
4

Lethal Autonomous Weapons Systems (LAWS) represent one of the fastest-proliferating military technologies in history, spreading from advanced militaries to regional powers and increasingly to non-state actors at a pace that dramatically outstrips historical precedents like nuclear weapons. Unlike nuclear technology, which requires rare materials, massive infrastructure, and generates detectable signatures, autonomous weapons rely entirely on dual-use commercial technology—artificial intelligence models, computer vision systems, and consumer drones—that proliferates through normal economic channels and cannot be meaningfully restricted without crippling civilian industries. This creates a fundamentally different proliferation dynamic where barriers to entry drop exponentially as AI capabilities improve and commercial hardware becomes more sophisticated.

This model analyzes LAWS diffusion through five distinct proliferation stages, from great power development to mass individual access, and projects that by 2030, approximately 60 nations will possess autonomous weapons capabilities, with non-state actors achieving regular operational use by the mid-2030s. The central question is whether any form of proliferation control remains feasible given the dual-use nature of enabling technologies, or whether the global security community must accept widespread LAWS proliferation as inevitable and focus instead on damage limitation, defensive countermeasures, and attribution mechanisms.

The key insight emerging from quantitative proliferation modeling is that LAWS follow an S-curve diffusion pattern similar to civilian technologies rather than the linear, barrier-constrained progression of nuclear weapons, suggesting that the 2025-2030 period represents a critical inflection point where proliferation accelerates dramatically and opportunities for control narrow to near-zero. This matters because once autonomous weapons reach widespread availability among both state and non-state actors, the risks of assassination-at-scale, authoritarian repression, and lowered thresholds for violence become structural features of the international system rather than manageable threats.

The proliferation of autonomous weapons follows a five-stage diffusion model that parallels technology adoption curves but proceeds at unprecedented speed due to the dual-use nature of enabling technologies. Each stage is characterized by distinct actor types, capability levels, and deployment patterns, with the transition between stages driven primarily by three factors: declining cost of AI and hardware, increasing availability of technical knowledge through open-source channels, and weakening of export control mechanisms as commercial applications overwhelm regulatory capacity.

Loading diagram...

The diagram illustrates how proliferation proceeds through multiple parallel pathways—direct military-to-military technology transfer and commercial dual-use diffusion—that converge to enable non-state access unless effective control mechanisms are established during the regional adoption phase. The critical branching point occurs at Stage 3-4 transition, where the window for meaningful proliferation control effectively closes.

StageTimelineActor TypeAccess MethodCapability LevelCurrent Status
Stage 1: Great Power Development2015-2023U.S., China, Russia, IsraelIndigenous R&DAdvanced autonomy, military-gradeComplete (100%)
Stage 2: Regional Power Adoption2020-2026Turkey, Iran, UK, France, India, S. KoreaLicensed/adapted systemsMid-tier autonomyOngoing (70%)
Stage 3: Widespread State Access2024-203030-50 nationsCommercial/exportVariable sophisticationEmerging (30%)
Stage 4: Non-State Actor Access2026-2032Militant groups, terrorist orgsCommercial drones + DIYBasic-to-moderate autonomyEarly signs (8%)
Stage 5: Mass Availability2030+Individuals with technical skillsConsumer hardware modsBasic autonomyNot reached (2%)

The table quantifies the progression through proliferation stages, showing that as of 2025, the world is simultaneously in three overlapping stages, with great power capabilities fully mature while non-state access is beginning to emerge. The compression of timelines—Stage 1 through Stage 4 occurring within just 10-15 years—contrasts sharply with nuclear proliferation, which took over 60 years to reach nine state actors and has never achieved non-state access at scale.

Tier 1 (Advanced, Deployed):

  • United States: Loyal Wingman drones, Autonomous Air Defense
  • China: Extensive autonomous drone programs, swarming capabilities
  • Israel: Harpy/Harop loitering munitions, autonomous defense systems
  • Turkey: Kargu-2, STM Kargu (documented autonomous use in Libya)
  • Russia: Claimed autonomous systems (verification difficult)

Tier 2 (Developing, Testing):

  • Ukraine: AI-enabled FPV drones, autonomous targeting (4M annual production capacity)
  • Iran: Shahed drones with increasing autonomy
  • UK, France, Germany: Collaborative European programs
  • South Korea: DMZ autonomous sentry guns
  • India: Development programs

Tier 3 (Early Research/Procurement):

  • 20-30 additional nations with announced programs

Total: ~40-50 nations with some level of autonomous weapons capability

1. Direct Military Development Nations with advanced AI capabilities develop indigenous systems

  • High cost, high capability
  • Examples: U.S., China

2. Technology Transfer & Sales Advanced nations sell or license systems to allies

  • Medium cost, high capability
  • Examples: Israeli systems sold internationally

3. Commercial Adaptation Military adaptation of commercial drone/AI technology

  • Low cost, medium capability
  • Examples: Ukraine’s modified commercial drones

4. Reverse Engineering Captured or crashed systems reverse-engineered

  • Medium cost, medium capability
  • Examples: Iran’s drone programs

5. Open-Source AI + Commercial Hardware Public AI models combined with readily available drones

  • Very low cost, low-medium capability
  • Examples: DIY autonomous targeting systems

The speed at which autonomous weapons proliferate can be quantified by comparing proliferation milestones to historical precedents. Nuclear weapons provide the most relevant comparison as the last major weapons technology subject to international control efforts, though the comparison reveals how fundamentally different LAWS proliferation dynamics operate.

Nuclear vs. Autonomous Weapons Proliferation Timelines

Section titled “Nuclear vs. Autonomous Weapons Proliferation Timelines”
Technology5 Nations10 Nations20 Nations50 NationsNon-State Access
Nuclear Weapons19 yearsNever reachedNever reachedNever reachedNever achieved
Autonomous Weapons3-5 years5-7 years7-10 years (projected)10-15 years (projected)10-15 years (projected)
Proliferation Rate Multiplier4-6x faster∞ (nuclear never reached)Not comparableNot comparableUnique to LAWS

The table reveals that autonomous weapons proliferate at 4-6 times the rate of nuclear weapons in early stages, but the comparison breaks down entirely at higher proliferation levels because nuclear weapons never achieved widespread state proliferation and never reached non-state actors at scale. LAWS are projected to reach more nations by 2032 than nuclear weapons have reached in 80 years, while simultaneously becoming accessible to non-state actors—a proliferation outcome that nuclear weapons control successfully prevented.

Understanding why LAWS proliferate faster requires analyzing the structural barriers that slow or enable diffusion. The following table quantifies the relative difficulty across five critical barrier categories:

Barrier TypeNuclear WeaponsAutonomous WeaponsLAWS Advantage Factor
Material RequirementsEnriched uranium/plutonium (extremely rare)Computer chips, sensors, cameras (commodity hardware)1000x easier
Infrastructure Cost$5B-$50B for weapons program$50K-$5M for basic capability10,000x cheaper
Technical KnowledgeHighly classified, restrictedPublished in journals, open-source100x more accessible
Dual-Use LegitimacyVery low (enrichment primarily military)Very high (AI/drones overwhelmingly civilian)Impossible to restrict
Detection/VerificationSatellite imagery, radiation signaturesIndistinguishable from civilian activity1000x harder to verify

This barrier analysis explains the divergent proliferation trajectories. Nuclear weapons face compounding barriers—rare materials AND high costs AND classification AND detectability—that multiply to create extremely high total barriers. LAWS face diminishing barriers—each enabling technology becomes cheaper, more accessible, and more legitimately dual-use over time—creating a proliferation environment where barriers approach zero asymptotically. The cost advantage alone (10,000x cheaper) makes LAWS accessible to actors that could never contemplate nuclear weapons development, while the verification impossibility means that even moderately effective control regimes cannot be constructed around detection mechanisms.

The diffusion of autonomous weapons can be modeled using a logistic growth function adapted from technology adoption theory and epidemiological modeling. This approach captures the S-curve pattern observed in technology proliferation, where adoption starts slowly among early adopters, accelerates rapidly during a growth phase, then saturates as the market approaches capacity.

N(t)=K1+er(tt0)N(t) = \frac{K}{1 + e^{-r(t - t_0)}}

Where:

  • N(t)N(t) = Number of nations with operational LAWS capability at time tt
  • KK = Maximum potential adopters (saturation point)
  • rr = Proliferation rate parameter (controls steepness of adoption curve)
  • t0t_0 = Inflection point (year when adoption growth rate peaks)
  • tt = Time in years since 2015 (baseline year)
ParameterBest EstimateRangeConfidenceJustification
KK (Max adopters)120 nations100-140HighExcludes smallest states lacking military capacity
rr (Growth rate)0.35 per year0.28-0.45MediumFaster than conventional military tech (0.15-0.25), slower than pure commercial tech (0.5-0.8)
t0t_0 (Inflection year)20252024-2027MediumObservable acceleration in procurement and deployment
Current adopters (2025)20 nations15-25HighBased on documented programs

The parameter estimates reflect LAWS’ hybrid nature: proliferation proceeds faster than traditional military technology due to dual-use characteristics but slower than purely commercial technology due to remaining institutional and policy barriers. The relatively high rr value of 0.35 captures the acceleration effect of improving AI capabilities and declining hardware costs, while the inflection point at 2025 reflects the current transition from early adopter phase to mass adoption phase.

YearNations with LAWS% of Potential AdoptersConfidence LevelNew Adopters per Year
202054%High (observed)1-2
20252017%High (observed)3-4
20306050%Medium8-10
20359579%Low7-8
204011092%Very Low3-4

The projections indicate that 2025-2030 represents the maximum growth rate period, with 8-10 new nations achieving LAWS capability annually during the peak proliferation years. By 2030, half of all militarily capable nations will possess autonomous weapons—a proliferation outcome that transforms LAWS from a specialized capability of great powers to a standard military technology. The slowdown after 2035 reflects saturation effects as the remaining non-adopters are primarily nations with limited military capacity or strong normative commitments against autonomous weapons.

Non-state proliferation follows a different trajectory than state adoption, driven by the lag between commercial technology availability and the technical sophistication required to weaponize it. Historical precedent from commercial drone weaponization provides calibration data for projecting autonomous weapons timelines.

Capability MilestoneCommercial Drones (2014-2020)Autonomous Weapons (Projected)Time Lag Explanation
Initial weaponization2014 (ISIS drops grenades)2020-2022 (basic autonomous targeting)Requires AI integration, more complex than mechanical mods
Sophisticated tactical use2017 (coordinated swarms)2025-2027 (autonomous assassination attempts)3-5 year lag for AI models to become accessible
Regular operational capability2020 (routine in conflicts)2030-2032 (well-funded groups deploy routinely)10-12 year lag from state capabilities to non-state access
Mass availabilityNot yet reached2035-2040 (small groups, potentially individuals)Depends on open-source AI advancement rate

The progression from initial weaponization to operational capability took approximately six years for commercial drones. Autonomous weapons are projected to follow a similar but somewhat delayed timeline, with the critical transition to regular use by non-state actors occurring around 2030-2032—roughly 10-12 years behind state-level capabilities. This lag is shorter than the nuclear weapons case (where non-state access never materialized) but longer than the commercial drone case due to the additional technical complexity of integrating AI targeting systems.

FactorImpact on TimelineCurrent Trend2030 Projection
Open-source AI model capabilitiesHigh accelerationRapidly improvingModels sufficient for basic autonomous targeting widely available
Commercial drone sophisticationMedium accelerationSteady improvementConsumer drones with high-res cameras, long range, payload capacity
Legal restrictions on salesLow decelerationWeak enforcementMinimal effectiveness due to dual-use nature
Technical knowledge barriersMedium decelerationDeclining rapidlyBasic autonomous targeting within reach of moderately skilled individuals
Net effectAccelerating accessBarriers decliningNon-state operational capability highly likely

The key uncertainty centers on whether truly lone actors (not just small groups) will gain practical access to autonomous weapons. This depends critically on the trajectory of open-source AI models and the availability of weaponization guides, both of which are currently trending toward increased accessibility with no effective control mechanisms in place.

The international community has attempted various control mechanisms to slow LAWS proliferation, with uniformly disappointing results. The dual-use nature of enabling technologies makes traditional arms control approaches—which rely on restricting access to specialized materials or equipment—essentially ineffective. This section evaluates both implemented and proposed control mechanisms across three dimensions: technical feasibility, political feasibility, and potential impact.

MechanismTechnical FeasibilityPolitical FeasibilityPotential ImpactOverall EffectivenessCurrent Status
Export controls (Wassenaar)LowMediumLow5% - Easily circumventedImplemented, largely ineffective
International treaty (CCW)MediumVery LowHigh (if universal)10% - No major power buy-inYears of negotiations, no result
Corporate self-regulationHighLowVery Low8% - State actors unaffectedVoluntary, limited participation
Technical safeguards (kill switches)MediumMediumLow15% - Bypassed by military systemsCommercial only
AI model export controlsLowLowVery Low3% - Models recreatableProposed, likely ineffective
Component-level restrictionsVery LowVery LowNegligible1% - Too many civilian usesNot seriously pursued
Attribution mechanismsMedium-HighMediumMedium35% - Best feasible optionEarly development
Defensive technology focusHighHighMedium40% - Most promisingUnderfunded but growing
Stigmatization campaignN/A (normative)MediumMedium-High25% - Partial effect possibleOngoing, mixed reception

The effectiveness percentages represent estimated reduction in proliferation rate or risk if mechanisms were fully implemented. The highest-impact options—international treaties requiring meaningful human control—face insurmountable political obstacles due to major power opposition. The most feasible options—defensive technology and attribution—offer only partial risk reduction. Critically, no combination of these mechanisms appears capable of preventing proliferation beyond 60+ nations by 2030.

Traditional arms control succeeds when technologies have three characteristics: identifiable specialized components, limited civilian applications, and detectable development signatures. LAWS possess none of these characteristics, creating a proliferation control environment fundamentally more challenging than any previous weapons technology including nuclear, chemical, and biological weapons.

The dual-use problem is particularly acute. Restricting AI model development would require shutting down the entire AI research enterprise. Restricting drone sales would eliminate agricultural, delivery, inspection, and emergency response applications worth hundreds of billions of dollars. Restricting sensor technology would cripple consumer electronics. No state will accept these economic costs to slow military proliferation by a few years at most, especially when competitors can gain advantages by defecting from restrictions.

Verification challenges compound these problems. Nuclear weapons programs generate heat signatures, consume enormous amounts of electricity, and require visible enrichment facilities. LAWS development occurs in ordinary office buildings running ordinary computers doing work indistinguishable from civilian AI research. International inspectors cannot determine whether an AI lab is developing autonomous targeting systems or optimizing package delivery routes—and even if they could, the knowledge required transfers instantly once discovered.

Future proliferation trajectories depend primarily on two uncertain variables: the success of international control efforts and the relative pace of offensive versus defensive technology development. Four distinct scenarios capture the plausible outcome space, with probabilities assigned based on current trends and historical precedents.

ScenarioProbability2030 State Proliferation2035 Non-State AccessKey DriverRisk Level
Uncontrolled Proliferation40%60-70 nationsRoutine operational useControl mechanisms failVery High
Partial Control35%40-50 nationsLimited but growingWeak international normsHigh
Effective Control15%20-30 nationsRare/preventedStrong treaty regimeMedium
Defensive Dominance10%60+ nations but ineffectiveIrrelevant (weapons don’t work)Counter-LAWS breakthroughLow

The scenario probabilities reflect the base rate of arms control failure (75% probability that control mechanisms are weak or ineffective) combined with a small probability (10%) that technological development favors defense so strongly that proliferation becomes strategically irrelevant. The most likely outcome remains uncontrolled or weakly controlled proliferation.

Scenario 1: Uncontrolled Proliferation (40% probability)

Section titled “Scenario 1: Uncontrolled Proliferation (40% probability)”

This scenario represents the current default trajectory absent major policy interventions or technological surprises. International control negotiations continue to stall on verification and enforcement mechanisms, while dual-use technology continues to improve and diffuse through commercial channels. By 2030, LAWS capabilities reach 60-70 nations across all regions, with no meaningful technical barriers preventing non-state acquisition.

Proliferation timeline: Following the logistic growth curve with r=0.40r = 0.40 (slightly faster than base case due to complete absence of control friction). State proliferation reaches 70 nations by 2030, 100+ by 2035. Non-state actors begin routine operational use by 2032, with well-funded groups possessing capabilities rivaling smaller nations. Individual access becomes feasible by 2038 as open-source AI models and commercial drones converge.

Security consequences: Assassination costs drop from millions of dollars (current professional operations) to thousands or hundreds of dollars (autonomous systems), making targeted killing accessible to actors who previously lacked such capabilities. Authoritarian regimes deploy LAWS for internal security, creating surveillance-to-strike infrastructure that operates at machine speed. Terrorist organizations shift from mass casualty attacks (difficult to execute, high probability of interdiction) to high-confidence targeted strikes against specific individuals. Democratic societies face persistent threats from autonomous weapons that can be pre-positioned and activated remotely, fundamentally changing security trade-offs around civil liberties versus protection.

Likelihood assessment: This scenario receives 40% probability because it requires only continuation of current trends—no diplomatic breakthroughs needed, no major technical surprises, no dramatic public opinion shifts. The primary uncertainty is whether some external shock (catastrophic LAWS incident, major power conflict) triggers rapid norm formation, but historical precedent suggests such shocks more often accelerate proliferation than constrain it.

Scenario 2: Partial Control (35% probability)

Section titled “Scenario 2: Partial Control (35% probability)”

International norms emerge around LAWS use but lack universal adherence and enforcement mechanisms. Major powers sign treaties with significant loopholes or reservations that preserve their own programs while nominally supporting restrictions. A two-tier system develops where advanced militaries maintain LAWS capabilities under various legal fictions (human-in-the-loop requirements that can be waived, autonomous systems classified as defensive, etc.) while smaller nations and non-state actors face restrictions with uneven enforcement.

Proliferation timeline: International norms slow diffusion by approximately 30%, reducing rr to 0.25 in the logistic growth model. State proliferation reaches 45 nations by 2030, 75 by 2035. Non-state access occurs but through black markets and state sponsorship rather than direct commercial acquisition, delaying routine operational use to 2034-2036. Export controls reduce but do not eliminate availability.

Security consequences: The two-tier system creates strategic asymmetries where major powers possess capabilities that regional competitors cannot legally acquire, increasing incentives for covert development and international tensions. Black markets develop for autonomous weapons components and expertise, with state sponsors (Iran, North Korea, others) providing LAWS to proxy forces. Periodic norm violations occur when nations caught face limited consequences, gradually eroding the stigma. Democratic nations face difficult trade-offs between maintaining technological superiority and supporting international norms.

Likelihood assessment: 35% probability reflects this as the most likely “successful” control outcome—meaningful norms emerge but fall short of preventing proliferation. This mirrors historical outcomes for cluster munitions (treaty with major power abstentions) and landmines (treaty with incomplete participation). The scenario requires modest diplomatic success but not the transformative breakthroughs needed for effective control.

Scenario 3: Effective Control (15% probability)

Section titled “Scenario 3: Effective Control (15% probability)”

A strong international treaty establishes meaningful human control requirements for lethal autonomous systems, with major power buy-in and credible verification mechanisms. Proliferation slows dramatically as normative, legal, and technical barriers combine to prevent widespread LAWS deployment. Development focuses on defensive systems and human-supervised applications.

Proliferation timeline: Treaty constraints reduce proliferation rate to r=0.12r = 0.12, approaching rates for heavily regulated military technologies. State proliferation limited to 25-30 nations by 2030, reaching only 45 nations by 2035. Non-state access effectively prevented through combination of export controls, technical safeguards, and attribution mechanisms. Individual access remains infeasible through 2040.

Security consequences: Military AI development proceeds along supervised and defensive pathways, maintaining human decision-making for lethal force. International security benefits from reduced assassination risks and lower conflict escalation probabilities. However, tensions persist around verification and compliance, with periodic crises when states are suspected of covert development. Some military advantages potentially ceded to non-compliant actors, creating pressure for treaty withdrawal.

Likelihood assessment: Only 15% probability due to the substantial diplomatic and technical barriers. Requires unprecedented major power cooperation on military technology—the U.S., China, and Russia would all need to agree that autonomous weapons risks outweigh military advantages and commit to intrusive verification. No historical precedent exists for this level of great power cooperation on militarily relevant technology. Would likely require a catalyst event (catastrophic LAWS incident) combined with unusual political leadership.

Scenario 4: Defensive Technology Victory (10% probability)

Section titled “Scenario 4: Defensive Technology Victory (10% probability)”

Counter-LAWS technology advances faster than offensive capabilities, creating a defender’s advantage where autonomous weapons become tactically ineffective. Detection systems identify autonomous platforms, jamming disrupts their operation, and active defenses defeat them at high rates. Proliferation continues but military value of LAWS declines sharply.

Proliferation timeline: Follows uncontrolled proliferation curve for acquisition (many nations obtain LAWS) but deployment and operational use remain limited because systems prove unreliable in contested environments. By 2030, 60+ nations possess LAWS but rarely deploy them operationally. Non-state actors acquire systems but find them ineffective against defended targets.

Security consequences: LAWS shift from game-changing weapons to niche capabilities useful only against undefended targets. Arms race dynamics redirect to counter-LAWS measures. Assassination risks increase against soft targets lacking defenses but decrease for protected individuals and installations. Overall risk level substantially lower than proliferation scenarios because weapons don’t function as feared.

Likelihood assessment: 10% probability reflects significant technical uncertainty about offense-defense balance. While possible, historical precedent suggests offense-defense races rarely end in decisive defensive victory—more commonly they create continued cycles of measure and countermeasure. Defensive dominance requires not just good counter-LAWS technology but fundamental asymmetries favoring defense, which seems unlikely given the diversity of autonomous weapons platforms and attack modes.

Proliferation level determines the nature and magnitude of autonomous weapons risks. As capabilities spread from great powers to regional states to non-state actors, the risk profile shifts from interstate strategic considerations to diffuse threats against individuals and small groups.

Proliferation LevelTimelinePrimary RisksRisk MagnitudeControl FeasibilityKey Threshold Crossed
Great powers only (5-8 nations)2015-2020Strategic instability, arms raceMediumHigh - small number of actorsMilitary AI integration begins
Regional powers (20-30 nations)2020-2027Regional conflicts, proxy useMedium-HighMedium - norms still formableRoutine battlefield deployment
Widespread state (60+ nations)2027-2032Ubiquitous military capabilityHighLow - too many actorsLAWS become standard military technology
Non-state operational (100+ groups)2030-2035Terrorism, assassination-at-scaleVery HighVery Low - impossible to monitorKilling capability democratized
Individual access (1000+ individuals)2035-2040Lone-wolf attacks, vigilantismExtremeNone - completely uncontrollablePoint of no return

The table identifies five critical thresholds in the proliferation trajectory. The most dangerous transition occurs between widespread state access and non-state operational capability, where autonomous weapons shift from regulated military systems subject to command authority and international law to tools accessible to actors with no accountability to state structures. Once this threshold is crossed, assassination transforms from a capability requiring state resources and risking attribution to an accessible option for well-funded individuals and organizations.

Current assessment places the world in 2025 between the regional powers and widespread state thresholds, with 5-7 years remaining before the high-risk transition to non-state operational capability. This represents a critical window where control mechanisms might still prevent the most dangerous outcomes, though the analysis above suggests low probability of success.

Section titled “Primary Strategy: Damage Limitation (Recommended for 75% Probability Scenario)”

Given the high likelihood that proliferation cannot be prevented, policy should focus on limiting harm in a post-proliferation world rather than attempting to prevent the inevitable. This represents a fundamental strategic reorientation from control to resilience.

Defensive technology investment: Prioritize detection systems, jamming capabilities, and active defenses against autonomous weapons. Unlike offensive LAWS development, defensive cooperation faces fewer barriers and provides public goods that reduce everyone’s vulnerability. Estimated cost of comprehensive defensive technology program: $5-10B annually across allied nations. Expected impact: 40-60% reduction in LAWS effectiveness against defended targets.

Attribution mechanisms: Develop technical forensics to trace LAWS use to source actors, creating accountability and deterrence. Analogous to nuclear forensics but more challenging due to commercial component ubiquity. Requires international cooperation on component tracking, operating system signatures, and post-incident analysis protocols. Expected impact: Enables retaliation/prosecution in 30-50% of incidents, providing some deterrent effect.

Civilian protection norms: Establish strong international prohibitions on LAWS use against civilians, distinct from general LAWS bans that have failed. More achievable politically because it doesn’t restrict military capabilities against legitimate targets. Compliance likely incomplete but could reduce civilian casualties by 20-40% through combination of normative pressure and fear of attribution.

Counter-proliferation targeting: Accept that universal proliferation prevention is impossible but maintain intelligence and operational capabilities to prevent or delay access by highest-risk actors (designated terrorist organizations, particularly unstable regimes, individuals with assassination/terrorism indicators). Resource-intensive but more tractable than comprehensive control. Can delay high-risk actor access by 3-5 years on average.

Section titled “Secondary Strategy: Pursue Control Despite Low Probability (Recommended for 25% Possibility)”

Even with low probability of success, the benefits of effective control are sufficiently high to justify continued investment in control mechanisms. Portfolio approach: pursue multiple control pathways simultaneously to maximize chance that at least one succeeds.

International treaty negotiation: Continue CCW and other diplomatic processes despite slow progress. Focus on meaningful human control requirements rather than total bans (more politically achievable). Build coalition of committed states even if major powers abstain initially—create template for eventual accession if circumstances change. Expected success probability: 15%, but success would reduce long-term risk by 60-70%.

Export control strengthening: Tighten Wassenaar Arrangement and equivalent regimes where possible. Focus on complete weapon systems rather than components (more enforcement-feasible). Expected impact: Delay proliferation to marginal actors by 2-3 years, minimal impact on determined state programs.

Technical safeguards mandates: Require kill switches, geofencing, and autonomous operation limitations in commercial drones and AI systems. Effective only against non-state actors using commercial systems, easily bypassed by state military programs. Expected impact: 20-30% reduction in non-state actor capability, no impact on state actors.

Stigmatization campaign: Support organizations like Campaign to Stop Killer Robots in creating normative pressure against LAWS development and use. Analogous to landmine and cluster munition campaigns which achieved partial success despite major power abstention. Expected impact: Slow proliferation by 15-25%, reduce willingness to use by 30-40%.

DimensionAssessmentQuantitative Estimate
Potential severityHigh - fundamentally changes cost of political violenceAssassination costs drop from $500K-5M to $1K-10K
Probability-weighted importanceVery High - proliferation trajectory appears nearly certain75-85% probability of 60+ nation proliferation by 2030
Comparative rankingTop-tier among weapons proliferation risks4-6x faster than nuclear, 10x more actors by 2035

Current global investment in LAWS control mechanisms: ~$50M annually Estimated investment needed for meaningful defensive capability: $5-10B annually Gap factor: 100-200x underfunded

Priority interventions by cost-effectiveness:

  1. Attribution technology R&D ($500M-1B, 35% risk reduction)
  2. Defensive counter-LAWS systems ($2-5B, 40-60% effectiveness improvement)
  3. Civilian protection norm building ($100M, 25% stigmatization effect)
CruxIf TrueIf FalseCurrent Assessment
Dual-use restriction is impossibleFocus on damage limitationArms control still viable85% likely true
Non-state access by 2032Assassination-at-scale becomes realityWindow for control remains70% likely true
Defensive technology can dominateProliferation becomes less dangerousFull risk realization10% likely
Major incident triggers normsPartial control possibleUncontrolled proliferation25% dependent on events

This model faces several fundamental constraints that limit confidence in specific quantitative projections while preserving confidence in directional conclusions and relative comparisons.

Technology trajectory uncertainty: The model assumes AI capabilities and commercial drone sophistication continue improving at current rates, but breakthrough advances or unexpected plateaus could substantially alter proliferation timelines. If AI progress slows due to algorithmic limits or compute constraints, the projected 2030-2035 timeline for non-state operational capability could extend by 5-10 years. Conversely, rapid advances in open-source AI models could accelerate timelines by 3-5 years. The logistic growth model captures smooth diffusion but cannot predict discontinuous technology jumps that might rapidly enable or prevent certain proliferation pathways.

Historical analogy limitations: The model relies heavily on nuclear weapons proliferation as a comparison case, but LAWS represent an unprecedented combination of characteristics—dual-use technology, low cost, impossible verification—that may produce dynamics with no historical precedent. The commercial drone weaponization timeline provides more recent calibration data but spans only 10 years and involves simpler technologies. No historical case perfectly matches LAWS proliferation dynamics, creating irreducible uncertainty in projections beyond 10 years.

Non-state actor modeling challenges: State proliferation can be tracked through public announcements, observable procurement, and intelligence reporting. Non-state actor capabilities remain largely invisible until used operationally, creating severe measurement problems. The model projects non-state timelines based on technology availability and historical weapon adoption rates, but actual access depends on factors difficult to quantify—criminal network evolution, state sponsorship decisions, availability of technical expertise in underground markets. Confidence intervals for non-state projections are at least 2x wider than for state proliferation.

Control mechanism effectiveness: The model assigns effectiveness percentages to various control mechanisms based on expert judgment rather than empirical validation. No controlled experiments exist for LAWS proliferation control, and historical arms control outcomes provide limited guidance due to the unique dual-use challenges. The assigned probabilities to different scenarios (40% uncontrolled, 35% partial control, etc.) represent informed estimates but lack rigorous empirical grounding. Different expert panels might assign substantially different probabilities while agreeing on directional conclusions.

Second-order effects omitted: The model focuses narrowly on proliferation dynamics and does not capture important second-order effects that might alter trajectories. Public backlash after a catastrophic incident could shift political feasibility of control mechanisms. Major power conflict could either accelerate proliferation (arms race dynamics) or enable cooperation (mutual threat recognition). Economic shocks affecting AI industry could slow or redirect technology development. These potential discontinuities are acknowledged but not formally modeled, limiting the model’s ability to predict outcomes in turbulent geopolitical environments.

  • Carnegie Endowment for International Peace. “Mapping LAWS Development Globally”
  • Campaign to Stop Killer Robots. Annual reports
  • UN CCW Group of Governmental Experts reports
  • SIPRI. “Autonomous Weapons Systems and International Humanitarian Law”
  • Academic literature on weapons proliferation dynamics