Skip to content

Epoch AI model database

🔗 Web

Unknown author

View Original ↗

Summary

Epoch AI analyzed the landscape of large-scale AI models, identifying over 30 models trained with more than 10^25 floating-point operations (FLOP). The analysis covers models from leading AI developers across language, reasoning, and multimodal domains.

Review

The Epoch AI model database provides a comprehensive tracking of AI models trained at unprecedented computational scales, representing a critical resource for understanding AI technological progress. By meticulously examining model releases from major AI labs like OpenAI, Google, Meta, and others, the researchers developed a systematic methodology to estimate training compute using a combination of direct reporting, benchmark performance, and expert estimation techniques. The research is significant for AI safety because it offers unprecedented transparency into the computational scale of frontier AI models, which is a key indicator of potential capabilities and risks. By tracking models exceeding 10^25 FLOP, the database helps researchers, policymakers, and AI safety experts monitor the rapid advancement of large AI systems. The study also highlights emerging trends like the proliferation of high-compute models, with approximately two models per month reaching this threshold in 2024, and provides insights into regulatory implications like the EU AI Act's upcoming requirements for such large-scale models.

Key Points

  • Over 30 AI models trained with more than 10^25 FLOP since March 2023
  • Models estimated using benchmark performance, training details, and expert analysis
  • Training such models costs tens of millions of dollars
  • Regulatory frameworks like EU AI Act will apply to models at this computational scale

Cited By (1 articles)

← Back to Resources