Epoch AI OpenAI compute spend
Summary
Epoch AI analyzed OpenAI's 2024 compute spending, estimating $5 billion in R&D compute and $2 billion in inference compute. Most compute was likely used for experimental and unreleased model training.
Review
The Epoch AI analysis provides a comprehensive breakdown of OpenAI's computational expenditure in 2024, revealing significant investments in cloud computing infrastructure. By examining reports from The Information and The New York Times, the researchers estimated OpenAI's total compute spending at approximately $7 billion, with $5 billion dedicated to research and development and $2 billion to inference compute.
The study's methodology involves detailed estimates of training compute costs for models like GPT-4.5, GPT-4o, and Sora Turbo, using confidence intervals and assumptions about cluster sizes, training durations, and GPU costs. The analysis highlights that most of OpenAI's compute resources were likely allocated to experimental and unreleased model training runs, rather than final production models. This insight offers valuable transparency into the computational resources required for cutting-edge AI development and underscores the massive investments needed to maintain leadership in frontier AI technologies.
Key Points
- OpenAI spent approximately $7 billion on compute in 2024
- Majority of compute was used for research and experimental training
- Estimates based on investor documents and industry trends