T
Theo - t3.gg·TechAnthropic just…wait what
TL;DR
Anthropic partnered with SpaceX to lease 220,000 H100s because 80x unexpected growth left them critically compute-starved, while XAI needed Cursor's coding data to compete.
Key Points
- 1.Anthropic experienced 80x annualized revenue and usage growth in Q1 2025, which no one could plan for. Dario admitted they conservatively under-bought compute, expecting only 10x growth, leaving them unable to serve demand.
- 2.Anthropic's compute is split across three incompatible hardware architectures. They run on AWS Trainium, Google TPUs, and Nvidia GPUs — researchers prefer CUDA, so Trainium and TPUs are used mainly for inference to free up Nvidia for training.
- 3.The Anthropic-SpaceX deal gives Anthropic access to over 300 megawatts and 220,000 Nvidia H100s from Colossus 1. That's roughly 70% of Colossus 1's capacity, leaving XAI only ~100 megawatts for Grok inference — revealing how little Grok is actually used.
- 4.SpaceX has already moved training to Colossus 2, which is 5x larger at 1.4 million H100 equivalents and 1.5 gigawatts. This freed Colossus 1 to be leased to Anthropic, since Grok has minimal inference demand.
- 5.For Claude Code users, the 5-hour rate limits were doubled and peak-hour reductions removed. However, users hitting weekly limits (heavy daily users, parallel agents) won't benefit — only burst users hitting the 5-hour cap will see improvement.
- 6.The XAI-Cursor deal is fundamentally about data, not compute. XAI has compute but lacks quality coding interaction data; Cursor holds the most valuable corpus of human-AI coding back-and-forths across all major models, which is the exact data needed for RL training.
- 7.The $10B fee in the Cursor deal is effectively the price XAI is paying just for Cursor's data. The acquisition option at $60B covers the full company; if SpaceX declines acquisition, they've paid $10B purely for training data rights.
- 8.OpenAI is uniquely positioned with all three critical resources — research, data, and compute — while every other lab has gaps. Anthropic's biggest competitive moat (AWS Bedrock exclusivity) is now dead as OpenAI gains AWS support, and Codex is rapidly closing the coding capability gap.
Life's too short for long videos.
Summarize any YouTube video in seconds.
Quit Yapping — Try it Free →