Microsoft and OpenAI break up (Amazon is pumped)
36:46
Watch on YouTube ↗
T
Theo - t3.gg·Tech

Microsoft and OpenAI break up (Amazon is pumped)

TL;DR

Microsoft and OpenAI restructured their exclusive partnership, freeing OpenAI to sell models on AWS — driven by Anthropic's enterprise dominance through Bedrock.

Key Points

  • 1.Microsoft became OpenAI's exclusive cloud partner in 2019 for $1B. The deal gave Microsoft exclusive IP licensing rights to all pre-AGI technology, with OpenAI locked to Azure as its sole cloud provider — a restriction tied to an undefined AGI threshold.
  • 2.The breakup's root cause was OpenAI hiding its reasoning model breakthroughs from Microsoft. After the O1 launch in September 2024, Microsoft executive Sman yelled at OpenAI staff including CTO Mira Murati for withholding chain-of-thought documentation, marking the start of the end.
  • 3.The 2025 renegotiation extended Microsoft's IP rights to 2032 but stripped exclusivity. Microsoft's license became non-exclusive, the AGI definition clause was replaced by an independent expert panel, and OpenAI dropped revenue-share payments — effectively gutting Microsoft's leverage.
  • 4.Amazon invested $50B in OpenAI and became its exclusive third-party cloud distribution partner. AWS will host OpenAI's Frontier agent platform, provide 2 gigawatts of Trainium capacity, and co-create a stateful runtime environment on Amazon Bedrock.
  • 5.Anthropic's explosive enterprise growth — hitting a $30B run rate — is the real reason OpenAI fled Azure. Anthropic's deep AWS Bedrock integration meant enterprises already on AWS defaulted to Claude, giving Anthropic a structural sales advantage OpenAI couldn't match while locked to Azure.
  • 6.Anthropic's egregious cloud revenue-share deals paradoxically blocked startup credit usage. Theo couldn't spend AWS, GCP, or Azure startup credits on Anthropic models because cloud providers would owe Anthropic ~50% of spend, making free credits financially untenable for providers.
  • 7.Azure's OpenAI inference was catastrophically unreliable, bottoming at 0.3–2 tokens/second versus OpenAI's 70–240 TPS. After Theo published a public benchmark and threatened to repost aggressively, Microsoft fixed the bug within one day — confirming the issue was a longstanding implementation defect.
  • 8.The next AI war will shift from model competition to silicon competition. Theo predicts the battleground moves from OpenAI vs. Anthropic vs. Gemini to Nvidia vs. AMD vs. AWS Trainium (including Trainium 4 arriving 2027) vs. Google TPUs.

Life's too short for long videos.

Summarize any YouTube video in seconds.

Quit Yapping — Try it Free →