Quit Yapping
The Early Days of Anthropic & How 21 of 22 VCs Rejected It | The Four Bottlenecks in AI | Anj Midha
1:15:08
Watch on YouTube ↗
2
20VC with Harry Stebbings

The Early Days of Anthropic & How 21 of 22 VCs Rejected It | The Four Bottlenecks in AI | Anj Midha

TL;DR

Anj Midha explains how Anthropic faced 21 VC rejections at seed stage and breaks down the four key bottlenecks limiting AI progress today.

Key Points

  • 1.Anthropic was rejected by 21 of 22 VCs when raising its seed round. Midha introduced them to investors up and down Sand Hill Road; most didn't know what GPT-3 was, and the team had to drop their target from $500M down to $100M.
  • 2.The four bottlenecks in AI are context feedback, compute, capital, and culture. Midha argues culture is the most important because the right mission-driven culture attracts top researchers who solve the algorithmic and architecture questions automatically.
  • 3.Context feedback loops — unique domain-specific data — are where the biggest capability gaps exist. Models a year ago were terrible at physics and chemistry because that data is locked in national labs and semiconductor plants, not on the open internet.
  • 4.Scaling laws are still holding but only if you look beyond saturated domains like coding. At Periodic Labs, a materials science incubation in Menlo Park, throwing more compute at superconductor discovery is producing super-exponential gains per iteration.
  • 5.Midha co-founded Periodic Labs to close the science data gap using a physical robot lab. LLMs predict new materials, robots synthesize them, X-ray diffraction machines validate properties, and that verification data is piped back into the training run.
  • 6.Mistral's core investment thesis is sovereign AI infrastructure independent of US hyperscalers. The US CLOUD Act means mission-critical European workloads can't legally run on AWS, GCP, or Azure — making a fully local French stack strategically essential, culminating in a gigawatt facility announced in Paris with Macron and Jensen Huang.
  • 7.China is catching up through adversarial distillation and full-stack systems co-design rather than chip superiority. They take Western frontier models, distill state-of-the-art capabilities at scale, release as open-source to collect feedback, iterate, and are now approaching frontier performance using Huawei chips integrated up the full stack.
  • 8.Midha calls for a Western 'Iron Dome' for frontier model inference to counter distillation attacks. All inference from any company would route through a shared proxy that alerts the ecosystem when adversarial distillation spikes are detected — currently only an informal group chat among founders he coordinates.
  • 9.Compute is not fungible today, creating a GPU wastage bubble alongside an infrastructure shortage. H100, GB200, and GB300 clusters can't share workloads; stranded pools of unutilized compute exist because there are no open standards equivalent to what TCP/IP did for the internet or AC/DC did for electricity.
  • 10.AMP is building an independent compute grid — analogous to the electrical grid — not a cloud provider or VC firm. It has secured ~1.3 gigawatts (~$40B of cloud spend over four years), financed with roughly 20% equity (~$10B) and 80% debt, and provides compute to frontier teams at cost as a public benefit corporation.
  • 11.Midha argues venture capital must return to a co-founding model like Arthur Rock at Intel or Bob Swanson at Genentech. He spends three days a week at Periodic Labs, holds daily 8 a.m. standups with the CEO, and says deep operational partnership — not just writing checks — is where value accrues in a frontier technology era.

Life's too short for long videos.

Summarize any YouTube video in seconds.

Quit Yapping — Try it Free →