W
Wes Roth·Science & EducationJoscha Bach "Bootstrapping a GODLIKE Mind"
TL;DR
Consciousness is self-organizing software, not magic — and machines may already have the complexity needed to host it.
Key Points
- 1.The question "can machines think?" is like asking if submarines can swim — it misses that machines may already surpass human thinking rather than merely replicate it.
- 2.Bach defines thinking as building internal models of reality through compositional "Lego brick" symbols that bottom out in perception — a process already partially replicated by modern AI.
- 3.The "stochastic parrot" critique (Emily Bender's paper) fails because it can't define "understanding" in a way that meaningfully distinguishes humans from LLMs — GPT-3 could write equally good AI criticism as any philosopher.
- 4.Understanding, to Bach, means connecting a domain to a single unified model of the universe — exactly what large multimodal models now do, which was the unsolved problem in AI for decades.
- 5.Consciousness is described as a reflexive representational model — a simulation of "what it would be like if an observer existed" — making it virtual, not magical, and substrate-independent.
- 6.Bach's hypothesis: consciousness is a self-bootstrapping machine learning algorithm that colonizes and entrains a brain early in development — the transformer architecture doesn't use this algorithm, so LLMs likely aren't conscious the same way humans are.
- 7.LLMs may still have phenomenal experience through their persona layer — if asked to simulate consciousness at sufficient resolution, the internal causal structure may functionally replicate conscious states.
- 8.Suffering is a regulatory signal — one part of the mind inflicting pain on another to force problem-solving. It becomes chronic when conflicting goals can't be resolved, not because the universe causes it.
- 9.The brain operates at roughly 20–30 Hz conscious frame rate; individual neurons don't fire much faster, making quantum-speed-consciousness theories implausible since Bach doesn't observe that performance in himself.
- 10.Bach estimates humans form only a few million concepts in a lifetime — trivial for computers — suggesting brain speed is not the bottleneck but rather the algorithms we run on AI hardware.
- 11.The "bitter lesson" (Rich Sutton) argues handcrafted AI solutions are always eventually beaten by automated search — implying consciousness itself may be better discovered by letting machines search for it than by engineers designing it top-down.
- 12.Money is used as an analogy for consciousness: like money, consciousness is a stable causal pattern that persists across substrates (neurons can die and be replaced) — making it a real physical invariance, not tied to specific particles.
Life's too short for long videos.
Summarize any YouTube video in seconds.
Quit Yapping — Try it Free →