Hassabis: We Are Three Quarters of the Way to AGI

This is an AI-generated summary. The source video may include demos, visuals and additional context.
In Brief
Demis Hassabis founded DeepMind in 2009 with a mission he still recites from memory: "Step one: solve intelligence. Step two: use it to solve everything else." Speaking at Sequoia's AI Ascent 2026, he says the field is three quarters of the way there. And he is more specific than ever about what waits on the other side.
Related reading:
Secret keepers in 2009
When Hassabis started DeepMind, academia rolled its eyes. Deep learning (see glossary) was barely a recognized term. Reinforcement learning (see glossary) was confined to toy problems. AGI (artificial general intelligence, AI that can do everything a human can) was considered science fiction.
And yet: Hassabis and his colleagues saw something others had missed. They believed that deep learning combined with reinforcement learning held untapped potential, that the computing power was coming, and that the path to AGI was shorter than the world assumed.
"We almost felt like we were keepers of a secret," he says, "because neither academia nor industry really believed that any big progress was possible."
They set themselves a 20-year horizon from 2010. And Hassabis believes the field is broadly on track.
The mission that has not changed
Much in the AI industry shifts quickly. Hassabis's core positions have not.
2030 remains his answer on AGI, as it has been for years. At AI Ascent: "2030. I've been pretty consistent about that."
And he is equally clear on why AGI matters in the first place. Not the technology itself, but what it can be turned toward: curing diseases, advancing materials science, tackling energy. Hassabis is not interested in AGI as an academic achievement. He is interested in AGI as the ultimate scientific instrument.
Drug discovery: from ten years to days
The most concrete example he offers is medicine.
AlphaFold (DeepMind's AI that solved the 50-year protein folding problem, see glossary) is just the beginning. DeepMind spinout Isomorphic Labs is now working to automate the next step: designing the drug candidates themselves.
Discovering a new drug takes roughly ten years on average today. Most of that time is exploration: testing thousands of chemical compounds to find one that binds to the right site on a protein without binding to anything else, which would cause side effects.
Hassabis describes the dream: 99 percent of that exploration can be done in silico, inside a computer. Only the validation step needs a lab. And if that becomes possible, something he believes is within reach in the next few years, the timeline collapses. Ten years to months. Months to weeks. One day, perhaps, to days.
At that point, all diseases are in principle within reach. Personalized medicine becomes possible.
Machine learning is biology's mathematics
Hassabis believes AI-driven science will create entirely new disciplines.
The reasoning is elegant. Physics is powerful because we can write equations. F = ma. E = mc². Mathematics is the language of description for physics. But biology is different. It runs on millions of weak signals, complex chains of cause and effect, and data volumes no human mind can process by hand. Mathematics has not been able to capture it.
According to Hassabis, "machine learning is the perfect description language for biology in the same way maths is for physics." That is why AlphaFold succeeded where everything else had failed. Not because AI is magic, but because certain kinds of systems require statistical learning to be described at all.
The same applies to economics. You cannot raise interest rates a thousand times and observe the results. But an accurate simulator you can run a thousand times. Hassabis suggests that the next generation of research might extract fundamental laws of economics the way Maxwell extracted laws of electromagnetism. The equations would just need to be pulled out of a simulator rather than derived by a human mind.
DeepMind is already working on what Hassabis calls a virtual cell: a simulation of a biological cell with all its dynamic processes. If it works, it would be a scientific instrument of a kind that has never existed before.
Information is more fundamental than matter and energy
The conversation moves into philosophy, and Hassabis is clearly at home there.
Einstein showed that matter and energy are two sides of the same coin: E = mc². Hassabis believes information is a third, and more fundamental, element. He argues the universe is best understood as an information-processing system, not one where matter and energy are primary with information as a byproduct, but the reverse.
Biology is a natural example: living organisms actively resist entropy (the tendency of systems to fall into disorder) by organizing information. From that perspective, a cell and a neural network are variants of the same phenomenon.
DeepMind sees itself as Turing's heirs. Alan Turing proved that everything computable can be computed by a single, simple machine. AlphaFold demonstrated something surprising: a classical computer could model protein folding, even though proteins are quantum systems at the atomic level. That suggests quantum mechanics may not be necessary to solve many of the problems we assumed required it.
Build the tool first
A recurring question in advanced AI discussions: is it a tool, or is it a subject with inner experience? Hassabis frames this as a question of sequence, not a binary choice. Build the tool first. Then use that tool to address the deeper questions.
Philosopher Daniel Dennett (who died in 2024, and with whom Hassabis had long conversations) believed the consciousness debate comes down to two things: behavior and substrate. Does the system behave like a conscious being? And is it made of the same stuff as us? The first question is empirical. The second is philosophical. And that second gap, Hassabis suggests, will always be there with artificial systems.
Rapid fire
The conference's lightning round surfaces a few answers:
Must-read book post-AGI: The Fabric of Reality by David Deutsch. Hassabis hopes to use AGI to answer the questions Deutsch poses in it.
Proudest DeepMind moment: AlphaFold.
Scientist to pick for a strategy game? John von Neumann. "He's a game theorist," Hassabis says. "He's the best."
The lesson from Elixir Studios: You want to be five years ahead of your time, not fifty. In his twenties, Hassabis's first company tried to simulate an entire country on a Pentium processor. It was a touch too ambitious. But only a touch.
Glossary
| Term | Definition |
|---|---|
| AGI | Artificial general intelligence: AI that can perform any cognitive task as well as a human, as opposed to today's specialized systems |
| AlphaFold | DeepMind's AI that solved the 50-year protein folding problem and won the 2024 Nobel Prize in Chemistry |
| Protein folding | The process by which a protein chain folds into a specific 3D shape. The shape determines what the protein does in the body, which is critical for drug design |
| In silico | Latin for "in silicon": experiments conducted as computer simulations rather than in a physical lab |
| Deep learning | A family of machine learning methods loosely inspired by neurons in the brain. The foundation of modern AI, including ChatGPT and AlphaFold |
| Reinforcement learning | A training method where AI learns through trial and error with reward signals, like learning a game by playing it millions of times |
| Turing machine | Alan Turing's theoretical computer, which proved that anything computable can be computed by a single simple machine. The foundation of all modern computing |
| Entropy | The tendency of systems to move toward disorder. Living organisms spend energy actively fighting it, which is itself a form of information processing |
| Quantum system | A system governed by the rules of quantum physics at the atomic scale, where classical physical laws no longer fully apply |
Sources and resources
Want to go deeper? Watch the full video on YouTube →