Skip to content
Back to articles

Living Brain Cells Play Doom on a Biological Computer

March 10, 2026ยท5 min readยท906 words
AIbiological computingCortical Labs CL1brain-computer interfacebiocomputation
The Cortical Labs CL1 biological computer, a black box containing living human neurons on a microchip.
Image: Screenshot from YouTube.

Key insights

  • Around 200,000 living human neurons inside the CL1 received game inputs as electrical signals and returned motor commands to control a Doom character.
  • The same feat took 18 months with the original hardware. With the new API, an independent researcher implemented it in under a week.
  • Cortical Labs says the interface problem is solved. The next challenge is improving how the neurons learn, not how they connect.
SourceYouTube
Published February 25, 2026
Cortical Labs
Cortical Labs
Hosts:Dr. Alon Loeffler

This is an AI-generated summary. The source video includes demos, visuals and context not covered here. Watch the video โ†’ ยท How our articles are made โ†’

Read this article in Norwegian


In Brief

Cortical Labs, a biological computing company based in Melbourne, Australia, has shown living human brain cells playing the 1993 first-person shooter Doom on its CL1 biological computer (a machine that uses living cells instead of conventional chips to process information). The cells are not taken from anyone's brain: they are neurons (nerve cells) grown from human stem cells (cells that can develop into different cell types) in a laboratory and placed on a silicon microchip. The CL1 houses around 200,000 such neurons, which receive game data as electrical signals and fire back impulses that the system reads as commands: shoot, move, turn. Using a new cloud-based Application Programming Interface (API), an independent researcher named Sean Cole implemented the Doom demo in under a week. The company describes the result as proof that the core interface challenge in biological computing has been solved, and is now inviting developers and researchers to access the platform.

~200K
living human neurons inside the CL1
18 months
to get neurons playing Pong with original hardware
< 1 week
to implement Doom with the new API

What happened

Dr. Alon Loeffler, a postdoctoral scientist at Cortical Labs, presented the Doom demo in a video published on February 25, 2026. The neurons did not merely respond to button presses โ€” they received game state as electrical stimulation and generated movement commands in return.

The setup works like this. The CL1's microchip is a multi-electrode array (MEA): a grid of tiny electrodes that can both deliver electrical pulses to neurons and record the signals the neurons fire back. When a demon appears on the left side of the screen, electrodes stimulate the left side of the neural culture (the collection of lab-grown cells). The neurons fire in response, and the system reads those firing patterns as game commands: shoot, turn right, move forward (3:08).

Sean Cole, an independent researcher and collaborator, translated the Doom video feed into these stimulation patterns using Python commands through the Cortical Labs API. He had a working version running in under a week (3:38).


Context and background

Cortical Labs' earlier research project, DishBrain, showed in 2021 that neurons could learn to play Pong. That milestone showed what Loeffler describes as "adaptive real-time goal-directed learning" (meaning the cells could adjust their behavior in real time to achieve a goal), but it took over 18 months using original hardware and software (2:04). Pong also had a simple, direct relationship between inputs and outputs: ball goes up, paddle goes up.

Doom is considerably more complex. It is a three-dimensional environment with enemies, exploration, and no direct input-output relationship. To connect the digital game world to the biological language of neurons, the team had to build an interface layer. That layer translates visual game state into patterns of electrical stimulation and interprets neural activity as action commands.

According to the company's website, the neurons in the CL1 are grown from stem cells directly onto custom silicon chips. The company claims the technology learns from smaller datasets than conventional AI and uses dramatically less energy. The Cortical Cloud platform, which Sean Cole used to build the Doom demo, launched in March 2026 and is now open to developers and researchers.


How to interpret these claims

The neurons play like a beginner who has never seen a computer. They die often, but they do show signs of learning: they seek out enemies, shoot, and navigate the environment (4:07). Loeffler is explicit that the cells are nowhere near a competitive level.

The milestone Cortical Labs is claiming is not performance, but connectivity. "We've solved the interface problem," Loeffler says (4:31). The company argues that the hard part was establishing a reliable, programmable channel between software and living neurons. That channel now exists and can be accessed through a standard API.

What remains unsolved is the learning side: better feedback signals, better ways to encode information, and better reward mechanisms. Without improvements there, the neurons cannot progress beyond beginner-level play. The company frames this openly as the next phase of work, not a solved problem.

The broader claims, including potential applications in personalized medicine and cell therapy (treatment using living cells as medicine), are stated on the company's website but are not shown by the Doom experiment. Those remain long-horizon goals. For a different take on how computing itself may be changing, see Karpathy: We're Redoing Computing From the 1960s.


Glossary

TermDefinition
Multi-electrode array (MEA)A chip covered in tiny electrodes that can both send electrical pulses to neurons and record the signals neurons fire. Acts as a two-way communication channel between hardware and living cells.
NeuronA nerve cell. The basic working unit of the brain. Communicates with other neurons by firing electrical signals called spikes.
SpikeA brief electrical signal fired by a neuron. The basic unit of communication in the nervous system.
Biological computerA computing device that uses living cells, typically neurons, instead of silicon transistors to process information.
Stem cellsCells that can develop into many different specialized cell types. Used here to grow neurons directly onto the CL1 microchip.
API (Application Programming Interface)A set of instructions that lets one piece of software communicate with another. Here, it lets developers send commands to the CL1 using Python code.
BiocomputationThe use of biological systems, such as living cells, to perform computing tasks.

Sources and resources

Share this article