Skip to content
Back to articles

AI Data Centers: Grid Killers or Grid Savers?

April 3, 2026/4 min read/876 words
AIAI InfrastructureAI ResearchAI Startups
Ayşe Coskun on the TED stage at TEDAI San Francisco 2025
Image: Screenshot from YouTube.

Key insights

  • Unlike hospitals or homes, AI workloads can be delayed, slowed, or shifted without users noticing. That makes data centers uniquely suited to act as flexible grid assets.
  • Solar peaks at noon, electricity demand peaks in the evening. Power-flexible data centers can absorb that mismatch today, without waiting for new infrastructure.
  • In Virginia, AI data centers already face 5-7 year wait times to connect to the grid. Flexibility could let them skip the queue, accelerating AI adoption itself.
SourceYouTube
Published October 21, 2025
TED
TED
Hosts:Ayşe Coskun

This is an AI-generated summary. The source video may include demos, visuals and additional context.

Watch the video · How the articles are generated

In Brief

AI data centers are usually framed as energy hogs. That's true, but Ayşe Coskun, Professor of Electrical and Computer Engineering and Associate Dean at Boston University and Chief Scientist at Emerald AI, says the framing is missing something big. Data centers could be the most flexible electricity consumers on the grid. And that flexibility could help stabilize the very grid everyone is worried about.

Her research, now running on real AI data centers after 12 years of development, makes the case that the technology creating this energy crisis might also be the thing smart enough to fix it.

The problem everyone already knows

New AI data center projects in the US are requesting power loads equal to entire cities. In some regions, utilities simply can't keep up.

The numbers are hard to ignore. Training GPT-4 alone is estimated to have consumed roughly the annual electricity of thousands of US homes. In Ireland, nearly 20% of the nation's electricity is now drawn by data centers. In Virginia's data center alley, residents have already seen their electricity bills climb 20% compared to just a few years ago.

So "energy hog" seems like a fair label. But Coskun says it's only half the story.

The flip side: virtual batteries

Here's the insight that changes the framing. Unlike hospitals or homes, most AI computing jobs can wait. A researcher running AI analysis on hundreds of medical images can tolerate a few extra minutes. A team fine-tuning a model over several days can tolerate a few slower hours. This inherent flexibility in computing is exactly the flexibility a power grid needs.

Coskun calls this making data centers the muscles of the grid, flexing on demand. Think of it like a virtual battery. Rather than physically storing electricity, a data center absorbs excess power when the grid has too much and scales back when the grid is under stress. No new hardware required. No new power plants.

This matters even more because the AI boom and the renewable energy boom are arriving at the same time. Wind and solar don't follow human schedules. Solar gives a glut of electricity at noon, but demand peaks in the evening. A power-flexible data center can soak up that midday surplus instead of letting it go to waste, then scale back when the grid tightens later in the day.

How it actually works

Coskun's research produced three core strategies: capping power, shifting workloads, and provisioning data centers as flexible reserves for the grid.

Power capping means setting a ceiling on how much electricity a data center draws during critical grid moments. Workload shifting means moving computing tasks to times when more power is available, or spreading them across multiple data centers in different regions. The key constraint is that user experience stays protected throughout. The system makes performance promises to users and keeps them; it just has more room to maneuver behind the scenes.

"We reframed the problem. Instead of asking how do we compute as fast as possible, we asked, how do we make computer systems meet the constraints of the power grid, while at the same time still delivering on user performance agreements?"

Prototypes built on real data center servers confirmed it works. And what began as scribbles on a whiteboard 12 years ago is now running in production.

Why the timing is urgent

The power grid's challenge isn't just generating more electricity. It's a timing problem. Batteries can help bridge supply and demand gaps, but scaling them is slow, expensive, and often not environmentally clean. Nuclear takes decades and billions to build.

Meanwhile, AI data centers in places like Virginia face 5-7 year wait times just to connect to the grid. In an industry where technologies shift fundamentally every six months, that's a paralyzing constraint.

The Texas heat wave of August 2023 shows what's at stake on the other side of the equation. During the crisis, wholesale electricity prices spiked over 800% in a single afternoon. Flexible loads, if widely available, could have reduced costs and prevented the emergency alerts sent to consumers.

Power-flexible data centers offer two opportunities at once. They can help stabilize the current grid during peaks and emergencies. And they can get future data centers connected much faster, without waiting for massive infrastructure upgrades.

AI as the conductor

Orchestrating all this flexibility is genuinely hard. Electricity prices change hourly. Workloads arrive unpredictably. Grid rules vary across states and countries. No human operator and no fixed management policy can track all of it in real time.

This is where AI itself re-enters the story. Coskun describes a network of data centers as an orchestra: hundreds of instruments playing at once. Left alone, it sounds like chaos. Bring in a conductor and all that noise becomes music.

"The conductor in this case is AI."

Emerald AI's platform slows down, speeds up, or pauses workloads within a data center, and shifts work between data centers, all while respecting the performance needs of users and cloud providers. The system reads grid conditions, anticipates demand, and coordinates across facilities in real time.

The result, in Coskun's words, is harmony: reliable electricity, efficient computing, and a system that works together rather than against itself.

Glossary

TermDefinition
Virtual batteryA data center that acts like a battery by absorbing or reducing power use on demand, without actually storing electricity
Power-flexible data centerA data center that can adjust its electricity consumption up or down based on grid needs
Workload shiftingMoving computing tasks to different times or locations to match available power
Power cappingSetting a ceiling on a data center's electricity draw during grid stress periods

Sources and resources

Share this article