Skip to content
Back to articles

Sam Altman: AI Has Crossed Into Major Economic Utility

March 12, 2026ยท6 min readยท1,140 words
AIOpenAI infrastructure investmentStargate data centerAI economic impactUS-China AI competition
Sam Altman speaking at BlackRock's U.S. Infrastructure Summit in Washington, D.C.
Image: Screenshot from YouTube.

Key insights

  • Altman says AI crossed into 'major economic utility' in recent months, with coding agents now handling multi-hour tasks and startups counting compute instead of employees.
  • OpenAI's $110 billion funding round, four times larger than Saudi Aramco's record IPO, will finance Stargate data centers and a custom inference chip deploying at scale by end of 2026.
  • The cost of answering a hard problem with AI dropped 1,000x from model o1 to GPT-5.4 in roughly 16 months, and Altman expects that pace of cost reduction to continue.
  • Altman predicts a potentially deflationary AI economy where GDP falls but quality of life rises, and calls the coming workforce adjustment 'painful' but sees no long-term jobs doom.
SourceYouTube
Published March 11, 2026
DRM News
DRM News
Hosts:Bayo Ogunlesi
OpenAI
Guest:Sam Altman โ€” OpenAI

This is an AI-generated summary. The source video includes demos, visuals and context not covered here. Watch the video โ†’ ยท How our articles are made โ†’

In Brief

Sam Altman, CEO of OpenAI, sat down with Bayo Ogunlesi, chairman of Global Infrastructure Partners (GIP/BlackRock) and an OpenAI board member, at BlackRock's U.S. Infrastructure Summit in Washington, D.C. on March 11, 2026. Altman argued that AI has moved from a research curiosity into a genuine economic force, backed by a record-breaking $110 billion funding round and a 1,000-fold drop in the cost of running AI models. He also addressed the US-China race for AI dominance and a new partnership with construction unions. He shared his view that AI could eventually produce a deflationary economy where costs fall faster than measured output.


What happened

Speaking at an event that gathered leaders from government, technology, and finance, Altman told the audience that something fundamental shifted in the past few months. AI has crossed into "major economic utility," he argued, meaning it now produces real economic value rather than just demonstrating technical potential.

The numbers are striking. OpenAI's tools now reach 900 million people and businesses worldwide. Altman said AI software engineers can handle tasks that take multiple hours today, with multi-day and eventually multi-week autonomous work coming soon. At Y Combinator, the startup accelerator Altman once led, founders no longer pitch with headcount projections. "The question is: how much compute do you have?" he said, describing the rise of "zero-person startups" that run almost entirely on AI capacity rather than employees.

Altman also said the term artificial general intelligence (AGI) has "ceased to have much meaning." AGI refers to AI that matches human ability across virtually all cognitive tasks. He prefers a more concrete milestone: by roughly late 2028, he expects more cognitive capacity to exist inside data centers than outside them, meaning AI systems collectively processing more than all humans combined.


The $110 billion bet on infrastructure

To get there, OpenAI needs physical infrastructure on a scale that has no recent precedent. The company recently closed a $110 billion funding round, which Altman described as roughly four times larger than Saudi Aramco's $25 billion IPO, the largest public offering ever recorded. Strategic partners include Amazon, Nvidia, and SoftBank.

The goal is to make intelligence "too cheap to meter." The phrase comes from early nuclear power advocates who imagined electricity so abundant that billing for it would not be worth the hassle. Altman wants AI to flood the world the way cheap electricity did: invisibly embedded in everything.

The first Stargate data center site in Abilene, Texas is already active. Altman said OpenAI is currently training what he hopes will be "the best model in the world" there. Stargate is a joint venture between OpenAI and SoftBank to build large-scale AI data centers across the United States.

Alongside the data center build-out, OpenAI is developing its own custom chip. Unlike the high-speed graphics processing units (GPUs) from Nvidia that power AI training, this chip is designed purely for inference: generating responses from a trained model. The target is the cheapest and most energy-efficient chip per watt, not the fastest, since inference at scale rewards cost control over raw speed. Altman expects the first custom chips to be deployed at scale by end of 2026.

How fast are costs actually falling?

Altman shared a figure that explains why the investment makes sense. The cost of reaching a correct answer on a hard problem dropped 1,000 times from model o1 to GPT-5.4, in roughly 16 months. That kind of cost curve means spending heavily today on infrastructure could look cheap in hindsight.


Union partnership and the people building AI's backbone

Building data centers at this pace requires skilled construction workers. OpenAI has signed a new partnership with NABTU, the North American Building Trades Unions, to access those workers through established union frameworks. NABTU president Sean McGarvey appeared at the summit alongside Altman.

The partnership is notable because it ties the AI industry's expansion explicitly to organized labor, at least on the construction side. For every data center that processes AI requests, there are welders, electricians, and pipefitters who built it โ€” a workforce that looks nothing like Silicon Valley's typical hiring profile.


The US-China competition

Ogunlesi pressed Altman on the geopolitical dimension of the AI race. Altman described a division of leadership: the United States leads on frontier models, the most capable AI systems at the cutting edge, while China leads on the cheapest inference and open-source AI.

He pushed back on framing deep learning, the core technique behind modern AI, as a technology that one country invented or owns. "This is more like discovering a fundamental property of physics," he said, comparing it to calculus or quantum mechanics: a discovery that any sufficiently advanced civilization would eventually make. The implication is that no one can monopolize the underlying science, only the lead time and the infrastructure built on top of it.

One data point he highlighted on global uptake: in India, usage of Codex, OpenAI's AI coding tool, grew tenfold in a matter of months.


What this means for the economy and jobs

Ogunlesi closed the conversation with the question that hangs over every AI discussion in 2026: what happens to workers? Altman said he is "not a long-term jobs doomer," believing that AI will eventually create more economic activity and employment than it displaces. But he expects the adjustment period coming in the next few years to be "painful".

He also floated a more unusual economic possibility: a "forever deflationary world" where AI drives costs down so fast that measured GDP falls even as actual quality of life rises. Standard economic measures were built around scarcity. If AI makes cognitive labor abundant, those measures may no longer capture what is actually happening to human welfare.


How to interpret these claims

Altman is an optimist with a direct financial interest in AI adoption, and Ogunlesi is both the interviewer and a board member of OpenAI, which limits the adversarial pressure in this conversation. Several claims deserve scrutiny.

The 1,000x cost reduction is real and well-documented, but it describes a specific benchmark: hard reasoning problems where model o1 and GPT-5.4 both produce correct answers. Costs for other use cases may have fallen by different amounts. The late-2028 data center projection is Altman's personal estimate, not a consensus figure. It relies on contested assumptions about how to quantify human and machine cognition โ€” and how to compare them at all.

The "too cheap to meter" vision also glosses over energy and water consumption, which scale with compute and are not cheap at all. The Stargate buildout has drawn scrutiny for its land and power demands, a tension Altman did not address at the summit.


Glossary

TermDefinition
InferenceWhen an AI model generates a response to a query. The "live" phase, as opposed to training.
TrainingThe process of teaching an AI model by processing massive amounts of data. Happens once (or periodically), then the model is deployed.
GPU (Graphics Processing Unit)Specialized chips originally built for rendering video game graphics, now the primary hardware for training and running AI models.
ComputeThe processing power, measured in GPU-hours or similar units, needed to train and run AI models. Often discussed like a resource to be purchased or rationed.
StargateA joint venture between OpenAI and SoftBank to build large-scale AI data centers across the United States.
AGI (Artificial General Intelligence)AI that matches or exceeds human-level ability across virtually all cognitive tasks, not just specific domains.
Scaling lawsMathematical relationships showing that more compute and more data predictably produce smarter AI models. The basis for the bet that bigger infrastructure equals better AI.
NABTUNorth American Building Trades Unions, representing skilled construction workers in the US and Canada.

Sources and resources

Share this article