Skip to content
Back to articles

Google's TPU Chips Take Aim at NVIDIA's Dominance

April 7, 2026/4 min read/805 words
GoogleAI HardwareNVIDIAAnthropicAI Infrastructure
MacKenzie Sigalos reporting on Google's TPU chip business on CNBC
Image: Screenshot from YouTube.

Key insights

  • Google's TPUs were built for internal use. Customers now buy them directly, turning a cost-saving efficiency tool into a standalone revenue line for Alphabet.
  • Anthropic tripling its capacity on Google chips is a major signal. A serious OpenAI competitor choosing TPUs over NVIDIA GPUs validates Google silicon as a credible alternative, not just a cheaper option.
  • Google is the only big tech company with custom AI chips, a cloud platform, top-tier models, and products people use every day (Search, YouTube, Android) all under one roof. No peer can replicate that stack.
  • A year ago many feared AI chatbots would make Google Search obsolete. Analysts now see a potential $900 billion chip opportunity the stock price has not yet reflected.
Published April 7, 2026
CNBC Television
CNBC Television
Hosts:Carl Quintanilla
CNBC
Guest:MacKenzie SigalosCNBC

This is an AI-generated summary. The source video may include demos, visuals and additional context.

Watch the video · How the articles are generated

In Brief

Google's custom AI chips have moved from an internal tool to something customers can actually buy. Broadcom confirmed a five-year deal to develop and supply Google's in-house chips, sending Broadcom shares higher on the news. Anthropic, the AI company backed by Google and a direct competitor to OpenAI, is set to more than triple its computing capacity on Google chips by next year. CNBC's MacKenzie Sigalos broke down what this means for Alphabet (Google's parent company): a potential $900 billion market opportunity that analysts say the stock price hasn't caught up with yet.

From Google's internal secret to a product

For years, Google's TPUs were not for sale. A TPU, or Tensor Processing Unit, is a chip Google designed from scratch specifically for AI workloads: training models, running search, and powering Gemini. Unlike NVIDIA's GPUs (Graphics Processing Units), which were originally built for gaming and later adapted for AI, TPUs were built with one job in mind from day one.

The chips have been available through Google Cloud since 2018, but the current shift is different. Broadcom confirmed a deal to develop and supply Google's in-house chips for the next five years, a signal that demand has reached a scale where the supply chain needs long-term commitments. Customers are now leasing and buying TPUs directly, looking for cheaper alternatives to NVIDIA hardware.

The timing is not accidental. It coincides with Gemini's comeback. A year ago Google's AI model was seen as trailing ChatGPT. Since then it has started to outperform in benchmark tests (standardized comparisons used to rank AI models), and interest in the underlying chips followed. Broadcom's purchase orders for Anthropic's TPU capacity reveal exactly how competitive Google's pricing is: they can beat NVIDIA on cost.

Less flexible, but built for the job

TPUs trade breadth for depth. They are not as flexible as NVIDIA's chips, which can handle a wide range of computing tasks. But for the specific work of training and running AI models, they are less expensive and more efficient.

That specialization is exactly what large AI customers need. Anthropic's computing capacity from Google's chips alone is set to more than triple by next year, a figure that comes from Broadcom's own purchase orders. The numbers confirm what the deal structure implied: Google can offer AI-specific performance at a lower cost, and major customers are committing at scale.

The demand is accelerating, not leveling off. As more AI companies look for cost-effective ways to scale their compute, Google's purpose-built chips become an increasingly attractive alternative to NVIDIA hardware.

A $900 billion market that the stock hasn't priced in

Investment firm D.A. Davidson has put a number on the opportunity. Google's chip business could capture 20% of the AI market over the next few years, making it a roughly $900 billion opportunity, potentially worth more than Google Cloud itself.

That is a striking comparison. Google Cloud is already one of the three largest cloud platforms in the world. An opportunity potentially bigger than that puts Google's chip business in a different category entirely: not a side project, but something that could rival the cloud business Google has spent years building.

The market hasn't caught up yet. Google still trades at a discount to Apple and Amazon, even though analysts now see the chip business as a serious new way to make money. Carl Quintanilla noted on air that last summer the dominant conversation was about the fear that AI chatbots would make Google Search obsolete. The rethink since then has been one of the more striking reversals in how Wall Street talks about a major tech company.

The vertical integration edge nobody else has

This is where Google's position becomes genuinely hard to replicate. Vertical integration means controlling multiple layers of a product stack rather than depending on outside suppliers. Google has the custom chips, the cloud platform, the Gemini models, and the consumer products all under one roof. No other big tech peer has all four.

Microsoft and Amazon both run massive cloud platforms, but neither designs the AI chips that power them at the same depth. NVIDIA makes the dominant AI chips but has no cloud platform or products people use every day. Apple has world-class chips and consumer products, but no cloud infrastructure or a top-tier AI model competing at Gemini's level.

Google's full-stack position turns what looks like a chip story into something structurally bigger. Each layer reinforces the others: better chips make Gemini faster, a stronger Gemini makes Google Cloud more attractive, more cloud customers drive demand for more chips. The Broadcom deal locks in supply for five years. Google is betting the loop will keep tightening.

Glossary

TermDefinition
TPU (Tensor Processing Unit)Google's custom-designed AI chip, built specifically for machine learning workloads like training and running AI models
GPU (Graphics Processing Unit)General-purpose chips, mainly from NVIDIA, used for AI and graphics. More flexible than TPUs but typically more expensive for AI-specific tasks
Vertical integrationWhen a company controls multiple layers of its own product stack (chips, cloud, models, apps) instead of relying on outside suppliers

Sources and resources

Share this article