Nvidia-Google AI Chip Rivalry Escalates on Report of Meta Talks

view original post

Meta Platforms Inc. is in talks to spend billions on Google’s AI chips, the Information reported, adding to a monthslong share rally as the search giant has made the case it can rival Nvidia Corp. as a leader in artificial intelligence technology.

A deal would signal growing momentum for Google’s chips and long-term potential to challenge Nvidia’s market dominance, after the company earlier agreed to supply up to 1 million chips to Anthropic PBC.

Most Read from Bloomberg

Google owner Alphabet Inc. is on track to hit a $4 trillion market valuation for the first time when trading opens in New York on Wednesday. Nvidia’s shares were down about 4% in premarket trading.

WATCH: Meta Platforms is in talks to spend billions of dollars on Google’s AI chips, according to the Information. Meta has been betting its chips can rival those of Nvidia as a leader in artificial intelligence technology. Source: Bloomberg

Meta is in discussions to use the Google chips — known as tensor processing units, or TPUs — in data centers in 2027, The Information reported, citing an unidentified person familiar with the talks. Meta also may rent chips from Google’s cloud division next year, the news outlet said.

Explainer: How Google’s TPUs May Give Nvidia a Run for Its Money

An agreement would help establish TPUs as an alternative to Nvidia’s chips, the gold standard for big tech firms and startups from Meta to OpenAI that need computing power to develop and run artificial intelligence platforms.

Nvidia’s stock is already facing headwinds as investors fear a broader AI bubble. Michael Burry, immortalized in for his bets against the housing market during the 2008 financial crisis, has scrutinized the chipmaker over circular AI deals, hardware depreciation and revenue recognition.

After Google’s Anthropic deal was announced, Seaport analyst Jay Goldberg called it a “really powerful validation” for TPUs. “A lot of people were already thinking about it, and a lot more people are probably thinking about it now,” he said.

“Google Cloud is experiencing accelerating demand for both our custom TPUs and NVIDIA GPUs; we are committed to supporting both, as we have for years,” a spokesperson for Google said.

Representatives for Meta declined to comment.

What Bloomberg Intelligence Says

Meta’s likely use of Google’s TPUs, which are already used by Anthropic, shows third-party providers of large language models are likely to leverage Google as a secondary supplier of accelerator chips for inferencing in the near term. Meta’s capex of at least $100 billion for 2026 suggests it will spend at least $40-$50 billion on inferencing-chip capacity next year, we calculate. Consumption and backlog growth for Google Cloud might accelerate vs. other hyperscalers and neo-cloud peers due to demand from enterprise customers that want to consume TPUs and Gemini LLMs on Google Cloud.

– Mandeep Singh and Robert Biggar, analysts

Click here for the research.

Asian stocks related to Alphabet surged in early Tuesday trading in Asia. In South Korea, IsuPetasys Co., which supplies multilayered boards to Alphabet, jumped 18% to a new intraday record. In Taiwan, MediaTek Inc. shares rose almost 5%.

A deal with Meta — one of the biggest spenders globally on data centers and AI development — would mark a win for Google. But much depends on whether the tensor chips can demonstrate the power efficiency and computing muscle necessary to become a viable option in the long run.

The tensor chip — first developed more than 10 years ago especially for artificial intelligence tasks — is gaining momentum outside its home company as a way to train and run complex AI models. Its allure as an alternative has grown at a time companies around the world worry about an overreliance on Nvidia, in a market where even Advanced Micro Devices Inc. is a distant runner-up.

Graphics processing units, or GPUs, the part of the chip market dominated by Nvidia, were created to speed the rendering of graphics — mainly in video games and other visual-effects applications — but turned out to be well-suited to training AI models because they can handle large amounts of data and computations. TPUs, on the other hand, are a type of specialized product known as application-specific integrated circuits, or microchips that were designed for a discrete purpose.

The tensor chips were also adapted as an accelerator for AI and machine learning tasks in Google’s own applications. Because Google and its DeepMind unit develop cutting-edge AI models like Gemini, the company has been able to take lessons from those teams back to the chip designers. At the same time, the ability to customize the chips has benefited the AI teams.

–With assistance from Riley Griffin and Carmen Arroyo.

Most Read from Bloomberg Businessweek

©2025 Bloomberg L.P.