Shares of Nvidia, a prominent chip manufacturer, dropped around 4% on Tuesday following reports that Meta Inc. is negotiating a significant investment in chips from Google Inc., its competitor.
Meta, known for platforms like Facebook, Instagram, and WhatsApp, could spend billions to utilize Google’s tensor processing units (TPUs) in its data centers by 2027, according to sources involved in the discussions. There’s also a mention of plans to lease Google chips from Google Cloud as early as next year.
Currently, Meta relies on Nvidia’s graphics processing units (GPUs), which means this potential deal could result in Google claiming about 10% of Nvidia’s annual earnings, leading to considerable revenue for Google’s parent company, Alphabet Inc.
A spokesperson from Google Cloud acknowledged the increasing demand for both custom TPUs and Nvidia GPUs, emphasizing their ongoing commitment to support both technologies. Meanwhile, an Nvidia representative expressed pride in Google’s advancements in AI while reiterating the company’s leading position in the industry.
GPUs are fundamental for AI operations and are attracting massive investments. Google’s TPUs, on the other hand, are specialized chips central to many of Google’s popular AI services, such as search and YouTube. Historically, Google has rented its AI chips to clients through Google Cloud, but it seems to be ramping up efforts to persuade companies like Meta to adopt their chips for internal use.
The competition is heating up, especially since Google is marketing its TPUs as a cost-effective alternative to Nvidia GPUs. Yet, Nvidia’s major clients, like Oracle, are struggling to maintain robust profit margins while significantly investing in Nvidia’s products.
Interestingly, Google is looking into partnerships amid rising demands for enhanced data security, although catching up with Nvidia won’t be straightforward. Nvidia holds a dominant position in the AI market, boasting a market capitalization of a staggering $4.2 trillion.
Jensen Huang, Nvidia’s CEO, is monitoring Google’s developments closely and may pursue a deal with Meta before Google finalizes its arrangement. Despite this, Google is progressing in the AI realm, having recently introduced its largest language model, Gemini 3, which has received positive feedback.
Meta’s discussions with Google reportedly also involve using TPUs for new AI models, a process referred to as inference, rather than merely improving existing models. Analysts suggest that to effectively compete with Nvidia, mastering inference chips is crucial, as these chips enable AI models to identify patterns and draw insights from new information.
The sources indicate that Meta is developing its own inference chips to cut costs and lessen its reliance on Nvidia. Additionally, Google has crafted software called TPU Command Center, aimed directly at Nvidia’s Cuba software, regarded as the standard in the industry.
Interestingly, reports suggest that Nvidia’s key clients are already enjoying benefits from Google’s advancements. Following Google’s announcement of a deal to supply up to 1 million TPUs to Anthropic, Huang responded by revealing plans to invest billions in Anthropic. Similarly, when OpenAI disclosed its intention to rent TPUs from Google, Huang quickly arranged a deal to invest up to $100 billion in OpenAI.


