TLDR Meta is considering using Google’s tensor processing units (TPUs) in its data centers starting in 2027, with potential cloud rentals beginning next year Nvidia shares dropped 3.2% in premarket trading while Alphabet gained 2.1% on the news Meta plans to spend $70 billion to $72 billion on AI infrastructure this year, making it one [...] The post Nvidia (NVDA) Stock: Meta Considers Google TPU Chips for 2027 Data Centers appeared first on CoinCentral.TLDR Meta is considering using Google’s tensor processing units (TPUs) in its data centers starting in 2027, with potential cloud rentals beginning next year Nvidia shares dropped 3.2% in premarket trading while Alphabet gained 2.1% on the news Meta plans to spend $70 billion to $72 billion on AI infrastructure this year, making it one [...] The post Nvidia (NVDA) Stock: Meta Considers Google TPU Chips for 2027 Data Centers appeared first on CoinCentral.

Nvidia (NVDA) Stock: Meta Considers Google TPU Chips for 2027 Data Centers

TLDR

  • Meta is considering using Google’s tensor processing units (TPUs) in its data centers starting in 2027, with potential cloud rentals beginning next year
  • Nvidia shares dropped 3.2% in premarket trading while Alphabet gained 2.1% on the news
  • Meta plans to spend $70 billion to $72 billion on AI infrastructure this year, making it one of the largest spenders globally
  • Google’s TPUs represent growing competition in the AI chip market, with Anthropic already agreeing to purchase up to 1 million units
  • The move reflects tech companies’ efforts to diversify chip suppliers and reduce dependence on Nvidia’s market-leading GPUs

Nvidia took a hit in premarket trading Tuesday, falling 3.2% after reports surfaced that Meta is in talks to use Google’s AI chips. The news sent Alphabet shares up 2.1% as investors digested the potential shift in the AI hardware landscape.


NVDA Stock Card
NVIDIA Corporation, NVDA

The Information broke the story Monday, reporting that Meta is considering deploying Google’s tensor processing units in its data centers by 2027. The social media giant may also rent TPUs from Google Cloud as early as next year.

For Google, landing Meta as a customer would validate its custom chip technology. TPUs were first launched in 2018 for internal use in Google’s cloud business. The chips have evolved through multiple generations, each designed specifically for AI workloads.

The customized nature of TPUs gives Google an edge. Experts point to the efficiency gains that come from chips built for specific tasks rather than general-purpose computing.

Meta ranks among the world’s biggest AI infrastructure spenders. The company projects capital expenditure between $70 billion and $72 billion this year alone. That spending power makes Meta’s chip choices influential across the industry.

Diversification Drives Chip Shopping

Tech companies have been actively seeking alternatives to Nvidia’s graphics processing units. While Nvidia maintains its market leadership, the push for diversification has intensified.

Google recently closed a deal with Anthropic for up to 1 million TPUs. Seaport analyst Jay Goldberg called the agreement a “really powerful validation” for the technology. He noted that many companies were already evaluating TPUs, and even more are likely considering them now.

The chip architecture differences matter here. GPUs were originally created for rendering graphics in video games. They proved excellent for AI training because they handle massive amounts of data and parallel computations well.

TPUs take a different approach. They’re application-specific integrated circuits, built from the ground up for discrete purposes. Google designed them specifically for AI and machine learning tasks.

Google’s In-House Advantage

Google’s chip development benefits from its AI teams. DeepMind and other units working on models like Gemini provide real-world feedback to chip designers. This creates a cycle of improvement that’s hard for competitors to replicate.

The ability to customize chips for specific AI tasks has proven valuable. Google’s experience running its own AI models means the TPUs reflect actual use cases rather than theoretical requirements.

Bloomberg Intelligence analysts Mandeep Singh and Robert Biggar estimate Meta will spend $40 billion to $50 billion on inferencing chip capacity next year alone. That figure assumes total capital expenditure of at least $100 billion for 2026.

The analysts suggest Google Cloud could see accelerated growth in consumption and backlog. Enterprise customers wanting access to TPUs and Gemini models would need to use Google’s cloud platform.

Asian suppliers connected to Alphabet saw immediate market reactions. IsuPetasys, a South Korean company supplying multilayered boards to Alphabet, jumped 18% to a new intraday high. Taiwan’s MediaTek rose nearly 5% in early trading.

Advanced Micro Devices remains a distant second to Nvidia in the GPU market. The entrance of Google’s TPUs as a viable third option reshapes competitive dynamics. Companies now have more choices when building AI infrastructure.

Google and Meta representatives declined to comment on the reported discussions. The deals remain under negotiation, with final terms and timing still uncertain.

The post Nvidia (NVDA) Stock: Meta Considers Google TPU Chips for 2027 Data Centers appeared first on CoinCentral.

Market Opportunity
Cloud Logo
Cloud Price(CLOUD)
$0.06658
$0.06658$0.06658
-3.78%
USD
Cloud (CLOUD) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.