The post Qualcomm unveils AI data center chips to rival Nvidia and AMD, stock jumps 23% appeared on BitcoinEthereumNews.com. Qualcomm stock shot up by 23% on Monday, after the company said it’s launching new AI accelerator chips to take on Nvidia and AMD in the most expensive chip war to date. The announcement, made on October 27, was the company’s loudest statement yet that it’s entering the data center arms race. The two new chips (AI200, set for release in 2026, and AI250, coming in 2027) won’t be in smartphones. They’ll be powering entire liquid-cooled racks inside massive AI server farms. According to CNBC, these new chips are a major leap away from Qualcomm’s usual comfort zone of mobile and wireless devices. Both accelerators can fill a full rack like Nvidia’s and AMD’s current systems, which let 72 chips operate as one. Source: Qualcom/X The idea is to give AI labs and hyperscalers the horsepower they need to run massive AI models, without needing Nvidia’s supply chain or AMD’s second-place position. Qualcomm enters full-rack battle with data center AI chips The AI200 and AI250 are built using the same tech inside Qualcomm’s phone chips, called Hexagon neural processing units (NPUs). Durga Malladi, the company’s general manager for data center and edge, told reporters last week: “We first wanted to prove ourselves in other domains, and once we built our strength over there, it was pretty easy for us to go up a notch into the data center level.” These racks are built for inference, not training. That means Qualcomm isn’t trying to build chips that help train models like OpenAI’s GPTs, which were trained on Nvidia GPUs. Instead, the focus is on running those models faster and cheaper once they’re trained. That’s where most real-world workloads actually happen. And there’s money here… real money. McKinsey says the world will spend $6.7 trillion on data centers by 2030, and most… The post Qualcomm unveils AI data center chips to rival Nvidia and AMD, stock jumps 23% appeared on BitcoinEthereumNews.com. Qualcomm stock shot up by 23% on Monday, after the company said it’s launching new AI accelerator chips to take on Nvidia and AMD in the most expensive chip war to date. The announcement, made on October 27, was the company’s loudest statement yet that it’s entering the data center arms race. The two new chips (AI200, set for release in 2026, and AI250, coming in 2027) won’t be in smartphones. They’ll be powering entire liquid-cooled racks inside massive AI server farms. According to CNBC, these new chips are a major leap away from Qualcomm’s usual comfort zone of mobile and wireless devices. Both accelerators can fill a full rack like Nvidia’s and AMD’s current systems, which let 72 chips operate as one. Source: Qualcom/X The idea is to give AI labs and hyperscalers the horsepower they need to run massive AI models, without needing Nvidia’s supply chain or AMD’s second-place position. Qualcomm enters full-rack battle with data center AI chips The AI200 and AI250 are built using the same tech inside Qualcomm’s phone chips, called Hexagon neural processing units (NPUs). Durga Malladi, the company’s general manager for data center and edge, told reporters last week: “We first wanted to prove ourselves in other domains, and once we built our strength over there, it was pretty easy for us to go up a notch into the data center level.” These racks are built for inference, not training. That means Qualcomm isn’t trying to build chips that help train models like OpenAI’s GPTs, which were trained on Nvidia GPUs. Instead, the focus is on running those models faster and cheaper once they’re trained. That’s where most real-world workloads actually happen. And there’s money here… real money. McKinsey says the world will spend $6.7 trillion on data centers by 2030, and most…

Qualcomm unveils AI data center chips to rival Nvidia and AMD, stock jumps 23%

Qualcomm stock shot up by 23% on Monday, after the company said it’s launching new AI accelerator chips to take on Nvidia and AMD in the most expensive chip war to date.

The announcement, made on October 27, was the company’s loudest statement yet that it’s entering the data center arms race.

The two new chips (AI200, set for release in 2026, and AI250, coming in 2027) won’t be in smartphones. They’ll be powering entire liquid-cooled racks inside massive AI server farms.

According to CNBC, these new chips are a major leap away from Qualcomm’s usual comfort zone of mobile and wireless devices. Both accelerators can fill a full rack like Nvidia’s and AMD’s current systems, which let 72 chips operate as one.

Source: Qualcom/X

The idea is to give AI labs and hyperscalers the horsepower they need to run massive AI models, without needing Nvidia’s supply chain or AMD’s second-place position.

Qualcomm enters full-rack battle with data center AI chips

The AI200 and AI250 are built using the same tech inside Qualcomm’s phone chips, called Hexagon neural processing units (NPUs).

Durga Malladi, the company’s general manager for data center and edge, told reporters last week: “We first wanted to prove ourselves in other domains, and once we built our strength over there, it was pretty easy for us to go up a notch into the data center level.”

These racks are built for inference, not training. That means Qualcomm isn’t trying to build chips that help train models like OpenAI’s GPTs, which were trained on Nvidia GPUs.

Instead, the focus is on running those models faster and cheaper once they’re trained. That’s where most real-world workloads actually happen.

And there’s money here… real money. McKinsey says the world will spend $6.7 trillion on data centers by 2030, and most of that will go to AI hardware. Nvidia controls more than 90% of that market today and is sitting on a market cap of over $4.5 trillion. But customers are getting restless.

OpenAI recently said it’s buying chips from AMD and might even buy a piece of the company. Google, Amazon, and Microsoft are all designing their own AI accelerators. Everyone wants an option that doesn’t involve waiting in line behind a dozen other AI labs just to get a GPU shipment from Nvidia.

Power draw, flexibility, and memory make Qualcomm stand out

Malladi said the racks draw around 160 kilowatts, which matches the power usage of Nvidia racks. But Qualcomm claims its systems are cheaper to run, especially for cloud service providers.

The company will also sell parts separately, giving clients the freedom to build custom racks. “What we have tried to do is make sure that our customers are in a position to either take all of it or say, ‘I’m going to mix and match,’” Malladi added.

Even Nvidia and AMD could end up buying parts of Qualcomm’s stack. That includes its central processing units (CPUs), which Malladi said will be available as standalone components. The full pricing for chips, cards, and racks hasn’t been disclosed. Qualcomm didn’t confirm how many NPUs can fit in a rack either.

Earlier this year, Qualcomm signed a deal with Saudi Arabia’s Humain, which plans to install Qualcomm inferencing chips across data centers using up to 200 megawatts of power. That deal made Humain one of the first major customers for the rack-scale systems.

The company also said its AI cards handle 768 gigabytes of memory, which is more than what Nvidia or AMD currently offer. It also claimed better efficiency in power and cost of ownership, though it didn’t provide exact figures.

Claim your free seat in an exclusive crypto trading community – limited to 1,000 members.

Source: https://www.cryptopolitan.com/qualcomm-stock-surges-24-after-new-ai-chips/

Market Opportunity
Sleepless AI Logo
Sleepless AI Price(AI)
$0.0397
$0.0397$0.0397
+2.26%
USD
Sleepless AI (AI) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Meteora: JUP stakers will be eligible for MET token airdrops

Meteora: JUP stakers will be eligible for MET token airdrops

PANews reported on September 18 that Meteora officials confirmed in the community Discord that JUP stakers will be eligible for MET token airdrops. Earlier news, Meteora announced that it will conduct TGE in October , and the token will be MET.
Share
PANews2025/09/18 11:13
Optopia and EDITH Join Forces to Drive Real-World AI Compute On-Chain

Optopia and EDITH Join Forces to Drive Real-World AI Compute On-Chain

Optopia intends to address challenges in the Web3 and AI sector by offering reliable, tokenized, and efficient computing power to drive intelligent agents.
Share
Blockchainreporter2025/09/18 20:15
Polygon Tops RWA Rankings With $1.1B in Tokenized Assets

Polygon Tops RWA Rankings With $1.1B in Tokenized Assets

The post Polygon Tops RWA Rankings With $1.1B in Tokenized Assets appeared on BitcoinEthereumNews.com. Key Notes A new report from Dune and RWA.xyz highlights Polygon’s role in the growing RWA sector. Polygon PoS currently holds $1.13 billion in RWA Total Value Locked (TVL) across 269 assets. The network holds a 62% market share of tokenized global bonds, driven by European money market funds. The Polygon POL $0.25 24h volatility: 1.4% Market cap: $2.64 B Vol. 24h: $106.17 M network is securing a significant position in the rapidly growing tokenization space, now holding over $1.13 billion in total value locked (TVL) from Real World Assets (RWAs). This development comes as the network continues to evolve, recently deploying its major “Rio” upgrade on the Amoy testnet to enhance future scaling capabilities. This information comes from a new joint report on the state of the RWA market published on Sept. 17 by blockchain analytics firm Dune and data platform RWA.xyz. The focus on RWAs is intensifying across the industry, coinciding with events like the ongoing Real-World Asset Summit in New York. Sandeep Nailwal, CEO of the Polygon Foundation, highlighted the findings via a post on X, noting that the TVL is spread across 269 assets and 2,900 holders on the Polygon PoS chain. The Dune and https://t.co/W6WSFlHoQF report on RWA is out and it shows that RWA is happening on Polygon. Here are a few highlights: – Leading in Global Bonds: Polygon holds 62% share of tokenized global bonds (driven by Spiko’s euro MMF and Cashlink euro issues) – Spiko U.S.… — Sandeep | CEO, Polygon Foundation (※,※) (@sandeepnailwal) September 17, 2025 Key Trends From the 2025 RWA Report The joint publication, titled “RWA REPORT 2025,” offers a comprehensive look into the tokenized asset landscape, which it states has grown 224% since the start of 2024. The report identifies several key trends driving this expansion. According to…
Share
BitcoinEthereumNews2025/09/18 00:40