Qualcomm Inc., best known as the world’s leading supplier of smartphone processors, is making a bold move into the high-stakes artificial intelligence data center market. With a fresh lineup of chips and servers, the company aims to take on Nvidia Corp., which currently dominates the fastest-growing segment of the semiconductor industry.
The centerpiece of Qualcomm’s new offering is the AI200, a chip set to launch next year. It will be available in multiple formats: as a standalone processor, as add-in cards for existing systems, or integrated into complete server racks built by Qualcomm itself.
The company’s first major client for this technology will be Humain, a Saudi Arabian AI startup that plans to deploy 200 megawatts of computing power using Qualcomm’s chips starting in 2026.
Qualcomm’s roadmap doesn’t stop there. The company announced that a successor, the AI250, will arrive in 2027. Depending on customer needs, the chip can be installed in systems running processors from Nvidia or other competitors or as a complete server solution directly challenging those rivals.
The market responded enthusiastically to the news, sending Qualcomm’s stock soaring as much as 15% in New York trading, its biggest intraday gain in more than half a year.
This strategic push represents Qualcomm’s entry into a sector that has fundamentally transformed the semiconductor landscape. As artificial intelligence drives an unprecedented wave of investment, hundreds of billions of dollars are being funneled into data centers to support next-generation AI tools and services. Qualcomm is betting that its strengths especially in energy-efficient chip design and advanced memory features will help it stand out, even as a late entrant in the race.
The new chips are built around a neural processing unit (NPU), a technology that first gained prominence in smartphones for handling AI tasks efficiently without draining battery life. Qualcomm has steadily evolved this capability, adapting it for laptops and now scaling it up for use in high-performance computing environments.
Under the leadership of CEO Cristiano Amon, Qualcomm has been steadily working to reduce its dependence on the smartphone market, where growth has slowed significantly in recent years.
The company has already made progress expanding into automotive and PC chips, and its new data center products mark its first major foray into the booming AI infrastructure segment now the largest market for processors globally.
According to Durga Malladi, Qualcomm’s senior vice president, the company has taken a deliberate and patient approach to developing its AI server business. “We’ve been quiet in this space, building our capabilities methodically,” he explained. Malladi added that Qualcomm is currently in discussions with all major hyperscale customers about potential deployments of its new hardware.
Securing deals with top-tier cloud providers such as Microsoft, Amazon, or Meta Platforms would represent a game-changing opportunity for Qualcomm. These partnerships could significantly diversify its revenue streams and strengthen its presence in the AI ecosystem.
Although Qualcomm has posted solid profit growth over the past two years, investors have tended to favor other chipmakers. The company’s shares are up around 10% in 2025, trailing the Philadelphia Semiconductor Index, which has surged 40% this year.
In contrast, Nvidia continues to dominate the AI hardware space. Analysts estimate its data center division alone will generate more than $180 billion in revenue this year a figure that surpasses the total annual sales of most of its competitors, including Qualcomm.
Still, Qualcomm believes it can carve out a niche by emphasizing efficiency and innovation. The company claims its new processors will feature unprecedented memory capacity, offering up to 768 gigabytes of low-power dynamic random-access memory (LPDDR). Memory performance both in speed and capacity is a critical factor in AI computation, determining how quickly chips can analyze and interpret massive datasets.
Qualcomm’s AI200 and AI250 systems will primarily target inference workloads, which involve running AI models after they’ve already been trained. This is the stage where AI applications deliver real-time results such as generating responses, identifying patterns, or powering large-scale services.
By leveraging its mobile design expertise and expanding into data centers, Qualcomm is positioning itself as a formidable competitor in the next wave of AI-driven computing. While it faces steep competition from Nvidia and other established players, its entry could signal a new phase of innovation and efficiency for the global chip industry.
If Qualcomm succeeds in winning over major cloud clients and proving its chips’ performance advantages, it could unlock a massive new growth engine one that reshapes its role from smartphone supplier to a serious force in the world of AI infrastructure.

As a leading independent research provider, TradeAlgo keeps you connected from anywhere.