![]() |
| Source: Qualcomm. Its new rack- based solutions will deliver optimised AI inferencing. |
The Qualcomm AI200 is a purpose-built rack-level AI inference solution designed to deliver low total cost of ownership (TCO) and optimised performance for large language and multimodal model (LLM, LMM) inference and other AI workloads. It supports 768 GB of LPDDR per card for higher memory capacity and lower cost.
The Qualcomm AI250 solution will debut with an innovative memory architecture based on near-memory computing, delivering greater than 10x higher effective memory bandwidth and lower power consumption for AI inferencing workloads.
Both rack solutions feature direct liquid cooling for thermal efficiency, PCIe for scale up, Ethernet for scale out, confidential computing for secure AI workloads, and a rack-level power consumption of 160 kW.
“With Qualcomm AI200 and AI250, we’re redefining what’s possible for rack-scale AI inference. These innovative new AI infrastructure solutions empower customers to deploy generative AI at unprecedented TCO, while maintaining the flexibility and security modern data centres demand,” said Durga Malladi, SVP & GM, Technology Planning, Edge Solutions & Data Center, Qualcomm Technologies.
“Our rich software stack and open ecosystem support make it easier than ever for developers and enterprises to integrate, manage, and scale already trained AI models on our optimised AI inference solutions. With seamless compatibility for leading AI frameworks and one-click model deployment, Qualcomm AI200 and AI250 are designed for frictionless adoption and rapid innovation.”
Qualcomm's hyperscaler-grade AI software stack supports major machine learning (ML) frameworks, inference engines, generative AI frameworks, and LLM/LMM inference optimisation techniques. Developers benefit from seamless model onboarding and one-click deployment of Hugging Face models via Qualcomm Technologies’ Efficient Transformers Library and the Qualcomm AI Inference Suite. Qualcomm also provides ready-to-use AI applications and agents, tools, libraries, application programming interfaces (APIs), and services for operationalising AI.
Qualcomm Technologies is committed to a data centre roadmap with an annual cadence moving forward, focused on industry-leading AI inference performance, energy efficiency, and TCO.
The company also announced a collaboration between HUMAIN and Qualcomm to deploy advanced AI infrastructure in KSA. This initiative will offer global AI inferencing services and be the world’s first fully-optimised edge-to-cloud hybrid AI, Qualcomm said. The programme will position KSA as a global AI hub and is a follow up of an announcement made by the two companies in May 2025.
Under the programme, HUMAIN is targeting 200 megawatts starting in 2026 of Qualcomm AI200 and AI250 rack solutions to deliver high-performance AI inference services in KSA and globally. The initiative advances the KSA technology ecosystem by combining HUMAIN's regional infrastructure and full AI stack expertise with Qualcomm Technologies’ global leadership in AI and semiconductor innovation, creating a blueprint for how nations can build complete AI capabilities, from data centre operations to commercial AI services.
"With Qualcomm's world-class AI infrastructure solutions, we're shaping the foundation of the Kingdom's AI future," said Tareq Amin, CEO, HUMAIN.
"This collaboration unites HUMAIN’s deep regional insight and unique full AI stack capabilities with Qualcomm's unmatched semiconductors technology and product leadership. Together, we will enable Saudi Arabia to lead the next wave of global AI and semiconductors innovation."
“By establishing advanced AI data centres powered by Qualcomm's industry-leading inference solutions, we are helping the Kingdom create a technology ecosystem that will accelerate its AI ambitions of becoming a hub of intelligent computing,” said Cristiano Amon, President and CEO at Qualcomm.
“Together with HUMAIN, we are laying the groundwork for transformative AI-driven innovation that will empower enterprises, government organisations and communities in the region and globally.”
The initiative will see the integration of HUMAIN's Saudi-developed AI ALLaM models with Qualcomm's proven AI platforms. Qualcomm and HUMAIN will also develop customer-specific solutions to address the specific needs of enterprises and government organisations across the kingdom and beyond.
Details
Qualcomm AI200 and AI250 are expected to be commercially available in 2026 and 2027 respectively.

No comments:
Post a Comment