G42 Diversifies AI Chips, Taps AMD, Cerebras, Qualcomm for Global AI Hub.

Abu Dhabi's G42 diversifies beyond Nvidia, partnering AMD, Cerebras, and Qualcomm to power its secure, strategic UAE-U.S. AI Campus.

September 1, 2025

G42 Diversifies AI Chips, Taps AMD, Cerebras, Qualcomm for Global AI Hub.
In a significant strategic pivot, Abu Dhabi’s artificial intelligence powerhouse G42 is actively exploring partnerships with a trio of American chipmakers—Advanced Micro Devices (AMD), Cerebras Systems, and Qualcomm—to equip its ambitious AI infrastructure projects. This move signals a deliberate effort to diversify its hardware supply chain and lessen its reliance on Nvidia, the current dominant force in the AI accelerator market. The discussions, confirmed by sources familiar with the matter, are a clear indication that even as G42 continues to work with Nvidia on certain projects, it is building a more resilient and versatile technology stack for the future, driven by a confluence of geopolitical, commercial, and technical considerations. This initiative is unfolding as G42 spearheads the development of the massive UAE-U.S. AI Campus in Abu Dhabi, a project poised to become one of the largest AI-dedicated data center deployments globally.
The impetus behind G42's diversification strategy is multifaceted. Primarily, it addresses the growing industry-wide concern over dependency on a single supplier for critical AI components.[1] The global demand for high-performance GPUs has created supply-chain bottlenecks, and by engaging with multiple vendors, G42 aims to mitigate these risks and ensure a stable hardware pipeline for its large-scale ambitions.[1] This strategy also reflects a significant geopolitical realignment. G42 has been actively strengthening its ties with the United States, a move highlighted by a recent $1.5 billion investment from Microsoft and a commitment to divest from Chinese technology.[2][3] By collaborating with U.S. chipmakers like AMD, Cerebras, and Qualcomm, G42 is cementing its position as a key strategic partner for American technology in the Middle East, aligning with the objectives of the U.S.-UAE AI Acceleration Partnership.[2][4] This framework is designed to ensure the secure and responsible development of AI, with G42's infrastructure serving as a regional hub for U.S. hyperscalers.[4][5]
While the move to diversify is clear, G42's relationship with Nvidia is not one of outright replacement but of strategic augmentation. A significant portion of the new 5-gigawatt UAE-U.S. AI Campus, a 1-gigawatt cluster dubbed "Stargate UAE," will be built using Nvidia's next-generation Grace Blackwell GB300 systems.[6] This flagship project, a collaboration with OpenAI and Oracle, underscores Nvidia's continued importance for cutting-edge, large-scale deployments.[6] However, for the remaining four gigawatts of the campus and other future initiatives, G42 is evaluating a range of specialized hardware.[7][8] This approach allows G42 to select the optimal processor for specific workloads, balancing performance, cost, and power efficiency across its vast operations. The company is currently in deep negotiations with major American tech firms, including Google, Microsoft, Amazon's AWS, Meta, and xAI, to become anchor tenants for the new campus, making the choice of underlying hardware a critical decision.[7][8]
Each of the potential new partners offers a unique value proposition that complements G42's needs. The relationship with Cerebras Systems is already deep and well-established. G42 is a major customer, partner, and investor in the company, and the two have collaborated on building the Condor Galaxy network, a series of interconnected AI supercomputers.[9][10] The latest iteration, Condor Galaxy 3, leverages Cerebras's CS-3 system, powered by the innovative Wafer-Scale Engine 3 (WSE-3).[10][11] This technology, which integrates an entire silicon wafer into a single chip, is purpose-built for training the largest AI models, offering massive on-chip memory and extreme bandwidth that can overcome the bottlenecks associated with traditional GPU clusters.[12][13][14] For G42, Cerebras provides a powerful, specialized solution for pushing the boundaries of foundational model training.[15][9]
AMD and Qualcomm, meanwhile, present compelling alternatives for different segments of the AI workload spectrum. AMD has emerged as Nvidia's most direct competitor with its Instinct MI300 series of accelerators.[16] The MI300X GPU, in particular, is designed for large-scale AI training and inference, boasting a market-leading 192GB of HBM3 memory, which allows it to run massive language models more efficiently than competing hardware.[17][18] An eight-GPU AMD Instinct Platform can offer significantly higher throughput on certain large language models compared to Nvidia's H100 equivalent, representing a powerful option for G42's AI cloud services.[17][19] Qualcomm's offering, the Cloud AI 100 Ultra, is focused on the critical area of AI inference—the process of running a trained model.[20] This accelerator is engineered for high performance at very low power consumption, delivering a leading performance-per-watt that reduces the total cost of ownership (TCO).[20][21][22] The ability of a single, 150-watt Cloud AI 100 Ultra card to handle models with over 100 billion parameters makes it an attractive and sustainable option for deploying AI services at scale.[21][23]
In conclusion, G42's exploration of hardware from AMD, Cerebras, and Qualcomm represents a sophisticated and forward-looking strategy that is becoming a new industry norm. It is a calculated response to the realities of the current AI landscape: concentrated supply chains, intense geopolitical dynamics, and the increasingly diverse computational demands of modern artificial intelligence. While Nvidia remains a central partner, G42's multi-vendor approach will provide it with greater operational flexibility, supply chain security, and the ability to optimize its infrastructure for a range of specialized AI tasks, from training colossal models on Cerebras's wafer-scale engines to running efficient, large-scale inference with Qualcomm's power-sipping chips. This strategic diversification not only strengthens G42's own position as a global AI leader but also signals a broader maturation of the AI market, where choice and specialization are beginning to challenge the dominance of a single architecture.[24][25]

Sources
Share this article