1. Home >
  2. Computing

Microsoft and OpenAI Will Spend $100 Billion to Wean Themselves Off Nvidia GPUs

The companies are working on an audacious data center for AI that's expected to be operational in 2028.
By Josh Norem
Microsoft data center
Credit: Microsoft

Microsoft was the first big company to throw a few billion at ChatGPT-maker OpenAI. Now, a new report states the two companies are working together on a very ambitious AI project that will cost at least $100 billion. Both companies are currently huge Nvidia customers; Microsoft uses Nvidia hardware for its Azure cloud infrastructure, while OpenAI uses Nvidia GPUs for ChatGPT. However, the new data center will host an AI supercomputer codenamed "Stargate," which might not include any Nvidia hardware at all.

The news of the companies' plans to ditch Nvidia hardware comes from a variety of sources, as noted by Windows Central. The report details a five-phase plan developed by Microsoft and OpenAI to advance the two companies' AI ambitions through the end of the decade, with the fifth phase being the so-called Stargate AI supercomputer. This computer is expected to be operational by 2028 and will reportedly be outfitted with future versions of Microsoft's custom-made Arm Cobalt processors and Maia XPUs, all connected by Ethernet.

Microsoft Maia server
Microsoft is reportedly planning on using its custom-built Cobalt and Maia silicon to power its future AI ambitions. Credit: Microsoft

This future data center, which will house Stargate, will allow both companies to pursue their AI ambitions far into the future; reports say it will cost around $115 billion. That level of investment shows both companies have no plans to move their respective feet off the AI gas pedal any time soon and that they expect this market to continue to expand far into the future. TechRadar also notes that the amount required to get this supercomputer running is more than triple what Microsoft spent on CapEx last year, so the company is tripling down on AI, it seems.

What's also notable is at least one source says the data center itself will be the computer, as opposed to just housing it. Multiple data centers may link together, like Voltron, to form the supercomputer. This futuristic machine will reportedly push the boundaries of AI capabilities. Given how fast things are advancing in this field, it's impossible to imagine what that will even mean four years from now.

This situation, where massive companies abandon Nvidia for custom-made AI accelerators, will likely become a significant issue for Nvidia soon. Long wait times for Nvidia GPUs and exorbitant pricing have resulted in many companies reportedly beginning to look elsewhere to satisfy their AI hardware needs, which is why Nvidia is already looking to capture this market. OpenAI CEO Sam Altman is reportedly looking to build a global infrastructure of fabs and power sources to make custom silicon, so its plans with Microsoft might be aligned along this front.

Tagged In

Data Centers Semiconductors

More from Computing

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of use(Opens in a new window) and Privacy Policy. You may unsubscribe from the newsletter at any time.
Thanks for Signing Up