OpenAI’s 11.9 Billion Power Move with CoreWeave Signals a Break from Microsoft

With GPU demand at an all-time high and an IPO on the horizon, CoreWeave is at the center of the AI arms race.

CoreWeave just pulled off a power move that changes the AI infrastructure game, locking in a five-year, $11.9 billion deal with OpenAI. And this is not only just about cloud computing or a reverse acquihire but it also puts OpenAI in direct control of its own computing future.

At the heart of this blockbuster agreement is OpenAI’s acquisition of $350 million in CoreWeave equity. OpenAI isn’t just renting GPUs; it now owns a stake in the very company powering its AI ambitions. If there is one organization worried it’s Microsoft, OpenAI’s biggest investor and (formerly) sole cloud provider.

The OpenAI-Microsoft relationship has been one of convenience, necessity, and increasing tension. Microsoft poured billions into OpenAI, gaining access to its models for Azure customers. But as OpenAI’s dominance in AI grew, so did its appetite for independence. This CoreWeave deal is the loudest signal yet that OpenAI doesn’t want to be tethered to just one cloud provider, especially one that is now competing with its own AI models.

Microsoft has been leveraging CoreWeave’s cloud services to bolster its own AI ambitions, accounting for 62% of CoreWeave’s $1.9 billion revenue in 2024. And now? OpenAI is stepping in as a direct customer, reshaping the battlefield. This isn’t just about securing more GPUs—it’s a strategic play to ensure OpenAI’s compute resources aren’t at Microsoft’s mercy. CEO Sam Altman might as well have said, “If Microsoft can use CoreWeave, why can’t we?”

Just seven years ago, CoreWeave was a crypto mining operation started by three former hedge fund traders. Fast forward to today, and it’s emerged as one of the most critical players in AI cloud infrastructure. How? By making a high-stakes bet on GPUs. While traditional cloud giants like AWS and Azure built general-purpose infrastructure, CoreWeave focused on AI-specific workloads offering highly optimized, GPU-heavy cloud services that enterprises and research labs desperately needed.

CoreWeave’s numbers are staggering. Revenue jumped from $228.9 million in 2023 to $1.9 billion in 2024—a nearly 8x surge. The company now operates 32 data centers running more than 250,000 Nvidia GPUs, and that number is still climbing. In fact, CoreWeave’s latest acquisition spree includes Nvidia’s Blackwell GPUs, tailored for AI reasoning—a nod to the fact that generative AI isn’t just about training models anymore, but deploying them at scale.

This OpenAI deal couldn’t come at a better time for CoreWeave. The company recently filed for an IPO, seeking to raise over $4 billion. But before this deal, CoreWeave had a glaring issue: a dangerous overreliance on Microsoft. With 62% of its revenue coming from a single customer, investors had reason to be nervous. What happens if Microsoft decides to build its own AI cloud solution? What if it shifts more workloads to Azure? The OpenAI contract helps mitigate that risk, offering a new multi-billion-dollar revenue stream and a stamp of approval from one of the most influential AI companies in the world.

That said, CoreWeave isn’t in the clear just yet. The company has $7.9 billion in debt, much of it from private financings that fueled its explosive growth. Its co-founders, who have already cashed out $488 million worth of shares, need this IPO to go smoothly. The stakes couldn’t be higher.

Morgan Stanley, J.P. Morgan, and Goldman Sachs, among others are betting that CoreWeave’s role as the backbone of AI infrastructure is enough to drive strong demand. But CoreWeave isn’t the only game in town. Competitors like Lambda (another Nvidia-backed AI cloud provider) and traditional hyperscalers like AWS and Google Cloud are ramping up their AI cloud offerings.

There’s one player in this saga that’s quietly winning no matter what is Nvidia. CoreWeave’s entire business is built around Nvidia GPUs. Nvidia, which holds a 6% stake in CoreWeave, has essentially found a way to monetize AI compute at every level—by selling the hardware, backing the cloud providers, and powering the AI models themselves.

CoreWeave’s rise is a testament to Nvidia’s strategy of seeding alternative cloud providers to counterbalance the dominance of AWS, Azure, and Google Cloud. Nvidia’s investments ensure that there’s always a strong market for its GPUs, no matter which AI company is on top.

OpenAI’s push for independence, Microsoft’s counteroffensive with its own AI models, CoreWeave’s bid for Wall Street credibility, and Nvidia’s strategic positioning, all of it points to a landscape where control over compute is just as important as the AI models themselves.

With this agreement, OpenAI is making it clear that it won’t let Microsoft or any single player dictate its AI future. CoreWeave, once an upstart GPU provider, is now at the center of the AI infrastructure boom. And Nvidia? It’s the puppet master ensuring its GPUs remain the foundation of this AI arms race.

Is OpenAI hedging against Microsoft’s control by diversifying its infrastructure? Or is this just another example of AI power players operating in a gray zone, where access to GPUs dictates who gets to dominate the market? And more importantly, what does this say about Microsoft’s own GPU supply chain if OpenAI had to go outside its primary cloud partner?

CoreWeave’s shift from a crypto mining venture to an AI infrastructure provider has turned its three cofounders Michael Intrator, Brannin McBee, and Brian Venturo into billionaires, with their company now supplying computing power to Microsoft, Meta, Nvidia, and billion-dollar AI labs like Cohere and Mistral. With GPU demand at an all-time high and an IPO on the horizon, CoreWeave is at the center of the AI arms race. “What we did is we said, ‘Hey, there’s a new and emergent way that compute is going to be used in the future,’” Intrator told Forbes in 2024. “What are they going to need to be as successful as possible?” So far, that bet has paid off.

📣 Want to advertise in AIM Research? Book here >

Picture of Anshika Mathews
Anshika Mathews
Anshika is the Senior Content Strategist for AIM Research. She holds a keen interest in technology and related policy-making and its impact on society. She can be reached at anshika.mathews@aimresearch.co
Subscribe to our Latest Insights
By clicking the “Continue” button, you are agreeing to the AIM Media Terms of Use and Privacy Policy.
Recognitions & Lists
Discover, Apply, and Contribute on Noteworthy Awards and Surveys from AIM
AIM Leaders Council
An invitation-only forum of senior executives in the Data Science and AI industry.
Stay Current with our In-Depth Insights
The Most Powerful Generative AI Conference for Enterprise Leaders and Startup Founders

Cypher 2024
21-22 Nov 2024, Santa Clara Convention Center, CA

25 July 2025 | 583 Park Avenue, New York
The Biggest Exclusive Gathering of CDOs & AI Leaders In United States
Our Latest Reports on AI Industry
Supercharge your top goals and objectives to reach new heights of success!