Nvidia Scores a Huge Win — Microsoft Will Be First Public Cloud to Adopt Its AI Stack

Nvidia (NVDA 3.00%) just reported third-quarter earnings, and it’s still working through too much inventory and a tough market in China. Results were about what could be expected right now.

The biggest news, though, was a partnership with Microsoft‘s (MSFT 1.04%) cloud computing segment, Azure, to build a new supercomputer. Nvidia says Microsoft is the first public cloud business to adopt its full, advanced artificial intelligence (AI) stack built from Nvidia hardware and software.

It isn’t the first deal like this announced lately. But it underscores how far Nvidia has come — and how much it still has to gain in the coming years — as its data center segment grows. Here’s what investors need to know.

Nvidia GPUs head for the private and public cloud

If Nvidia’s new supercomputer partnership with Microsoft Azure sounds familiar, here’s why. Just a few weeks prior, Nvidia said it was partnering with tech stalwart Oracle. Oracle has a public cloud infrastructure and platform service, and it’s trying to play catch-up with some of the other big names like Azure, Amazon‘s AWS, and Alphabet‘s Google Cloud. Besides adding “tens of thousands” of Nvidia graphics processing units (GPUs) to its service, Oracle will also make Nvidia AI software available to its users.

The deal with Microsoft is slightly different, and not just in size and scope. Microsoft’s supercomputer will feature tens of thousands of Nvidia GPUs, but it will also be the first public cloud provider to fully adopt Nvidia’s AI software development tools and services. This “full stack” (which references the hardware, the software that governs how it functions, and actual applications) will be used by Microsoft for its own internal development and also be made available to subscribers to the tech giant’s cloud service.

Interestingly, Nvidia’s VP of enterprise computing — the division that’s helping spearhead this project with Microsoft — is Manuvir Das. Das was a general manager at Microsoft in the mid-to-late 2000s, helping to develop the framework for Microsoft Azure.

Microsoft Azure and its other cloud computing segments are massive. The company’s intelligent cloud division generated $20.3 billion in revenue last quarter, a 20% year-over-year increase. Nvidia’s data center, by comparison, hauled in just $3.8 billion, good for nearly 31% year-over-year growth.

How high can Nvidia data centers fly?

As CEO Jensen Huang discussed in the latest earnings call, lots of companies — from large, well-established ones to a slew of new start-ups — want quick and easy-to-use access to Nvidia’s full-stack computing offerings. Thus, obtaining distribution of its hardware and AI software via public clouds like Microsoft and Oracle is a big deal for Nvidia. The quicker it can do so, the better. It’s still early on in generating a recurring revenue stream from cloud subscription access to its AI suite, and distribution via partners like Microsoft and Oracle will help it reach more customers at a faster rate.

But how far could Nvidia take things? After all, it has competition in this space. AMD also has data center computing accelerator chips. Even Intelthough woefully behind at the moment, is trying to make some headway designing GPUs for data center AI applications.

However, Nvidia’s strategy of building hardware and software is unique. It’s steadily worming its way into the enterprise software market here, creating a diverse set of tools that can be used by developers to build their own solutions all the way to full-blown applications that are ready to use. Cloud-based enterprise software is worth millions of trillions of dollars in annual spending worldwide. Suffice it to say there’s some room to wiggle in. Even a small slice of this pie would be huge for Nvidia.

Given the incredible number of use cases out there for its software suite, Huang said Nvidia will start disaggregating its data center segment a bit going forward. Rather than just a lump sum “data center” figure — now far and away the company’s largest moneymaker — Nvidia will begin reporting on different industries within the whole. In other words, giant customers like Microsoft aren’t Nvidia’s only source of growth anymore. Look forward to some extra commentary on specific industry customers and financials spanning automotive, energy, healthcare, and more.

Nvidia is dealing with some tough times right now as its consumer-facing businesses undergo a big adjustment. However, given the breadth and depth of its expertise in the burgeoning AI space, Nvidia remains my top buy-and-hold stock that I expect to stick with for the duration of the 2020s.

John Mackey, CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool’s board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool’s board of directors. Nicholas Rossolillo and his clients have positions in Advanced Micro Devices, Alphabet (C shares), Amazon, and Nvidia. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet (A shares), Alphabet (C shares), Amazon, Intel, Microsoft, and Nvidia. The Motley Fool recommends the following options: long January 2023 $57.50 calls on Intel, long January 2025 $45 calls on Intel, and short January 2025 $45 puts on Intel. The Motley Fool has a disclosure policy.

Leave a Comment