At the start of last week, OpenAI’s technology chief personally thanked Nvidia CEO Jensen Huang for “bringing us the most advanced” chips needed to run the demo for a presentation the company delivered on its latest artificial intelligence models.
A day later, at Google’s annual developer conference, Alphabet CEO Sundar Pichai highlighted his company’s “longstanding partnership with Nvidia,” and noted that Google Cloud will be using the chipmaker’s Blackwell graphics processing units (GPUs) in early 2025.
And this week, Microsoft, which provides servers to OpenAI, will announce new AI advancements and features that were developed on the company’s massive clusters of Nvidia GPUs. The company is hosting its Build conference in Redmond, Washington.
Heading into its quarterly earnings report on Wednesday, Nvidia finds itself at the center of the action in technology, a position that’s become increasingly commonplace for the 31-year-old company, whose market cap has ballooned past $2 trillion this year.
Nvidia is expected to report year-over-year revenue growth in excess of 200% for a third straight quarter, with analysts projecting a fiscal first-quarter bump-up of 243% to $24.6 billion, according to LSEG. More than $21 billion of that is expected to come from Nvidia’s data center business, which includes all the advanced processors the company is selling to Google, Microsoft, Meta, Amazon, OpenAI and others.
Nvidia is squeezing so much profit out of its AI suite of products that net income is expected to be up more than fivefold from a year earlier to $13.9 billion.
The stock has soared 91% this year after more than tripling in 2023.
Dan Niles, founder of Niles Investment Management, compared Nvidia’s position in the AI boom to the “internet buildout” of the 1990s and Cisco’s role at the center in those days. Over a three-year stretch, Niles said, Cisco had several dramatic pullbacks, but ultimately increased 4,000% up to its peak in 2000. Nvidia will go through similar cycles, he said.
“We’re still really early in the AI build,” Niles told CNBC’s “Money Matters” on Monday. “I think the revenue will go up three to four times from current levels over the next three to four years, and I think the stock goes with it.”
Google, Amazon, Microsoft, Meta, and Apple are expected to shell out a combined $200 billion in capital expenditures this year, according to an estimate from Bernstein, with a huge portion of the spending going to AI-specific infrastructure like Nvidia chips.
Elsewhere, OpenAI is relying on Nvidia’s technology for its latest chatbot, GPT-4o. Meta announced plans in March to buy and build out computers that will include 350,000 Nvidia GPUs, costing billions of dollars, and CEO Mark Zuckerberg even swapped jackets with Huang and posed for a picture with the Nvidia CEO.
“If you look at today for the AI build out, who’s really driving that?” Niles said. “It’s the most profitable companies on the planet — it’s Microsoft, it’s Google, it’s Meta, and they’re driving this.”
Prior to the recent AI boom, Nvidia was known as the primary maker of chips used for 3D gaming. About a year ago, the chipmaker gave investors their first clue that the company would see a period of historic growth, signaling to Wall Street that it would generate about 50% more in sales than what analysts expected in the July 2023 quarter.
Growth rates have since accelerated. But starting in the second quarter, expansion is expected to slow, with analysts anticipating significant deceleration in each of the next three periods.
“We just don’t know how long this investment cycle lasts and just how much excess capacity will be created over that time in case this AI thing doesn’t materialize as quickly as expected,” Bernstein analysts wrote in a note earlier this month.
That’s not to say that Nvidia is at risk of losing a ton of the AI chip business to rivals. Piper Sandler analysts expect it to keep at least 75% of the AI accelerator market, even as companies like Google build their own custom chips.
“We view the percentage of hyperscaler spend that is dedicated towards compute further rising in 2024 and 2025,” Piper Sandler analyst Harsh Kumar wrote in a note.
One question the company faces is how well the transition is going to its next generation of AI chips, called Blackwell, which are expected to ship later this year. Some worry there could be a lull as clients hold off on buying the older Hopper GPUs like the H100 in favor of Blackwell-based chips such as the GH200.
“To some degree, the setup has shifted,” wrote Morgan Stanley analyst Joseph Moore in a note on Monday. “Six months ago, short term expectations were very strong but there was anxiety about durability. Now, fresh on the back of hyperscalers talking up longer term spending expectations for AI, those longer term views are more positive, but there is anxiety about a pause in front of Blackwell.”
Don’t miss these exclusives from CNBC PRO
Read the full article here