Nvidia: Is This the World’s Most Important Stock, or Just a Hot Chip?

All copyrighted images used with permission of the respective copyright holders.

Nvidia CEO Jensen Huang makes a speech at an event at COMPUTEX forum in Taipei, Taiwan June 4, 2024. 

Ann Wang | Reuters

For **Nvidia** investors, the past two years have been a joyride. But recently they’ve been on more of a roller coaster. As the primary beneficiary of the artificial intelligence **boom**, Nvidia has seen its market cap expand by about ninefold since the end of 2022. But after reaching a record in June and **briefly** becoming the world’s most valuable public company, Nvidia proceeded to lose almost 30% of its value over the next seven weeks, shedding roughly $800 billion in market cap. Now, it’s in the midst of a rally that’s pushed the stock within about 7% of its all-time high. With the chipmaker set to report quarterly results Wednesday, the stock’s volatility is top of mind for Wall Street. Any indication that AI demand is waning or that a leading cloud customer is modestly tightening its belt potentially translates into significant revenue slippage. **”It’s the most important stock in the world right now,”** EMJ Capital’s Eric Jackson **told CNBC**’s **”Closing Bell”** last week. **”If they lay an egg, it would be a major problem for the whole market. I think they’re going to surprise to the upside.”**

Nvidia’s report comes weeks after its megacap tech peers got through earnings. The company’s name was sprinkled throughout those analyst calls, as **Microsoft**, **Alphabet**, **Meta**, **Amazon** and **Tesla** all spend heavily on Nvidia’s **graphics processing units (GPUs)** to train AI models and run massive workloads.

Key Takeaways

  • Nvidia’s stock has been volatile in recent months, but it’s currently rallying and is approaching its all-time high.
  • The company’s earnings report this week is highly anticipated by Wall Street, as any sign of weakening AI demand or reduced spending by cloud customers could significantly impact revenue.
  • While analysts expect a strong quarter, growth is expected to slow in the coming quarters, making Nvidia’s forecast for the October quarter crucial.
  • Nvidia’s next-generation AI chips, dubbed **Blackwell**, are facing production issues, which could push back shipments into the first quarter of 2025.
  • Despite the potential delay, Nvidia’s current-generation **Hopper** chips remain in high demand, and the company’s leading customers are still investing heavily in AI infrastructure.

Blackwell timing

In Nvidia’s past three quarters, revenue has more than tripled on an annual basis, with the vast majority of growth coming from the data center business. Analysts expect a fourth straight quarter of triple-digit growth, but at a reduced pace of 112% to $28.7 billion, according to LSEG. From here, year-over-year comparisons get much tougher, and growth is expected to slow in each of the next six quarters.

Investors will be paying particularly close attention to Nvidia’s forecast for the October quarter. The company is expected to show growth of about 75% to $31.7 billion. Optimistic guidance will suggest that Nvidia’s deep-pocketed clients are signaling an ongoing willingness to open their wallets for the AI build-out, while a disappointing forecast could raise concern that infrastructure spending has gotten frothy. **”Given the steep increase in hyperscale capex over the past 18 months and the strong near-term outlook, investors frequently question the sustainability of the current capex trajectory,”** analysts at Goldman Sachs, who recommend buying the stock, wrote in a note last month.

Much of the optimism heading into the report — the stock is up 8% in August — is due to comments from top customers about how much they’re continuing to shell out for data centers and Nvidia-based infrastructure. Last month, the CEOs of Google and Meta **enthusiastically endorsed** the pace of their build-outs and said underinvesting was a greater risk than overspending. Former Google CEO Eric Schmidt recently **told students** at Stanford, in a video that was later removed, that he was hearing from top tech companies **”they need $20 billion, $50 billion, $100 billion”** worth of processors.

But while Nvidia’s profit margin has been expanding of late, the company still faces questions about the long-term return on investment that clients will see from their purchases of devices that cost tens of thousands of dollars each and are being ordered in bulk. During Nvidia’s last earnings call in May, CFO Colette Kress **provided data points** suggesting that cloud providers, which account for more than 40% of Nvidia’s revenue, would generate $5 in revenue for every $1 spent on Nvidia chips over four years. More such stats are likely on the way. Last month, Goldman analysts wrote, following a meeting with Kress, that the company would share further ROI metrics this quarter **”to instill confidence in investors.”**

The other major question facing Nvidia is the timeline for its next-generation AI chips, dubbed **Blackwell**. The Information **reported** earlier this month that the company is facing production issues, which will likely push big shipments back into the first quarter of 2025. Nvidia said at the time that production was on track to ramp in the second half of the year. The report came after Nvidia CEO Jensen Huang surprised investors and analysts in May by saying the company will see **”a lot”** of Blackwell revenue this fiscal year.

While Nvidia’s current generation of chips, called **Hopper**, remain the premium option for deploying AI applications like ChatGPT, competition is popping up from **Advanced Micro Devices**, Google and a smattering of startups, which is pressuring Nvidia to maintain its performance lead through a smooth upgrade cycle. Even with a potential Blackwell delay, that revenue could just get pushed back into a future quarter while boosting current Hopper sales, especially the newer **H200** chip. The first Hopper chips were in full production in September 2022. **”That shift in timing doesn’t matter very much, as supply and customer demand has rapidly pivoted to H200,”** Morgan Stanley analysts wrote in a note this week.

Many of Nvidia’s leading customers say they need the additional processing power of Blackwell chips in order to train more advanced next-generation AI models. But they’ll take what they can get. **”We expect Nvidia to deemphasize its Blackwell B100/B200 GPU allocation in favor of ramping up its Hopper H200s in”** the second half of the year, HSBC analyst Frank Lee wrote in a August note. He has a buy rating on the stock.

Hard to value momentum stocks like Nvidia, says Miramar Capital's Max Wasserman

Article Reference

Brian Adams
Brian Adams
Brian Adams is a technology writer with a passion for exploring new innovations and trends. His articles cover a wide range of tech topics, making complex concepts accessible to a broad audience. Brian's engaging writing style and thorough research make his pieces a must-read for tech enthusiasts.