Here are three bullish aspects of Nvidia’s quarter that the sellers are missing


Investors are selling Nvidia shares Thursday, sending the stock down more than 3% after the artificial intelligence giant’s better-than-expected quarterly results the prior evening. What, exactly, is causing the selling? It could be the guidance for its current fiscal 2026 first quarter. While strong, it wasn’t as strong as investors got used to in recent years. It could be that it’s expecting some margin pressure in the short run as its next-generation Blackwell chip platform launches. It’s also an angsty moment on Wall Street overall as investors grapple with President Donald Trump ‘s tariffs and wider tariff threats and question the health of the U.S. economy. The trade policy uncertainty could be hurting Nvidia , too, because it does sell AI chips to China, albeit lower-powered ones due to U.S. restrictions on semiconductor exports. Indeed, shares of Nvidia were initially higher Thursday before Trump clarified more tariffs are coming next month. In any case, we liked the quarter and continue to believe Nvidia is an “own it, don’t trade it” stock. “It was a monumental quarter,” Jim Cramer said during Thursday’s Morning Meeting. Here’s a closer look at three bullish aspects of Nvidia’s quarter and conference call beyond the headlines — things that make us even more confident in Nvidia’s future. Blackwell pain for long-term gain The Blackwell chip platform encountered various manufacturing and installation challenges in its rollout, primarily a full server rack version known as the GB200 NVL72 . That product is much more than a single chip that someone could hold in their hands — each GB200 rack has 1.5 million components, according to Nvidia CEO Jensen Huang. Despite all the hand-wringing on Wall Street about Blackwell’s challenges, it may not all be for naught. On Wednesday night’s earnings call, Huang acknowledged the “hiccup,” saying it “probably cost us a couple of months.” Still, Nvidia booked a greater-than-expected $11 billion in Blackwell revenue in the reported November-to-January fiscal 2025 fourth quarter, and supply is increasing to meet the strong customer demand. “The team did an amazing job recovering,” he said. The lessons learned in that recovery could help Nvidia improve execution on its annual roadmap for new data-center AI chips — an important competitive advantage over rivals, if the company can live up to such an aggressive timeline. The Blackwell lineup succeeded the Hopper family, which was first released in 2022. Nvidia is preparing to launch the Blackwell Ultra, sometimes called the GB300, later this year. “Between Blackwell and Blackwell Ultra, the system architecture is exactly the same. It’s a lot harder going from Hopper to Blackwell because we went from an NVLink 8 system to an NVLink 72-based system. So the chassis, the architecture of the system, the hardware, the power delivery, all of that had to change. This was quite a challenging transition. But … Blackwell Ultra will slot right in.” It doesn’t stop there: Nvidia already is getting its partners across the supply up to speed on Blackwell’s successor, which is going to be called Vera Rubin. In a note to clients Thursday, analysts at Morgan Stanley said they believe Nvidia will improve its execution for Rubin. “To the extent that the Blackwell ramp might have been slightly too ambitious, they will course correct with [Blackwell Ultra] and Rubin. Which we think will have more of a focus on manufacturability — but even if GB200 was not as manufacturable initially, it is ramping now which will pressure the competition,” the analysts wrote. In other words, the short-term Blackwell pain could help Nvidia maintain its technology leadership over the long term. ‘Vast majority’ in inference Nvidia’s status as the dominant maker of chips to train AI models has been well-established ever since the launch of ChatGPT in late 2022 sparked the generative AI boom. ChatGPT from Microsoft -backed OpenAI used Nvidia chips to create its model. The other part of AI computing is called inference . Training is basically the process of feeding an AI model a lot of data and getting it ready for prime time. Inference is prime time; it is the model being put into action and used by people on a day-to-day basis. We continue to get new information that shows Nvidia’s foothold in inference is quite substantial. That helps cut against a long-standing Nvidia bear case that went something like this: As inference becomes a bigger piece of the AI computing pie, Nvidia will cede a lot of ground to competitors such as Advanced Micro Devices and large technology companies that make their own chips such as Club name Amazon . AMD and the threat of custom chips have not been vanquished, but there’s little doubt that Nvidia is a serious threat in…



Read More: Here are three bullish aspects of Nvidia’s quarter that the sellers are missing

Advanced Micro Devices IncAlphabet IncAmazon.com Inc.aspectsBreaking News: MarketsBroadcom Inc.bullishbusiness newsDonald J. TrumpInvestment strategyJensen HuangJim CramerMarketsMeta Platforms IncMicrosoft CorpmissingNVIDIA Corp.NvidiasQuartersellersstock takes
Comments (0)
Add Comment