Bull Of The Day Nvidia
Is NVIDIA (NVDA) a buy in the $120s or has the DeepSeek invasion delivered a serious re-rating to its growth and valuation?
The biggest lever to answer this question revolves around whether the world suddenly needs less (and less-expensive) GPUs to power their LLM (large language models) and HPC (high-performance computing) models.
As I write this on Wednesday afternoon, we await earnings calls from two of NVIDIA's biggest customers, Microsoft (MSFT) and Meta Platforms (META), that may indicate if their capex plans are being downsized after the suddenly cheaper, better, faster chatterbot from China.
The Data Will Now Speak
Even though I remain a long-term NVDA bull and added shares near $120 on Monday when it fell 17%, the stage is set for a potential near-term peak in its growth trajectory.
Coming into the week, the stock was still a Zacks #2 Rank (Buy) because the consensus EPS estimate for the coming fiscal year (begins February) had moved up 11.7% to $4.21 since the company's last earnings report in November.
The full-year topline consensus is for $192 billion. I was projecting that number would be over $200B by now. So $192B and $4.21, representing 49% and 43% growth, will be the numbers to watch today. If they get hit, so too might the stock.
So far, I haven't heard even a single downgrade or negative estimate revision. But, again, the analysts have been waiting for the Microsoft and Meta conference calls (last night) to deliver their model adjustments this morning.
I'll tell you this, the analysts are going to be chomping at the bit to get answers. This won't be your average quarterly call and I wouldn't want to be Microsoft or Meta brass having to explain and justify their respective AI capex budgets of $80 billion and $65 billion, respectively.
Tesla Too
And as if that wasn't enough excitement today, Tesla (TSLA) will also have reported their quarter on Wednesday. Telsa and Musk are very big players in the GPU-AI market for self-driving cars and for his own chatterbot Grok.
Elon Musk's new xAI data center in Memphis achieved something remarkable when his team assembled 100,000 Nvidia H100 GPUs in only a matter of months.
Nicknamed "Colossus," it was considereed in September the largest and one of the most powerful supercomputers in existence. I don't think Musk's appetite for GPUs will slow down much in the next five years.
My NVDA Bull Case
Here's what I told my TAZR Trader members on Tuesday night...
I have read a dozen different experts on AI and the DeepSeek invasion in the past 72 hours and sometimes my head was just spinning.
Because often they don't even agree with each other on the impacts.
That's when I just go back to my thesis on NVIDIA: they dominate the technology stack that enterprises want, and the next stacks they don't even know they need yet (Isaac-Gr00t Robotics, Omniverse, Cosmos, NIMS, DIGITS, etc).
(If you watched Jensen Huang's keynote at CES earlier this month, you got a clinic on all the latest innovations in these platforms.)
I also told my group that the DeepSeek invasion has put the "AI arms race" back in the spotlight....
Even bigger picture: forget the Jevon's Paradox stuff (lower cost + higher efficiency = more demand) for a moment and think about who is competing here.
It's USA vs. China, just like I made content around in 2019 after reading Kai-Fu Lee's book AI Superpowers: China, Silicon Valley, and the New World Order.
(end of notes to TAZR members)
So when I put together the innovative platforms coming from NVIDIA with the fact that the AI "arms race" is back in focus, I think NVIDIA sales estimates will not be revised lower.
To round out my view, on Tuesday I also bought Vertiv ((VRT Quick QuoteVRT - Free Report) ), a high-growth provider of datacenter infrastructure, when the stock was down over 30% off its highs. I did this because I don't think the 2-4 year plans to build datacenters are being scrapped.
Finally, part of my NVDA bull thesis after ChatGPT took the world by storm in early 2023 was that every large corporation would want to build their own internal LLMs.
China definitely delivered a wake-up call to hyperscalers and LLM builders about cost efficiency. But I don't think it changes their plans to build on the model that Microsoft, Alphabet, and Meta have established just because China tweaked an existing model that probably cost 100X what they claimed.
Here's what I posted on X about my "50,000 foot view" of the panic...
In May 2023 when Jensen revealed at Computex that GOOG, MSFT, and META were already in line for the coming GH200-DGX systems, I said "Here we go to $200 billion in sales!"
Why? Because the other 1996 corps in Global 2000 would want to build their own INTERNAL LLMs. Same story today.
What I mean is that no major corp wants to have their proprietary data in the hands of any outside entity. They want their own Magic Kingdoms of OZ to mine and model what only they know to serve employees and customers.
The story might change at the margins. But the big builders (MSFT, AMZN, META, GOOG, Tesla, OpenAI, Oracle) are nowhere near done with their grand datacenter plans because their visions of serving the autonomous vehicle, robotics, energy, biotechnology, and custom AI markets are bigger than most of us can imagine.
They will pay for the NVIDIA GPU-DGX systems and clusters now because they see the ROI as massive in five years.
By the time you are reading this on Thursday morning, we should know a lot more as analysts react to MSFT and META.
More By This Author:
Bull Of The Day: OktaBull Of The Day: Taiwan Semi
Bull Of The Day: Datadog - Wednesday, August 7
Keith tracks to bring his readers some of the best investment recommendations on the markets today. These trends have been making fortunes for centuries, and they'll likely continue to do so ...
more