Nvidia After $5 Trillion: Has The AI Trade Really Peaked?

Nvidia After $5 Trillion: Has the AI Trade Really Peaked?

Image courtesy of 123rf.com


It has been just over three years since the launch of ChatGPT on November 30, 2022. Not only was this the beginning of the relentless artificial intelligence (AI) hype and data center buildup, but also transformational gains for chip designer Nvidia Corp. (NVDA).

Nvidia ended 2022 with a negative return of 50.31%, only to impress shareholders with 238.87% gains in 2023, 171.17% in 2024, and 38.88% in 2025. The trajectory is clearly downward, especially after crossing the $5 trillion market cap as the world’s most valued company in stock market history.

As profit-taking took hold, Nvidia is presently valued at $4.57 trillion, or $191.31 per share. However, after the latest CES 2026 announcements, could the case be made that Nvidia will finish 2026 with higher returns than in 2025?

But first, let’s remind ourselves how Nvidia became synonymous with AI chips and the AI hype itself.


Nvidia’s Engineering of Irreversible AI Lock-In

As we explored previously, Nvidia cornered the AI chip market by providing a full stack offering. Both AMD and Intel had the know-how and hardware to be picked as AI chip suppliers for the emerging data center infrastructure. However, instead of focusing on selling individual components, Nvidia built up an entire ecosystem as a turnkey supercomputer offering.

Nvidia’s first moat was the Compute Unified Device Architecture (CUDA) software layer, launched in 2006. This allowed researchers and developers to grow deep learning libraries like TensorFlow and PyTorch, cementing the lock-in for later AI code. This is similar to how Microsoft built dominance with Windows, incurring high switching costs.

On top of CUDA, Nvidia’s future-proofing came in March 2019 with the acquisition of Mellanox for $6.9 billion, closing in late April 2020. For the purpose of cost-effective AI model training, Mellanox supplied Nvidia with InfiniBand technology, a high-speed, low-latency interconnect network architecture. This allowed Nvidia to launch DGX SuperPODs as clusters of GPUs, representing Nvidia’s first turnkey solution for hyperscalers like Microsoft, Alphabet and Meta.

In other words, Nvidia moved on from just marketing chips and became a systems company. Moreover, Nvidia’s chips were also top-performing and the most cost-effective as well. H100 served as the catalyst in the initial AI hype, followed by H200 to address memory bottlenecks, and culminating with Blackwell as the long-term scaler to secure Nvidia’s dominance.

But Nvidia’s path to dominance doesn’t end with Blackwell.


Nvidia Beyond Blackwell: CES 2026 Examined

Between January 6 and 9, the Consumer Electronics Show (CES) once again brings together all the heavyweights. As the world’s most important and influential broad-spectrum tech event, CES is often setting the agenda for the entire year.

This year, Nvidia’s notable announcements could be grouped into three categories:

Gaming & Graphics: From DLSS 3 to DLSS 4, there was a huge leap from single-frame generation to three-frames generation (original frame + 3 AI generated), in order to boost frame rate at high resolutions and smooth out latency. Although named DLSS 4.5, the latest scaling upgrade offers 6x dynamic multi-frame generation, further cleaning up image artifacting from the DLSS 4 with a new transformer model. DLSS 4.5 is set to launch in Spring.

Nvidia is also upgrading its monitor technology with G-Sync Pulsar monitors, offering high effective motion (“stutter-free”) and over 1000 Hz refresh rate. Unlike previous variable refresh rate (VRR) launches that required dedicated G-Sync hardware modules, G-Sync Pulsar monitors will have it built-in within the display scaler, owing to collaboration with MediaTek.

Autonomous Vehicles: Nvidia is launching Alpamayo, a reasoning-based, open-source AI model for self-driving cars for “human-like decision-making”. That is to say, in addition to reacting to environmental cues, the Alpamayo AI model also scans the environment to generate the most-likely cues.

Incorporating these two aspects of reality-modeling would go a long way to make autonomous driving far more robust and safer. Using Alpamayo, Mercedes-Benz CLA will be the first car to launch in Q1, already rated as the safest car by New Car Assessment Programme (NCAP).

AI & Data Center: The core of Nvidia’s business model, Nvidia is adding Vera Rubin platform on top of Blackwell. As the new, rack-scale AI cluster, Vera Rubin NVL72 unifies 72 Rubin GPUs and 36 Vera CPUs into a single, massive AI supercomputer. While 88-core Vera CPUs orchestrate data flows and double CPU-to-GPU bandwidth to 1.8 TB/s for more efficient workloads, Rubin GPU accelerators use the 3rd-gen transformer engine to provide cutting-edge 50 petaflops performance per GPU.

In addition to the upgraded CPU and GPU power, Nvidia fortified its computing lead with a 2nd-gen RAS engine for real-time maintenance and serviceability, alongside 6th-gen NVLink Switch that offers 3.6 TB/s of GPU-to-GPU bandwidth, potentially scaling up to 260 TB/s for AI workloads. For context, Blackwell scales up to 130 TB/s with NVLink 5, which is 2x less than the latest Vera Rubin platform.

Nvidia’s press release on this significant upgrade included quotes from all the major tech CEOs, praising the Vera Rubin platform. Here is one from Elon Musk:

“💚🎉🚀 🤖NVIDIA Rubin will be a rocket engine for AI. If you want to train and deploy frontier models at scale, this is the infrastructure you use — and Rubin will remind the world that NVIDIA is the gold standard.💚🎉🚀 🤖”


Taken together, these announcements reinforce Nvidia’s position not just as The GPU vendor for gaming, but as a full-stack platform spanning automotive AI and hyperscale data centers.

As we noted previously, although AI-related debt issuance rose to $428.3 billion in 2025, AI capex is still heading for major inflows. Goldman Sachs Research report indicated this in December with an AI capex revision for 2026 from $465 billion to $527 billion.


The Bottom Line

Despite decelerating annual returns, Nvidia’s post-Blackwell roadmap shows that its AI leadership is not peaking, but compounding at scale. In other words, Nvidia’s valuation slowdown looks more like maturation than exhaustion.

After the profit-taking spree from early investors (as early as 2023), it is now likely that Nvidia is yet to move significantly beyond $5 trillion market cap by the year’s end. The Wall Street Journal’s forecasting data affirms this with the average NVDA price target of $260.45, which is 36% above the current price level of $191.31 per share.


More By This Author:

Post-Maduro Venezuela: What U.S. Intervention Means For Energy Supply And Safe-Haven Assets
Why Are Trane Technologies Shares Slipping? Nvidia’s Rubin Chips Spark HVAC Concerns
Why Are Novo Nordisk Shares Rising In Premarket? GLP-1 Pill Launch Begins In U.S.

Disclaimer: The author does not hold or have a position in any securities discussed in the article. All stock prices were quoted at the time of writing.

How did you like this article? Let us know so we can better customize your reading experience.

Comments

Leave a comment to automatically be entered into our contest to win a free Echo Show.