AI Firms Scramble For Power As Chip Demand Outpaces Supply

AI Firms Scramble for Power as Chip Demand Outpaces Supply

Image courtesy of 123rf.


AI firms are grappling with a shortage of chips that are critical to developing and improving AI systems. The shortfall of the much-needed advanced graphics processing units (GPUs) is causing a race among firms to secure capacity, with many resorting to their connections to gain access to more chips. Nvidia, the biggest player in the market currently, saw its stock soar 25% in a day recently after reporting a robust, strong quarter that surpassed earnings expectations.


Nvidia is the Biggest Producer of AI Chips

Nvidia (NVDA) currently produces the vast majority of GPUs used in AI development. The company has seen a sharp spike in demand as AI tools require vast amounts of data and enormous processing power, and building just one AI system can require thousands of chips. 

While Nvidia initially focused on manufacturing graphics-processing chips for the video gaming industry, the company has diversified its customer base. In recent years, the chipmaker has expanded into AI and cryptocurrency mining, broadening its range of products and services.

More recently, the company’s AI chips helped its data center division surpass its gaming division in revenues. This has even prompted the chipmaker to offer a new generation of AI chips for data centers that promise a substantial performance upgrade.

Earlier this month, Nvidia announced the shipment of its DGX H100 systems. The product features eight H100 Tensor Core GPUs that are connected via NVLink, alongside dual Intel Xeon Platinum 8480C processors, 2TB of system memory, and 30 terabytes of NVMe SSD, the company said in a blog post.


Nvidia Sees Demand Exceed Supply Despite Efforts to Meet Growing Demand

Last week, Nvidia CEO Jensen Huang said the company is working on a new generation of advanced Nvidia chips for AI calculations in data centers to meet the surging demand. “We are significantly increasing our supply to meet surging demand for them,” he added.

However, the company has ostensibly failed to meet the overwhelming demand for its products. “It’s like toilet paper during the pandemic,” said Sharon Zhou, co-founder, and CEO of Lamini, a startup that helps companies build chatbots using AI models.

The shortage has reportedly restricted the processing power cloud service providers like Microsoft and Amazon can offer clients such as OpenAI, the company behind the popular AI chatbot called ChatGPT.

As UBS analysts estimate, an earlier version of ChatGPT could require up to 10,000 graphic chips, while Musk estimates that an updated version requires three to five times as many of Nvidia’s advanced processors.

In particular, Companies in the AI space are struggling to secure the necessary computing power needed to develop and operate their increasingly complex models and help other companies build AI services. Even industry insiders are finding it challenging to secure the necessary computing capacity. 

In a recent Wall Street Journal CEO Council Summit, Elon Musk stated that GPUs are more difficult to get now than drugs. Similarly, OpenAI CEO Sam Altman said fewer people should use ChatGPT because of the processor bottleneck during a recent congressional hearing on AI.


AI Startups Rely on Connections to Find More Chips

Startups are now relying on their networks to find spare computing power, while some founders are also begging salespeople at Amazon and Microsoft for more power. At the same time, others are blocking cloud capacity in fear that they might be unable to access the relevant resources later.

Some AI investors and startups have also started shrinking their AI models to make them more efficient, buying their physical servers or switching to less popular cloud providers until the shortage is resolved. Additionally, others are orchestrating bulk orders of processors and server capacity that can be shared across their AI startups. 

While Nvidia is scaling up its production to meet the increasing demand, many AI founders expect the shortage to persist until at least next year. Due to the shortage, the cost of Nvidia’s advanced GPU chips ranges from $33,000 upwards in secondary markets. 

The chipmaker has also added $220 billion to its market share since the start of the year, with the stock surging 165% in 2023 alone. Moreover, the company has projected $11 billion in sales for the current quarter, far above the $7.2 billion Wall Street estimated and the highest quarterly total ever for the firm.


More By This Author:

Cardano Has a Meme Coin Now, And it’s Up 425% Over Two Weeks
Nvidia’s One-Day Gain Nearly Equals Ethereum’s Market Cap
How The AI Boom Could Be Nvidia’s “iPhone Moment”

Neither the author, Tim Fries, nor this website, The Tokenist, provide financial advice. Please consult our  more

How did you like this article? Let us know so we can better customize your reading experience.

Comments

Leave a comment to automatically be entered into our contest to win a free Echo Show.