Visualizing Big Tech Company Spending On AI Data Centers
(Click on image to enlarge)
Big Tech’s AI Data Center Costs
Big tech companies are aggressively investing billions of dollars in AI data centers to meet the escalating demand for computational power and infrastructure necessary for advanced AI workloads.
This graphic visualizes the total AI capital expenditures and data center operating costs for Microsoft MSFT, Google GOOGL, META, and Amazon AMZN from January to August 2024.
AI capital expenditures are one-time or infrequent investments in long-term AI assets and infrastructure.
Data center operating costs are ongoing expenses for running and maintaining AI data centers on a day-to-day basis
The data comes from New Street Research via JP Morgan and is updated as of August 2024. Figures are in billions. Operating costs include cash operating expenses, software, depreciation, and electricity.
Training AI Models Is Eating Up Costs
Below, we show the total AI capital expenditures and data center operating costs for Microsoft, Google, Meta, and Amazon.
(Click on image to enlarge)
Microsoft currently leads the pack in total AI data center costs, spending a total of $46 billion on capital expenditures and operating costs as of August 2024.
Microsoft also currently has the highest number of data centers at 300, followed by Amazon at about 215. However, variations in size and capacity mean the number of facilities doesn’t always reflect total computing power.
In September, Microsoft and BlackRock unveiled a $100 billion plan through the Global Artificial Intelligence Infrastructure Investment Partnership (GAIIP) to develop AI-focused data centers and the energy infrastructure to support them.
Notably, both Google and Amazon currently spend more than twice as much training their models as they do running them for their end-use customers (inference).
The training cost for a major AI model is getting increasingly expensive, as it requires large data sets, complex calculations, and substantial computational resources, often involving powerful GPUs and significant energy consumption.
However, as the frequency and scale of AI model deployments continue to grow, the cumulative cost of inference is likely to surpass these initial training costs, which is already the case for OpenAI’s ChatGPT.
More By This Author:
China’s GDP Growth Is Now Lagging The Rest Of Asia
Visualizing the Supply Deficit Of Battery Minerals (2024-2034P)
Visualizing $102 Trillion Of Global Debt In 2024