AI Isn’t Inflated, Big Tech’s Earnings Are

human hand holding plasma ball

Image Source: Unsplash
 

Every few years, someone shows up and says: “Wait a second. What we’re celebrating… is not what we think we’re celebrating.” In 2008, that guy was Michael Burry. Today, he’s back, and once again he’s pointing at something everyone else is too dazzled to look at.

And no, it’s not that AI is fake. It’s not that models don’t work. It’s something much simpler, and much more uncomfortable: the numbers look better than the business underneath.

Let me explain what’s going on. Not in accounting language, but in normal human words.


1. The trick is stupidly simple: stretch the life of the hardware.
 

Imagine you buy a €3,000 laptop. Normally you’d say: “I’ll replace it in three years.” So it “costs” you €1,000 per year.

Now imagine you tell your accountant: “You know what? Let’s say it lasts six years.” Poof. Your annual “cost” drops to €500.

Same laptop. Same performance. Same battery dying after 18 months. But on paper? You look more profitable.

This is exactly what the biggest AI companies are doing. Not just one of them. All of them.

  • Microsoft (MSFT) → extended server life to 6 years.
  • Google (GOOGL) → extended to 6 years.
  • Amazon (AMZN) → extended to 6 years.
  • Meta (META) → extended to 5–6 years, saving $2.9B this year.
  • Oracle (ORCL) → extended to around 5 years.

For context: these are not laundry machines. These are GPUs and AI servers running at full heat, in the middle of the most intense compute boom since the internet was invented.

And yet, overnight, their “useful life” doubled.


2. What does this mean in practice?
 

It means this:

The AI boom is expensive. Very expensive. More expensive than any cloud wave we’ve seen before.

And instead of showing the true cost of that reality in their profit and loss statements, companies are smoothing the impact.

This isn’t illegal. Under GAAP, you can change useful life estimates as long as you disclose them. And they have disclosed them, in footnotes, earnings calls, and filings.

But legal doesn’t mean neutral. And “allowed” doesn’t mean “honest representation of economic reality.”

Because when you stretch depreciation, you’re not becoming more efficient. You’re just pushing expenses into the future and pulling profits into the present.


3. This is where Burry comes in.
 

Burry’s point is not that AI doesn’t work. His point is that the accounting narrative looks cleaner than the economic truth.

He looked at the numbers and said something very simple:

If these companies depreciated their hardware at the old rate, 2–3 years, their profits would be significantly lower.

How much lower?

  • Overstated profits at Meta by ~20% by 2028
  • Overstated profits at Oracle by ~26–27%
  • And across the hyperscalers, $176 billion of depreciation is missing between 2026–2028

Not small cosmetic adjustments. Not rounding errors. Not “one-time items.”

We’re talking about earnings that look dramatically better purely because the “life expectancy” of hardware doubled on paper, at the exact moment when competition for compute is exploding.

That’s the thing Burry is pointing at. Not the technology. Not the models. Not the innovation.

The balance sheets.


4. Why does this matter so much?
 

Because markets don’t price stories. They price earnings.

And if earnings are artificially boosted, then everything downstream becomes inflated:

  • Valuations
  • Stock prices
  • Investor expectations
  • GDP projections
  • AI productivity models
  • Market caps built on forward P/E
  • Even sentiment about AI’s economic “miracle”

Here’s the uncomfortable truth: If depreciation were still 2–3 years, the old standard, we would be having a very different conversation about profitability in Big Tech right now.

Meta’s “margin comeback”? Much smaller. Amazon’s AWS profitability? Much lower. Oracle’s AI narrative? Probably not one at all.

People think the AI revolution is entirely about innovation, breakthroughs, and technological leaps. A big part of it is also about accounting, the quiet, very boring kind of accounting that never makes headlines but moves trillions of dollars.


5. This isn’t a conspiracy. It’s psychology.
 

Why are companies doing this?

Because the alternative is ugly.

If they kept depreciating servers over 2–3 years, then:

  • Profit margins would drop
  • EPS growth would slow
  • Analysts would panic
  • Market multiples would shrink
  • Capex intensity would look frightening
  • Investors would question the entire economics of AI

And when you’re in the middle of the biggest capex race since the dot-com era, you do not want to show shrinking earnings.

So you do the only thing you can: You stretch the hardware life. You smooth the optics. You buy time.

And maybe, genuinely, some hardware will last longer. Some old GPUs can be repurposed. Some servers can remain in lower-priority clusters.

There are valid arguments. But none of them change the fact that this shift massively improves reported profits during the exact years when AI capex is exploding.

That timing is not a coincidence. It’s coordination through incentives.


6. So what’s the real risk?
 

Not that AI collapses. Not that tech melts down. Not that GPUs become worthless.

The real risk is subtler:

The financial picture of AI might look healthier than the operational reality beneath it.

And when the two finally converge, when capex stops being buried by accounting tweaks, the correction could be sharp.

Not because AI fails. But because expectations were built on polished numbers.

This is what Burry is warning about. The next bubble won’t be in mortgages. It won’t be in software. It will be in infrastructure economics, the most boring part of tech, and therefore the easiest part to hide things in.


7. Final thought for you!
 

AI is real. The breakthroughs are real. The products will change everything.

But the economics? The economics are not fully real yet. And the gap between “story” and “reality” is being filled by accounting choices that make everything look smoother than it is.

Burry’s warning isn’t an attack on AI. It’s a reminder that innovation doesn’t erase cost. And no amount of GPU enthusiasm can change how quickly hardware wears out when you run it at 100% load.

If you want to understand the next phase of the AI cycle, don’t look at conference keynotes or model demos.

Look at the footnotes in the 10-K’s. The ones that quietly doubled the lifespan of hardware that’s burning itself to death to keep up with demand.

That’s where the real story is hiding.


More By This Author:

The Next AI Drama Isn’t Online
More Profit, Less People
If Taiwan Falls, So Does AI

How did you like this article? Let us know so we can better customize your reading experience.

Comments

Leave a comment to automatically be entered into our contest to win a free Echo Show.