Washington Denials And AI Bailouts

Photo by Steve Johnson on Unsplash
There's an old adage in Washington: Don't believe anything until it is officially denied. Now that the Trump administration's so-called artificial intelligence (AI) czar David Sacks has gone on record stating that "[t]here will be no federal bailout for AI," we can begin speculating about what form that bailout might take.
It turns out that the chief financial officer of AI behemoth OpenAI has already put forth an idea regarding the form of such a bailout. Sarah Friar told The Wall Street Journal in a recorded interview that the industry would need federal guarantees in order to make the necessary investments to ensure American leadership in AI development and deployment.
Friar later "clarified" her comments in a LinkedIn post after the pushback from Sacks, saying that she had "muddied" her point by using the word "backstop" and that she really meant that AI leadership will require "government playing their part." That sounds like the government should still do more or less what she said in the The Wall Street Journal interview.
Now, maybe you are wondering why the hottest industry on the planet that is flush with hundreds of billions of dollars from investors needs a federal bailout. It's revealing that AI expert and commentator Gary Marcus predicted 10 months ago that the AI industry would go seeking a government bailout to make up for overspending, bad business decisions, and huge future commitments that the industry is unlikely to be able to meet.
For example, in a recent podcast hosted by an outside investor in OpenAI, the company's CEO, Sam Altman, got tetchy when asked how a company with only $13 billion in annual revenues that is running losses will somehow fulfill $1.4 trillion in spending commitments over the next few years. Altman did not actually answer the question.
So what possible justification could the AI industry dream up for government subsidies, loan guarantees, or other handouts? For years one of the best ways to get Washington's attention is to say the equivalent of "China bad. Must beat China." So that's what Altman is telling reporters. But that doesn't explain why OpenAI instead of other companies should be the target of federal largesse.
In what appears to be damage control, Altman wrote on his X account that OpenAI is not asking for direct federal assistance and then later outlines how the government can give it indirect assistance by building a lot of data centers of its own (that can then presumably be leased to the AI industry so the industry doesn't have to make the investment itself).
Maybe I'm wrong and what we are seeing is not the preliminary jockeying by the AI industry and the U.S. government regarding what sort of subsidy or bailout will be provided to the industry. Lest you think that the industry has so far moved forward without government handouts, the AP noted that subsidies are offered by more than 30 state governments to attract data centers.
Not everyone is happy with having data centers in their communities. And, those data centers have also sent electricity rates skyward as consumers and data centers compete for electricity and utilities seek additional funds to build the capacity necessary to power those data centers. Effectively, current electricity customers are subsidizing the AI data center build-out by paying for new generating capacity and lines to feed energy to those data centers.
The larger problem with AI is that it appears to have several limitations in its current form that will prevent it from taking over much of the work already done by humans and preclude it from being incorporated into critical systems (because it makes too many mistakes). All the grandiose claims made by AI boosters are dispatched with actual facts in this very long piece by AI critic Ed Zitron.
I am increasingly thinking of AI as a boondoggle. A boondoggle, according to Dictionary.com, is "a wasteful and worthless project undertaken for political, corporate, or personal gain." So far, the AI industry mostly fits this definition.
But there is a more expansive definition which I borrow from Dmitri Orlov, author of Reinventing Collapse: A contemporary boondoggle must not only be wasteful, it should, if possible, also create additional problems that can only be addressed by yet more boondoggles—such as the need for vast new electric generation capacity that will be unnecessary if AI turns out to be far less useful than advertised. AI boosters say that AI is going to have a big impact on society. I couldn't agree more, except not quite in the way these boosters think.
More By This Author:
How Did U.S. 'Energy Dominance' Turn Into Rising Domestic Natural Gas Prices?AI Versus Humans: The 'Singularity' Keeps Getting Postponed
The Trouble With Copper Tariffs