AI Market Structure In The Future: Few Engines, Many Applications

Photo by Steve Johnson on Unsplash

The artificial intelligence industry will be large but with only a few corporations providing models such as ChatGPT, Claude, and Google’s Gemini family. However, many companies will provide tools to apply these models to specialized tasks, with customers including businesses, governments, and consumers. This is a very early judgment and may prove wrong, but it’s based on solid economic theory and experience. What could change is the costs and opportunities of the services provided.

Some industries have few companies, and others have many. Name producers of large commercial airplanes and only two companies come to mind. How about restaurants? My small suburban city shows 24 that belong to the local chamber of commerce. There are hundreds in the metropolitan area. To understand AI’s market structure, we must figure out why some industries have many businesses and some few.

Economies of scale drive much of market structure. Last year Airbus and Boeing between them produced 1,263 aircraft, or about 600 each. Suppose that production had been spread across 100 manufacturers instead of two. That would be 12 airplanes each. The average cost of production would have been much higher. Materials costs might have been similar, but design costs would be spread over far fewer planes. Specialized fixtures and jigs used in the production process would be less common, leading to more hand work and then more re-work.

Now think about restaurants serving perhaps 100 diners at a time. How much cheaper would it be if 1000 customers were seated in one large restaurant? Food preparation would probably be more automated, but servers and busssers would run into each other more often. Maintaining quality of both the food and the experience might be more costly.

Diversity of demand also drives market structure. The many restaurants have many specialties: Mexican, Italian, Thai, etc. But air travelers do not seem interested in too many variations on an airplane design.

The large language models that power ChatGPT and its peers have been very expensive to develop. GPT-3 may have cost three million to develop, but GPT-4 cost more than $100 million. The models have greatly improved as they have been trained on larger data sets, requiring far more processing resources. After development of the model, the expense of running it to answer queries grows with the size of the model.

The great value proposition of AI comes from harnessing the large language model to specific applications. The large language models are general purpose, which enables their communication in everyday language. A company wants to answer customer inquiries about bill payments; an engineer wants to look at design alternatives that fit certain parameters; a sales person wants to identify past customers who are likely to buy again. They can all use the same general purpose AI model, then “bolt on” additional features, data sources and practices.

Large language models can be fine-tuned to specific applications. In the best-known example, GPT was fine-tuned to sound like a helpful assistant called ChatGPT. In some cases the large language model will simply be an input-output mechanism, translating common English (or another language) into instructions that can be fed into a small, specialized artificial intelligence model. Then the results of that model can be fed back into the large language model so that the user receives the results in everyday language. In other cases the large language model will have a connection to outside information, such as a history of bill payments or a product’s service manual.

Each of these specialized applications will benefit from development by people with in-depth knowledge of the field. A mechanic knows best what information will be useful to another mechanic. There will be many, many of these specialized applications. They can be developed and implemented at relatively low cost. Whereas the large language model will resemble the Airbus-Boeing oligopoly, the applications sector will look like sushi, burgers, pizza, and on and on.

Within one application, such as helping customers return merchandise, there may be just two or three companies providing the service. But the number of specialized services will be huge. Some companies will build expertise developing the specialized applications, and they will have many different products developed with that expertise. But there will always be entrepreneurs looking for opportunities that others have missed, so small app developers will persist.

This view of the future market structure of artificial intelligence could go wrong in several ways, some that can be readily recognized today. First, the large language models could develop greater capability to handle specialized tasks. In 2021 I asked an AI specialist about developing a highly specialized application for an industry, such as finance. He told me that one could spend a year on that, but then the application would be based on a model that was one year old. Better results would probably come from using an up-to-date but nonspecialized AI model. So maybe take a year off in Tahiti and then use the general-purpose model.

The second alternative to this market structure forecast comes from the falling cost of AI model development. One estimate put training costs at 80% lower over a 2.5 year period, equivalent to about a 50% annual decline. Other guesses range from 20% to 70% lower each year. Many articles have noted the high price of the Nvidia chips used in AI. Those prices are high because the chips are so powerful, helping to pull the overall model training costs down. So perhaps large language models will be so cheap that many companies will generate their own. Even if the cost levels out at, say, $100 million, that’s well within the capital budget of many corporations.

Third, regulation could change the market structure, as it has in many other industries. Countries could require or subsidize locally-built AI models. If Canadian television stations must show Canadian-produced content, then perhaps that content should be generated using Canadian AI models. Some of the regulatory schemes being suggested in the United States might well entrench the early incumbents to the disadvantage of start-ups.

Fourth, somebody may develop a proprietary development method unknown to others. Recent research results in AI have often been shared publicly, or one company has figured out the same solution that another firm had previously developed, but it’s always possible that something new and great will be developed and kept secret.

Market concentration can also be affected by other factors that don’t seem to apply to AI as we know it, but that could change. Those other factors include network effects and access to distribution channels.

Businesses in the AI space should be as agnostic as they can be regarding future market structure, but when a bet has to be made the most likely outlook is a few large firms developing large language models with many small companies providing specialized applications.

More By This Author:

Ozempic Is Not Bad For Business Despite Snack Food Suppression
AI And The Economy: Manufacturing’s Benefits Are Mostly Outside The Factory
Strategies For Improving Labor Productivity Through Employee Training

Follow me on Twitter or LinkedIn. Check ...

How did you like this article? Let us know so we can better customize your reading experience.


Leave a comment to automatically be entered into our contest to win a free Echo Show.