Meet the pickaxe vendors of the AI gold rush
There's an old saying that the surest path to profit in a gold rush is to bet on the companies supplying the pickaxes — and that idea is now igniting the toolmakers and wholesalers of today's generative AI boom.
Why it matters: It's comparatively easy to see a broad tech trend on the horizon, but often much harder to home in on who will win and over what timeframe. Netscape and BlackBerry serve as cautionary tales.
Software veteran Tom Siebel, who now runs enterprise AI firm C3.ai, sees a wide swath of the tech industry benefiting from the AI boom.
- "Enterprise AI and large language models are poised for exponential growth and wide-scale adoption in coming years," Siebel said in a statement to Axios.
- "Products and services directly coupled with the adoption of pure-play AI offerings will benefit from similar growth — these include enterprise AI software providers like C3 AI, LLM providers, cloud computing providers, providers of GPU or AI chips, systems integrators, and consulting services providers.”
Here are three categories of companies that serve as the modern-day axe vendors for AI.
The cloud computing giants
The first tool needed to dig for AI gold is compute capacity — and a lot of it. The models that power the current generation of generative AI tools like ChatGPT or Dall-E are complex, with billions of parameters.
- The result is that the systems, especially those open for general public use, consume huge amounts of computing power — both in their development or training phase and in their regular use.
That augurs well for the giants of cloud computing: Microsoft, Google and Amazon.
- Of those, Microsoft has been at the forefront, combining its Azure cloud with its OpenAI partnership, which is cemented by billions in investment.
- Google has been offering machine learning services for a while, too, and is keen to show it can compete with Microsoft in other areas.
- Amazon, which is the leader in the overall cloud computing business, has been seen as less of a player in generative AI. The company hopes to change that with a flurry of announcements Thursday that included both partnerships and the release of its own foundation models.
All that computing work means a lot of chips will be needed to power all those AI servers. They depend on several different kinds of chips, including CPUs from the likes of Intel and AMD as well as graphics processors from companies like Nvidia.
- Many of the cloud providers are also developing their own chips for AI, including Amazon and Google. Meanwhile, chipmakers, including Nvidia, are also developing their own foundation models.
- "We also see a lot of healthy competition on the cloud compute providers between Nvidia, AWS, AMD, Google — lots of new silicon is developed with the goal of accelerating training and cloud inference" (a specific type of logical operation that's central to AI), said Alexandru Costin, Adobe's VP of generative AI. Adobe recently announced a partnership with Nvidia, as did OpenAI.
- AI performance is not so much about pure chip speed as about how well a specific chip performs running a particular model. Many say Nvidia has the edge here over AMD, Intel and a variety of AI chip startups.
- Intel, meanwhile, points to published results from Hugging Face that it says show inference runs faster on Intel’s AI hardware accelerators than any graphics chips on the market.
The human element
It's counterintuitive, but it turns out that artificial intelligence actually requires a lot of actual human labor — at least in the early phase of development — to check the computer's work and add metadata, as well to mitigate bias.
- One of the big players here is ScaleAI, a startup with a $7.3 billion valuation that helps AI companies annotate their data.
- The AI boom should also benefit the legions of consultants and system integrators who help large businesses develop and implement new technology strategies.
Yes, but: Many of those directly developing AI technology are also positioning themselves as axe vendors.
- OpenAI, for example, has its mega deal with Microsoft and also provides a wide range of developers with API access to its text-to-image and GPT-4 text models.
- Many others, including Stable Diffusion, aim to enable other firms to build customized consumer and business tools based on their own underlying algorithms and models.