This Week in AI. So AI is Rapidly Being Commoditized
3 min readThis week, we explore the rapid commoditization of AI. Major players are slashing prices, signalling a shift in the landscape.
What’s driving these changes, and what does it mean for the future of AI? Let’s dig deeper.
Price Reductions by Major Vendors
In early August, we saw a significant shift. Both Google and OpenAI reduced prices on their cost-effective text-generating models. For example, Google cut the input price for Gemini 1.5 Flash by 78% and the output price by 71%. OpenAI followed suit, reducing costs for its GPT-4o model significantly.
These reductions are not isolated events. According to estimates, the average cost of running AI models is falling by an impressive 86% annually.
Drivers Behind Cost Reduction
So, what’s driving these price cuts? One primary reason is the lack of differentiation among top models in terms of capabilities. Andy Thurai, a principal analyst at Constellation Research, mentions, “We expect the pricing pressure to continue with all AI models if there is no unique differentiator.”,”John Lovelock, VP analyst at Gartner, adds that the high competition is a significant factor. Models have traditionally been priced to cover the immense costs of development and operation. However, as data centers scale, discounts become more feasible.
Cost-Saving Techniques
Vendors are employing techniques like prompt caching and batching to reduce expenses.
Prompt caching stores specific “prompt contexts” for reuse, while batching processes groups of low-priority requests simultaneously, further driving down costs.
Open source models, such as Meta’s Llama 3, also play a role. When run on enterprise infrastructure, these models can be more cost-effective, adding to the competitive pricing landscape.
Sustainability of Price Declines
The question remains: Are these price declines sustainable? Generative AI vendors are currently burning through cash at an alarming rate. OpenAI, for instance, is projected to lose $5 billion this year. Similarly, Anthropic anticipates losses of over $2.7 billion by 2025.
Due to the high capital and operational costs, experts like Lovelock believe vendors may need to adopt new pricing structures. The cost estimates to develop next-generation models are still in the hundreds of millions.
Legislative and Ethical Issues
In California, a bill known as SB 1047 is garnering support. This legislation requires creators of large AI models to implement safeguards against potential harms. Elon Musk is a notable supporter of this bill.
Additionally, companies like OpenAI, Adobe, and Microsoft are backing legislation that mandates labeling AI-generated content. The final vote on this bill is expected soon.
Impacts on AI Vendor Strategies
The commoditization of AI forces vendors to rethink their strategies. Some, like Inflection, have chosen to cap free access to their chatbots as they shift focus toward enterprise products.
Other vendors are exploring entirely new offerings. Anthropic recently launched its Artifacts feature, which allows users to turn conversations with AI into functional apps and websites. This feature aims to add value beyond simple text generation.
Future of AI Models
Looking ahead, expectations are high. OpenAI is reportedly working on a new product, codenamed Strawberry, to solve complex problems more effectively than existing models.
If successful, such advancements could set a new benchmark for AI capabilities, addressing current limitations in problem-solving and programming.
AI’s rapid commoditization offers both opportunities and challenges. As prices drop, accessibility increases, but sustainability remains a concern.
The ongoing evolution will require strategic adaptations from all players in the industry.