AI Models Are Becoming Commoditized, But App Development Will Drive Next Wave of Value
The rapid development of artificial intelligence (AI), particularly large language models (LLMs) like ChatGPT, has captivated the world. But according to Nandan Nilekani, co-founder and chair of Infosys, the next wave of value in AI won’t come from the models themselves, but from the applications built upon them. LLMs, which are trained on vast datasets, are becoming increasingly commoditized as more companies build their own variations tailored to specific regions and languages.
Key Takeaways:
- LLMs will become more commonplace, with new models emerging globally.
- The focus will shift to building applications that utilize LLMs, unlocking more value.
- Enterprise AI adoption is slower than consumer AI, requiring companies to rethink their internal operations.
The Rise of Localized LLMs and the Commoditization of Models
Nilekani believes that the rise of LLMs is only the beginning. As the technology matures, the focus will shift towards building customized applications that leverage these powerful models.
"There are many companies now coming up which are building India-specific LLM solutions for Indian languages … so I think what’s going to happen is that because AI is finally dependent on the data it is trained on, every part of the world which has unique data will be required to do something about training the models for that data," Nilekani shared with CNBC.
This localized approach to AI development signifies a shift toward greater accessibility and utilization of the technology. As LLMs are tailored for specific regions and languages, they become more relevant and valuable to users, increasing their adoption rate and leading to a broader range of applications.
The commoditization of LLM models doesn’t imply their irrelevance. Instead, it highlights the evolution of the AI landscape. Nilekani predicts that "the models will become more commoditized and the value will switch to the application layer and the whole stack." This means that while the core LLMs might become readily accessible, the real innovation and profit will come from building unique and effective applications on top of them.
The Slow Burn of Enterprise AI
While consumer-facing AI applications are proliferating rapidly, Nilekani notes that enterprise AI adoption is a slower process:
"Consumer AI you can get up a chatbot and start working. Enterprise AI requires firms to reinvent themselves internally. So it’s a longer haul, but definitely it’s a huge thing happening right now," Nilekani explained.
This distinction highlights the inherent complexity of integrating AI into business operations. Enterprise AI involves a deeper level of transformation, requiring companies to re-evaluate their workflows, data management, and overall strategies. This process takes time, requiring careful planning and implementation to ensure successful integration and achieve tangible benefits.
The Future of AI: From Models to Applications
Nilekani’s insights into the commoditization of LLM models and the shift towards application development paint a clear picture of the future of AI. While powerful LLM models are crucial building blocks, the real potential lies in their utilization.
The race now is to develop innovative applications that leverage LLM capabilities to solve real-world problems, improve user experiences, and drive business efficiency. As AI technology continues to evolve, we can expect to see a flourishing ecosystem of diverse applications that cater to every aspect of life, from personal productivity and entertainment to complex business operations.
This shift towards application development also underscores the importance of data. Nilekani emphasizes that the data a model is trained on is crucial to its effectiveness. The rise of localized LLMs further highlights this point, demonstrating how data unique to specific regions can unlock new opportunities for AI-powered solutions.
As AI becomes increasingly integrated into our lives, finding innovative ways to leverage this powerful technology will be essential. The future of AI lies in the development of creative and impactful applications, and those who can build these solutions will ultimately shape the future of the technology.