When it comes to monetizing AI models, startups face a significant disadvantage compared to large tech companies like Microsoft, Amazon Web Services (AWS), etc. Their large resource base allows them to release smaller solutions for free, outselling smaller innovators operating in similar fields.
Abhinav Aggarwal, founder of Fluid AI, addressed a common scenario at the Global IndiaAI Summit on Thursday: “It’s a very small layer, all the magic happens in LLM if you’re using chat GPT API or some mysterious API. And then it becomes a very thin layer to make it available to the end user…. Then after six months to a year, that wrapper gets deprecated. Microsoft releases it for free, AWS releases it for free with their solution, and it gets deprecated,” he said.
Agarwal said it’s hard for startups to survive in such a climate, so the question doesn’t arise as to whether they will thrive. He suggested pivoting towards creating a more robust application layer. “How do you thicken it? As you go down, you go to the LLM side,” he advised. This shift involves both hardening the application layer to effectively solve end use cases and adding a fine-tuning layer around the language model.
Agarwal further noted that a hybrid approach has proven beneficial for Fluid AI. “We are solving an end use case, such as making manufacturing plants more efficient, and we are building an application layer for that. But then we build a fine-tuning layer around the language model,” he said. “This strategy not only provides significant value to our customers, but also strengthens the startup against the threat of impending disruption,” he said.
To address the unique challenges of use cases in India, Agarwal said AI models need advanced reasoning capabilities. “What we’re struggling with with Indian models, at least so far, is that after a year or two, most of these models or approaches have become agentialized. Agentialization means that the model has to think a little bit deeper, meaning it has to go through six or seven steps of reasoning before it can give you an answer,” he said.
Further highlighting the essence of sustainable AI integration in startups, he said, “When you have nothing, it’s when you have everything. When you’re trying to solve a real problem, AI becomes one of the tools to solve that problem.” He believes this approach will distinguish successful startups from those that simply chase the next cool trend in generative AI.