
Supercomputer and 12 billion parameters: Krutrim opens a new chapter in Indian AI
Ola company founder and billionaire Bhavish Aggarwal announced investing $230 million in his AI startup Krutrim, intending to make India a leader in the global artificial intelligence race. According to informed sources at TechCrunch, the funding will be primarily through Aggarwal’s family office, with total investments planned to reach $1.15 billion by next year.
Krutrim marked an important event by opening access to its AI models and announcing the construction of India’s largest supercomputer in partnership with Nvidia. The company introduced Krutrim-2 – a language model with 12 billion parameters, specializing in processing Indian languages.
The test results are impressive: in text sentiment analysis, the model showed a result of 0.95 compared to 0.70 from competitors, and in code generation tasks achieved 80% success. Technical capabilities include a context window of 128,000 tokens, allowing for processing long texts and conducting complex dialogues. In grammar correction tests, the model reached 0.98, and in multi-stage conversations – 0.91.
“We are still far from global standards, but have achieved good progress in a year,” noted Aggarwal. “By opening access to our models, we hope for collaboration from the entire Indian AI community to create a world-class ecosystem.”
Krutrim has also developed its own BharatBench evaluation system for testing AI models’ effectiveness in working with Indian languages, filling a gap in existing evaluation systems that primarily focus on English and Chinese languages.
The initiative emerged against the backdrop of India’s desire to strengthen its position in artificial intelligence, where American and Chinese companies dominate. Recently, India welcomed the progress of Chinese company DeepSeek and announced the placement of its language models on domestic servers. Krutrim’s cloud division has already begun providing access to DeepSeek on Indian servers.