Sarvam AI Launches 24B Parameter Open-Source LLM for Indian Languages and Reasoning Tasks This release follows Sarvam's selection by the Indian government to build a sovereign LLM under the IndiaAI Mission, marking the first step in strengthening the country's domestic AI capabilities
You're reading Entrepreneur India, an international franchise of Entrepreneur Media.

Bengaluru-based AI startup Sarvam AI has introduced its flagship large language model (LLM), Sarvam-M, a 24-billion-parameter open-weights hybrid model built on Mistral Small. Designed with a focus on Indian languages and advanced reasoning capabilities, Sarvam-M is intended to power applications such as conversational agents, machine translation, and educational tools.
According to Sarvam, the model has been fine-tuned through a three-step process involving Supervised Fine-Tuning (SFT), Reinforcement Learning with Verifiable Rewards (RLVR), and inference optimisation. The SFT process involved crafting diverse, high-quality prompts to train the model in both general dialogue and complex reasoning. RLVR further improved its instruction-following and mathematical capabilities using custom reward engineering and curated datasets. Inference was optimised using FP8 post-training quantisation and techniques like lookahead decoding.
Sarvam-M has demonstrated strong performance in multilingual and reasoning benchmarks. It achieved a significant 86 per cent gain on a romanised Indian language version of the GSM-8K math dataset and showed average performance boosts of 20 per cent on Indian language benchmarks, 21.6 per cent on math, and 17.6 per cent on programming tasks. It outperforms Llama-4 Scout and is on par with models like Llama-3.3 70B and Gemma 3 27B, though it slightly lags in English benchmarks such as MMLU.
The model is accessible via Sarvam's API, its own playground, and available for download on Hugging Face. This release follows Sarvam's selection by the Indian government to build a sovereign LLM under the IndiaAI Mission, marking the first step in strengthening the country's domestic AI capabilities.