Africa’s first multilingual small language model, InkubaLM, has been compressed by 75% without losing performance – making it more efficient for low-resource environments. The breakthrough came from the Buzuzu-Mavi Challenge, a global competition hosted by Lelapa AI in partnership with Zindi.
The challenge drew 490 participants from 61 countries to reduce the size of InkubaLM, which supports African languages. All top winners were African, highlighting the continent’s growing AI expertise.
Cameroon’s Yvan Carré won first place by combining adapter heads, quantization, and knowledge distillation. South African Stefan Strydom placed second, slimming the model to 40 million parameters using vocabulary trimming and shared embeddings. The AI_Buzz team – Abdourahamane Ide Salifou, Mubarak Muhammad, and Victor Olufemi from Niger and Nigeria – took third, applying model distillation and blended datasets.
InkubaLM’s lightweight design is a major win for Africa, where only 33% have regular internet access and 70% use entry-level smartphones. Smaller models can power tools in education, agriculture, translation, and customer service – without constant connectivity.
“This isn’t just technical progress – it’s proof that inclusive, African-built AI is possible,” said Lelapa AI CEO Pelonomi Moiloa.
“These models show how much can be done with less,” added Zindi CEO Celina Lee.
The most promising entries will inform future open-source versions of InkubaLM, with Lelapa AI and Zindi calling for continued collaboration to keep advancing African AI.