- Advertisement -
HomeArtificial IntelligenceAfrica’s First Multilingual SLM Shrinks by 75%—A Triumph for Local AI Expertise

Africa’s First Multilingual SLM Shrinks by 75%—A Triumph for Local AI Expertise

- Advertisement -

Africa’s first multilingual Small Language Model (SLM), InkubaLM, has just achieved a 75% reduction in size while maintaining performance, thanks to the brilliance of African AI expertise.

In a continent where internet availability is only 33% and 70% of people use entry-level devices, lightweight AI is not a luxury but a necessity. Smaller models, such as InkubaLM, can run on low-cost devices, operate without continual connectivity, and power real-world applications like translation, education, agriculture, and customer service.

The Buzuzu-Mavi Challenge, hosted by Lelapa AI and Zindi, saw over 490 participants from 61 countries compete to compress InkubaLM while maintaining its multilingual capabilities. All top winners were African, highlighting the continent’s AI innovation potential.

This milestone represents a significant step forward in accessible, efficient AI for low-resource situations across the continent.

“This challenge isn’t simply about technical progress; it reflects our deeper mission at Lelapa AI: to build AI that is inclusive, accessible, and grounded in African realities. The Buzuzu-Mavi Challenge affirms what we’ve always believed—when AI is designed with Africa in mind, it becomes both technically excellent and deeply transformative.

And when African talent is trusted with meaningful challenges, the results are not just outstanding; they’re a glimpse into the future we’re building for and from the continent.” – Pelonomi Moiloa, CEO, Lelapa AI

Meet the Winners:

  1. Yvan Carré from Cameroon developed InkubaLM, a compressed model that uses adapter heads, quantization, and knowledge distillation to reduce memory requirements and enhance model performance without compromising capability.

  2. Stefan Strydom, hailing from South Africa, was able to reduce the model to just 40 million parameters by trimming vocabulary (removing infrequent words), reducing layers (streamlining the structure), and sharing embeddings (reusing components to save space).

  3. Team AI_Buzz, Abdourahamane Ide Salifou, Mubarak Muhammad, and Victor Olufemi, from Nigeria & Niger, developed a 177M-parameter student model using dataset blending and distillation, achieving size reduction and solid performance.

“It is a joy and a privilege for us at Zindi to partner with Lelapa AI on the Buzuzu-Mavi Challenge. Seeing the impact that our incredible community of AI builders can have on a truly African problem is inspiring and rewarding in its own right, but even better, these solutions showcase what African innovators can do in the language model space. In a world where the state of the art requires ever larger language models, we’re proud to show the world that more can be done with less.” – Celina Lee, co-founder and CEO, Zindi

-Subscribe to our Newsletter-
Newsletter: Sign up to receive daily updates from IT News Africa
- Advertisement -
- Advertisement -
Stay Connected
9,000FansLike
876FollowersFollow
22,877FollowersFollow
884SubscribersSubscribe
Must Read
- Advertisement -
Related News
- Advertisement -