An AI-driven system used to determine how much Kenyans must pay for healthcare has systematically driven up costs for the country’s poorest households, an investigation has found, with the algorithm overestimating the incomes of low-income families while underestimating those of wealthier households.
The system underpins Kenya’s Social Health Authority, launched in October 2024 as a flagship campaign promise of President William Ruto and intended to replace Kenya’s decades-old national insurance scheme. Billed as “accelerating digital transformation,” the program aimed to expand healthcare access to the country’s large informal economy — including day labourers, hawkers, farmers and other non-salaried workers, who together account for 83% of Kenya’s workforce.
“No Kenyan will be left behind,” Ruto told a Kericho stadium during his 2023 presidential campaign. But the rollout has instead sparked protests as healthcare contributions are now calculated using a formula critics describe as flawed and opaque.
Although Ruto has described the system as AI-powered, it does not rely on the recent advances in artificial intelligence underpinning large language models such as ChatGPT. Instead, it uses a predictive machine learning algorithm that runs a means-testing process determining contributions for millions of households.
Through months of investigation, reporters at Africa Uncensored, working in collaboration with Lighthouse Reports and the Guardian, obtained details of the system and conducted an audit of how it operates. The findings show that, from the outset, the system was systematically overcharging the poorest Kenyans by overestimating their incomes, while undercharging the wealthiest by underestimating theirs.
Grace Amani, a community volunteer, sees the consequences daily. She visits households across Nairobi to walk residents through dozens of questions on a digital questionnaire — asking about the type of toilet they use, what their roof is made of, whether they own a radio. The form is then submitted to the algorithm, which calculates the household’s annual public health insurance premium. People are often confused, with some fearing they are under investigation.
“People are dying, people are suffering,” Amani said. The volunteer has watched families struggling to feed themselves charged premiums of between 10% and 20% of their meagre incomes, and seen critically ill people unable to access treatment because they cannot meet the amount the algorithm assigns them. “Many people have been unable to go to hospital. Will they pay SHA, or pay for food, or pay for the small house they live in?”
David Khaoya, a health economist who advised Kenya’s health ministry, said the system’s structural constraints had forced a choice: it could correctly assess poor households or correctly assess rich ones. The government, he said, opted to prioritize accurately evaluating the wealthy — even at the cost of overcharging the poor. “If you identify a richer person as poor and therefore ask him to pay less, this person will never own up and say, ‘I’m actually supposed to be paying more,'” Khaoya said.
The Kenyan algorithm is built on a long-standing World Bank methodology known as proxy means testing, which estimates household incomes from observable possessions and life circumstances such as housing materials, livestock or family size. Stephen Kidd, a development economist, said the model has been deployed “all over Africa, all over Asia and the Pacific,” often as a condition for governments to receive World Bank loans.
But research has repeatedly shown the systems do not work as intended. Kidd has tested similar schemes in Indonesia and Rwanda, finding error rates of 82% and 90% respectively in identifying intended beneficiaries. Across Africa, Asia and Latin America, proxy means testing algorithms have become popular tools for determining who qualifies for cash transfers, food subsidies and other benefits — but in attempting to classify populations as “poor” or “not poor,” they routinely produce significant misclassifications.
The Africa Uncensored and Lighthouse Reports audit found that Kenya’s Social Health Authority appears to overcharge more than half of poor households, while wealthier households’ incomes are underestimated. For two farmers tested, the algorithm predicted their income at twice the actual figure, based on the fact that they own their home and have electricity. Kidd argues that poverty is a fluid category — and using indicators such as an iron roof or a pit toilet to estimate household wealth is intrinsically imprecise. The opacity of the algorithms also erodes public trust. “It feels like a lottery,” Kidd said. “The lottery is not a great way of building trust.”
The system’s failures appear to have been anticipated. A report by the international data consultancy IDinsight, obtained by reporters and shared with the government before implementation, warned that the system was “inequitable, particularly for low-income households” and that its wealth-determination basis “over-represents middle-income households and has very few data points from poverty pockets.” The report also said the system was “out-of-date with the current socioeconomic condition” in Kenya given the “multiple economic shocks” the country had absorbed.
Despite the warnings, the Social Health Authority was deployed as designed. Of more than 20 million people registered, only 5 million are regularly paying their premiums. Some hospitals are reporting large deficits as promised reimbursements from the authority remain unpaid. In March, former Deputy President Rigathi Gachagua predicted that “SHA will collapse in another six months.”
Brian Lishenga, chair of Kenya’s Rural and Urban Private Hospitals Association, has emerged as one of the system’s most vocal critics. “This is an experiment that has failed,” he said. “It’s a really poor tool for identifying poor households. It’s a great tool for helping the government run away from responsibility. A very great tool for that.”
Note: One of the volunteers in this investigation is identified by a pseudonym in the original source





