More than 1,000 low-paid workers in Kenya have been abruptly dismissed by Sama, a Nairobi-based outsourcing firm that handled content moderation and AI training for Meta, after Meta ended the contract. Activists said the move highlighted how precarious tech jobs in the global south can be.
Reports last month said some Kenyan data annotators were asked to view footage captured by Meta’s Ray‑Ban smart glasses showing people using toilets or having sex. The Oversight Lab, an advocacy group for fair tech deployment in Africa, said the affected workers—many doing AI training—were given six days’ notice and that it was advising them on legal options.
Earlier mass layoffs of Sama content moderators led to a 2024 civil lawsuit in which 140 former workers alleged they developed severe PTSD, depression and anxiety from repeatedly viewing disturbing online content.
Meta said it had paused work with Sama after the allegations, adding: “Photos and videos are private to users. Humans review AI content to improve product performance, for which we get clear user consent. We’ve also decided to end our work with Sama because they don’t meet our standards.”
Sama said it recognised the impact and was “supporting affected employees with care and respect,” and described itself as “a responsible corporate citizen,” saying teams received living wages, full benefits, wellness resources, medical benefits and on‑site counselling.
The Oversight Lab called the layoffs “devastating and shocking,” warning current strategies harm youth, hurt the economy and do not advance Kenya’s role in the AI ecosystem. Former Sama worker Kauna Malgwi said the situation “is not confined to one company or contract,” and shows how power in the global AI industry sits with large tech firms while risk flows downward to outsourced workers in the global south who have the least protection.
Last month a Los Angeles jury found that Meta’s Instagram and Google’s YouTube had deliberately designed addictive social media products that hooked a young user and led to her being harmed.
