A COALITION of lawyers in Ghana and the United Kingdom has launched an investigation into what they describe as ‘dire’ working conditions for content moderators employed to review harmful content for social media giant Meta in Accra, Ghana. The probe could become the latest labour controversy surrounding the Facebook and Instagram parent company in Africa, AFP reports.
Legal experts from Accra-based Agency Seven Seven and the UK non-profit Foxglove told AFP that workers have been subjected to ‘distressing’ and ‘bloody’ content — including images of sexual assault, child abuse, and murder — without adequate mental health support.
They are also probing claims that workers have been dismissed for attempting to unionise and raise concerns about the psychological toll of their work.
‘What we are talking about here is potential psychological injury,’ said Carla Olympio, founder and managing partner of Agency Seven Seven, who has recently met with affected workers.
Mental health crisis and retaliation claims
‘Everyone is suffering in terms of their mental health — whether that’s post-traumatic stress disorder, insomnia, depression, suicidal thoughts and more,’ Foxglove founder Martha Dark told AFP. She described the situation facing moderators in Ghana as ‘pretty dire’.
The Ghanaian hub is reportedly operated by Majorel, a subsidiary of the Paris-based outsourcing firm Teleperformance. According to the lawyers, approximately 150 moderators work from the Accra facility, reviewing disturbing posts for Meta under contract.
Dark added that moderators are often required to share employer-provided housing and face pressure to view excessive amounts of violent content in order to earn bonuses. A low base salary and unclear performance-based incentives, she said, encourage workers to push beyond safe psychological limits.
One worker, who relocated from East Africa, told The Guardian that the traumatic nature of the job led him to attempt suicide.
Pattern of abuse echoes Kenya case
The Ghana probe mirrors ongoing legal action in Kenya, where Meta’s now-closed Nairobi content moderation hub became the subject of lawsuits over alleged unlawful dismissals and mental health harms. In those cases, Foxglove also represents the affected workers, some of whom say they were fired for trying to unionise.
Dark pointed to international examples, including limits on exposure for child abuse investigators in Ireland, to argue that safer moderation practices are possible — if employers prioritise worker wellbeing and professional psychiatric care.
Meta has not publicly responded to AFP’s request for comment. Teleperformance, the parent company of Majorel, also did not respond to AFP but told The Guardian it had ‘robust people management systems and workplace practices, including a wellbeing programme staffed by fully licensed psychologists.’
Meta similarly told the newspaper it took ‘the support of content reviewers seriously.’
However, lawyers say the secrecy around the opening of the Accra moderation centre and recurring patterns of harm raise broader questions about how Big Tech manages outsourced labour on the continent.
‘This is not just an isolated incident,’ said Olympio. ‘This is part of a system where African workers are being exposed to extreme trauma, with little support and zero accountability.’