The Human Cost of Your Clean Feed: Meta's Content Moderators and the PTSD Crisis
Outsourced workers review thousands of violent and disturbing posts daily for a fraction of tech industry wages
Behind the algorithmic curation of your Facebook and Instagram feeds lies an army of human content moderators — predominantly outsourced workers in countries like Kenya, the Philippines, and India — who spend their shifts reviewing some of the most disturbing content imaginable. From graphic violence and child exploitation to terrorist propaganda and self-harm imagery, these workers serve as the first line of defense between harmful content and the billions of people who use Meta's platforms daily.
The working conditions facing these moderators have been extensively documented through lawsuits, investigative journalism, and whistleblower testimony. A landmark 2020 settlement saw Meta pay $52 million to current and former content moderators in the United States who developed PTSD and related mental health conditions from their work.
Key Takeaways
- Meta paid $52 million to settle claims from US moderators who developed PTSD from reviewing graphic content
- Outsourced moderators in Kenya were paid as little as $1.50 per hour to review beheadings and child abuse imagery
- Workers review over 1000 posts per shift with fewer than 60 seconds per decision and inadequate mental health support