Blood on the Platform: How Meta's Moderation Failures Contributed to Genocide in Myanmar
A UN investigation found Facebook played a "determining role" in atrocities against the Rohingya while Meta's response remained inadequate for years
In 2018, a United Nations fact-finding mission delivered one of the most damning assessments ever leveled against a technology company: Facebook had played a "determining role" in the genocide against the Rohingya Muslim minority in Myanmar. The platform had been used systematically to spread hate speech, incite violence, and coordinate attacks against a vulnerable population, and Meta's failure to moderate this content — despite repeated warnings from civil society organizations — contributed to atrocities that displaced over 700,000 people and resulted in mass killings, sexual violence, and the destruction of entire communities.
The dynamics in Myanmar were particularly dangerous because Facebook had become effectively synonymous with the internet for most of the country's population.
Key Takeaways
- A UN fact-finding mission found Facebook played a "determining role" in genocide that displaced over 700,000 Rohingya people
- At the crisis peak Facebook employed fewer than a handful of Burmese-speaking moderators for over 20 million users
- Civil society organizations had warned Facebook about dangerous Myanmar content for years before the company took meaningful action