With over 2 billion users, Facebook has a major moderation problem on its hands.
Whether you're talking about the platform's use by Russian government-backed trolls in the 2016 US Presidential election, or to spread propaganda during the 2016 Rohingya genocide, or when a shooter livestreamed a mass shooting in New Zealand, Facebook has faced moderation issue after moderation issue across the past few years.
And the company is well aware of the enormity of its problem. "One of the most painful lessons I've learned," CEO Mark Zuckerberg wrote in late 2018, "is that when you connect two billion people, you will see all the beauty and ugliness of humanity."Â
As a result, Facebook is establishing an oversight board that it says is outside of Facebook's control, that can ultimately overrule Facebook's own policies on content management. The company has even pledged $130 million to get the board funded and operational, with plans to launch in 2020.Â
Here's everything we know so far: