Facebook is to implement a new policy starting in Sri Lanka and later in Myanmar where it will start removing ‘misinformation’ that could incite violence.
Under the new rules, Facebook said it would create partnerships with local civil society groups to identify misinformation for removal, the New York Times reports.
Facebook has been criticised for allowing misinformation and hate speech to spread on its platform.
The platform’s founder, Mark Zuckerberg, defended the company’s free speech policy with regards to issues such as holocaust denial but said, “let me give you an example of where we would take it down. In Myanmar or Sri Lanka, where there’s a history of sectarian violence.”
“We have identified that there is a type of misinformation that is shared in certain countries that can incite underlying tensions and lead to physical harm offline,” said Tessa Lyons, a Facebook product manager. “We have a broader responsibility to not just reduce that type of content but remove it.”
Facebook and WhatsApp were cited as a catalyst for anti-Muslim violence that occurred earlier this year in Sri Lanka, although some observers pointed out that pogroms against Tamils and Muslims had been endemic to Sri Lanka long before the advent of social media and viral news.