Facebook internal guides on content moderation revealed in recent leak
An investigation by the Guardian gives a first time look at Facebook’s content moderation manuals, related to hate speech, terrorism, pornography, racism, and others. Facebook moderators often have less than 10 seconds to review and action a reported post, meaning the possibility for inconsistent judgment is quite real. But, hey, with more than 6.5 million reported posts a week, those guys are not having a stroll in their 9-to-5. However, some of the rules seem quite confusing and look set to fuel social debate on how the company manages content.
In one of the leaked documents, Facebook acknowledges ‘people use violent language to express frustration online’ and feel ‘safe to do so’ on the site. ‘I hope someone kills you’ is regarded by the rules as either generic or not credible, for example. However, if a posts threatens a specific living person or a location, the moderator is to delete it and follow steps to help track down the poster. The guidelines also provide special protection to 'vulnerable groups', such as foreigners, heads of state or candidates for the position, Zionists, and, behold, drug dealers in the Philippines.
Another debatable practice of Facebook is allowing people to livestream attempts to self-harm because the company “doesn’t want to censor or punish people in distress or who are attempting suicide”. The social media quoted experts, as it said that it was “best for these people’s safety to let them livestream, as long as they are engaging with viewers.”
Things that are NOT allowed: