Facebook internal guides on content moderation revealed in recent leak
In one of the leaked documents, Facebook acknowledges ‘people use violent language to express frustration online’ and feel ‘safe to do so’ on the site. ‘I hope someone kills you’ is regarded by the rules as either generic or not credible, for example. However, if a posts threatens a specific living person or a location, the moderator is to delete it and follow steps to help track down the poster. The guidelines also provide special protection to 'vulnerable groups', such as foreigners, heads of state or candidates for the position, Zionists, and, behold, drug dealers in the Philippines.
Another debatable practice of Facebook is allowing people to livestream attempts to self-harm because the company “doesn’t want to censor or punish people in distress or who are attempting suicide”. The social media quoted experts, as it said that it was “best for these people’s safety to let them livestream, as long as they are engaging with viewers.”
Having a pool of 2 billion content-hungry users means that you need solid guidelines of what publications are tolerated and what are not. Facebook's logic in its approach to censorship of content is surely based on reason and experience. However, advocates of freedom of speech, as well as social ethics preachers, might still have something to say about Facebook's leaked manuals.
source: The Guardian