Facebook has updated its Community Standards to help users understand what they can and can’t share on the social network.
The revised guidelines clarify Facebook’s policies on a variety of topics, including bullying, threats of violence, self-harm and hate speech. The company is attempting to strike a balance between blocking offensive content and allowing for freedom of expression on its network.
Facebook prohibits the harassment of other users, threatening sexual violence, promoting self-harm and supporting terrorist organizations. It also removes hate speech that attacks people based on their race, ethnicity, national origin, religious affiliation and sexual orientation.
The company says it attempts to review reported cases in their context so as to maintain users’ freedom to share ideas. For example, it will look into content depicting violence and human rights abuse to determine whether it’s been posted for sadistic pleasure or to raise awareness about such issues, before deciding to take it down.
If Facebook receives a request from a country to remove content because it is illegal there, it will not necessarily take it down, but may restrict access to it in that country.
The company also says it has seen an 11 percent increase in government requests for content restrictions in the second half of 2014 in comparison to the first half. Requests for account data increased only slightly, from 35,051 from 34,946 — with more requests coming from countries including India.
Facebook adds that it will “continue to push governments around the world to reform their surveillance practices in a way that maintains the safety and security of their people while ensuring their rights and freedoms are protected.”
➤ Explaining Our Community Standards and Approach to Government Requests [Facebook Newsroom]
Get the TNW newsletter
Get the most important tech news in your inbox each week.