Share This Post

Facebook updates its community standards to clarify what it will remove

Facebook updates its community standards to clarify what it will remove

The social network updated its content guidelines to more clearly explain what is and isn’t acceptable on its service, as it balances free speech with removal of offensive content.

Facebook clarified its real name policy, offering something of a response to complaints that the social network discriminated against users who chose not to use their legal name online, or had names that Facebook saw as unacceptable. In a post that linked to the updated community standards, the company also specified how it selected content for removal from the social network, and explained how it responded to government requests for information.

Facebook Community Standards from Facebook on Vimeo.

The updated policy reiterates Facebook’s stance against harassment and provides “more guidance on policies related to self-injury, dangerous organizations, bullying and harassment, criminal activity, sexual violence and exploitation, nudity, hate speech, and violence and graphic content.”

Facebook’s definition of hate speech, for example, covers content that directly attacks people based on their race, ethnicity, national origin, religion, sexual orientation, sex, gender or gender identity, or serious disabilities or diseases.

“Sometimes people share content containing someone else’s hate speech for the purpose of raising awareness or educating others about that hate speech,” explain the revised guidelines.

When it comes to public figures, Facebook maintains that it allows “open and critical discussion” of celebrities and people featured in the news, while warning that it will remove “credible threats to public figures, as well as hate speech directed at them – just as we do for private individuals”.

Facebook has also said that it may geographically target areas in accordance with laws of specific countries, even if the content doesn’t violate the Facebook’s standards. Citing the example of blasphemy, Facebook said “if a country requests that we remove content because it is illegal in that country, we will not necessarily remove it from Facebook entirely, but may restrict access to it in the country where it is illegal.”

Share This Post

Knowledge Hub Specialized in Publishing Insights and Analytics Developed for Digital Marketing, Public Relations and Communications Experts.