Why, How Does Facebook Delete Posts? Secret Content Removal Policies Revealed By Leak In Germany
Long mum on its inner workings and actions regarding how Facebook manages and removes content, a new report has revealed the social media giant's “secret” content removal policies, Digital Trends reported Tuesday.
Founder Mark Zuckerberg’s social media empire, built on nearly two billion users around the world, faced and settled a lawsuit in Germany last month over how it manages hate speech. Those proceedings led German newspaper Suddeutsche Zeitung to release excerpts of documents given to Facebook staff and third-party content moderators that work with the site.
These new revelations, while offering a much more refined account of Facebook’s content management protocols than it posts to the public, are ostensibly vague, according to Digital Trends. They also came at a time when social media’s influence around the country and how it provided convoluted and fake news to users during the presidential election has come under the microscope.
In regards to hate speech, Facebook evidently disallows “verbal attacks” and break users down into categories based on sexual orientation, religion, gender, race, ethnicity and others, and even have sub categories for each.
Certain phrases linked to members of these categories raise red flags and Facebook attempts to then shut them down. Specifically, a phrase like “[expletive] Muslims” is not allowed, while the term “migrant” and saying “migrants are dirty” is ok. However, saying “migrants are dirt” doesn’t fly.
Another report from SZ released Thursday highlighted a Facebook office in Berlin which is used to monitor content in response to criticism not only from Germany but other European countries for how it handles hate speech. The 600-member staff is poorly paid and confused about the content management policies, the report continued.
“The rules are almost impossible to understand. I’ve said to my team leader: this is crazy!” a member of the Berlin staff told SZ. “The picture is full of blood and brutality, no one should have to see that. But he said: that’s just your opinion. You have to try and think about what Facebook wants. We’re expected to think like machines.”
© Copyright IBTimes 2024. All rights reserved.