Facebook Protected Hateful, Far-Right Pages Despite Rule Violations
A new television documentary alleges that Facebook, at least for a time, gave special protection to right-wing pages that clearly violated the site’s rules. The newest episode of the documentary series “Dispatches” on Britain’s Channel 4 found that Facebook kept controversial, rule-breaking pages around for longer than it should have because of their popularity, according to the Guardian.
“Dispatches” sent an undercover reporter to work for Cpl, a third-party contractor that Facebook uses for content moderation. The episode, set to air Tuesday night, reveals that Facebook allowed pages attached to violent, far-right entities like the Britain First party and activist Tommy Robinson to exist despite clear breaches of site standards.
According to a moderator who appears in the episode, Britain First was allowed to keep going simply because its large follower count generated revenue for the site.
Britain First party, considered a fascist political organization, has shown hostility towards Muslims before it was deregistered by the country’s Electoral Commission in November. Two of its main leaders were convicted of hate crimes in March, when the Facebook pages were finally taken down. Robinson’s story is similar, and ended similarly; he was jailed for contempt of court in May.
However, according to information uncovered by “Dispatches,” their Facebook pages were allowed to exist because of their popularity. The average page gets removed if five or more of its posts break Facebook’s guidelines, but really popular ones operate under a different set of standards.
This is known as “shielded review,” according to the Guardian. It gives popular pages protected status, meaning people who work directly for Facebook decide what to do with them rather than third-party contractors. Even if Robinson or pages affiliated with Britain First repeatedly broke the rules, they could be kept afloat because of “shielded review.”
Facebook responded to Channel 4’s report in a Tuesday blog post. The company acknowledged that the moderation process in the Dublin office where the undercover reporter worked was not appropriate, before promising to do better in the future.
“It has been suggested that turning a blind eye to bad content is in our commercial interests. This is not true. Creating a safe environment where people from all over the world can share and connect is core to Facebook’s long-term success. If our services aren’t safe, people won’t share and over time would stop using them. Nor do advertisers want their brands associated with disturbing or problematic content.”
© Copyright IBTimes 2024. All rights reserved.