Facebook’s ability to moderate sensitive or disturbing content – like suicides, child abuse and self-harm – was negatively impacted by the COVID-19 pandemic, the company said Tuesday in its latest report on Community Standards and Enforcement, which covered the second quarter of 2020.

Facebook said a lack of human content moderators during quarantines led to a lack of control for graphic content, including violence and sexual content involving both adults and children.

“With fewer content reviewers, we took action on fewer pieces of content on both Facebook and Instagram for suicide and self-injury, and child nudity and sexual exploitation on Instagram,” the company explained in the report. “Despite these decreases, we prioritized and took action on the most harmful content within these categories. Our focus remains on finding and removing this content while increasing reviewer capacity as quickly and as safely as possible.”

Facebook also reported improvements in the AI systems used to moderate certain kinds of content, including hate speech, harassment and terrorism. Despite these strides, it conceded there remains a need for human moderators for a wide variety of content.

At this time, Facebook said most of its moderators are back to work. Only a handful of these workers are actually working in an office setting, while the rest are working from home.

“As the COVID-19 pandemic evolves, we’ll continue adapting our content review process and working to improve our technology and bring more reviewers back online,” the report concluded.

Facebook says a pro-Trump campaign directed from Romania deceived users by posting content pretending to be from Americans
Facebook says a pro-Trump campaign directed from Romania deceived users by posting content pretending to be from Americans AFP / Kenzo TRIBOUILLARD