YouTube Moderation Might Be Causing PTSD In Human Workers
The sheer volume of inappropriate or disturbing videos posted to YouTube every day presents a monumental task for the platform’s human moderators. Now, the impact this task may be having on these employees is coming to light thanks to a California lawsuit.
Filed on Monday in a California superior court, the suit alleges that the task of sifting through videos with violent and disturbing content led one former moderator for the Google-owned company to develop depression and other symptoms consistent with PTSD. She is now seeking compensation for her suffering, money to pay for mental health treatment, and for YouTube to establish better programs to address employee trauma.
The suit claims that the unnamed employee frequently saw videos of child abuse and beheadings, as well as specific ones featuring such horrific content as “people eating from a smashed open skull, school shootings with dead children, a fox being skinned alive and a person’s head getting run over by a tank,” CNET reports. She now claims to suffer from anxiety, panic attacks, and fear of open spaces where mass shootings might occur. The anxiety allegedly caused by the moderation position has also led to her losing friends and being afraid to have children.
“She has trouble sleeping and when she does sleep, she has horrific nightmares. She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind,” the lawsuit reads.
The suit specifically alleged that YouTube, in violation of California law, did not create a safe work environment and failed to properly consider the mental health needs of employees handling such graphic content. Furthermore, it claimed that the company’s moderation team is “chronically understaffed,” forcing the employees to handle 100-300 videos a day.
YouTube recently had to rely more heavily on automated moderation when employees were sent home during the COVID-19 outbreak. As a result, thousands of innocent videos were taken down by algorithms that didn't know any better. While unconfirmed by YouTube, these hiccups could lead to a greater reliance on human moderators moving forward.
© Copyright IBTimes 2024. All rights reserved.