YouTube is under fire: A new lawsuit claims that it failed to support the content moderators who watch and remove inappropriate or violent videos uploaded to the site.
The lawsuit was filed Monday by a former content moderator who says she had to watch videos of beheadings, shootings, child abuse, and other disturbing content, according to NBC News. As a result — and because YouTube didn’t offer medical support — she experienced nightmares, panic attacks, and found herself unable to stay in crowded areas.
The anonymous plaintiff, who like other moderators was staffed by a third-party agency, alleges that moderation teams were understaffed to the point that workers often had to exceed the recommended four hours per day of scanning violent videos, NBC reports. That comes out to between 100 and 300 videos per day, with very little room for error.
Meanwhile, the suit claims, the YouTube “Wellness Coaches” available to content moderators only worked limited hours and weren’t actually licensed to offer medical advice, leaving moderators to find and pay for their own mental healthcare.
The lawsuit echoes similar content moderation lawsuits filed against Facebook, further illustrating tension around how the tech industry regulates user-generated content.
The problem isn’t going away any time soon. YouTube plans to rely more heavily on human moderators after its algorithmic content moderation system made too many errors, the Financial Times reports.
At the same time, it’s tough to prescribe a better strategy for tech giants. Sure, they could hire their content moderators, pay them well, and provide them with benefits and mental health support. But artificial intelligence isn’t yet good enough to take over human judgment on policing horrible content — so, for now, someone has to do it.
READ MORE: Former YouTube content moderator describes horrors of the job in new lawsuit [NBC News]
More on content moderation: Facebook Moderators Are Dying at Their Desks