A TikTok moderator has filed a class-action lawsuit against the company, claiming that the daily exposure to graphic videos has left her suffering from post-traumatic stress disorder. She is demanding that the social media giant set up a medical fund for its moderators.
Bloomberg reports that a Las Vegas woman named Candie Frazier is suing TikTok and its parent company ByteDance over claims that her job as a moderator for the hit app has impacted her mental health after the constant screening of violent videos.
As part of her job as a moderator, Frazier is exposed to a stream of graphic videos that contain subjects such as child pornography, sexual crimes, beheadings, animal mutilations, and more. In her lawsuit, Frazier further adds that she has had to endure “freakish cannibalism, crushed heads, school shootings, suicides, and even a fatal fall from a building, complete with audio.”
Moderators are required to work quickly and have to watch hundreds of videos during their 12-hour shift, receiving only one hour-long lunch break and two 15-minute breaks, Frazier claims. Her lawyers explain that “due to the sheer volume of content, content moderators are permitted no more than 25 seconds per video, and simultaneously view three to ten videos at the same time.”
The experience has purportedly left Frazier with trouble sleeping and “when she does sleep, she has horrific nightmares,” reads the complaint. Frazier is asking for compensation covering her personal psychological injuries as well as a court order to ensure that TikTok sets up a medical fund for its 10,000 moderators.
According to Frazier’s lawsuit, the company has failed to implement guidelines that help its moderators cope with screening violent videos and images. Some of the guidelines include limiting moderator shifts to four hours and a provision of psychological support.
Although the social media giant says it doesn’t comment on ongoing legal proceedings, it tells Bloomberg that it aims “to promote a caring working environment for [its] employees and contractors.”
“Our safety team partners with third-party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally,” a TikTok spokesperson says in a statement.
Image credits: Header photo licensed from Depositphotos