TikTok sued by content moderator over ‘psychological trauma’
More under this adThe plaintiff said she suffers from post-traumatic stress disorder from all the disturbing videos she had to watch.
Tiktok is being sued by a former content moderator over trauma caused by reviewing graphic videos. The complainant claims the TikTok and its parent company, Bytedance, failed to protect her mental health after long exposure to disturbing content.
Discover our latest podcast
Psychological Damage
Candie Frazier has filed a proposed class-action lawsuit stating she has had to spend long hours reviewing videos containing crushed heads, school shootings, suicides, child pornography, rape, cannibalism, complete with audio.
More under this adMore under this adAccording to the complaint filed in a federal court in Los Angeles, the social media platform demands moderators to watch hundreds of such videos during a 12-hour shift with only an hour off for lunch and two 15-minute breaks.
Frazier’s lawyers said in the complaint:
Due to the sheer volume of content, content moderators are permitted no more than 25 seconds per video, and simultaneously view three to ten videos at the same time.More under this adMore under this ad
The proposed-lawsuit also claims that Frazier suffers from post-traumatic stress disorder due to all the violent and disturbing content she was made to consume during her time working as a moderator.
Plaintiff has trouble sleeping and when she does sleep, she has horrific nightmaresMore under this adMore under this ad
Response
In September, TikTok announced 1 billion people were using the app each month. It has also received much flak for some of the content it allows on its platforms, especially those targeting children.
To protect its users, the video-sharing platform relies on thousands of in-house and contract content moderators to filter out videos and accounts that break its rules.
More under this adMore under this adA company spokesperson said in a statement that:
Our safety team partners with third party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally