TikTok Moderators Claim They Viewed Child Sex Abuse Images in Training

tiktok on a phone

A cache of child sexual abuse photos has been allegedly shown to TikTok moderators during training. This is according to workers who say they were granted insecure access to illegal photos and videos.

Forbes reports on a worker called Nasser who says he was shocked after being shown uncensored, sexually explicit images of children during his training to become a content moderator for TikTok.

Nasser was working for El Paso-based Teleperformance, a third-party company that moderates content for the social media giant, when he was assigned to a project that teaches TikTok’s artificial intelligence (AI) how to spot the very worst pictures or videos that could be posted in the app.

Horrifyingly, Nasser claims to have been shown material of graphic images and videos of children involved in sexual acts that had already been removed from TikTok.

He raised concerns to Forbes telling the business magazine that, as a father, he didn’t think it was right.

“I don’t think they should use something like that for training,” he says.

Denial

When Forbes gave TikTok a right of reply, it denied that the company had shown employees sexual exploitative content for training purposes.

TikTok spokesperson Jamie Favazza said that the company’s “training materials have strict access controls and do not include visual examples of child sex abuse material.”

However, the company did concede that it works with third-party firms that may have different processes.

A Difficult Issue

Child sex abuse material (CSAM) is posted to social media and platforms like TikTok cannot idly stand by, meaning content moderators are routinely forced to deal with it.

Images of child abuse and exploitation are illegal in the United States, and there are strict rules for handling them when discovered.

Companies are supposed to report the content to the National Center for Missing and Exploited Children (NCMEC) and then preserve it for 90 days but minimize the number of people who see it.

The allegations that Nasser has brought to Forbes go far beyond these limits with one employee telling the magazine that she has contacted the FBI to ask whether Teleperformance was criminally spreading CSAM.


Image credits: Header photo licensed via Depositphotos.

Discussion