A new report has revealed how child safety investigators are struggling to stop thousands of disturbing artificial intelligence (AI) generated “child sex” images that are being created and shared across the web.
According to a report published by The Washington Post on Monday, the rise of AI technology has sparked a dangerous “explosion” of lifelike images showing child sexual exploitation — causing concern among child safety experts.
The report notes that thousands of AI-generated child-sex images have been found on forums across the dark web. Users are also sharing detailed instructions for how other pedophiles can make their own realistic AI images of children performing sex acts, commonly known as child pornography.
“Children’s images, including the content of known victims, are being repurposed for this really evil output,” Rebecca Portnoff, the director of data science at Thorn, a nonprofit child-safety group tells The Washington Post.
Since last fall, Thorn has seen month-over-month growth in AI images’ prevalence on the dark web.
Is an Image Real or Fake?
The explosion of such images has the worrying potential to undermine efforts to find victims and combat real abuse as law enforcement will have to go to extra lengths to investigate whether a photograph is real or fake.
According to the publication, AI-generated child sex images could “confound” the central tracking system built to block such material from the web because it is designed only to catch known images of abuse, rather than detect newly-generated ones.
Law enforcement officials, who work to identify victimized children, may now be forced to spend time determining whether the images are real or AI-generated.
AI tools can also re-victimize any individual whose photographs of past child sex abuse are used to train models to generate fake images.
“Victim identification is already a needle in a haystack problem, where law enforcement is trying to find a child in harm’s way,” Portnoff explains.
“The ease of using these tools is a significant shift, as well as the realism. It just makes everything more of a challenge.”
‘Children Who Don’t Exist’
The images have also fueled a debate on whether they even violate federal child-protection laws as the pictures often depict children who do not actually exist.
According to The Washington Post, Justice Department officials who combat child exploitation say such images are still illegal even if the child depicted is AI-generated.
However, there is no previous case in the U.S. of a suspect had been charged for creating deepfake child pornography.
In April, a man in Quebec, Canada was recently sentenced to three years in prison for using AI to generate images of child pornography — the first ruling of its kind in the country.
Image credits: Header photo licensed via Depositphotos.