Internet Watchdog Warn of ‘Nightmare’ Rise of AI Child Sex Abuse Images
An internet watchdog agency warned that the rise of artificial intelligence (AI) generated child sex abuse images online could get even worse — if controls aren’t put on the technology that generates deepfake photos.
According to AP News, the Internet Watch Foundation (IWF) has urged governments and technology providers to act quickly before a deluge of AI-generated images of child sex abuse overwhelm law enforcement investigators and expand the pool of potential victims.
“We’re not talking about the harm it might do,” the IWF’s chief technology officer Dan Sexton says.
“This is happening right now and it needs to be addressed right now.”
The details come from the IWF’s latest report into the growing problem as the organization tries to raise awareness about the dangers of pedophiles using generative AI systems that can create images from simple text instructions.
The IWF’s latest report reveals how researchers found that 3,000 AI-generated images of child abuse were shared on a single site in September, with 564 pictures depicting the most serious kind of imagery including rape and sexual torture.
According to the IWF’s new report, criminals are using the technology to create images of celebrities who have been “de-aged” to depict them as children in sexual abuse scenarios. Criminals are also using AI to make existing photos of child actors appear sexual.
The internet watchdog says there is a growing trend where a single image of a known abuse victim is taken and used to generate more images of the victim in different abuse settings.
While reports of AI-generated child sex abuse images are still overshadowed by the number of real abuse images and videos found online, the IWF was shocked at the speed of the development and the potential the technology creates for new kinds of abusive images.
“Our worst nightmares have come true,” the IWF’s chief executive Susie Hargreaves says.
“Earlier this year, we warned AI imagery could soon become indistinguishable from real pictures of children suffering sexual abuse, and that we could start to see this imagery proliferating in much greater numbers. We have now passed that point.”
Last month, PetaPixel reported on a sleepy town in Spain that was rocked by an AI image scandal that saw nude pictures of children passed around.
Image credits: Header photo licensed via Depositphotos.