Pedophiles Are Using AI Generators to Create Child Abuse Images

Pedophiles are using artificial intelligence (AI) to create indecent images of child abuse. Abusers are reportedly using AI text-to-image generators and deepfake technology to add the faces of real children onto computer-generated bodies.

According to the Daily Mail, a computer programmer was arrested in Spain in December for creating child porn images with AI software. The arrest was allegedly one of the first of its kind.

According to the Spanish National Police, investigators discovered a “huge” stash of images at the man’s home in Valladolid near Madrid, Spain.

The Spanish National Police say that the computer programmer had used real images of children online. He then used an AI image generator to rewrite the children’s photos into abusive scenarios and create horrific images. The man was also arrested for downloading real indecent images of children.

In a statement, the Spanish National Police say that the AI-generated child porn images were extremely disturbing for investigators.

“The files, which caused a great impact to the researchers because of their extreme harshness, depicted real images of very young girls being raped and using disproportionate organs and sex toys,” a spokesperson for the Spanish National Police says.

The Daily Mail reports that the National Crime Agency (NCA) in the U.K. are also aware of the threat of pedophiles using AI software to create child abuse images.

“The amount of child abuse imagery found online is concerning; every year industry detects and reports an increasing number of illegal images. We constantly review the impact that new technologies can have on the child sexual abuse threat,” a spokesperson for the NCA tells the Daily Mail.

“The NCA works closely with partners across law enforcement and wider government, as well as gaining insights from the private sector, to ensure we have the specialist capabilities to continue to detect and investigate AI-generated child abuse images.”

AI-generated pornography which features the faces of non-consenting individuals is becoming increasingly common online.

In January, a community of young Twitch livestreamers discovered that their images were on a deepfake porn site. AI had been used to depict them in sex acts without their consent or knowledge.

In the U.S., there is currently no federal legislation to protect against people’s images being used without their consent in deepfake porn or with any associated technology.

However last year, the U.K. announced plans to criminalize the sharing of any deepfake porn made without the subject’s consent. Those found violating the law can face jail time.

Image credits: Header photo sourced via Twitter/Policia Nacional.