Spanish Town Rocked by Nude AI Image Scandal

AI image

A sleepy town in Spain has been rocked by an AI image scandal that saw nude pictures of children passed around.

Police in Almendralejo, southern Spain are investigating the case in which 20 girls have come forward as victims of an artificial intelligence (AI) app that takes photos of fully-clothed people and generates an image without their clothes on.

The girls are aged between 11 and 17 with one of the victim’s mothers sounding the alarm and drawing attention to the issue.

“One day my daughter came out of school and she said ‘Mom there are photos circulating of me topless’,” says MarĂ­a Blanco Rayo, the mother of a 14-year-old.

“I asked her if she had taken any photos of herself nude, and she said, ‘No, Mom, these are fake photos of girls that are being created a lot right now and there are other girls in my class that this has happened to as well’.”

Parents have now formed a support group and say the images have impacted the victims differently with some of the girls now afraid to leave their house.

The BBC reports that 11 local boys are under investigation for the creation and dissemination of the images over WhatsApp and Telegram.

New Threats to Women

According to Euronews, the images were made on an AI-powered app called ClothOff which allows users to “Undress anybody, undress girls for free.”

The case highlights the dangers that generative AI imagery poses and there is a fear that this kind of behavior is far more widespread than people realize.

One of the girl’s mothers in Almendralejo told a local television station that her daughter had been blackmailed. A boy allegedly asked her for money on social media and when she refused she was sent naked AI-generated pictures of herself.

“It may have started as a joke, but the implications are much greater and could have serious consequences for those who made these photos,” says the mayor of Almendralejo.

The parents and victims are also concerned that the pictures could be uploaded to pornographic sites and while the images are not real, the verisimilitude of AI images is such that the girls’ distress over them is very real.

Image credits: Header photo licensed via Depositphotos.