Police Report Reveals How Deepfake Nude Photos Took Over a High School

Teen boys used artificial intelligence (AI) to maake deepfake nudes female classmates

A police report has revealed how a deepfake nude scandal took over a high school in the U.S. last year.

In November, it was reported that a teenage boy had taken photos of his female classmates in Issaquah High School in suburban Seattle, Washington State and used AI technology to “undress” them.

The deepfake pornographic images of these underage girls were then circulated around the school — an act that the police have characterized as a possible sex crime against children.

Now, a police report obtained exclusively by 404 Media has revealed exactly how this deepfake nightmare unfolded in the high school and how easily-accessible AI tools are being used to harass young girls.

It also reveals the lack of clarity on such AI-generated images’ legality. The question of how or whether to punish the creators of the deepfaked photos has left parents, schools, and law enforcement struggling to deal with the technology.

According to 404 Media, the police report confirms that the images circulated in Issaquah High School were created using web-based “nudify” or “undress” apps, which automatically and instantly alters photos of women to make them appear naked.

The students who used the app to create naked images of other students told police that they discovered the app on TikTok and posted some of them on Snapchat, over text messages, or showed them to other students at the lunch table at school.

The police report says that at least some of the images that were altered with AI were taken from Instagram. Students were reportedly sharing pornographic images of around seven students and an adult staff member.

404 Media reports that these nudify apps only require a single photo of a person to create what looks like a realistic nude image of them. These apps take a photo of a clothed person, and create what they might look like if their clothes were removed.

The publication says that these apps have also been promoted on Twitter, TikTok, can reportedly be easily found via Google or Bing searches.

No Charges Were Brought

Despite students telling school staff what was happening, the school did not report what was happening to the police.

Instead, 404 Media reports that the Issaquah School District told the suspected victims — and sometimes their parents — that their classmates may have been circulating AI-generated nude photos of them.

The police report indicates that law enforcement ultimately found out about what was happening from three separate parents — not school officials. An officer was surprised that the school did not report the incident to the police directly as these incidents qualify as “sexual abuse,” and school administrators are “mandatory reporters.”

Eventually, after an investigation, police referred the case to a local prosecutor, noting that there was a probable cause to charge the student who admitted to making the photos with “Cyber Harassment.”

However, the prosecutor ultimately decided not to bring charges.

Axios reports that Washington state lawmakers are now weighing whether to make it illegal to share deepfake pornographic images — after the technology victimized these high school students.

A bill advancing in Washington’s Legislature with bipartisan support would create a new criminal offense called “distributing fabricated intimate images.” It would also let victims sue those who share deepfake porn.

The full 404 Media report can be read here.


Image credits: Header photo licensed via Depositphotos.

Discussion