Facebook and Instagram Ran Deepfake Nude Ads of a 16-Year-Old Jenna Ortega

jenna ortega deepfake nude images facebook instagram underage meta

Facebook and Instagram ran ads that featured a blurred deepfake nude image of an underage Jenna Ortega to promote an AI app.

Deepfake pornographic images of a 16-year-old Ortega were used in ads for the Perky AI app — which allows users to create explicit sexually explicit images of anyone using AI technology.

According to a report by NBC News, Meta allowed Perky AI to run 11 ads in February on Facebook, Instagram, and Messenger that featured an edited image that undressed the Wednesday actress.

Ortega is 21 years old today. However, Perky AI created a topless deepfake image of Ortega based on a photo of the actress when she was just 16 years old and starring on Disney Channel’s Stuck in the Middle.

Perky AI — which charges $7.99 a week or $29.99 for 12 weeks — advertised that it could create “NSFW,” meaning “not safe for work,” images by users’ requests.

The ads showed how users could change Ortega’s outfit in the photo based on text prompts, including “Latex costume,” “Batman underwear” and finally, “No clothes.”

NBC News reports that a description of the app on the Apple store describes how users can “enter a prompt to make them look and be dressed as you wish.”

Nonconsensual Deepfake Nudes are a Growing Crisis

After NBC News reached out to Meta, it suspended Perky AI app’s page, which has reportedly run more than 260 different ads on Meta’s platforms since September.

“Meta strictly prohibits child nudity, content that sexualizes children, and services offering AI-generated non-consensual nude images,” Meta spokesperson Ryan Daniels says in a statement.

Perky AI has also since been removed from the Apple app store for violating policies around “overtly sexual or pornographic material.”

The worrying ads featuring Ortega are part of a growing crisis online, where nonconsensual deepfake nude images of underage girls and women are becoming widely available due to AI tools.

According to independent research from deepfake analyst Genevieve Oh and advocacy group for deepfake victims MyImageMyChoice cited by NBC News, more nonconsensual sexually explicit deepfake videos were posted online in 2023 than every other year combined.

Earlier this year, U.S. senators introduced a bill that would criminalize the spread of nonconsensual deepfake porn — after sexually explicit AI-generated photos of Taylor Swift went viral on X (formerly known as Twitter).

Image credits: Header photo licensed via Depositphotos.