FBI Warns of AI-Generated Deepfakes Images in Sextortion Schemes

AI Face

The FBI issued a warning about the rise of artificial intelligence (AI) generated deepfakes to harass or blackmail people with fake sexually explicit photos or videos of them.

On Monday, the FBI published a statement warning that scammers are increasingly using AI technology to create sexually explicit deepfake photos and videos of people in a bid to extort money from them, which is also known as “sextortion.”

The threat is particularly concerning as scammers often exploit the benign photographs and videos people post on their public social media accounts.

Using deepfake technology, the scammers then use the victim’s face in their social media photos to create AI-generated pornography.

“The FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content,” the agency says.

“The photos or videos are then publicly circulated on social media or pornographic websites, for the purpose of harassing victims or sextortion schemes.”

‘The Dangers of Posting Public Photos’

The FBI also issued a warning to the public about the dangers of posting public photos and videos of themselves online.

“Although seemingly innocuous when posted or shared, the images and videos can provide malicious actors an abundant supply of content to exploit for criminal activity,” the FBI states.

“Advancements in content creation technology and accessible personal images online present new opportunities for malicious actors to find and target victims.

“This leaves them vulnerable to embarrassment, harassment, extortion, financial loss, or continued long-term re-victimization.”

In April, the FBI says it observed an uptick in sextortion victims reporting the use of AI-generated deepfake images or videos based on their social media content.

The scammers would typically demand payment from victims with threats to share the photos and clips with family members and social media friends if funds were not received. Alternatively, they would demand that the victims send real sexual images of them.

In the U.S., there is currently no federal legislation to protect against people’s images being used without their consent in deepfake porn or with any associated technology.

Deepfakes scams are on the rise. Last year, the FBI issued a warning that a rising number of scammers are using deepfake technology to impersonate job candidates during interviews for remote positions.


Image credits: Header photo licensed via Depositphotos.

Discussion