Romance Scammers are Using AI Images to Trick Victims

scammer

A victim of a romance scammer had $430,000 stolen after AI images and deepfake video technology tricked her into believing she was in a legitimate two-year relationship.

A charity in the U.K. has warned bad actors are using the technology to “manipulate victims into believing that they’re real people.”

How the AI Scam Works

In the case above, the fraudster met the victim on a dating website and eventually proposed to her using an AI image of a man holding up a sign that read “Will you marry me?”

The victim, who is in her fifties, drew upon her pension pot early and sold personal possessions after the criminal convinced her that he was being held hostage and tortured by people he owed money to.

“Aside from the financial aspect, the victims go through a lot of emotional stress because they feel like their boyfriend or girlfriend is in danger,” Lisa Mills tells inews, a relationship fraud expert at the charity Victim Support.

Mills warns that deep learning synthesis models like text-to-image generators are becoming “more sophisticated” and new AI tools are becoming a fresh weapon in the fraudsters’ toolkit.

Midjourney v5 Unleashed

One of the marquee AI image generators, Midjourney, released its latest v5 model earlier this month offering the very latest in powerful synthetic imagery.

the pope in a puffer jacket
AI image of Pope Francis in a puffer jacket.

Over the weekend an AI image created on Midjurney v5 of Pope Francis in a white puffer jacket fooled many people into believing it was genuine. Pablo Xavier posted the images to a Facebook group called AI Art Universe and then on Reddit after which they proceeded to go viral.

“I was just blown away,” he tells Buzzfeed News. “I didn’t want it to blow up like that.”

It comes shortly after fake images of Donald Trump being arrested, also created with Midjourney v5, went viral offering yet more proof that anyone can now make fake images that are capable of deceiving millions of people.

Last year, the FBI warned that deepfake technology could be used to fool employers in remote job interviews.


Image credits: Header photo licensed via Depositphotos.

Discussion