Scammers Using AI Images to Profit from Turkey-Syria Earthquake
Scammers are using images of the earthquakes in Turkey and Syria that have been generated with artificial intelligence (AI) to trick people into donating money.
According to the BBC, security experts have warned that fraudsters are using AI to create emotive images of the Turkey-Syria earthquake and then creating fake appeal pages for survivors.
Online scammers have been sharing these AI-generated images of the disaster on Twitter alongside links to cryptocurrency wallets asking for charitable donations.
The devastating 7.8 magnitude earthquake in Turkey and Syria on Tuesday has left more than 35,000 people dead and millions more without shelter, food, or water.
However, while claiming to be raising money for survivors of the disaster, the scammers are reportedly putting the donated funds into their own accounts.
The BBC reports that a Twitter account posted a fake appeal eight times in twelve hours, sharing the same “photo” of a firefighter holding a child surrounded by collapsed buildings.
However, the Greek newspaper OEMA later discovered that the image was generated by AI text-to-image generator, Midjourney.
Midjourney creates similar pictures when given the prompt “Image of firefighter in aftermath of an earthquake rescuing young child and wearing helmet with Greek flag.”
Social media users also spotted that the AI-generated firefighter had six fingers on his right hand.
One of the crypto wallet addresses that the Twitter account had posted had also reportedly been used in scam and spam tweets back in 2018. The other address had been shared on the Russian social media website VK alongside pornographic content.
When the BBC contacted the Twitter account that shared the appeal, the user denied it was a scam.
“My aim is to be able to help people affected by the earthquake if I manage to raise funds,” they say. “Now people are cold in the disaster area, and especially babies do not have food. I can prove this process with receipts.”
The BBC says it has not received any receipts or proof of their identity from the Twitter account.
The incident is a sign of the increasing use of AI-generated imagery in online scams. Earlier this week, PetaPixel reported on a deep fake video of podcaster Joe Rogan that fooled TikTok viewers into buying a product he had never discussed.