Deepfake Movies Trick Viewers Into Remembering Films That Don’t Exist

Deepfake videos that reinvent movies with a different actor are realistic enough to trick viewers into falsely remembering a film that never existed.

A team of psychologists, led by Gillian Murphy at University College Cork in Ireland, undertook a study in which they invited 436 people to watch several deepfake videos of fictitious movie remakes starring different actors.

The psychologists wanted to investigate growing concerns about the potential for artificially intelligent (AI) deepfake technology to spread misinformation and distort memories.

In the study, participants — who thought they were being surveyed about recent movie remakes — were presented with four real and two deepfake movie remakes in random order. They were not informed that the deepfakes were false until later in the survey.

However, in some cases, participants were given text descriptions of the remakes instead of watching the deepfakes.

Among the deepfake movie remakes shown to participants was a clip in which Will Smith was depicted as the lead character Neo in The Matrix — a role that was originally played by Keanu Reeves.

Another movie remake replaced Jack Nicholson and Shelley Duvall in The Shining with Brad Pitt and Angelina Jolie. While further deepfake clips starred Charlize Theron as the lead in Captain Marvel and Chris Pratt in the titular role of Indiana Jones.

Meanwhile, some of the real movies shown to participants included Charlie and the Chocolate Factory, Total Recall, Carrie, and Tomb Raider.

For each of the six movies, participants were asked whether they had seen or heard of the original film as well as the remake.

‘The Fake Remake was Better Than The Real Film’

The study found that participants readily formed false memories of the deepfake remakes, with approximately 49% of them believing that these movies were genuine.

Many of them also reported that “the fake remake as better than the original film” — 41% in the case of Captain Marvel, 13% for Indiana Jones, 12% for The Matrix, and 9% for The Shining.

But, interestingly, false memory rates were equally high when subjects were shown text descriptions, suggesting that deepfake technology may not be more powerful than other tools at distorting memory.

“While deepfakes are of deep concern for many reasons, such as non-consensual pornography and bullying, the current study suggests they are not uniquely powerful at distorting our memories of the past,” the researchers explain in the study.

“Though deepfakes caused people to form false memories at quite high rates, we achieved the same effects using simple text. In essence, this study shows we don’t need technical advances to distort memory, we can do it very easily and effectively using non-technical means.”

Researchers said that the findings could help inform future design and regulation of deepfake technology.

PetaPixel previously reported on research that revealed how AI generated deepfake faces look more real than genuine photos.

Discussion