Controversial Interview With School Shooting Victim’s AI Avatar Sparks Debate

A teenager who was murdered in a school shooting has been interviewed from beyond the grave thanks to artificial intelligence.
Joaquin Oliver’s parents created an AI-generated avatar that was trained on his social media posts, pictures and videos, as well as other personal details. Oliver was murdered in 2018 during the Parkland high school shooting in Florida when he was just 17 years old.
Journalist Jim Acosta, who until recently worked for CNN, conducted the controversial interview that many have condemned as “exploitative” while igniting a debate on the ethics of digital twins of the deceased.
“If the problem that you have is with the AI, then you have the wrong problem. The real problem is that my son was shot eight years ago,” Joaquin’s father Manuel says in an Instagram video published on what would have been his son’s 25th birthday.
Acosta called it a “one of a kind” interview which began with the journalist asking what happened to him in Florida and ended in a bizarre conversation about his favorite sports teams and movies.
Hany Farid, a professor at the University of California at Berkeley, tells The Washington Post that, “This sort of interview style can’t possibly represent what that child wants to say in any reasonable way.”
“There are plenty of opportunities to talk to real victims and have a serious conversation about this epidemic that’s happening in our country without resorting to this sort of stunt,” he adds.
The video marks the first time a dead person has been interviewed in such a way. Earlier this year, an AI-generated video avatar of a man who was shot and killed in a 2021 road rage incident was presented in court as an unprecedented type of victim statement. But Acosta’s interview with Oliver marks the first time that there has been an official back and forth between the living and a representative of the dead.
“The chatbot is both designed to fill in the blanks and designed to please,” Sam Gregory, an AI and human rights expert, tells the BBC. “So, it’s possible it’s going to say things that Joaquin would never have said because it’s an AI hallucinating facts.”