Viral Video of Kamala Harris Rambling is Revealed as a Deepfake
A viral video featuring Vice President Kamala Harris rambling during a speech has been revealed as a deepfake.
The deepfaked clip of Harris speaking gibberish and riffing on the word “today” has amassed over five million views across social media.
"Tomorrow will be today, tomorrow." pic.twitter.com/saOHmaY5VY
— Ramble_Rants (@ramble_rants) April 29, 2023
And while some critics have previously commented on Harris’s tendency for making speeches with buzzwords and “word salads,” this widely-shared video is completely doctored.
In the clip, Harris appears to say: “Today is today, and yesterday is today yesterday. Tomorrow will be today tomorrow, so live today so the future today will be as the past today, as it is tomorrow.”
The footage is originally from a speech the vice president gave at a rally for abortions rights at Howard University in Washington, D.C on April 25.
However, the Vice President’s words were changed to an audio that originates from a TikTok user who impersonates Harris on the platform. Meanwhile, Harris’s mouth was manipulated in a way to make it look like she was actually saying those words.
Twitter’s and Facebook’s independent fact-checkers later added a note to the video to inform viewers that it was doctored, but not before the deepfaked clip of Harris went viral across social media.
The convincing clip highlights the potential of deepfake technology to spread misinformation. Moreover, Dr Dominic Lees, Deepfake researcher and Associate Professor of Filmmaking at the University of Reading, UK, told online fact-checker BOOM that the manipulated video of Harris was “amateur work in deepfake by simply creating a voice clone and applying it to the existing video footage and then manipulating the lips and mouth to match the fake speech.”
Legal experts are reportedly conflicted about what steps can be taken against deepfaked videos and photos going forward. While some argue that they are protected by the first amendment as a creative invention or parody, others contend they could in some cases face claims of defamation or copyright infringement.
PetaPixel previously reported on how a deep fake video of podcaster Joe Rogan fooled TikTok viewers into buying a product he had never discussed.
Image credits: Header photo licensed via Depositphotos.