Concerns over advanced video-editing technology being used for nefarious political purposes came to fruition Wednesday evening. A manipulated video, known as a “deepfake,” of President Volodymyr Zelenskyy apparently urging Ukrainians to surrender was making the rounds on social media for several hours. The false message originated from hackers compromising the website for Segodnya, a Russian-language tabloid newspaper.
Zelenskyy quickly released his own video denouncing the misleading media, calling it “childish provocation.”
“If I can offer someone to lay down their arms, it’s the Russian military. Go home. Because we’re home. We are defending our land, our children, and our families,” he said in the post to his Facebook and Telegram accounts. “We are not going to lay down any weapons until our victory.”
Meta, Facebook and Instagram’s parent company, has identified and removed the video according to a tweet from its head of security policy, Nathaniel Gleicher. The deepfake can still be seen on Twitter and YouTube, but with clear context provided.
Deepfake is a portmanteau term deriving from “deep learning” and “fake.” Deep learning refers to the same artificial intelligence software that makes editing your face onto your favorite celebrities’ bodies possible. In this case, instead of an innocent classic movie scene, the hackers used video from various Zelenskyy press conferences to fabricate an important message from Ukraine’s leader to stand down.
In this particular instance, the Zelenskyy deepfake is being widely reviewed as laughable and of poor quality, including by many Ukrainians. They have been warned for weeks to be on the lookout for this very thing. There are several telltale signs to watch for in a deepfake. The subject may not blink in those made with older software. Technology has improved over time to remedy that discrepancy, but lip-synching that doesn’t match up to audio, poorly rendered hair, skin and teeth, and a lack of head and hand/ arm movement are all ways to identify a deepfake. Deepfakes will often be grainy in an attempt to pass off inconsistencies as nothing more than poor video resolution.
In this particular case, since Zelenksyy’s press conferences are widely available online, several outlets were able to identify the footage from which the deepfake was derived and debunk it. Verifythis.com was quickly on the case and after analyzing it with video forensics and reverse image searching, published an article confirming their findings that the video was fake at 8:22 pm Eastern time.
“Zelenskyy’s face was superimposed onto the body in the deepfake,” says Verify author Kelly Jones. “The face we see talking was generated from this still image taken of Zelenskyy at a press conference on March 3. The mouth on the photo was manipulated to appear as if Zelenskyy was actually talking.”
The video’s poor quality notwithstanding, this being introduced as a tactic in the Ukraine war and beyond is definitely concerning. The hackers might not have had much intent beyond harassing Ukrainians and making information passed to them more questionable going forward, but that is significant in itself. Furthermore, it could be just the tip of the iceberg as Russia works out the kinks in what could be another tool in their propaganda and disinformation toolbox. Everyone in the Western world should remain vigilant and expect more of this in the future.
UPDATE: Another deepfake video released Wednesday morning has surfaced, this one of Russian President Vladimir Putin declaring that peace has been reached with Ukraine, and that Crimea will be restored as an independent republic within Ukraine.
This video, it appears, was meant to be more of a parody and to call out the Russian hackers for their shoddy editing, rather than an attempt at disinformation.
The user who posted it admits in the first comment below: “Learn, katsaps, how to make deepfakes. This is high-quality work, and not the garbage that you made against Zelensky.”
“Katsap” is a common slur for Russians used in Ukraine, Poland and other Slavic countries.
Fact-checkers for Reuters determined that the video footage used to create the deepfake was from Putin addressing the situation in Ukraine on Feb. 21, just prior to the invasion, found on the official website of the President of Russia. Russian speakers at Reuters confirmed that the audio does not sound like Putin and does not match up with the movement of his lips.
Regardless of intent in this one video, a new tit-for-tat trend appears to be developing in the social media realm that, given time and technological advances, could lead to legitimate consequences in the conflict.
Read more from Sandboxx News
- Answers to questions US military parents have about war in Ukraine
- Are we too quick to draw parallels between Ukraine and Taiwan?
- War in Ukraine: The West balances the head and the heart going forward
- Russia’s focus on perception is costing them the skies over Ukraine
- Is Russia planning to use chemical weapons in Ukraine? The signs are there.
Feature image: Screen capture from YouTube of comparison generated by verifythis.com.