IE 11 is not supported. For an optimal experience visit our site on another browser.
Photo Illustration: Ukranian president Volodymyr Zelenskyy
MSNBC / Getty Images

This deepfake of Zelenskyy went viral. Let it be a warning to us all.

Officials spotted a fake video purporting to show Ukraine's president surrendering. We may not be so lucky next time.

By

A fake video purporting to show Ukrainian President Volodymyr Zelenskyy telling his soldiers to surrender made the rounds on social media earlier this week, before being scrubbed from several social media sites.

In a video post on Telegram, the real Zelenskyy refuted claims that he called on Ukrainians to surrender. He said he's still in Ukraine, defending it from Russian invaders. 

Nonetheless, the fake video still reached a substantial audience. The Ukrainian TV network Ukraine 24 said hackers placed the video on its website and put Zelenskyy’s fake message in its scrolling news ticker at the bottom of the screen. The message was also spread across Russian-backed social media sites like VK, a site similar to Facebook predominantly used by Russian speakers.

Fortunately, the fake Zelenskyy video was poorly made and easy to spot. But that won’t always be the case.

For years, security professionals around the world have warned about the nefarious use of deepfakes, machine-manipulated video and audio often used to deceive people into believing public figures have said or done things they actually haven’t. 

Here’s an NBC News video from 2018 explaining how the technology is used:

Unfortunately, deepfakes aren’t going away. In fact, the technologies used to create them are being refined and becoming more widely accessible, which has inevitably led to more widespread use. 

For example, the very artificial intelligence technology that allowed someone to make the fake video of Zelenskyy is also being utilized to improve dubs on foreign-language films.  

And the same augmented reality technology — known as "AR" — used in some deepfakes to superimpose someone’s talking head on another person’s body has already become widely popular thanks to "face swap" features on apps like Snapchat

So-called face-swap apps are so easy to make, there are tutorials on how to do so widely available online

In this age of rapidly improving technology, deepfakes are going to become increasingly easy to make using our mobile devices. Companies are already experimenting with high-powered technology, with the help of 5G networks, to help consumers place an AR couch in their home to see how it might fit the feng shui or virtually try on a shirt before they order it.

But that same technology could just as easily place your face in surveillance footage at a crime scene you never visited or in a pornographic film you never participated in. Or — similar to the Zelenskyy deepfake — it could make it look like world leaders are making proclamations they’ve never made, with potentially deadly consequences. 

We’ve developed an affinity for these technologies because of the cute, fun and resourceful things they do for us. But as we’ve embraced them, we’ve also ushered in a frightening new era of technological trickery.

We all need to beware.

Related: