Deepfake technology has existed for a little more than half a decade, but the number of victims has exploded recently

There’s a video of me attempting an armed robbery. The victim’s head-mounted GoPro camera recorded it. He was a cyclist, minding his business as he made his way down the street. Out of nowhere, the video shows me riding up on a motorcycle and pulling out a gun, forcing him to stop in his tracks; I demand he gives me his rucksack, but he claims he doesn’t understand what I’m saying.

Again, I shout the order, waving my weapon. That’s when he jumps off his bike and runs. I pursue on foot, catching up with him and thrusting the gun towards him. In the video, you can see the anxiety and fear on my face – but I want that rucksack. Will I shoot him for it? My agitation suggests so, yet thankfully he never finds out; a bystander rushes to his aid, and off I go.

Now it’s pretty stupid to write about your failed attempt at an armed robbery – especially one caught on camera, clearly showing my face. But despite this video showing me committing the crime, it never actually happened.

The video is a deepfake – a visual construct generated by artificial intelligence, that swaps one face out for another at the behest of a human creator. That creator does not need any programming or special effects skills; a deepfaker can be anyone – me, you, the 15-year-old across the street. Or, more worryingly, someone with a vendetta – a jilted ex, a former employee, a rival. Thanks to deepfake technology, these people can now make a video of you doing anything they want – one that could land you in serious trouble.

Deepfake technology has existed for a little more than half a decade, but the number of victims has exploded recently. Hundreds of thousands – maybe even millions – of people, primarily women, have already been deepfaked into pornographic videos without their consent, “their” nude bodies with their real faces performing to the creator’s will.

Whether you realised it or not, you’ve probably seen a deepfake before. If you’ve ever watched those funny face swap videos of celebrities on YouTube – perhaps Tom Cruise as Iron Man, or Will Smith playing Neo in The Matrix – you’ve seen a deepfake. If you didn’t realise, it’s no wonder: the technology is incredible. But that incredible technology enabling those Hollywood face swaps is just as responsible for all that non-consensual fake porn and more, too.

Michael Grothaus: ‘One of my most significant concerns is the reputational harm deepfaking can cause ordinary folk’

You may think: “I’m not famous, and no one is ever going to want to see me act in a porno.” But as my deepfake shows, that’s not the only damaging scenario in which you could be deepfaked. I have it in my power – as do you and everyone reading this – to download free deepfaking software right now and create something seriously damaging. What if I made a video of you kissing a stranger, and threatened to send it to your spouse or children? Wouldn’t you pay me some blackmail money for that not to happen? Especially if your spouse knows you’ve been unfaithful in the past?

Or maybe I know you’re up for that big promotion. A well-timed deepfake of you using homophobic slurs – and deepfake technology can put words into your mouth, using your actual voice – could derail any advancement you might have made. And why stop there? Why not just deepfake you into a crime, so I can really mess up your life?

I had the fortune of knowing a deepfake of me was coming; commissioned while researching my latest book, Trust No One: Inside the World of Deepfakes. Others won’t be so lucky. I told the deepfaker I wanted to see myself ‘commit’ a crime, but gave no further details: I wanted to be surprised so I could feel, as much as possible, how victims do. I provided an existing video of myself from the internet, and off they went.

That last part is important because if a deepfaker wants to put your face into another video, they need to “train” their deepfake AI on thousands of images of yourself. This training is how the AI learns to copy your face exactly. But how could someone get thousands of images of you? It’s simple: if you’ve posted any video of yourself on the internet, the deepfaker can just grab the images from there. After all, every second of video is made up of about 30 photos. That means for every minute of video you’ve posted of yourself to social media, the deepfaker has 1,800 photos of your face.

The AI then learns how to recreate your face and mask it over one in the video that the deepfaker wants you to appear in – a porno or a robbery, for example. And, exemplifying why deepfake tech is so frighteningly powerful, not only does your synthetic face get masked over the face of the original individual in the final video, but that individual’s expressions also get mapped onto your face.

That’s why deepfakes look so realistic. Your synthetic face adopts the movements, both large and small, of the original face. If the original face was moaning in ecstasy, now your face is. If the original face was trembling in anguish, you’re now doing that too. In my deepfake, the cyclist didn’t know I was demanding his backpack because he spoke English; the robber in the video was speaking Spanish – a language I do not know. Still, because of deepfake technology, ‘I’ spoke it flawlessly.

Chris Ume, the man behind a viral deepfake TikTok video featuring Tom Cruise, is launching a start-up called Metaphysic, which can create virtual celebrities

One of the main worries people have over this technology is its geopolitical impact: what might happen if a deepfake was released showing President Biden declaring nuclear war on China? Or a candidate ‘doing’ something untoward in the run up to an election, which could sway voters.

However, one of my most significant concerns is the reputational harm they can cause ordinary folk. Social media makes things much worse, 2021 being an unforgiving dystopian landscape where nuance is not allowed, where there is no due process, and where chronic outrage is the norm. It’s also full of purveyors of disinformation and trolls eager to exploit such a hellscape. It is a massive sentient and eternally enraged powder keg, perpetually awaiting the next spark.

A deepfake is that spark.

If someone makes a deepfake of Boris Johnson using inflammatory language, the Prime Minister has the channels and reach to refute it. You and I do not have that luxury. If someone makes a deepfake of us doing the same, and releases it on Twitter at 9am, we could lose our friends and jobs by noon. A false accusation on Twitter will travel around the world a thousand times in the time it takes the truth to travel a mile. And even if the belated truth does manage to make it around the world, it will never reach everyone who last knew you for the “you” in the deepfake. Video doesn’t lie, right? So why follow updates about the event any more?

Yet while deepfake technology can and is being used for horrible things, the technology itself is not evil. It will be transformative for entertainment, marketing and even health industries; already deepfake tech enables some who have lost their voice to speak again, sounding precisely like themselves.

People are the problem. How do you solve that? There are limited steps you can take, but primarily, stop posting so much of your life on social media. Any video of you (or your spouse, parent, or child) is fodder for deepfake AI. If you can’t stop posting, at least limit the audience for the videos you post.

Other preventative steps are up to intelligence agencies, tech companies, and lawmakers. Some algorithms can detect deepfaked videos, but intelligence agencies and tech companies must continue to advance these methods.

But deepfake tech should never be made illegal. There are too many benefits to the technology. Many existing laws should already cover the nefarious uses of deepfakes – laws against blackmail and extortion, for example. However, it would be worth amending laws covering revenge porn to make sharing non-consensual pornographic deepfakes of another person an offence. Doing so could be a powerful deterrent for the most widespread use of deepfakes now. 

The problem is, the malicious use-cases of deepfakes will only grow as the technology becomes more mainstream. Indeed it may not be too long before someone asks: “Care to explain that video of you that’s going around?”

Trust No One: Inside the World of Deepfakes by Michael Grothaus (RRP £18.99). Buy now for £16.99 at books.telegraph.co.uk or call 0844 871 1514