Ready To Be “Deepfaked”? 3 Reasons You Should Be Concerned About The Internet’s Creepiest Data Heist

fake

Fraudsters typically line their pockets by forging our signatures, cloning our credit cards, and stealing our personal identities. Yet, we’d like to think that folks who know us personally – our family, friends, colleagues, and acquaintances – would catch these counterfeiters out if they brazenly claimed to be us in public. After all, seeing is believing isn’t it? If you don’t look like me, you’re not me. If you do look like me, the chances are that you are me. Right?

Well…maybe. And this could soon become the subject of some confusion.

But how?

Well, imagine if stealing your identity could include stealing your image. And if scammers could then use that image to put words in your mouth and – in some cases – fake your very actions. This isn’t just some outlandish thought experiment, but a foreseeable hazard if we fail to prepare for a surge in the production of “deepfakes”. 

So what is a deepfake?

The Urban Dictionary gives a characteristically unrefined definition:

“A horrific AI-assisted face swapping app which takes someone’s face and places it on someone else’s body. Particularly great if you’re a creep imagining what your favorite celeb-crush looks like naked.”

Though this description captures the funny side of this machine learning software’s most popular use (i.e. celebrity porn), a useful BBC article offers clearer detail on the simple process by which existing footage can be expertly doctored using readily available tools:

“By using machine learning, the editing task has been condensed into three user-friendly steps: Gather a photoset of a person, choose a pornographic video to manipulate, and then just wait. Your computer will do the rest, though it can take more than 40 hours for a short clip. 

The most popular deepfakes feature celebrities, but the process works on anyone as long as you can get enough clear pictures of the person – not a particularly difficult task when people post so many selfies on social media.”

So there we have it, almost anyone can do it and literally anyone could become the “star” of some fake footage (which – be clear – need not be pornographic).

What could possibly go wrong?

Here’s why you should be concerned (if you aren’t already):

  1. Anyone can do it

Just to reiterate, anyone with a will can find a way. You could do it. And so could anyone you know.

The popular and user-friendly FakeApp has already been downloaded hundreds of thousands of times. The app can apparently generate convincing videos with only one or two high quality clips of the faces the users want to fake (Motherboard). This means if someone can access real footage of you, they can manipulate it.

Eric Goldman, a professor at Santa Clara University School of Law and director of the school’s High Tech Law Institute has cautioned that we, “have to prepare for a world where we are routinely exposed to a mix of truthful and fake photos and videos.” 

But is it okay to manipulate a video if no-one is hurt or embarrassed? After all, many deepfakes are produced for the purposes of humor and fun. When does the subject of a joke or a taunt become a victim of something akin to hate speech or slander? Where the deepfake is sexual, is it less harmful than so-called “revenge porn” (given it isn’t the victims actual body that is being exposed)? If everyone is doing it, does it even matter anymore?

At the moment, the parameters of the software’s acceptable use are unnervingly loose and ill-defined.

  1. It’s a pretty sticky legal area

That’s right. As preposterous as it may sound, if some miscreant makes a video in which your face is grafted onto the body of someone who is…well…doing anything they want you to, mounting a fightback won’t be easy.

The Verge explains that there is no single law to help you. Defamation? Well, maybe but these are expensive and hard to win, and if the creator is anonymous or overseas then such claims are unhelpful. And you can’t sue someone for a privacy violation when the intimate details they’re exposing are not of your life. Furthermore, pushing to have content removed can even count as a First Amendment violation.

Moreover, the third-party websites hosting the videos are in no-way liable for the video, nor can they be forced to remove it. Unless the copyright owner of the original video asserts an infringement…so you’d have to track them down and enlist their help.

  1. It’s a tool for deception

The person being “deepfaked” isn’t necessarily the only victim, or even a victim at all. Check out this video of President Barack Obama. It’s just one of many AI generated fakes that portrays a political leader. Its creators warn that in the future we could see similar pseudo-videos that are used to spread disinformation, panic and fear in the same way as we’ve witnessed with the recent “fake news” scandals. This could harm us without altering a single pixel of our own images.

Moreover, as these tools become more refined and realistic – and we’re on that trajectory – they could be used in bribery, to produce false evidence, and any number of other criminal activities. All with relative ease.

This is perhaps even more worrying in a climate where AI surveillance is promising the world in terms of crime reduction and, thus, validating the collection of enormous amounts of vulnerable video footage.

————————-

So what next? Well, as ever we need to be developing techniques that can counteract the pernicious effects of AI technology, whilst keeping pace with those effects.  Do we need an AI that can call out this AI? It sounds ridiculous but perhaps that’s what we will ultimately rely on.

But herein lies another problem: if we’re dependent upon an infinite regress of smart technology (AI that holds accountable the AI that holds accountable the AI….and so on) do we drop the reins in a way we will all live to regret? In other words, will these deepfake videos evolve to the level whereby no human, no matter how smart, could determine real footage from fake footage?

And if that’s where we’re headed, shouldn’t we really be having conversations about if it’s where we want to go?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s