Facebook wants you naked…and it’s for your own good

revenge porn

***UPDATE: Contrary to yesterday’s reporting, the BBC has now corrected its article on Facebook’s new “revenge porn” AI to include this rather critical detail:

“Humans rather than algorithms will view the naked images voluntarily sent to Facebook in a scheme being trialled in Australia to combat revenge porn. The BBC understands that members of Facebook’s community operations team will look at the images in order to make a “fingerprint” of them to prevent them being uploaded again.”

So now young victims will have the choice of mass humiliation, or faceless scrutiny…

Even if these human operators can be trusted not to share or copy these images, who is to say that they won’t mock, criticize or lust over these naked bodies from miles away in their office chairs? Or is it a case of “what you don’t know, doesn’t hurt you”?***

Facebook is asking you to be very open-minded, but it’s all for your own good. They need you to trust them. But it’ll be worth it…

That’s right. Facebook are asking for your nude photos so that its AI can help tackle the problem of so-called “revenge porn”, whereby ex-partners share compromising images of former lovers online. All they ask is that you send any vulnerable pictures of yourself in all your glory back to your own account via Messenger. From there, AI can make and store a “fingerprint” of the images, which in turn helps the network prevent copies being shared (BBC). Note: this is not a digital memory of your body, but of a particular picture. So, if you think many snaps might be vulnerable, you must upload them all.

“Yowzers!”, I hear you cry. Indeed, but not so fast. There are a couple of points worth making here.

Firstly, Facebook is clear that it will not store the image itself, but the link (which can be accessed by AI to block identical images being posted). This should minimize security fears. Secondly, in Australia, which is where the current trial is taking place, revenge porn is becoming a real issue (as it is worldwide). Studies have shown that as many as one in five women aged between 18-45 may have suffered from “image-based abuse”. Though it’s easy to chuckle at the idea of naughty snaps circulating the internet, the sobering truth is that it is ruining the lives of many, and in some cases causing suicidal thoughts. Being able to act before the fact could help many take back control.

It is important to note that this is still a trial and – of course – this particular AI can only prevent images being posted on Facebook. Offenders will still be able upload photographs to other sites. What’s more, though this technology will rescue some would-be victims from their humiliating fate, it’s unlikely many scared and embarrassed women (and this does affect mostly women…) will feel comfortable with the process.

It will be fascinating to see if this is rolled out across the Facebook estate. If it is, and it works, might it represent a new milestone with regards to our trust in the tech giants? Could it signal a brave new world in which our lives are more closely integrated with – as well as better understood and parented by –large commercial companies than they are by our respective states? Watch this space.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s