Revenge-Porn 'Deepfakes' Are Here To Spoil Humanity

AI-generated pornography – known as “deepfakes” – is becoming more convincing, seamless and real. People with rudimentary computing knowledge can now use artificial intelligence to swap the faces of actors in pornographic videos with those of people they know. Welcome to a new, terrifying era of revenge porn.

In January last year, a new app was released that gave users the ability to swap out faces in a video with a different face obtained from another photo or video – similar to Snapchat’s “face swap” feature. It’s an everyday version of the kind of high-tech computer-generated imagery (CGI) we see in the movies.

Fresh off the news that Reddit brought the banhammer down on r/deepfakes and a worried direct message from a friend on Twitter, I realised I should probably go and find out what 'deepfakes' actually are. Turns out, I probably didn't want to know.

You might recognise it from the cameo of a young Princess Leia in the 2016 Star Wars film Rogue One, which used the body of another actor and footage from the first Star Wars film created 39 years earlier.

Now, anyone with a high-powered computer, a graphics processing unit (GPU) and time on their hands can create realistic fake videos – known as “deepfakes” – using artificial intelligence (AI).

Sounds fun, right?

The problem is that these same tools are accessible to those who seek to create non-consensual pornography of friends, work colleagues, classmates, ex-partners and complete strangers – and post it online.

The evolution of deepfakes

In December 2017, Motherboard broke the story of a Reddit user known as “deep fakes”, who used AI to swap the faces of actors in pornographic videos with the faces of well-known celebrities. Another Reddit user then created the desktop application called FakeApp.

It allows anyone – even those without technical skills – to create their own fake videos using Google’s TensorFlow open source machine learning framework.

The technology uses an AI method known as “deep learning”, which involves feeding a computer data that the computer then uses to make decisions. In the case of fake porn, the computer will assess which facial images of a person will be most convincing as a face swap in a pornographic video.

Known as “morph” porn, or “parasite porn”, fake sex videos or photographs are not a new phenomenon. But what makes deepfakes a new and concerning problem is that AI-generated pornography looks significantly more convincing and real.

Another form of image-based sexual abuse

Creating, distributing or threatening to distribute fake pornography without the consent of the person whose face appears in the video is a form of “image-based sexual abuse” (IBSA). Also known as “non-consensual pornography” or “revenge porn”, it is an invasion of privacy and a violation of the right to dignity, sexual autonomy and freedom of expression.

In one case of morph porn, an Australian woman’s photos were stolen from her social media accounts, superimposed onto pornographic images and then posted on multiple websites. She described the experience as causing her to feel:

physically sick, disgusted, angry, degraded, dehumanised

Yet responses to this kind of sexual abuse remain inconsistent. Regulation is lacking in Australia, and elsewhere.

Recourse under Australian criminal law

South Australia, NSW, Victoria and the ACT have specific criminal offences for image-based sexual abuse with penalties of up to four years imprisonment. South Australia, NSW and the ACT explicitly define an “intimate” or “invasive” image as including images that have been altered or manipulated.

Jurisdictions without specific criminal offences could rely on more general criminal laws. For example, the federal telecommunications offence of “using a carriage service to menace, harass or cause offence”, or state and territory offences such as unlawful filming, indecency, stalking, voyeurism or blackmail.

Recourse under Australian civil law

Victims have little recourse under copyright law unless they can prove they are the owner of the image. It is unclear whether that means the owner of the face image or the owner of the original video. They may have better luck under defamation law. Here the plaintiff must prove that the defendant published false and disparaging material that identifies them.

Pursuing civil litigation, however, is time-consuming and costly. It will do little to stop the spread of non-consensual nude or sexual images on the internet. Also, Australian civil and criminal laws will be ineffective if the perpetrator is located overseas, or if the perpetrator is an anonymous content publisher.

Artificial intelligence makes it easier for people to scrape facial imagery from social media accounts and superimpose it into pornographic videos.

Addressing the gap in legislation

The Australian Parliament is currently debating the Enhancing Online Safety (Non-Consensual Sharing of Intimate Images) Bill 2017. This bill, which is yet to become law, seeks to give the Office of the eSafety Commissioner the power to administer a complaints system and impose formal warnings, removal notices or civil penalties on those posting or hosting non-consensual intimate images.

Civil penalties are up to A$105,000 for “end-users” (the individuals posting the images) or A$525,000 for a social media, internet service or hosting service provider.

Importantly, the proposed legislation covers images which have been altered, and so could apply to instances of deepfakes or other kinds of fake porn.

Prevention and response beyond the law

While clear and consistent laws are crucial, online platforms also play an important role in preventing and responding to fake porn. Platforms such as Reddit, Twitter and PornHub have already banned deepfakes. However, at the time of writing, the clips continue to be available on some of these sites, as well as being posted and hosted on other websites.

A key challenge is that it is difficult for online platforms to distinguish between what is fake and what is real, unless victims themselves discover their images are online and contact the site to request those images be removed.

To adequately address the issue of fake porn, it is going to take a combination of better laws, cooperation from online platforms, as well as technical solutions. Like other forms of image-based sexual abuse, support services as well as prevention education are also important.

Comments

You can't put the toothpaste back in the tube. The only way to beat this is for deepfakes to become MORE prolific and well-known in the media. Then if someone makes one of you, people will just respond with "oh, another one".

I know this is a reply to an old comment (yay recycled articles!) but I'm not sure that helps. Punishments need to be consistent across offences and I'd hate to see huge fines/sentences for this when we still struggle to get real, physical offences sentenced.

That's not to say that the victim of a deep fake can't suffer mental anguish or financial harm (like losing their job). I think there needs to be a review of crimes and sentencing as a whole and working out more appropriate sentencing for our modern society.

Only logged in users may vote for comments!

Get Permalink

Trending Stories Right Now

June has kicked off and with it comes another exodus of TV shows and movies from Netflix. Having said that, you'll be happy to know Mad Men's removal from the streaming platform — originally scheduled for the end of April — has been delayed to June 11. If you want to finish all seven seasons and 92 episodes before then, you'll need to start binge-watching straight away.

While used PS4s and Xboxes may not make up a significant portion of e-waste, that doesn’t mean they aren’t contributing to the problem. If you have any used gaming consoles, don’t trash them — you can likely re-sell them at any electronics store or recycle them at the very least.