The Banning of ‘Deepfake’ Videos

The emergence of the advanced superimposing of female celebrities’ faces onto pornography videos is a new symptom of male entitlement to women’s bodies

This week Twitter, Pornhub, and Reddit have changed their community guidelines to ban the non-consensual face-swapping of celebrities into porn videos. This comes after the emergence of some extremely convincing online videos that appear to show female celebrities such as Katy Perry, Ariana Grande, Jennifer Lawrence, and Natalie Portman. Pornhub has cited the fact that these videos disagree with their stance on ‘involuntary porn’ as the reason why they will be taken down when reported.

These videos have been christened ‘deepfakes’ after the most infamous member of a reddit forum that produces and shares these videos. The software ‘FakeApp’, enables the convincing superimposition of a face into a video by anyone with a little know-how. One important detail for those of us who aren’t famous is that this technology can be used with anybody’s picture. Any public figures, crushes, ex-partners or even children could easily be the subject. This footage also raises questions about how trust-worthy videos displaying events such as terrorist attacks, CCTV and live events are. Now that videos can be manipulated to star an entirely different person, how far can we trust what we see?

The stills from deepfake videos are bizarre. The technological progression is at the same time unnerving and impressive, as these videos exhibit techniques that have only been perfected in the last year, but are in many cases incredibly convincing. One thing is for sure, the emergence of these videos is unquestionably creepy. The thought of your face and identity being used for these means is in itself some kind of nightmare. The fact that Jennifer Lawrence, who had nude photos published online, now has fake pornography videos published without her consent leaves a nasty taste in your mouth. She recently spoke about how the leak in 2014 affected her, saying: “…it’s terrifying. When my publicist calls me, I’m like, ‘Oh, my God, what is it?’ even when it’s nothing. I’m always waiting to get blindsided again” (American Vogue). It seems that because she and over a hundred other female celebrities were in the public eye, private photos of them were fair-game. The distress that she experienced during this time is clear, and the anxiety about what fresh threat could be around the corner is palpable. It seems that deepfakes are the newest danger to women in both public and private life.

The bottom line is that the use of this technology enables millions to have access to the most intimate parts of someone against their will. When I asked friends how they felt about the videos they called them a violation and expressed how it was a step too far in how we consume pop-culture. One called it “the modern up-skirt photo”. This development feels like the latest in a long line of exhausting ways in which men wish to degrade and assault women. They already have all the porn videos they could ever hope to wish for, all manner of titillating material at their fingertips. There are no clear statistics on exactly how much porn there is online, however it seems likely there is more than one could ever consume in a lifetime, with more being published daily. There is a new pornographic video being made in the United States every 39 minutes (Webroot). All the viewing time in the world, and you still wouldn’t get through this vast amount of material. So why do they need more? Why do these women’s images need to be bastardised? When will these creeps have enough? This is just another- thoroughly modern and ultra-Black-Mirror- story in the long tale of men encroaching tirelessly and aggressively on women’s personal space and freedoms.

In this case, technology itself is not corrupting. Deepfakes are not a cause of systematic sexism, but a reflection of what some men will do when they have the means. These people are already misogynistic, and see nothing wrong in manipulating a women’s image because that’s what they just happen to fancy doing. They see these women as something to be consumed, with no thought of how violating it must be to see yourself in a porn video made with your image and without your consent. That Twitter, Reddit and Pornhub have moved so swiftly to condemn and try to remove these images is positive, but the threat of these videos is far from over. The problem is far-deeper rooted than new technology that was established in 2017. This is an issue created by the fact that men think they have a God-given right to whatever they want, especially and particularly women’s bodies. The issues of male exploitation of women are old. Advances in technology are simply a new mirror to reflect them.