The Best Adult Entertainment in the World

Deepfake Porn: The Adult Industry Gets A Little Bit Creepier

Heard about the disturbing phenomenon of deepfake porn?

In the current climate, fake news is commonplace and though some headlines can be so ridiculous they are obviously not true, others are alarmingly plausible. Combined with altered images and video footage (known as deepfake), it’s becoming harder to discern what is real and what is not. In 2017, deepfake technology, applied to pornography and depicting celebrities and the general public performing sexual acts, began to surface on the internet. Known as deepfake porn, the material can be staggeringly well put together and is often hard to distinguish from the real thing. Naturally, this is a hugely controversial issue and has resulted in many major websites, platforms and networks blocking this kind of content.

In this feature, we take a look at exactly what deepfake porn is and why it’s so controversial. We also consider the legal and moral implications of this kind of adult content and find out what the net is doing to prevent deepfake porn from being distributed on their platforms.

What is Deepfake?

Deepfake is a term that is applied to the method of using AI technology to superimpose one set of images and videos on to a completely different set. It is a method that is called human image synthesis and uses machines that have deep-learning capability.

Basically, you feed AI software (or an app) with footage of a person and a source video to superimpose them onto. The software can then overlay that person’s face (and body) onto the content making it look like they have done something that they haven’t. The more footage you can feed it, the more realistic the result – hence the deep ‘learning’ aspect of the technology.

what is deepfake

Deepfake uses artificial intelligence technology to create a new version of a video via superimposing source data. Image via Pixabay.

Deepfake has been around for a while and has even been released in the form ‘fun’ apps that you can use on your smartphone including FaceSwap, myFakeApp and DeepFaceLab.

Though the output of many of these low-budget apps is of a pretty poor quality, there are some more sophisticated versions where the results are undeniably very realistic. In fact, such is the power of the algorithms being used, top-of-the-range software can extrapolate what your target subject looks like based on quite limited input so you don’t need to feed it extensive data. However, the more information that the software has access to, the more realistic the results.

Examples of deepfake technology can be found across the internet from ‘satirical’ clips of current politicians to Facebook posts where individuals have made it seem as though they are incredibly athletic, performing Olympic feats for example.

Of course, some of this material is simply a bit of fun that is quite clearly produced for a giggle. However, with the application of this technology to adult content, deepfake has taken a rather dark turn…..

Deepfake Porn: The Moral Issue

The falsification of images and videos is not a new phenomenon and video/photo editing software has been used a lot in the past, particularly when it comes to ‘celebrity porn’. You only have to do a little searching on the internet and you can find doctored photos dating back to the 1990s where famous people have been depicted in porn photos and videos. As technology has improved and more people have access and the skills to master Photoshopping, the volume and quality of this material has increased.

However, it is the realism and format of deepfake porn which is causing so much controversy.

The phenomenon of deepfake porn came to the media’s attention in 2017 when one Reddit user started creating realistic fake pornography by feeding thousands of images into his home computer. The results led to some high-profile adult content hitting the mainstream media headlines. Two fo  the most notorious involved the star of the latest series of Star Wars films, Daisy Ridley and the Israeli actress, Gal Gadot. The material produced featuring the fake images of the latter, disturbingly, depicted her performing sexual acts with her step-brother. Whilst staged family porn produced in adult studios is a popular genre online, it is the fact that deepfakes are produced without the consent of the people depicted in them that is alarming.

Scarlett Johannsson, another victim of the technology has been vocal on the subject and was swift to point out the other sinister use that deepfake porn is put to. Speaking to the Washington Post in 2018, Johannsson claimed that whilst celebrities were mostly protected by their fame, others (particularly women) could be left exposed and vulnerable by their depiction in pornographic content without consent. Just like revenge porn, content of this nature, even if it is faked, can be highly damaging and distressing for the individuals concerned.

Deepfake Porn revenge porn

The implications of deepfake porn are serious and are being used by some in a similar way to revenge porn. Image via ComFreak/Pixabay.

Of course, there are wider applications of the technology being put to use in which footage of minors and other vulnerable members of society are exploited. The result is a hugely troubling issue and one that is rightly getting the attention of governing bodies, major hosting platforms and the adult industry at large (see below).

Is All Deepfake Porn Bad?

The key to answering this question is the issue that distinguishes the legitimate adult industry and the rogue porn markets; consent.

The fact is that most deepfake porn is produced without the consent of the individual being depicted. However, that does not mean that deepfake technology could not be used to produce interactive scenes, for instance, in which fans could choose the model. There is even talk about offering ‘blank’ porn scenes in which users could upload images of themselves so they could ‘star’ in a porn movie.

Again, the key to this being legitimate is having the consent of the person whose images are being used.

deepfake porn

Could the adult industry have a legitimate use for deepfake porn technology? Image via Wikimedia/Image by DrSJS from Pixabay.

Is Deepfake Porn Illegal?

There are no specific laws under which the production and/or distribution of deepfake porn can be prosecuted but there are a number of other charges that have been brought against people found guilty of this kind of act.

In the United States, victims of deepfake porn have pursued charges of cyberstalking, identity theft and revenge porn and, in the UK, producers of this kind of content can be prosecuted for harassment.

Other countries across Europe also do not have specific statutes to cover the offence but do have laws in place under which successful prosecutions can be made.

There are discussions in both the U.S. and the U.K for making the use of deepfake (and not just for pornography) a specific criminal offence.

what are the issues with deepfake porn

Deepfake is not covered under any specific regulations anywhere in the world. Image via Pixabay.

What is the Internet Doing to Police Deepfake Porn?

Deepfake pornography originally started to circulate around the web in 2017, notably on the social news aggregation site, Reddit. Initially flying under the radar a subreddit specifically dedicated to deepfakes featured pornographic material generated to look as though it featured stars like Katy Perry, Emma Watson and Taylor Swift.

Although it had a huge following, Reddit received many complaints forcing the site to take action, finally shutting the subreddit down in early 2018. The rise of deepfakes on their platform has resulted in a change to Reddit’s terms and conditions. Their previous rules made some provision to prohibit pornography that featured involuntary participation but has now been extended to specifically ban:

‘the dissemination of images or video depicting any person in a state of nudity or engaged in any act of sexual conduct apparently created or posted without their permission, including depictions that have been faked.’

Twitter has also stepped into the ring threatening to suspend those accounts associated with posting deepfake content as have two other media sharing platforms, Discord and Gfycat. YouTube on the other hand is still hosting tutorial videos that can teach you how to create adult deepfake videos ‘starring’ celebrities! This, despite their owners Google adding ‘involuntary synthetic pornographic imagery’ to its banned list in September 2018.

The adult industry itself is being asked to take a more active role in the clampdown of deepfake porn on the internet under copyright laws.

Given the fact that no specific legal protection is currently offered for victims of deepfake porn material, it is to infringement of copyright that the best chance of prosecution is possible. However, because no single image of a celebrity or person can be identified as the source for the content, it is difficulty to pursue this. That’s where the adult industry comes in.

Whilst the source images of the ‘star’ of a deepfake porn video can be difficult to determine, the scenes themselves are not. In this way, the studios who own the original content stand far more of a chance of pursuing legal action and having the material taken down.

The problem is that the porn industry already has its own fires to put out when it comes to piracy of their content.

The tubr-hosting giants, Pornhub, have also been quick to respond to complaints about deepfake content on their platform. They have advised the media that their site has included a ‘flagging’ mechanism since 2015 for users to report banned content which includes non-consensual porn such as deepfakes. However, many critics advice that Pornhub’s owners, MindGeek, are not going far enough to prevent this kind of material from being uploaded in the first place.

deepfake porn and pornhub

Is Pornhub doing enough to prevent Deepfake porn? Image via Marco Verch/Flickr.

So, what kind of technology can be used to prevent deepfakes from even reaching an audience?

Despite much of Deepfake porn being extremely realistic, this kind of content is not without its flaws with simulations being quite ‘choppy’ in places and featuring errors particularly in lighting and with shadows.

It is these flaws that are fuelling the hope that deepfake-detecting algorithms could be developed to discriminate fakes from the real thing. Certainly, if a fake porn video has been generated via an algorithm then it doesn’t seem impossible that a similar algorithm could be used to spot an offending video.

Despite the inroads being made in the automatic recognition of deepfake porn on the mainstream channels, the fact is that there are dozens of sites out there who are more than happy to host this kind of content. Some, including EroMe fail to see any issue with the genre and consider it a kind of ‘parody’. A spokesman for the site was quoted in the media during 2018 as saying:

It’s something new and for the moment we do not see any reason to be against [it]. If the people concerned are of age and the video presented as a fake we take this as a parody. If it hurts someone it’s always possible to send a DMCA (Digital Millennium Copyright Act) request and we remove the content.”

The fact remains that whilst the technology exists and legitimate channels to distribute it on, there will be people prepared to create it and to watch it. With further enhancements to the realism of the material, it will no doubt become trickier to distinguish what is real from what is fake and, for all those reasons cited above, this is a worry.

The answer?

Further investment in detection software is just one part of the solution to this creepy issue but legislation to prohibit the act would certainly help deter some people from generating, and distributing, it in the first place.

Featured image via Geralt/Pixabay.

Staff Writer

Leave A Reply