Deepfakes strip dignity
February 18, 2018
A new artificial intelligence program allows people to digitally impose another person’s face onto someone else in a video. The way it is being used could lead to disaster.
This AI faceswapping technology has been available to the public for some time now, but it wasn’t until recently that the actual version of faceswapping known as deepfakes have become a significant problem.
Deepfakes are when someone takes images of someone else’s face and uses the faceswapping AI to digitally impose that person’s face onto pornography.
Several versions of these videos popped up online recently, appearing in subreddits dedicated to the subject of creating deepfakes using famous actresses. The reason being that you have to have many different images at different angles to create a realistic image and since actresses or models will have lots of different pictures of themselves online, it is very easy for people to create the videos.
This puts a specific risk toward many women working in industries where they are photographed often. Daisy Ridley (“Star Wars: The Last Jedi”), Gal Gadot (“Wonder Woman”) and Sophie Turner (“Game of Thrones”) are just a few of the actresses who have been targeted by deepfakes, according to a report from Esquire.
The creation of deepfakes present several problems, the biggest of which is the legal issues behind the non-consensual nature of deepfakes. The actress whose face is being digitally imposed has not allowed the video creator to use their likeness and therefore has not given consent to be shown in this matter.
To many people, seeing these videos might confuse them and they may believe that the actresses in question are the actual women in the video. Parents could forbid their children to see the next movie starring their favorite actresses because they saw them in a porn video. This could cause a loss to studios, meaning the actresses could be out of work simply because of a misunderstanding and someone trying to sully their reputation.
That being said, deepfakes pose a threat that goes beyond just actresses or models, but to potentially anyone. We live in a vastly social world and people post photos of themselves online all the time. A person could potentially take your photos and create a deepfake video of you and simply use it to embarrass you in the form of revenge porn, use it as blackmail or extort you for money. Not to mention that deepfakes don’t only limit themselves to porn, someone could swap your face and put you onto the body of someone committing a crime.
Faceswapping, when done for comedy sake like putting Nicolas Cage into every movie (a novelty act which has its own subreddit), is harmless. Yet many are choosing to use it in very harmful and devious ways. Something must be done to punish or hinder people from creating deepfakes.
Danielle Criton, a law professor at University of Maryland, told Wired that there is very little in terms of currently existing law that can be done to prevent the creation of deepfakes and to punish those who make them.
“It falls through the cracks because it’s all very betwixt and between. There are all sorts of First Amendment problems because it’s not their real body,” Citron said.
So, at the moment, deepfakes can not be stopped through legal action. We must rely on banning from websites where deepfakes are frequently posted. This specific way of hindering deepfake creation has been very successful.
Reddit has taken down the deepfakes subreddit and made an amendment to their rules regarding revenge porn that said any “involuntary pornography” would be removed from the site.
Twitter has suspended accounts that share media of an “intimate nature” without the consent of the person depicted in the video. Even the world’s largest porn media site, PornHub, made a huge announcement that they were working towards removing deepfakes from their website because it violates their rules about non-consensual content.
Deepfakes will continue to be an issue online in our society, but some companies are taking the right steps to make sure that deepfakes are stopped in their tracks and to prevent anyone else from being targeted, but the law has yet to catch up with the issue.