For years, there was an understanding that if something was on video, it was real. Then, technology improved to the point where footage could be manipulated. It could be sped up or cropped or altered in other ways that made reality a bit unclear. However, it was usually pretty easy to spot things that looked odd and threw the video’s authenticity into question. Now, there’s a new technology in town that truly blurs the line between what’s real and what’s not: deepfakes. What are they are? What are deepfakes used for?
The history of deepfakes
A “deepfake” is essentially advanced face-swapping. What makes the technology impressive is that a computer does the hard work, which was usually a really slow, boring process that a human had to do. Ian Goodfellow and his research team first brought up the idea in 2014, suggesting that artificial intelligence could produce great face-swapping. What the AI does is examine facial expressions from a database of images and videos, and then match them to a second video. The software is free and available to anyone.
What are deepfakes used for? The first real application of this AI face-swapping came from an anonymous user calling himself “deepfakes.” In 2017, he started swapping the faces of porn actresses from videos with celebrities like Daisy Ridley and Gal Gadot, distributing his work on Reddit. As people worked out the bugs, the swapping became nearly seamless. The media got wind of these videos, as did many actresses featured. Scarlett Johansson, whose image is frequently used in this type of “deepfake porn,” talked to the Washington Post in 2018 about it, calling the internet a “vast wormhole of darkness that eats itself.”
How do deepfakes work exactly? As we mentioned, it uses artificial intelligence. The software also needs open-source libraries to pull the needed information. When Redditer deepfakes makes his videos, he simply uses stock photos, Google search images, and Youtube to get images of the celebrities he’s looking for. With millions of images for the AI to learn from, the network was able to look at the original face, find matching images from Gal Gadot, and do the swapping using an algorithm. That algorithm is performing “deep learning,” which is also what happens in a movie when directors want to change a winter scene to a summer scene, or day to night, and so on.
If the idea that someone can make a video of “Gal Gadot” as a porn actress disturbs you, you’re not alone. People also make porn with non-celebrities, and they’re using it with the intent to harass and bully. It could make a victim’s life a living hell. What can be done legally? Not much right now. Advocates and lawyers are trying to get the courts to recognize “non-consensual porn” as a type of harassment or defamation, but the videos could be considered a form of protected speech. It’s new territory, so it will take some time for the laws to catch up.
Deepfakes have implications beyond porn, forcing people to keep coming up with answers to the question, “What are deepfakes used for?” as well as, “What could deepfakes be used for?” You could make a person appear to say something incendiary, and the fake news could spread as real and legitimate before the truth is revealed. Society needs to realize how easy it is to create fake videos and photos, and the technology to catch these videos needs to improve. Sites that host videos (like Reddit) should also crack down on forgeries and develop better tools to filter what’s real from what’s fake. It may not be easy, but as fast as deepfake technology is moving, it’s necessary.
Artificial intelligence can be used for very innocent purposes, like identifying plants. Here’s an app that does just that.