Deepfake videos draw global concerns as it gets harder to differentiate

Last week, Facebook’s founder Mark Zuckerberg, seated on a desk, gave a dark sinister speech about the power of his social media tech firm. The video is framed with a broadcast that reads, ‘We’re increasing transparency on ads’ making it appear as if it’s part of a news segment. However, it wasn’t Mark Zuckerberg speaking. It was a deepfake video uploaded on Instagram.

For starters, deepfake videos are fake videos made using a software that helps you swap the face of a person with someone else’s. Ever since deepfakes stormed the Internet, several prominent personalities have suffered horrifying situations as their faces have been replaced with porn actors and then clipped together as an adult film. Several politicians, celebrities, business persons have become victims of this horrendous tool.

Defining fake
Deepfake is a portmanteau of ‘deep learning’ and ‘fake’. It is a technique for human image synthesis based on artificial intelligence. It is used to combine and superimpose existing images and videos onto source images or videos using a machine learning technique known as generative adversarial network. The phrase ‘deepfake’ was coined in 2017. Because of these capabilities, deepfakes have been used to create fake celebrity pornographic videos or revenge porn. Deepfakes can also be used to create fake news and malicious hoaxes.

The faking tool
In January 2018, a desktop application was launched that allowed users to easily create and share videos with faces swapped. The app uses an artificial neural network and the power of the graphics processor and three to four gigabytes of storage space to generate the fake video. For detailed information, the program needs a lot of visual material from the person to be inserted in order to learn which image aspects have to be exchanged, using the deep learning algorithm based on the video sequences and images.

The software uses the AI-Framework TensorFlow of Google, which among other things was already used for the program DeepDream. Celebrities are the main subjects of such fake videos, but other people also appear. In August 2018, researchers at the University of California, Berkeley published a paper introducing a fake dancing app that can create the impression of masterful dancing ability using AI. There are also open-source alternatives that allows users to create deepfake videos.

Effects on credibility
An effect of deepfakes is that it can no longer be distinguished whether content is targeted or if its genuine. AI researcher Alex Champandard has said everyone should know how fast things can be corrupted today with this technology and that the problem is not a technical one, but rather one to be solved by trust in information and journalism. The primary pitfall is that humanity could fall into an age in which it can no longer be determined whether a medium’s content corresponds to the truth.

Spot lens
There are ways you can figure out if you are watching a deepfake video. Check out for points like blurring evident in the face but not elsewhere in the video, a change of skin tone near the edge of the face, double chins, double eyebrows, or double edges to the face, whether the face gets blurry when it’s partially obscured by a hand or another object.

NT Bureau