Digitally imposing a face on someone else is by no means a new technology. It has been used by the film industry for anything from retouching faces, to make a person older or younger, to making a humanoid chimpanzee. However, this technology needed some serious hardware to render. It was also time-consuming with motion capture and needed a team of animators painstakingly adding minor details to make it as believable as possible. And even then we can veer into the uncanny valley.
Introducing Deep Fakes
Over the last year, a new process is emerging and slowly improving. These are face swaps on steroids. Deep Fakes use artificial neural networks that are trained to recognise faces and recreate them. The neural network then imposes the face and even moves the mouth depending on the audio. Some users have been used this to place Nicholas Cage in as many films as possible. Seems harmless enough right?
A moving actor is obviously going to be more difficult to recreate. It works better if you have a subject that stays in more or less the same position. And if the neural network has more material to train with. Say, a politician during a public address.
Get a good enough impersonator, and in the future, you can potentially have anyone on video saying anything.
What can we do about deep fakes?
To be clear. This process is not perfect. In the Obama video, you can see his mouth sometimes moves unnaturally, or not quite matching the words. And you can’t literally place any face on someone else. You also still need a decently powerful computer, and the neural network needs to be trained with multiple images. And especially on higher resolution videos, the fakes could be even easier to spot.
For the time being, since the technology is still relatively in its infancy, it’s mostly just harmless fun. But, should Deep Fakes continue to improve, to the point where they are convincing enough, the potential is dangerous. As redditor derpfakes said, “If I can put together a video in my home […] imagine what major tech firms and governments can do”. How do we keep track of what is real and what is fake?
What do you think should be done about deep fakes? Let us know in the comments below.