Starting World War III from Home

by Cornelius and Jacob

Imagine watching the news and seeing Trump declare war on Russia, China and North Korea. With the help of artificial intelligence that is not as farfetched as it sounds. You might even believe it with Trump insulting world leaders on a daily basis. Manipulating videos to make politicians say whatever you like them to is easy for everyone with a bit of practice and artificial intelligence. The technology is called DeepFake.

Figure 1: Recognising Facial Features

For creating DeepFakes a Neural Network (Ai) uses images of the targeted person to analyse their mimic and facial features by putting points on special facial marks. With enough video footage or pictures, the Ai knows how that person’s mimic would look being afraid, happy or saying “supercalifragilisticexpialidocious”. Snapchat uses such a program to create funny features like face swap.

Especially for the film industry DeepFake has valuable applications and can potentially save millions. Instead of using Photoshop frame by frame, DeepFake can be used to bring dead actors back to life or to correct mouth movements, while dubbing a movie. In the movie “Rogue One” from 2016, Disney spent millions to recreate Princess Lea, although her actor, Carrie Fisher, had aged by 34 years.[1] A few years later, fans created comparable footage for free with the help of DeepFake programs.

Despite these benefits the DeepFake technology can also be abused. DeepFake has its highest demand in the pornography industry. In Fact, 96% of all DeepFake videos contain adult content. Especially female celebrities are targeted, but also ordinary women or teenage girls find themselves having become famous in Pornography. It is impossible to guard one’s privacy. Any picture can be used to steal your face and violate your dignity.
Figure 2: Using Deepfake on President Obama

With media basically being the fourth pillar of democracy and video as one of its most important tools the application of DeepFake becomes very dangerous. In early 2019 a Fox-News employee used DeepFake to mock President Trump’s appearance during his Oval Office address. Already in 2018, the famous comedian and Obama enactor Jordan Peele used DeepFake, making Obama address the danger of DeepFakes in politics. By having Obama insult Trump as “Dipshit” and pointing out that “our enemies can make it look like anyone can say anything at any point in time”, Jordan Peele tries to raise awareness.

New technologies like “Lyrebird” can even train an Ai to imitate a person’s voice with analyzing just a few minutes of audio. The Ai is able to fake an emotional state making the target sound concerned, elated or hateful. Just few parts of conversation like sounds created by mouth movement or breathing cannot be imitated yet.

With a few pictures and a few minutes of audio one controls the power the targeted person holds. The CEO of a British energy company was called by his boss from the German parent company and ordered to transfer 220.000 Euro to a Hungarian supplier. Because the CEO wrongly recognised his boss’s voice and his typical intonations, he obliged and made the payment.[2]

Then how do we combat the threat of DeepFakes? Pictures, videos and voice messages could be certified via blockchain. Blockchain would ensure the authenticity of media, but would consume vast amounts of energy. The Cryptocurrency Bitcoin, which also uses blockchain as a technology, consumes 66.7-Terawathours per year, as much as the Czech Republic.

Figure 3: Spotting DeepFake Altered Footage

The alternative would be to fight fire with fire. Microsoft and Facebook recently invested 10 million USD in the development of Ai that is trained to spot Ai altered DeepFake videos. Google has released 3,000 DeepFakes that shall help researchers train their Ai.[3] Since especially the government is interested in the control of DeepFake, the Pentagon also has allocated some of its budget to the fight against the misuse of DeepFake. New technologies are on the horizon and are just a few steps away of being implemented to fight DeepFakes.

The Wall Street journal calls the DeepFake fight a “cat and mouse game” with an endless chase. Although it is easy to imagine a dark painted dystopian future, we believe that the cat is faster and stronger than the mouse with its fangs an inch away from the mouse’s neck. The development of DeepFake detection Ai will be successful and return some of the authenticity media once had. Nevertheless, this reasoning is no excuse to not stay vigilant, especially right now with the upcoming presidential election in 2020. Think twice before you believe what you see!




5 Antworten auf „Deepfakes“

  1. Wow! You chose an extremely interesting topic and explained it very well and easy to comprehend. You used appropriate language and wrote very nicely. Your arguments were very differenciated and your structure was well-chosen. In total, it was an enjoyable and enriching experience reading your post! Congrats!

  2. Well written article!
    #welldone #great #informative #best #youarethegreatest

    OMG ! We didn’t even know such #technologies existed. That’s #scary. In your article you have worked with really good examples that illustrate the #technology well. In the #future, the news channels will have to pay very close #attention to what they publish. Good #journalisticeducationalwork. You also added a lot of #links… which let you dive even deeper into the #topic.

    Thanks for inform us, so we can be careful of what #webelieve. #thanks

  3. We really enjoyed reading your blog entry, it was very interesting. It’s nice that you mentioned advantages and disadvantages of DeepFake by using meaningful examples. The title was also very eye-catching and the video makes the text easier to read.
    However, you could have mentioned, in which case this technology could lead to a dystopia. Also, we don’t agree with the point that misuse of DeepFake could ever be completely stopped.
    Nevertheless, we liked your metaphor in the end.
    All in all, your blog is very well-written and contains many interesting details 😀

Schreibe einen Kommentar

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert.