Deepfakes have long ceased to be considered “pranks” that can be found on the Internet. In the meantime, the former technical gimmick is also used to put words into the mouths of people of distinction that they did not actually say. This tactic has been observed more frequently as a means of Russian propaganda, especially in the context of the Ukraine war. But apparently other agencies also enjoy deepfakes. The U.S. military is one of them. The latter would like to use the fakes as a means of digital deception in the future.
Call for deepfakes from US special forces
In the Ukraine war, Russia is increasingly using deepfakes for domestic propaganda. And this approach is also proudly presented publicly. For example, according to a report by the Süddeutsche Zeitung from November 2022, Vladimir Putin had a deepfake video played for him by German Chancellor Olaf Scholz at a technology fair. Here, Scholz gave a speech whose content never passed the chancellor’s real lips. Instead, sophisticated technology ensured that a lifelike Olaf Scholz spoke the words that a programmer or AI put into his mouth. Such a fake can be promising on several levels.
On the one hand, it can boost morale in one’s own country. On the other hand, it can be used to spread fake video messages. This is what happened in the summer of 2022 when the (still) incumbent mayor of Berlin, Franziska Giffey (SPD), received a fake call from Vitali Klitschko. The video conference, which was faked with the help of Deepfake, was subsequently claimed by a duo of comedians from Russia. The effectiveness of such forgeries, however, seems to be met with more than just criticism. The Special Operations Command (SOCOM) of the United States of America sees in the technology rather a wonder weapon of the present and future. Especially in the information war, the manipulation of videos, images, but also sound recordings could be used as a powerful weapon.
A fatal step in the wrong direction?
The individual deepfake attacks on the part of Russia reveal that Vladimir Putin and with him the Russian military apparently think a lot of the technology. In order not to lose out, SOCOM is now demanding that more money be invested in appropriate manipulation techniques. A catalog of demands has now been published by The Intercept magazine. According to the report, the goal of the deepfakes is to influence the press stations and social media in a country. The media response to the deepfakes published in connection with the Ukraine war makes it clear that this is easily possible. However, it always became clear quite quickly that the videos were fakes. If SOCOM has its way, it wants to take the manipulation of images and sound to a whole new level.
Apparently, they don’t want to shy away from hacking attacks on regional smart TVs or other devices connected to the Internet. Just imagine that a platform like YouTube is suddenly full of fake deepfakes. Not everyone, however, is enthusiastic about this future-focused stuff. Especially not in the USA. After all, the government has always emphasized the need to work on countermeasures. Opposing voices are also being heard from the machine learning industry. For example, in the report of The Intercept, AI expert Chris Meserole emphasized that in the area of disinformation, one should “not fight fire with fire.” After all, manipulating audio and/or image files would destroy “the very foundation of democracy.”