The team at Vgency has analyzed the recent deepfake of Ukraine president Volodymyr Zelensky. The deepfake itself lacks professionalism and is far from convincing. It is alarming though that the video went live on a Ukrainian news channel after the tv station got compromised by hackers.
While the hacker group did a poor job on the deepfake, they successfully found their ways into the tv station's IT infrastructure and took over the live broadcast. Such cyberattack with a more realistic looking deepfake can have devastating impact and is later more difficult to invalidate by officials.
Vgency makes the following conclusions exclusively available to registered users.
The video compositing contains 4 main layers:
- A static background graphic shows logos of the «Office of the President of Ukraine».
- The upper body of a person had been recorded in front of a blue screen. A green screen is unlikely because of the green t-shirt. Image artifacts from such keying process are noticeable on the left and right edges of the arms. The dark wobbly shadow on the right arm (left arm for the viewer) indicates the use of a blue screen while blocky edges on the left arm (right arm for the viewer) are typical characteristics for video compression that impacted the keying process.
- The deepfake head and face of a digitally replicated president Volodymyr Zelensky appears on top of the recorded person. This digital counterfeit represents the actual deepfake. The head of the originally recorded person is completely removed and the deepfake version had been superimposed instead. The transition from the real person's neck to the deepfake head is badly done.
- Another superimposed static picture shows the upper part of the podium with the coat of arms of Ukraine.
The hacker group didn't even try to create a synthetic deepfake version of the voice. The voice sounds real but not like the real voice of president Zelensky. A deepfake voice generator would have created different sound characteristics. The voice in the deepfake video is self-recorded using the cheapest equipment. The imitated voice is one of many reasons why this deepfake failed. We already explained in our previous article why voice deepfakes are more challenging but can also be more dangerous.
Securing Official Communication
The first line of defense against deepfake is to increase complexity in all official audiovisual communication. Increase of complexity includes some of the following methods:
- High-quality cameras and microphones to allow 4k recording, production and archiving, as well as surround audio mastering.
- Minimum 2-3 camera angles because it's easy to create a single deepfake from the front in a typical medium close-up. Complexity increases tremendously if one or two additional cameras are used, e.g. one from the side as medium close-up and one more as full shot or medium wide shot.
- Increasing the amount of details in video scenes makes it harder for deepfakes, especially in combination with more camera angles. Add patterns to the background similar to security features in banknotes. Add complex decoration like plants and elements that cause reflections such as chrome, glas and mirrors.
- Only produce raw and real video. Don't use green/blue screens to insert unreal backgrounds. Be cautious about how to use video effects and computer graphics.
- Increase the body language. Deepfakes focus strongly on the face and barely get voice well done. Train your face expressions and gestures.
There is more that can be done to secure official communication. Technology doesn't rest. Below you can watch a professional deepfake of the Prime Minister of the Netherlands created by a professional deepfake artist. The basic concept is the same like in the Zelensky deepfake: An actor in front of a green-screen who can (really) imitate the voice. Astonishing results are possible in the highest quality with the right set of talents. Fortunately, those talents weren't available for the Zelensky deepfake.