What Are Deep Fakes, And How Can We Detect Them?

What Are Deep Fakes, And How Can We Detect Them?

The internet has proven, time and time again, to be an endless source of beneficial content, tools and communication advances for all of humanity. From the point of view of many experts, it has been the most revolutionary invention of the last 30 years that will continue to innovate the societies of the world for a long time.

Unfortunately, despite its benefits, new technologies are also emerging that use the infinite potential of the internet to misinform and cause harm. With this in mind, the famous “Deep Fakes” were born, which, although they had a deplorable purpose from the beginning, quickly evolved their techniques to cause even more damage than they initially achieved.

Broadly speaking, “Deep Fakes” are videos where, through a preprogrammed application or more sophisticated multimedia analysis functions, a user can superimpose someone else’s face on a video that is not originally theirs, reaching to emulate someone’s voice and expressions to produce fake videos where a public figure says or does something when in reality it never happened.

Although this is alarming, the ease with which these videos can be produced and the advances that this technology continues to present make us wonder: To what extent can these videos be used to misinform or harm? Is it possible to differentiate them from the real ones?

Here’s The Detail

There are many examples of how this technology can scale its use to create false content that goes viral and causes terrible damage. However, one of the most emblematic, created for demonstration purposes, was the Chinese news agency Xinhua, which presented a video of a well-known news announcer from that country but completely recreating his image with digital tools.

Being able to recreate his facial expressions, voice and even body gestures, the agency demonstrated the potential these tools have for the complete misinformation of the public.

But how do they do it? Now it is disturbingly easy to produce a “Deep Fake” with applications available even in the most visited app stores, and with the software in hand, you only need the original videos that you want to plagiarize so that any user can produce their own “Deep Fakes”.

The original process that led to this is more complex, however. It is a type of Artificial Intelligence called “Deep Learning” (from which the phenomenon takes its name) that processes thousands of frames found in each video to find the similarities between the faces to change or emulate, causing the effect known as “face-swapping”. Therefore, a “Deep Fake” of the most professional level requires processing power not found in ordinary computers. Unfortunately, this does not mean at all that it is inaccessible.

Because of how easy they are to use, public figures fear being involved in fake video scandals; however, The Economist newspaper, which carried out an extensive analysis of the possible effects that the advance of this technology may have, pointed out that the danger is not limited only to famous people. The potential for this to scam anyone, perpetuate cyberbullying and instigate personal conflict is great, so misuse of this can affect anyone.

Is there a way to detect them? In addition to the obvious common sense and cybersecurity education that we all must develop, identifying, for example, reliable news sites, there is also a tiny detail that gives us the advantage when detecting these lying videos: the eyes of the subject. More specifically, the fact that a person plagiarized in a “Deep Fake” never blinks or blinks very infrequently. This is because even the most famous figures are rarely photographed with their eyes closed, making it difficult to mimic their blinking with digital tools.

Unfortunately, these minor errors are not enough to dismiss Deep Fakes as a latent danger. Now more than ever, the work of journalists to combat this type of disinformation is essential, as well as formal research to create programs that work as antidotes against these dangerous tools.

What can ordinary users or businesses who fear being compromised by a scam using these videos do? The most important thing is to stay informed, using our common sense and digital education against possible cybercriminals who use these tools.

Also Read: Video Marketing Like The Pros: 11 Tips For Better Videos

Tech Amazers

Tech Amazers is “Of the technocrats, By the technocrats and For the technocrats.” We endeavor to constantly provide our readers with the best information related to Technology, Business, Gadgets, and everything that interests you in the changing technological world.

Leave a Reply

Your email address will not be published. Required fields are marked *