Artificial IntelligenceInsights

Deepfake AI Video Technologies

Artificial IntelligenceFacebookAIMark ZuckerbergPhotoshopBlack MirrorDeepfakefakenewsEditing videos with AItreating dementia with AIArt and AIDangers of AIAI in educationAI and health

For every Black Mirror fan out there, the latest season touched home more than any other before. But not because the creators of the popular futuristic series outdid themselves in gruesomeness. Rather because they show a very real possibility.

 

In case you are not up to date, spoiler alert! For the rest, who don’t follow the series, let us explain. The latest Black Mirror episode “Rachel, Jack and Ashley Too” shows how artificial intelligence is used to create a digital version of a fictional super-star, Ashley. More precisely, the AI gathers every video and audio recording of the pop-star and builds an identical, virtual personality of her. The idea of the process is continuing the legacy of the beloved artist, who is now in a coma.

 

Fiction or reality?

It’a s little scary and a little dramatic at the same time, isn’t it? Nevertheless, this is very likely the reality we are heading towards. Reports show a rise in such AI generated fake videos of real people. What makes AI-produced fake videos so problematic is that they look so real, becoming indistinguishable from reality. You see, manually edited videos are just adjustments over a light, speed and so on. AI can create a whole new scenario from completely unrelated footage.

 

You see, this technology makes the widely discussed topic of fake news seem insignificant as, while fake news can easily be debunked. Videos are a little harder to disprove. In fact, deep fake videos, as they are called, show an incredible potential of impacting both our private and public lives.

 

What is a deepfake video?

A deepfake video is a video edited completely by artificial intelligence, which substitutes real people with another person, putting this other person into fake situations they were never part of. To crudely explain, it is like “fast photoshopping” a video using AI to do the heavylifting. This technology cross-references and analyses photos and videos of individuals and basically ‘face-swaps’ them, much like in the Black Mirror example.

 

Ok, it’s actually a little more complicated than that, as it also takes into account body language, eye movements etc. Anyway, this technology creates a very realistic edited video of any person in any given situation. Moreover, deepfake videos are becoming so realistic, it’s almost impossible to differentiate

 

Is deepfake bad?

We have two famous examples of how this technology was used in bad ways. Facebook’s CEO Mark Zucherberg is the first and popular Hollywood actress Scarlet Johanssen is the second. In Mark Zuckerberg’s case, the video shows him threatening world domination using data. This, apparently was an artistic experiment and it was revealed as fake right away. Scarlet Johanssen, on the other hand, was not so … lucky. Her deepfake video was actually an adult video.

 

 

As terrying as these examples are, it is not an epidemic proportion yet. At least no one got physically hurt because of a deepfake video so far. But imagine when politicians and world leaders start getting faked on world television:

 

 

Fighting back

The good news is that we already see progress being made in detecting deepfake videos. For one, powerhouses such as the U.S.’s Defence Department are working hard on it. Also, from past experiences, we are already skeptical about the news media share over the internet. We are aware of doctored images of UFOs or mythical monsters and videos taken out of context. Deepfakes are similar, but it’s important that we keep creating public awareness on this topic, so people think twice before taking such a video as facts.

 

Deepfake as a positive thing

We have long explained that technologies are bad only as long as people use them for malevolent intentions. As long as we use the technology for the right reasons, a lot of good can come from it.

 

Deepfake in awareness

One example is a new Malaria awareness campaign that portrays David Beckham speaking in various different languages. Of course, the footballer has many talents but speaking all these languages isn’t one of them. This extra ability is added with AI. Unlike traditional dubbing, “deepfake” dubbing syncs the speaker’s words with their lip movement, making it seem real.

Deepfake in education

Deepfake brings historical photos and art to life. This helps people, especially children, feel more engaged with historical figures. While it may seem slightly creepy that we can animate people that are from the past or never existed, we must not overlook the educational benefits of bringing ‘history alive’.

 

Deepfake in mental health

It might seem counter-intuitive to indulge in the delusions of people who suffer from mental illnesses. But, some new approaches in treating illnesses like dementia, focus on anxiety-alleviation by playing into the patients reality, as in the case in these new dementia villages. Providing digital versions of former friends and loved ones helps these patients feel calmer and less alone. In other words, they get to experience a world that is normal for them.

 

Conclusion 

Deepfake is not bad, if used properly. However, it is critical to focus on preventing abuses of the technology, as well as promote the positive use of it. The first step is to help raise awareness of the negatives and make it easier for people to recognize a fake when they see one. Second, talk about the positive projects this technology is used in, and disperse the fear of those who don’t understand it yet. Hopefully, soon enough this amazing technology will no longer be called “deepfake” but maybe “VideoShop” instead.

 

In the end, Deepfake is just one of many new and emerging technologies today that we here at Wiredelta aim to keep you updated with. So follow us on social media or visit us daily for more exciting articles!

Leave a Reply

Your email address will not be published. Required fields are marked *