TRANSCIPT

It’s Throw Forward Thursday again, and my name is Graham Codrington. This is our weekly look at the future today, we’re not throwing forward that far because we’re talking about deep fakes. You all know what Photoshop is, right? You all know that we now have the ability to completely manufacture a photograph, either taking an existing photograph and adjusting it, we can put people into a scene who weren’t there to start with, we can change something completely or, we can literally create a photo realistic scene that actually never happened.

Well, Deep Fakes is taking that technology and applying it to video. Of course, Hollywood’s always been able to do this. We think of a movie like Forrest Gump where Tom Hanks was able to be inserted into a whole series of major historical moments as if he had actually been there. And we understand the concept of being able to use a green screen and CGI in order to create fake news, in a sense, a fake movie. And we do it for entertainment purposes.

But this always required millions of dollars’ worth of technology and, you know, in-depth investment in Hollywood studios to make it happen. Deep fakes, promises, the ability for us to do that very simply, at almost no cost. I’ve been impacted by this personally. About a year or so ago, somebody used a photograph of me as a whistle-blower. They had audio from somebody, and they needed to put a video together. It’s not very good, actually, but I suppose it indicates where things can go now.

You can even get an app that will put you into a video very, very simply. Just click and drag, and off you go. As algorithms, artificial intelligence, computer processing power increases dramatically, literally month by month, we’re going to see more and more deep fakes and they’re going to get better and better. Probably the best example to show you what a deep fake actually looks like is a wonderful use of deep fake that actually took Iron Man and Spider Man, Tom Holland and Robert Downey Junior, and inserted them into some famous scenes from back to the future.

And you can do a Google search for Back to the Future with Tom Holland and Robert Downey Jr. and see the remarkable use of deep fake technology to not just fake somebodies face, but also their voice and insert them into video, into movies that they weren’t before. Now, for entertainment purposes, this is all fun and good. Here’s where Throw Forward Thursday comes in. Sometime in the near future, there is going to be a politician who a day or two before the elections, a day or two before the most important moment of their life, the election to parliament.

A deep fake video will be released. Now, I say deep fake video, we’re not going to know that it’s a deep fake. It’s not going to look as if it has been faked, but it’s going to be this politician saying something horrendous, saying something totally inflammatory, something which causes you know, the news to go crazy about how could this person say it? They are going to come out and say, “but it’s a fake, I never said that”. But the damage will already have been done. We can throw forward to a time when that will happen with a famous CEO who just a day or two before an earnings announcement or a quarterly result, or an election to the board of directors, something will be released and it will cause the share price to collapse or maybe the other way, cause the valuation of the company to be boosted extraordinarily. But it is a fake and that person will come out and say I didn’t say that. These are deep fakes, and they are going to have real world implications in the very near future. So how do we respond? Well, I think there are three ways to respond. The first is we have to be as sceptical now, sadly, as sceptical about video as we are about photographs. I think we’ve learned over the last few years to say that could be a fake photograph that could be Photoshopped. How do we verify that that photograph is real?

We’re going to have to bring that same scepticism to video at the moment. I think if we see something that’s captured on video, we say, well, that definitely happened. And I think we’re going to have to be a little bit sceptical about that in the future. The second thing that you can do is, if you are actually in an organisation, this is a little tip for me to you, I would actually record if you’re in a public organisation where there is the potential for deep fake videos to be produced of you or somebody senior in your business, in your political party, whatever. What I would do is I would actually record that person right now, do it today. I would record them making some statements that they wouldn’t typically make. They can be a little bit of fun. So, if they are well known for supporting a certain football team, I’m very well known for supporting Manchester United, will record a video of me absolutely effusive in praise of Liverpool Football Club. Anybody who knows anything about football, no, there’s no ways.

And Manchester United supporter would never say anything good about Liverpool under any circumstances. So, record me saying that we record that video and then keep it keep it in a filing cabinet somewhere. And then if a deep fake is released of me, they release that video as well. In other words, create even more confusion in the system. Does this sound like craziness to you? I don’t think it is. I have worked with the Institute of Risk Management in South Africa and worked with people who think about this, and they think this is a pretty good idea because although it will create confusion in the system, what it will do is allow you to control the narrative of saying, do you see there are fake videos around and sort of dilute the impact.

I’m actually not joking about this. This is this is a very serious suggestion for those of you who, if you understand what deep fakes could do for you, this is a pre-emptive way to protect yourself against the damage of deep fakes. The third thing we can all do as a society is get somebody to invent authentication software. South Africa is maybe not well known, but as South Africans, we know very well the name of Mark Shuttleworth, our first tech billionaire and the first South African in space because he bought his way onto a space craft after he had made a pound of money.

How did he make his money? That little padlock that sits at the bottom of every secure transaction, I think it’s called SSL, the security that comes when you’re doing a transaction on the Internet. He created the algorithm. He built the algorithm that allows for us to create secure transactions. In other words, to verify a purchaser, and that somebody’s paying, and somebody buying that they verified that they are who they say they are.

I don’t know how it works. A few people do, but you probably don’t. But it works. And it made the internet a safer place. We need an algorithm like that for photographs and for videos, some way, some verification system of saying this photograph is not Photoshopped. This video is real,it actually happened at this location, at this time, recorded by this person, verified by that agency. We know we can’t rely on just somebody saying this is real.

There needs to be verification. In the past, we relied on news agencies to do that for us, the AP, Reuters and so on. And I think that some of those news agencies, we can still rely on. But some of them, like Fox News and 018 and others, have become such propaganda networks. Russia today is another example, had become so partisan and have this is verifiable fact. So, you know, they can’t sue me for saying this because there are verified instances of them playing fake information, whether they created that information or not, or whether they were just very happy to be caught up in the propaganda of a moment.

Well, that’s for them to answer. But sadly, we can’t trust all of the news systems. And this is where we have to throw forward to sometime in the future where I hope and believe that we will have been able to move beyond where we are at the moment, which is having to be sceptical and not being sure of visually what’s real in our world, and moving into a world where we do have that verification and can once again have confidence that what we’re seeing in front of us, is something that actually happened for entertainment purposes.

Deep fakes are remarkable and brilliant. Have fun with the apps that allow you to put your face into various things and for entertainment purposes, great. I’m not saying we should never have deep fakes, they’re lots of fun. But it’s not fun if it happens to you maliciously. And it’s not fun if somebody claims that it’s real factual and news throw forward deep fakes. Another moment from the future with our team at Tomorrow Today Global. Stay tuned.

If you are watching on YouTube, make sure you subscribe to our YouTube channel. Click the bell to get notifications. If you’re listening to this on the podcast version, on the audio only version, you might want to go and check out the YouTube because I’ve put a whole lot of graphics behind me that might be interesting for you to see, but otherwise make sure that you subscribe to the podcast and share it all with your friends. Of course, let’s go into the future every Thursday together and find out more about what the future holds for us.

It’s a bit scary, sometimes often exciting. Always interesting. I’ll see you next Thursday.

Graeme Codrington, is an internationally recognized futurist, specializing in the future of work. He helps organizations understand the forces that will shape our lives in the next ten years, and how we can respond in order to confidently stay ahead of change. Chat to us about booking Graeme to help you Re-Imagine and upgrade your thinking to identify the emerging opportunities in your industry.

For the past two decades, Graeme has worked with some of the world’s most recognized brands, travelling to over 80 countries in total, and speaking to around 100,000 people every year. He is the author of 5 best-selling books, and on faculty at 5 top global business schools.

TomorrowToday Global