Protecting organisations from deep fakes

Protecting organisations from deep fakes

How are criminals using deep fake technology and how can organisations protect themselves from deep fakes?

"In the past people received an email… but what happens when you receive a phone call and it’s the voice of your boss telling you to do something.”

Professor Marco Gercke, the founder of the CyberCrime Research Institute, talks to Jeremy Swinfen Green about how cyber criminals are using deep fake audio to commit crimes

Marco Gercke will be speaking at the inaugural teissBenelux2020 cyber security summit, taking place online from 27 to 29 October 2020. For free registration and more information , click here.

Video transcript:

It's a frightening world out there. Another thing that frightens me is the idea of deepfakes. And that's a really worrying problem for people who are interested in the truth. How are criminals using the deepfake technology, and how can organisations and individuals, as well, protect themselves and their reputations against deepfakes?

Well let me jump in very briefly, this frightening part. I'm not frightened about this world out there. I mean there are risks, we definitely have to be aware of this, we have to take it into consideration when we're acting, but I believe we've seen a lot of progress over the years. So I see the opportunity when I think about the internet and computer technology and not necessarily only the risk. It's a component, but we shouldn't blow it out of proportion.

And to answer your question with regard to the deepfakes, well, in the past, you mentioned phishing before, in the past it was that you receive an email saying, hey, I'm your boss, you remember me. Please make a transfer of $100,000 to a certain account and you respond to this.

Now people have learned they should not trust those emails. But what happens if you're receiving a phone call, you're hearing the voice of your boss, and he's telling you please do it. That's something where people might say, OK, well, now I heard him, it must be real. And we see those deepfakes not only with regard to political activities where they're putting words in people's mouth that they never said, but we also see criminals exploiting it, for example, with regard to social engineering.

That's interesting. So people are using deepfakes purely with audio and not just video?

Exactly. So there are cases that we've seen already. One case was involving, I think it was a German company based in the UK, where somebody received a call from one of the executives saying, hey, we need to make a transfer. And he actually-- they even had a meaningful conversation.

And he afterwards said, you know, I spoke to him, I know this guy, and that was him. And they said, no it was unfortunately not. So we see this pure audio. We also see, especially when it comes to the political context, we see that they're using videos, which is a bit more tricky, it's more difficult to manipulate them. It's sometimes easier to trace this back - in audio sometimes easier.

OK, thank you.

Copyright Lyonsdown Limited 2021

Top Articles

It’s time to upgrade the supply chain attack rule book

How can infosec professionals critically reassess how they detect and quickly prevent inevitable supply chain attacks?

Driving eCommerce growth across Africa

Fraud prevention company Forter has partnered with payments technology provider Flutterwave to drive eCommerce growth across Africa and beyond.

Over 500,000 Huawei phones found infected with Joker malware

The Joker malware infiltrated over 500,000 Huawei phones via ten apps using which the malware communicates with a command and control server.

Related Articles