At teissLondon2020, information security expert Bridget Kenyon expressed the importance of working with human nature in security training.
People are at the heart of any business, so understanding employees ways of thinking is crucial for a successful workplace.
Human Nature is not a problem that can be fixed by rules and regulations. All solutions to the existing problems must be based on how people behave, not on how we think they should behave’.
In particular, when it comes to cyber security, rather than trying to fight against natural human instincts, companies ought to understand the people they’re training in order to overcome the obstacles they face.
Behavioural economics and cyber awareness
Bridget Kenyon, DIS EMEA CISO and Information Security Programmes at Thales, is interested in the way ‘behavioural economics‘ interacts with cyber security. The idea refers to the assumption that people are not fully rational, known as ‘bounded rationality’.
Kenyon explains that we cannot know everything, nor evaluate all possibilities perfectly, but we are evolved to make decisions regardless based on what is ‘good enough’.
Behavioural economics suggests we have two modes of thinking:
1.The short-cut technique – the quick and easy way to make decisions, without needing much data, to accept a solution which is not perfect
2.The rational approach – the detailed way to think, which requires lots of data, is time-consuming and results in deep analyse of situations
For example, someone in a restaurant deciding what to order might analyse the menu in an immense amount of detail, based on calories, the carbon impact of the products and so on. Arguably though, this is impractical. It takes too long and is not sustainable for our everyday thinking.
But how does this play out in cyber space? In work, you may receive an email from your manager, with a subject line, that has information about a meeting and an attachment. You might ask: is this a pattern I recognise and trust? Often, pattern matching is a substitute for reasoning.
People will take the path of least resistance, which is at the root of behavioural economics. So security awareness and training should be built around this knowledge.
According to Kenyon, 95% of our decisions are irrational. In other words, we use thinking mode 1 majority of the time. But this is not necessarily a bad thing. In this way, we can make more decisions easily and reserve our energy.
She explains that we have a series of cognitive biases which shape our decisions daily. These include:
– The halo effect – we are more likely to agree with people we find attractive
– Confirmation bias – we seek evidence to support our decisions in order to reassure ourselves
– Framing – we deliver information in ways which alter how the recipient will react to it
– The curse of knowledge – the assumption that what you know is obvious to others and you don’t need to explain it
How can biases be used to enhance cyber security training?
The curse of knowledge is a particularly important cognitive bias for considering how to improve security awareness and training. Bridget advises that you assume your material is very complicated. And how can you make sure your training method is effective? Test it on non-specialists.
Ultimately, an understanding of human nature should be transferred to cyber security, in order to accept and adapt to human behaviour rather than resisting it. Bridget closed her talk with this memorable and crucial thought: “Technology should wrap around people. People shouldn’t wrap around technology”.
We need to dig deep into human behaviour, and start to think of people at the centre of cyber security.
Interested in this topic? Listen to Bridget talk more on the topic on the teissPodcast.