Want to improve your security? Understand the cognitive bias behind decision-making -TEISS® : Cracking Cyber Security

People

Want to improve your security? Understand the cognitive bias behind decision-making

People like to believe they’re perfectly rational beings. It can be a shock to learn that we’re not. It’s critical for people to understand their natural cognitive biases and learn how to overcome them so that critical security controls will function as-intended.   

How sure are you that you make logical, rational decisions based solely on facts and logic? Are you pretty sure? Confident, even? Enough to be assured that you’ll make the optimal decision during a crisis? How confident are you in your ability to interpret, prioritize, and respond to risk? Feel pretty good about that, too? How would you feel if I told you that you were making preventable errors that endanger both you and your organisation?

While that’s a bit harsh, it’s not meant to be. Quite the contrary: It’s intended to be empowering. The problem is that all humans must deal with cognitive biases and irrational decision-making habits. This doesn’t have anything to do with where you were born, what language you speak, or what schools you attended. It’s about flaws in the standard-issue human brain. All of us must find ways to contend with these issues and the good news is that we can. The first step is to realize we’re not as rational as we think we are.

This topic came up in a recent staff meeting. We discussed an article on HelpNetSecurity.com that referenced a “newly released” report by Dr Margaret Cunningham, a psychologist and Principal Research Scientist at Austin, Texas-based security vendor Forcepoint. Zeljka Zorz’s summary quoted the original report, including this comment:

‘In cybersecurity, understanding and overcoming security-related perceptual and decision-making biases is critical, as biases impact resource allocation and threat analysis,’ Dr Cunningham explained.”

I haven’t met Dr Cunningham yet, but I hope she might get an original series on Netflix or Amazon where she travels the world explaining how to defeat cognitive biases for security awareness purposes.

That line immediately caught people’s attention in our security department, and for good reason. Forcepoint’s social media team was kind enough to provide access to the original report. After reading the full report -- Thinking About Thinking: Exploring Bias in Cybersecurity with Insights from Cognitive Science, available here, strongly agree with Dr Cunningham’s arguments.

At my current employer, we’ve championed addressing cognitive biases as a core element of our security awareness program since its inception. If we can help our colleagues recognize their own biases, then we can start to help them make better decisions.

Consider  the Argument from Authority bias concept. [1] David McRaney discussed this in his 2012 best-seller You Are Not So Smart: Why You Have Too Many Friends on Facebook, Why Your Memory Is Mostly Fiction, and 46 Other Ways You're Deluding Yourself. People naturally consider others with credentials (like a doctor or a PhD) to be highly credible. When someone holding a position of acknowledged authority makes a statement, a person subject to that person’s authority (even voluntarily) assumes the speaker must know better and conforms to the speaker’s opinion. [2]

This is both a logical bias and an emotional one; humans feel a strong compulsion to obey people who appear to hold intellectual, organisational, or legal authority over them. The presence of a uniform or a specific costume (like a lab coat) can convey authority to a stranger. A specific title or rank level can convey authority for a fellow employee. For example, if two different colleagues give you a status update about a project, are you more likely to believe the CIO or a junior project manager when those two speakers’ reports differ? Odds are, the executive’s version will win out.

This is normal; this is how we all think. The problem comes when social engineers, insider threats, and cybercriminals use this cognitive bias to trick us into doing something that’s not in our best interests. For example, using actual or assumed authority to compel people to ignore or even to deliberately violate mandatory security controls. Leveraging this natural bias can be a very effective means of subverting defences.

To address this bias, we introduced a game [3] for last year’s National Cyber Security Awareness Month called “Catch the Intruder.” We have a company-wide rule that all colleagues must always wear their official photo ID badge so that it’s visible while they’re inside our facilities. We’d made this an emphasis item in New User Security Awareness Training for six months prior to ensure that the message reached as many people as possible. Then, during NCSAM, we asked a few senior leaders to wander around the complexes with their badges hidden. We then challenged the rest of the company to overcome their own cognitive biases and (politely!) reminding the bosses about wearing their ID badges.

The exercise was a success twice-over. Some brave colleagues did the right thing and called out the bosses for their non-compliance. Those were exemplary moments. More importantly, any people who may have spotted the badge-less bosses and hadn’t reacted were prompted to consider their own reluctance and anxiety around challenging an authority figure.

This is a completely understandable anxiety we counted on when designing the game. That is, we realized that if people spotted a violation and recognized it for what it was, then they’d likely recall their training on mandatory challenge. Then, once they considered the possible risk of challenging an authority figure, some people would freeze – caught between their sense of duty and their natural reluctance to antagonize an authority figure.

This wasn’t a failure; it was a practical learning opportunity. As people discussed the game afterwards, it inspired some to realize that their own instincts – their natural reluctance to aggravate someone more powerful than themselves – were preventing a mandatory and necessary security control from functioning. To help keep everyone safe in the office, each person needed to overcome that reluctance and (always politely!) help hold everyone else accountable to standards.

We strongly believe in explaining cognitive biases during awareness and training encounters, and then taking active steps to test these biases in the real world. The more that we can expose our colleagues to their own self-defeating mental processes, the more opportunities we’ll have to arm our colleagues with the coping techniques needed to pre-emptively mitigate those processes.

[1] Also known as an “appeal to authority.”

[2] Consider especially the infamous “Obedience to Authority Figures” social psychology experiments conducted by Stanley Milgram in the 1960s.

[3] In her report. Dr Cunningham refers to this sort of training as “Applied Insight.” See page 11.

The following two tabs change content below.

Keil Hubert

Keil Hubert is the head of Security Training and Awareness for OCC, the world’s largest equity derivatives clearing organization, headquartered in Chicago, Illinois. Prior to joining OCC, Keil has been a U.S. Army medical IT officer, a U.S.A.F. Cyberspace Operations officer, a small businessman, an author, and several different variations of commercial sector IT consultant. Keil deconstructed a cybersecurity breach in his presentation at TEISS 2014, and has served as Business Reporter’s resident U.S. ‘blogger since 2012. His books on applied leadership, business culture, and talent management are available on Amazon.com. Keil is based out of Dallas, Texas.

Comments

Get the latest cyber news in your inbox

Join our community of cyber professionals today!