Security training: why one approach is not going to work -TEISS® : Cracking Cyber Security

Information Security / Security training: why one approach is not going to work

Security training: why one approach is not going to work

Security training isn’t a one-to-one analogue for technical skills instruction, especially when you’re training non-technical people. Security trainers need to adjust their course designs and teaching method to account for the probability that some students don’t share a common reference model for how the equipment they use actually works.

I worked with a professional IT trainer (who I will call “Amy”) a few years back who proposed that “security training” was essentially indistinguishable from “application training.” Amy tried to convince me that teaching users how to protect themselves and their equipment was – at its heart – no different from teaching a user how to build tables in Microsoft Word or how to modify an open job order in a company’s bespoke job management system.

In her world, “training” involved a series of actions that must be performed in a specific sequence to bring about a desired result. A “trainer” (she opined) should spend 80 percent of her time mapping and describing the required process steps, and 20 percent watching students follow a script to see where they got confused following written instructions.

I understood where Amy was coming from. I’d reviewed several years’ worth of her projects, and most of them had been “how-to” guides intended for tech support staff. Amy was quite talented at distilling sequences of actions into clear and logical scripts. Lots of “IF THIS, THEN THIS” instructions complete with relevant illustrations and screen-grabs. Her work was solid. It was also completely ill-suited for most security-related training.

To be clear, Amy’s approach was fine for some activities. Since she wrote her products for tech support professionals, her “how-to” guides worked well as technical instructions for people who understood the underlying principles and only needed help with a specific implementation problem. For example, opening a port on a firewall or adjusting a user’s access permissions. Things that counted as “security-related” within the larger category of “tech support tasks.”

Her approach didn’t work for non-technical people. Amy’s entire body of work was tuned for an experienced, technically-astute, professional audience. She didn’t define terms or explain underlying systems behaviour; she assumed the reader already knew all that and simply needed to perform a discrete function whose purpose, outcome, and necessity had already been decided. That’s fine for IT techs working in an IT support department, yet it is awful for people who need foundation-level security skills training.

I don’t mean new hires fresh out of university. I mean EVERYONE. Just because a user has an advanced degree or an impressive title doesn’t mean they’re exempt from learning core cyberdefence skills.

One afternoon, I took advantage of the long walk from the company café back to our building to argue that Amy was essentially writing aircraft emergency procedures for pilots. [1] Everything that she wrote assumed the reader had mastered certain fundamentals (like flight mechanics) and only needed specific help on-demand to perform a single corrective action at a time based on certain triggering inputs (like an unintended roll or autopilot failure).

By comparison, I offered her the (entirely true) story of one of my previous executives. This fellow – let’s call him “Bob” – was well-educated and technically proficient in his specialty. Unfortunately, he only understood enough about computers to be dangerous to himself and others.

Somehow, early on in his life as a PC user, he’d got it into his head that his PC’s display was in fact “the computer.” He believed fervently that his computer’s RAM, processor, and motherboard were all located inside the monitor housing. He referred to the big plastic box attached to his monitor as “the hard drive.”

Now, Bob wasn’t stupid. He understood that if his “hard drive” was taken away, it would mean he would lose access to all his local files. That’s where they were saved, obviously. If he wanted to keep a file “safe,” though, he’d move the file from wherever he had it in his C:\ drive’s directory structure to his desktop … which was physically located inside the display … because that’s where his desktop always showed up when he turned on his PC. Clever fellow, Our Bob … If a bad guy nicked his “hard drive,” they’d never get the files that he moved to his “computer.”

For the record, I am not saying that Diet Coke explosively exited Amy’s nose when I told her this story. I’m not saying that didn’t happen either. A gentleman never tells ...

Like this, only more adorable and with gales of uncontrollable laughter following.

The problem, I continued, is that security training is often more complicated than straightforward IT training, for two reasons. First, students must understand what they’re guarding against when learning how to take a preventative or reactive action.

Most people have a difficult time inculcating new tasks when they don’t understand why they’re having to learn these tasks. If there’s a disconnect between the student’s understanding of the operating environment (e.g., “My PC’s desktop physically resides inside my display”) then our instructions are likely to seem silly. The student likely won’t follow our instructions even if we can get them to master the steps.

Second, students need to understand how to recognize when a security task is needed by noticing clues in the operating environment. If a student doesn’t understand how their equipment works, they won’t easily recognize when their gear is misbehaving. Or, worse, they’ll mistake a vulnerability that needs immediate correction for something benign. Either way, they won’t act.

That’s why (I argued) security training tends to work best when you introduce new topics with an even mix of foundational education (i.e., what’s really happening) with accessible analogies (i.e., what this means). You must help get students’ heads wrapped around hard concepts at the same time you’re trying to teach them process steps.

This is no mean feat, especially since every student comes to class with a worldview that’s askew of every other students’ view. Getting an entire class synched, grounded, and focused is the critical first-half of the battle. You can’t effectively teach process steps until that first bit is sorted.

To her credit, Amy understood my point immediately. A few months later, she proved to be instrumental in helping me teach some professional development classes to our colleagues. She got on board with the idea of “level setting” the entire group in the first half of the course so the second half could proceed smoothly. It made for a satisfying and productive experience for everyone.

Having an ally in the classroom who was already on-board with my program helped sway the other students into giving me the benefit of the doubt. I recommend always having at least one ally integrated with the students when teaching a subject that might be unsettling or intimidating.

I recommend that everyone taking on “security training” as a function consider adopting our approach. That is, plan your courses such that you communicate the “what” and the “why” aspects first before getting on to the “how” part. Trust me, it helps.

There’s usually at least one “Bob” character in every class who means well and is following along yet misses the plot because of an underlying assumption that undermines the logic of your process steps. Get your own Bob(s) properly-oriented first before starting the task instruction. It’ll save everyone a great deal of frustration.

Finally, consider going a bit easy on your own Bobs … Just because they fundamentally misunderstand technology doesn’t mean they’re not largely functional in their assigned role.

Most likely, whatever misconception they picked up “back in the day” probably made perfect sense at the time, and may well have been the result of an honest misunderstanding implanted by some previous technical trainer who was in a hurry … or, worse, a trainer who made a joke in class that assumed every student would recognize it as such based on a shared mental model. Sometimes, we’re our own worst enemies.

[1] I understood how those safety procedure guides worked, since my first full-time job in Dallas had been writing them that for an aircrew training company.  When it comes to analogies, similes, and metaphors, use what you know.

The following two tabs change content below.

Keil Hubert

Keil Hubert is the head of Security Training and Awareness for OCC, the world’s largest equity derivatives clearing organization, headquartered in Chicago, Illinois. Prior to joining OCC, Keil has been a U.S. Army medical IT officer, a U.S.A.F. Cyberspace Operations officer, a small businessman, an author, and several different variations of commercial sector IT consultant. Keil deconstructed a cybersecurity breach in his presentation at TEISS 2014, and has served as Business Reporter’s resident U.S. ‘blogger since 2012. His books on applied leadership, business culture, and talent management are available on Amazon.com. Keil is based out of Dallas, Texas.

Comments

Most Popular

Get the latest cyber news in your inbox

Join our community of cyber professionals today!