Why security is in denial about awareness

Denial has two meanings. It can refer to the refuting of an allegation or assertion. It can also refer to a psychological defense mechanism where criticisms are rejected because they are uncomfortable, despite evidence to the contrary. How a professional group responds to criticism tells you a lot about their ability to evolve and improve.

In his blog, Schneier on Security, Bruce Schneier states that security awareness is generally a waste of time. Since there's still a majority who think that awareness campaigns are about locking people in a room for an hour and putting up a few posters, Schneier is probably right.

At the heart of this debate is a fundamental question: While many would agree that information security awareness techniques need to improve, are we talking about a few tweaks or a complete overhaul? The problem is that if security awareness is all about changing behavior, then why don't security awareness tools and processes look anything like other, more mature industries that take behavioral change seriously?

Compared to other industries, the information security awareness approach to behavioral influence is an embarrassingly amateur affair. In fields such as public health and marketing, there are experts who have spent decades studying behavioral influence, testing their assumptions and making systematic improvements to their methods. The approach in these fields has led to a heavy emphasis on audience research. Why did you buy that particular product and not another? What thought processes were you following when you plugged that in? They go beyond the 'what' of behavior and seek to understand the 'why'. In contrast, information security professionals persist with the delusion that they can manage the what without understanding the why.

Many ways exist to systematically understand the why of an audience. Web designers commonly use personas. Safety risk communicators have mental models. Information security folk models have also been proposed. Ira Winkler was quick in his rebuttal to Schneier to dismiss folk models as 'unworkable' and 'not true'. The reality is that people have rules of thumb that they use to make decisions, such as: Is it growling and showing its teeth? Then I'm not going to pat it. Folk models are just a way of encapsulating these decision-making processes.

Generally, people's rules of thumb are adequate. When they go wrong, the information security tendency is to bombard an audience with facts, which is an extraordinarily inefficient approach. Some facts are more important than others and we need to identify specific 'fulcrum facts' on which decisions hinge rather than blindly 'teaching the topic.' Often, problem behaviors can be traced to a single mistaken perception. A good example that leads to a whole range of problematic behaviors is the belief that 'hackers don't target small businesses.' Information security professionals have been guilty of 'naïve realism' where we've assumed that our way of looking at problems is the only correct one. Despite our good intentions, our efforts will be hit and miss if we don't understand our audiences view of the world.

The cost of our mistaken approaches to security awareness should not be underestimated. How much has been spent on the password complexity topic alone? This problem could have been solved by system design but instead we've set ourselves the goal of trying to teach every last user. The crazy world of information security is such that Schneier was criticized for pointing this out.

Safety professionals would be shocked at our endemic complacency where high-risk functions with no business benefits exist on our systems with the potential for catastrophic failure. Why do we allow users and administrators to perform unsafe acts such as selecting passwords like 'Password1'? Next time you get on a plane, consider the effort that's been made to systematically design out risk in areas such as pilot training and cockpit ergonomics. If security professionals designed an aircraft cockpit they would include a 'crash plane' button on the dashboard and then spend years training people not to press it.

Is it a good idea to manage human risks? Yes, absolutely. Influencing user security behavior is a very important part of any organization's defense in depth. However, its about time we dropped the enthusiastic amateur approach. Sure, information security awareness has had its handicaps, not least a mistaken perception that changing behavior is easy. However, until we acknowledge that a better understanding of user behavior is needed, and that it's not efficient to use awareness to cover up poor security design, then it's the users who will suffer.

It's likely that due to the mix of specialist skills involved there's an increasing role for information security awareness marketing agencies with experts in communications and behavioral influence. This is very different from where we are now where security awareness is widely seen as an IT job that requires no particular communication skills.

Is it true that security awareness has allowed inefficiencies by compensating for bad design? Yes. Is there room to improve mainstream awareness techniques? Absolutely. Should security awareness be performed with a much better understanding of the audience? Definitely. Will you hear most awareness professionals admit it? Apparently not.

Latest Videos

Hear from Invictus Games Sydney 2019 CEO, Patrick Kidd OBE and Head of Technology, @James-d-smith -share their insights on how they partnered with Unisys to protect critical data over an open, public WiFi solution.

With so much change all the time, how can executives best prepare their businesses to meet the security challenges of the coming years? CSO Australia, in conjunction with Mimecast, explored this question in an interactive Webinar that looks at how the threat landscape has evolved – and what we can expect in 2019 and beyond.

According to new research conducted by the Ponemon Institute, Australia and New Zealand have the highest levels of data breaches out of the nine countries investigated. This was linked to heavy investment in security detection and an under-investment in security and vulnerability response capabilities

Copyright 2018 IDG Communications. ABN 14 001 592 650. All rights reserved. Reproduction in whole or in part in any form or medium without express written permission of IDG Communications is prohibited.