Jan 24, 2018

Today was a glorious summer day in Hawkes Bay - about 30C under clear blue skies, hot sun and plenty of greenery thanks to the odd thunderstorm lately. Not exactly the ideal weather for slaving away in the office.

As I was hootling down the track on the 4x4 farmbike on my way to turn off our water pump this afternoon, I turned to look across our paddock ... and saw Maka ("maarka"), our tame/pet red deer hind, sniffing at a little brown wobbly thing, staggering drunkenly around as it struggled to stand on the slope.

The fawn was only an hour or so old. We didn't even know Maka was pregnant, let alone due today, so it was a very pleasant surprise.

Jan 22, 2018

Social engineers exploit their "knowledge" of psychology to manipulate and exploit their victims. So how about we turn the tables - use our knowledge of psychology to counter the social engineers?

That thought popped unexpectedly into my head over the weekend as I was grubbing weeds in the paddock. I've been mulling it over ever since, making hardly any progress to be honest.

One thing that occurs to me is that social engineers are potentially just as vulnerable to manipulation as their victims, although they have the advantage of having consciously and deliberately performed their attacks ... which could in fact be a weak point: if they believe they are in the driving seat, they may not anticipate being driven.

There is some evidence of this, for example 419ers (advance fee fraudsters) have occasionally been led along the garden path by savvy targets. Scam-baiting became A Thing about a decade ago, relatively amateurish though and risky to boot: the authorities quite rightly warn against vigilantism in general, but there were some creative schemes and hilarious trophies.A better planned, coordinated and generally more professional approach, applying proper psychology and science rather than just bitterness, retribution and belittling, has some merit as a strategy, particularly if the aim is to fire up workers' imaginations and so make them more aware of, and resistant to, the scammers. Whereas an individual organization or even a group may stand little chance of stamping out the 419ers and other social engineers, they can perhaps tilt the odds in their favor, becoming slightly harder, less attractive targets.I'm still not sure where I'm going with this. It's one of those little germs of an idea that might sprout and flourish, but more likely will disappear without trace. Perhaps me writing about it here has set YOU thinking about it, and together we can take it forward as a discussion thread. It will at least remind me when I'm checking through the blog posts at some future point, having totally forgotten about it!

Jan 19, 2018

A day or so ago I wrote about organizations being pressured into security awareness for compliance reasons. With some exceptions, compliance is externally imposed and doesn't directly benefit the organization through increased profits - rather it avoids or reduces the losses and costs (including penalties) associated with noncompliance. That is still a financial benefit but with negative, oppressive connotations.

Today I'm moving on to more positive, profitable matters, the business benefits arising from security awareness and training, of which there are several:

You may have spotted an underlying theme, in that most of the benefits of security awareness and training stem from better information risk management. In a sense, awareness is 'just another security tool', but one with a multitude of applications, more Swiss multitool than hammer.

I am fleshing out all those bullet points into a template "Business case for an information risk and security awareness and training program" to be included in February's NoticeBored module. Email meto go on the list for a copy of the finished article. We'd love to help you persuade your management to take out a NoticeBored subscription and take your awareness program to new heights.

Jan 17, 2018

Security awareness may be something you have to do for compliance reasons (mostly to avoid penalties) or something you want to do to gain the benefits, often both.

Today I'll concentrate on the compliance aspects, the most straightforward part, leaving the business case for another day's blogging.

Compliance pressures come at us from all sides!

Laws and regulations: many information-related laws and regs mandate adequate information security, particularly those concerning privacy and governance, plus those applicable to the healthcare, financial services, government, infrastructure/utility and defense industries. Some of them specify awareness and training explicitly, others are more circumspect, typically referring to ensuring compliance without saying precisely how to achieve that.

Contracts and agreements:PCI-DSS is the classic example of a contractual obligation to secure information, specifically card holder information relating to credit and debit cards. Security awareness is a mandatory requirement of PCI-DSS. Another example is the typical employment or service contract, containing clauses about securing personal and proprietary information and protecting the organization's interests. Yet another is cyber insurance: the policy small-print may include requirements along the lines of 'generally accepted standards and practises of information security', or mention particular laws and standards, or may specify particular controls (such as incident management and breach notification). Many a lawyer's fee results from the nuances in this area! Claiming that an incident occurred because workers were unaware of their security obligations would be a strong case for the prosecution, not the defense.

Corporate strategies, policies and standards: many organizations have formal company rules relating to information risk and security, website privacy policies for instance. If employees don't know and care about them, what is the point in even having them? Despite being an obvious requirement (obvious to us anyway, and now you too!), awareness and training is not universal although the requirement is no different to, say, health and safety, overtime or expenses. Aside from the practical, there are legal aspects to this with obligations on employers to ensure that employees are properly informed about the policies and rules that apply to them, if they are to be enforced (e.g. used to discipline employees for noncompliance).

Published national and international standards: numerous information risk, security, privacy and related standards are available. Compliance is generally discretionary (management decides whether or not to adopt the standards, and how) but some (such as ISO27k and NIST SP800-53) are imposed on some organizations for commercial, legal and regulatory reasons. Technical security standards covering encryption, authentication etc. in products are mandatory in the sense that interoperability and compliance are essential for customers. Standards covering the confidentiality, integrity and availability aspects of financial information, reporting etc. are often imposed through accounting and auditing professional practises, qualifications/certifications or in laws and regulations.

Good practises: the compliance pressure here is less obvious but just as strong. Organizations that do not conform to generally accepted good practises in information security and privacy risk being hauled over the coals by aggrieved parties if there are incidents, given a rough time by the auditors, authorities and owners, shunned by customers, suppliers, partners and potential employees, and criticized by the media. The reputational damage can be severe with long-lasting impacts on business prospects, especially in those industries that are critically dependent on trust and integrity. Furthermore, organizations that go beyond the bare minimum are making an ethical stand that will resonate with others, enhancing their brands.

. . . . o o o o O O O O o o o o . . . .

Are you under pressure to comply? Take a look at the NoticeBored website and get in touch. We'll take the pressure off, setting you up with a world-class security awareness program, quickly and efficiently. Our consultancy, training, support and content services aren't mandatory ... but perhaps they should be. Security awareness is what we do!

Jan 16, 2018

I've been talking about simplifying our awareness content, making the materials more actionable, more direct in style - and here's an example.

Dipping into our stash of awareness content I discovered an awareness briefing on "Data backups" written six and a half years ago. It's not a massive tome, just a single A4 side of information, and the content hasn't aged significantly (although "PDA" is not an acronym we hear much these days!). But the written style needs some adjustment.

The original started out with a summary:

"IT Department makes regular backups of data on the network
drives so computer users must either store all their information on the
corporate network, or make alternative backup arrangements. Make sure you have good backups before it is too late."

The first sentence is passive, referring to "computer users" in the third person, rather than speaking directly to the reader. I have railed before about the term "end user" being used by IT professionals as a disparaging term with vague connotations of drug addiction - not exactly a flattering way to refer to our work colleagues! The second sentence is much more direct: it's a keeper.

Moving on, the next section headed "Why backups are so important" set the scene by outlining typical situations where computer data might be lost or corrupted, such that the only feasible response is to restore from backups - not a bad little list of incidents (malware, bugs, hackers and physical loss/damage), one we can re-use easily enough. It's a set of bullet points, quit succinct.

The next section gave advice: this took two substantial paragraphs making a big block of text. I've rewritten that to another set of succinct bullet points, more direct and action-oriented.

Finally, we end the piece with our usual bit about where to go for more information or help - it's just boilerplate, replaced now with the current wording.

Comparing the two briefings side by side there is an obvious difference: by reducing the block of words, we've found room to insert a graphic, with the text wrapped neatly around it.

You can't easily tell from these thumbnails but the font is different too, giving it a more 'modern' look, and we've moved the NoticeBored logo to the header: we're hoping customers will swap it for their own security awareness logo anyway.

So that's most of the update process for this document, mostly changing the style, look and feel of it. I'm checking and revising the wording as well to make sure it reflects current thinking, on cloud backups for instance and ransomware. All in all, I feel an hour's work has clearly improved the original, drawing on my 6½ years' additional experience at writing.

So what do you think? Worth the effort or a waste of time? What else could I have done?

Jan 15, 2018

The graphic is about securing data in the cloud, taking us into the realm of cloud computing and Internet security.

At the end of my previous blog item, I mentioned that I'd be looking for situations where tightening security by adding additional controls is not necessarily the best approach, and sure enough here's one.

Putting corporate and personal data into the cloud involves a significant increase in some information risks, compared to keeping everything in-house. Strong encryption of both data comms and storage is a substantial and obvious control - necessary but not sufficient to mitigate the cloud risks entirely. Many other information security controls can be applied to reduce the risks further. However the costs increase all the time. Extremely risk-averse organizations may take the position that cloud computing is simply too risky, even with strong controls in place, so they partially or wholly avoid it ... which also means forgoing the benefits, including significant business and information security benefits (such as the highly resilient and flexible cloud infrastructure, supporting business continuity plus proactive capacity and performance management).OK, so that's a situation we might explore for the "Protecting information" awareness module, but it's quite complex as described. We need to find a simpler, more straightforward way to express it - my task for today.

Jan 12, 2018

February's working title "Protecting information" is so vague as to be almost meaningless, yet it is written in an active sense, hinting at the process or practice of protecting information - the things we actually do, or should consider doing at least. We might instead have gone for "Information protection", placing more emphasis on the principles than the practices but, in keeping with yesterday's piece about engaging our reader on an individual basis, the new materials will be relatively simple and pragmatic: I'm thinking checklists and action plans, stuff that the reader can pick up and use directly.

More "Microwave ready meal" than "Michelin chef's secret recipe".

Leafing through our stash of awareness content, we have previously delved into information classification schemes (what they are for, how they are designed and how they typically work): this time around we might skim or ignore the theory to focus on using classification in practice, as a workplace tool - how to do it, basically.

Hmmm, I wonder if I can write a Haynes Manual-style step-by-step classification guide, with pictures?

We've also explored knowledge management and intellectual property rights before - again fairly academic or theoretical concerns. It will take a bit more head-scratching to think of practical applications that people can relate to. Straight-talking advice on 'What to look for in a license' maybe? Maybe not.

Another area we have covered repeatedly is information risk management, a structured approach that underpins the entire domain, including the ISO27k standards. The management aspects remain relevant for our customers' managers but for February I'm tempted to skirt around the conventional information risk and security perspective (identifying and characterising the risks, then applying security controls to mitigate them) to find real-world examples of risk avoidance, risk sharing and/or risk acceptance. So now I'm on the look-out for examples of real-world situations where tightening the controls is not necessarily the best approach ....

Jan 11, 2018

Over the past couple of months, I've written and published a suite of 'Hinson tips' on another passion of mine: amateur radio. The tips concern a cutting-edge development in digital communications, and how to get the most out of the associated software.

I've had a lot of feedback on the tips, reflecting global interest in the new software and, I guess, the need for more guidance on how to use it. The reason I'm bringing it up here is that my writing style appears to have influenced the nature of the feedback I'm getting from, and my relationship with, the readers.

I honestly wasn't expecting that.

There was already a reasonably comprehensive help file for the program, well-written but in a fairly formal and dry technical style typical of technical manuals (not those ineptly translated from Chinese via Double Dutch!). A constant refrain is that people don't read the help file, just as we don't RTFM (Read The Flamin' Manual!). I suspect part of the reason is that 'fairly formal and dry technical style': despite amateur radio being a technical hobby, many hams are not technically-minded. Some simply enjoy using the radio to talk to people, and why not? It takes all sorts. Digital communications adds another layer of complexity through information theory and mathematics underpinning the protocols we use, and IT is a world of pain for some. To be frank, although I have a passing interest and some knowledge, I'm way out of my depth in some of those areas ... which means I empathise with those who are equally uncomfortable.

There is also an active online support forum, populated by a mix of experts, somewhat experienced users and complete novices. Unfortunately, the forum is suffering a little from the recent influx of people, some of whom are very passionate (which can easily come across as opinionated, strong-willed and direct). Being a global community, a lot of hams don't understand English very well (if at all!), hence the language can be a problem for them, as well as the sometimes hostile reception anyone gets on asking a 'dumb question'. Even attempting to explain things patiently in response to a genuine question or discuss ways to respond to an issue can lead to complaints that there are 'too many messages' and we are 'going off-topic', reflecting general frustration and perhaps a lack of understanding and/or focus.

So, I deliberately chose to write the tips in an accessible, readable, informal style, drawing on, interpreting and re-writing material from the help file and the forum, supplementing it with my own experiences with the new software. It started out as a personal document, something to help me come to terms with the subject and get my act together, so I wrote it initially as pragmatic notes.

I've gradually woven the content into a story - a fairly rational sequence that leads someone along the path leading from the simple stuff aimed at novices through to more advanced stuff for experts. I've used plenty of images to supplement the words, and I've taken the trouble to explain stuff, as straightforwardly as I can given the nature of the topic, rather than just saying "Do this" and "Avoid that" (with the downside of it growing longer by the day!). There's even the odd touch of humour in there, and personal/subjective opinion, guidance and asides mixed in with the more factual/technical content.

Above all, I tried to keep it practical, hands-on, action-oriented and motivational. I want to encourage readers to give it a go, try things out for themselves and hopefully enjoy themselves picking up new tricks along the way.

The feedback I've had has been overwhelmingly positive - even the odd correction or clarification has been expressed gently, couched in supportive almost apologetic terms. I've noticed that all the responses have themselves been informal, people often telling me stuff on and around the topic as if they were chatting to a old pal rather than a total stranger, which I am to virtually all of them.

Some of that stems from this being a hobby, a shared interest. Hams are generally friendly and open towards other hams. It's a supportive community. The rest I put down to the style of my writing: the tips are written almost as if I am chatting to each reader individually ... hence maybe they feel they know me. We've made a connection.

I've heard much the same comment from famous people, such as Billy Connolly on TV the other evening. He talked about people half-recognising him when he's out in public (on day-release!), often appearing to mistake him for a distant friend. Billy being Billy, I'm sure he sometimes plays along, letting them think he really is just an old acquaintance.

That thought intrigues me - not fame but building trust with the reader. I'm wondering, now, about the security awareness materials I write professionally (and this very blog!). Am I using appropriate styles and formats for the different audience groups? Would adopting a more chatty, informal, friendly style be well received, more inspiring and motivational perhaps? It's a balancing act because the awareness stuff is, after all, work-related, a different context to hobbies and pastimes ... and yet I seldom get any feedback at all. I feel disconnected. Perhaps this is my cue to change.

Exactly what we will bring up, how we will raise and discuss things, the specific awareness messages we will be drawing out and so on is not determined at this point. It will become clear during January as we complete our prep-work and develop the awareness materials.

This morning, in connection with a discussion thread on the ISO27k Forum, I've been contemplating information risk management in a general sense by thinking through a situation, coming up with a specific example that draws out a much broader learning point.

Briefly setting the scene, the thread was started by someone asking whether it is really necessary under ISO/IEC 27001 to have a policy on risk-assessing valuable documents individually. We talked about grouping related assets together (such as 'Contents of cupboard 12') and controls (such as electronic backups) but the original poster circled back to the question of whether the ISO standard itself mandates a policy:

"I understood that I need to classify our assets according to their importancy and risk. But in general, would this cupboard-labeling method work according to ISO 27001 policies? For example, we have a lot of paperform documents in three cupboards and I would sort them all in some way, and make the cupboard lockable and label the cupboard according to the sorting and put the label into my inventory list. Would that violate any ISO 27001 policy?"

So this morning, I wrote this ...

. . . o o o O O O o o o . . .

Here's an important information security control that, as far as I know, isn't explicitly mentioned in ISO27k (yet!).

Do you have a formal archive - a dedicated long-term store for valuable information (physical information assets such as documents, computer data, records, forms ...) that needs to be retained for various reasons (legal compliance, business/commercial, history ...)?

Is all or much of the archived information literally or effectively irreplaceable? Would there be serious ramifications if it were lost or damaged? Isn't that precisely why you have the archive?

OK then, so what happens if, despite all the controls, the archive is physically destroyed in a fire or a flood? What if some crazy over-stressed worker takes to it with a match or an axe? What if "the big one" hits, wiping out the archive facility? What if? Will insurance pay out, and will that be sufficient compensation (given that the archived content was “irreplaceable”)?

So ... shouldn’t you have a backup copy, a duplicate or at least a facsimile of everything in the archive, stored separately, elsewhere? Shouldn't we be thinking and talking about 'a pair of archives' not 'an archive', in exactly the same way that we speak of 'a pair of trousers'?

Shouldn't there be strategies, policies and procedures concerning how the pair of archives are (is!) used, maintained and monitored?

And yes, I am talking about (at least!) doubling the cost of archival, so it makes sense to be even more careful about determining what truly needs to be formally archived versus stuff that can simply be backed up and stored normally. Those strategies and policies are important business and information security tools.

And, with this in mind, you can probably think of other archive-related risks, scenarios and controls … good on you! You’ve got my point! Good luck if you take it forward.

To me, this is an obvious, straightforward control against a foreseeable risk, because I've identified and thought about the information risks, experienced at least some of the issues (e.g. floppy disks that can only be read on a specific floppy disk drive, due to head misalignment) and I've read about though luckily not experienced disasters that have destroyed archives (e.g. an Iron Mountain facility - a warehouse-sized commercial archive - went up in flames in London about a decade ago, and there have been others). I’ve considered all possible forms of risk treatment (accept, mitigate, share and avoid, remember) and thought about possible controls. It’s not hard really, if you start from the point of information risk identification and analysis. If, before you read this piece, you had never even considered it, and nobody in your organization had really thought it through (despite someone, somewhere, deciding that ‘an archive would be a jolly useful thing to have’), then there would have been no impetus to improve your archival controls.

This is just an example to illustrate the value of your ISMS and you entire approach in this area being risk-driven. The ISO27k standards, and methods, and books, and courses, and forums, and consultants and Google and so forth can help you enormously with possible risk treatments including controls, but calling on them for assistance won’t happen if you haven’t even identified the risks in the first place.

Our journey starts right here: what are our information assets? What possible harm could befall those assets, or befall us in relation to those assets? What worries us most? And what are we missing?

. . . o o o O O O o o o . . .

The information-risk-driven approach will undoubtedly be a strong theme in February's NoticeBored materials, but exactly how we express and elaborate on it is unclear at this point. We may incorporate the pair-of-archives piece as an illustrative example, perhaps in the form of a case study, something that people can work their way through, thinking about a specific scenario and then drawing out the more general learning points. It might become an aside in the awareness presentations or briefings. We'll see.But for now we need to think about those 'learning points': what are the awareness objectives? What are we hoping to achieve with the next module? Having engaged with and lapped up the content during February, how will security-aware workers think and behave any differently? Coming up with specific objectives will help us turn the vague 'information protection' title into something we can work with. That's our task for today.

Jan 4, 2018

The Internet of Things and Bring Your Own Device typically involve the use of small, portable, wireless networked computer systems, big on convenience and utility but small on security. Striking the right balance between those and other factors is tricky, especially if people don’t understand or willfully ignore the issues – hence education through security awareness on this topic makes a lot of sense.

From the average employee’s perspective, BYOD is simply a matter of working on their favorite IT devices rather than being lumbered with the clunky corporate stuff provided by most organizations. In practice, there are substantial implications for information risk and security e.g.:

Ownership and control of the BYOD device is distinct from ownership and control of the corporate data and IT services;

The lines between business use and personal life, and data, are blurred;

The organization and workers may have differing, perhaps even conflicting expectations and requirements concerning security and privacy (particularly the workers' private and personal information on their devices);

Granting access to the corporate network, systems, applications and data by assorted devices, most of which are portable and often physically remote, markedly changes the organization’s cyber-risk profile compared to everything being contained on the facilities and wired LANs;

IoT is more than just allowing assorted things to be connected to and accessed through the Internet and/or corporate or home networks. Securing things is distinctly challenging when the devices are technically and physically diverse, often inaccessible with limited storage, processing and other capabilities (cybersecurity in particular). If they are delivering business- or safety-critical functions, the associated risks may be serious or grave.

It strikes me as odd that risks to the critical national infrastructure resulting from the proliferation of IoT things are not higher up the public agendas of various governments. I have the uneasy feeling that maybe the authorities are wary of drawing attention to the issue, except (hopefully!) in private dealings with the utilities plus defense, finance and healthcare industries. Conversely, I could be mistaken in believing that IoT is substantially increasing information risks in industrial situations: perhaps the risks are all fully under control. Perhaps pigs have wings.Subscribe to the NoticeBored service to boost your security awareness program and catch imaginations with creative content, fresh every month.