As I mentioned in a previous post, use of PGP Desktop to encrypt all laptop disks is compulsory at IBM and is enforced through our end-user computing standards.

The default power management configuration for laptops often just suspends the laptop when the lid is closed or when ‘sleep’ button is pressed. unless the laptop user selects ‘hibernate’ the disk drives are not encrypted. standards dictate that laptop configuration should be changed to hibernate in these circumstances, but how many users actually make the necessary changes?

The comprehensive help documents provided by IBM for configuring the whole disk encryption software step the user through making a ‘rescue disk’ to allow recovery in the event of a lost encryption password. So, how many users take any precautions to protect that?

Going back to the potential attack against whole disk encryption, it relies on the attacker being able to recover the encryption key from memory dumps or hibernation files, after the disk has been decrypted. Of course, if the laptop is always left safe (ie. powered down or at least hibernating) then that attack vector isn’t available. However, how many users leave their laptop unattended and logged in when they believe the environment is ‘safe’? And, how many leave their laptop unattended before the hibernation process has completed?

The common thread through all of this is that if users are careless, they can inadvertently cancel out any benefits from technical countermeasures. It’s simple enough to describe the exact behaviour that will prevent this. In Public sector security, we call this Security Operating Procedures, or SyOPs for short.

It’s usual to define the IT security risk management process as starting with risk assessment to select the right security controls, followed by incident management to deal with residual risk, invoking crisis management and BCP when required, to recover from the most severe incidents. I strongly believe that SyOP production and security awareness training for end users must form part of the risk management process and must be in place before a service is activated to ensure that the security controls operate as designed and to defend against the sort of attack described here.

As I said in the title, users are the one part of the system that can’t be patched to remove vulnerabilities. It’s vitally important to explain the importance of what we ask them to do and then to reinforce that through adherence to mandatory written instructions, in order to establish the ‘habit’.

Not my normal security-related subject matter, but I had to pull together some highlights (wrong word?) of the appalling events in London over the past few days.The sequence below, taken from Twitter and Flickr and assembled in Storify (http://www.storify.com), show clearly that the vast majority of people in the UK are sickened by the mindless violence and sheer greed of the criminals who did this. The story also shows (to me at least) that when it comes down to it, the people of the UK, and particularly Londoners, will always rise above attempts to terrorise them and just get on with sorting things out.

Something we can all do to help. Publish the banner on your website or your blog or retweet the post. Let people know, so they can turn out to help with getting things back to normal.

For me, this picture sums up the violence of the whole thing. This morning’s television news showed footage of a 150 year old family run furniture store ablaze. Why? What did that achieve?

But, as bad as things get, people act with kindness and show their appreciation to the police..

And then this morning, I can only echo Professor Brian Cox on Twitter (above). it really does restore your faith in human nature.

People turned out in droves, responding to a spontaneous campaign to clean up the devastation left by the rioters.

… and shame the Devil, as I was often told as a child. Sound advice you’d think, but in the world of IT Security such honesty could cost you your job. I was alerted on Twitter by Kai Wittenburg to the story of Pennsylvania’s CISO Robert Maley. According to the story on Computerworld’s web site, Maley was fired by his employer, apparently after commenting on a security incident during the RSA show. The reason given for his dismissal ws that he failed to get the proper approvals before making his comments. The incident in question appears to have been a vulnerability in a scheduling system at the Department of Transport. The Department denies that any hacking or breach was involved in the incident, but details have been handed over to the State Police for investigation. This furore is taking place against a backdrop of cuts of 38% in IT security budgets and 40% in staffing.

Chances are, Maley’s employer does insist on rigid prior approval for this sort of thing. It’s all part of the culture of secrecy around security incidents that’s endemic in large organisations. The immediate effect is to make it more difficult for all of us to get budgets approved for security programmes. Faced with yet another capital expenditure request for an IT security programme, the CEO will say “..but , if this threat is real, why don’t I ever read about it in the Press?” Answer: because far too many organisations follow the lead of the Commonwealth of Pennsylvania and deny everything.

And there’s another consequence of not discussing these incidents – we don’t learn from them. In his book “Managing the Human Factor in Information Security“, David Lacey describes how the aviation industry has systematically and ruthlessly pursued safety through a combination of mandatory incident reporting and thorough investigation of “near misses”. Any major incident is the result of a series of cascading failures. If any one element holds up under pressure, then the disaster is averted. However, there are still a whole load of individual failures to be investigated and lessons to be learned. Next time, you might not be so lucky.

As our World becomes ever more dependent upon on-line systems, so the impact of security incidents will become ever greater. Unless we allow – even encourage – IT security professionals to follow Maley’s example and openly discuss these incidents, how can we ever hope to improve?

From the age of 16, for the next 15 years, I served in the Royal Navy. Like all uniformed, military organisation, a vital part of the induction process is learning the etiquette attached to membership. I don’t just mean the rules necessary for large and (at that time) wholly male groups to live and work in extremely close proximity, away from their families for long periods. Nor do I just mean the discipline on which lives can depend in a fighting force. Finally, I don’t just mean the quaint and unique traditions that come from 500 years of history. What I mean is the way in which servicemen (and women) are expected to dress (both in and out of uniform) and to behave (whether on duty or not), particularly when in the view of the general public.

The pressure to conform to these standards (which generally far exceed the norms for society) is immense and is imposed by one’s peers, not through the hierarchy. Having said that though, the lessons a 16 year-old learns from a Gunnery Instructor tend to stay learned for life! A good example is the practice of saluting. Saluting is always a mark of respect to the Monarch. So, we face the mast and salute at morning Colours and at evening Sunset, we face the ensign and salute as we board the ship or go ashore. And, we salute officers, because they hold the Queen’s Commission and that’s what we’re acknowledging, not the individual. To illustrate that point, from their inception in November 1917, the Women’s’ Royal Naval Service (WRNS) were not formally part of the Royal Navy, having their own rules and organisation. WRNS officers did not hold a commission and thus, Royal Naval personnel were not required to salute them. This all changed on 1 July 1977, when the WRNS became subject to the Naval Discipline Act.

Why am I telling this long winded story? Well, although I left the Navy nearly 30 years ago, MrsV1951 and I still live in a naval town, so seeing uniformed RN personnel in the town centre is a common occurrence. A few days ago, in search of sanctuary and free wi-fi, I was headed to a local coffee shop and I happened to be following a naval officer, in uniform. Coming in the opposite direction were two naval ratings, also in uniform. They passed without even acknowledging the other’s presence, much less saluting. I was incensed, not just by this, but by the fact that the ratings were wearing their blue denim working uniforms (never, ever worn ashore in my day) and the officer was drinking Cola from a McDonalds cup as he walked! Why was I so annoyed? Maybe I’m just becoming a curmudgeon (I’m certainly old enough to qualify).

And then, today, an article in the Times by Daniel Finkelstein shed some light on my disquiet. Finkelstein was discussing how group identity has an impact on how we behave. This phenomenon has attracted the attention of the Nobel Prize-winning economist George Akerlof. Together with Rachel Kranton, he developed the idea of Identity Economics. The central concept is that we adopt an identity to fit in with our peer group and that preserving that identity is one of our major economic drivers. In their book “Identity Economics: How Our Identities Shape Our Work, Wages, and Well-Being” (to be published next month), they describe how the Armed Forces successfully exploit this behaviour to make service personnel adopt the identity of the service to build team spirit and morale – all the attributes that make every serviceman and woman determined to do their best for their colleagues every time. And they know that their colleagues will do the same – essential in the face of extreme danger (I served much of my time in submarines, where extreme danger was always close by, though rarely due to hostile action). So, maybe that explains my annoyance. What I saw was members of a peer group of which I am (subconsciously?) still a member not obeying what I think are the norms of group behaviour. If Akerlof is right, then I see that (subconsciously?) as a threat to my identity.

So, finally, what’s all this got to do with Identity Management? Well, it seems to me that some of the more perceptive commentators in the security industry, including David Lacey and Bruce Schneier, are saying that the real challenge for security professionals is to address the behaviour of the humans in the system. And, if Akerlof is right, then those humans have a composite identity, where each segment represents a peer group with which they identify and carries with it a set of behavioural norms.

It seems to me that this is reflected in the different behaviour people exhibit in revealing personal information on sites such as Facebook and LinkedIn. They expect to be able to portray an appropriate “face” to their peers in these different environments, without them interacting. And this, allowing a user to control who can see which parts of their identity profile and under what circumstances, is where we’re going to need some technology.