It has been a mantra for so long that it’s a cliché: Humans are the weakest link in the cybersecurity chain. The best technology in the world can’t protect an organization from an employee (and that includes top management) falling for a well-crafted social media or phishing attack.

Of course, the best security awareness training in the world might help strengthen that link, but according to people like Lance Spitzner, training director for the SANS Securing the Human Program, it’s not happening.

“We have invested a huge amount of effort in the past 15-20 years securing one type of operating system (Windows OS) while investing almost nothing in securing the other operating system (Human OS),” he said.

A point he made with this tongue-in-cheek chart:

Chart courtesy of Lance Spitzner

So it should come as no surprise that the latest report from the Ponemon Institute, which surveyed 1,000 IT professionals across North America and the UK, finds that a majority – 54% – of those that suffered data breaches said the root cause was “negligent employees.”

In spite of constant calls for better security awareness training, that percentage is up from last year’s 48%. And it could be even worse, since, “almost a third of the companies in this research could not determine the root cause (of the breach),” the report said.

The anecdotal evidence, going back years, supports the statistics. At 2012’s DEF CON, Shane MacDougall won the social engineering “capture the flag” contest by getting a Wal-Mart store manager to give him 75 pieces of information over the phone in 20 minutes.

Wired reporter Mat Honan reported in the same year that, “in the space of one hour, my entire digital life was destroyed,” thanks to his own security lapses (no 2FA!) and the “helpfulness” of Amazon and Apple tech support.

The results of that weakest link are also depressingly familiar. Ponemon reports that:

Cyber attacks against small and medium-sized businesses (SMBs) increased from 55% to 61% in the past 12 months.

Ransomware showed a huge spike, from a reported 2% last year to 52% this year, with 79% saying the ransomware got into their systems through phishing/social engineering.

While strong passwords and biometrics are “an essential part of the security defense … 59% of respondents said they do not have visibility into employees’ password practices …”

The average cost of attacks rose, from $879,582 to $1,027,053 for damage or theft of IT assets and infrastructure; and from $955,429 to $1,207,965 for disruption to normal operations.

And the reasons why this is so are also familiar. Among them:

Attackers take advantage of the general tendency of people to want to be helpful.

People are trained to be compliant with authority figures, hence they are more likely to fall for attackers posing as law enforcement, top management or even HR.

Phishing continues to improve. In the case of the finance director mentioned above, the email address looked genuine, and since the real boss had posted pictures on social media of his Greek island getaway, it made sense when the fake boss said he didn’t want to be disturbed because he was on holiday.

All of which should be a signal to company leadership that IT clichés like PEBKAC (Problem Exists Between Keyboard and Chair) or “you can’t patch stupid” are getting in the way.

It’s an attitude that sets people like Spitzner off.

“The reason people continue to be the weakest link is that most organizations continue to fail to invest in them,” he told Naked Security. “If you want your awareness program to really be a success, put a FTE in charge of it. Too many programs have minimal support and maybe 15% of someone’s time.”

And in a post this week on the SANS blog, he said that since people “store, process and transfer information,” they are targets just like operating systems, apps and other computing technology.

His blunt assessment: “we the security community have failed to secure them.”

To do that, he said, will require, “mature awareness programs that focus on key behaviors that people can easily exhibit. We have failed to engage people in their own terms that they can easily understand.”

But there is also some ongoing debate about the best way to do that. At last year’s Black Hat, several presenters argued that making employees hyper-vigilant could create paranoia leading to a, “constant state of distrust,” and would interfere with, “how people actually do their jobs.”

“That would be like saying wearing a seat belt takes away the enjoyment of driving. Or locking your car makes people drive poorly,” he said. “In the world we live in, security precautions become second nature, and people adapt.”

Maybe the finance manager’s boss should have been fired also, he set up the scam by ignoring security advice to post pictures after returning home rather than while away. What do you want to bet the crooks targeted the company because of the boss’s vacation photos? It doesn’t sound like coincidence to me.

To me a lot of social media is one great big bragfest. I don’t bother with it except for the few occasions where it is necessary or can be proved to be a bonus to my everyday life. I’m not terribly interested where my boss goes on holiday except out of politeness and a bit of brown-nosing. This isn’t jealousy as I have been known to sail around the Ionians myself and the only thing I immediately envy is that fresh crab you can buy directly from a fisherman in the early morning. Yum. Yes, he should have waited till he got home and I hope he didn’t get burgled because of his post.

Security awareness training CAN be annoying, if you’re just hearing “Don’t choose 123456 for your password” for the thousandth time. Instead, organizations should set up processes expecting there to be scammers. “HR will never ask for your password.” “All large payments must be verified in this way.” “Highly sensitive information must be stored and transferred using only the following process.” “This action must be authorized by two executives.” And so on.

Education isn’t just telling people either. It’s testing that they know, and testing that they do what they know.

As much as possible though we have to remove our reliance on training so that we’re only talking about the bits for which there is simply no other solution. Nobody should have to teach you not to use “123456” as your password because no system should ever accept that as a password.

Attaboys can go a long way too. When fairly well crafted emails get forwarded to me by staff to review for them (before opening attachments usually), if they are phishing (usually fake pdfs or links that redirect to malware) I thank them and congratulate them on a good job being critical of the Emails. The response from users is that they feel good for contributing to security, and keep the attentiveness up. And they see me as helping them rather than blocking everything lol.

Much as I hate many of my (useless) colleagues in other teams we support, and I’m the “you f*cked up” guy because of my role, I will also help those same awful people to do things better in future, regardless of its impact on me, because it makes them trust me so they’ll only fight me when it matters to them, instead of as a habit. I’ve barely ever met an infosec team that thinks like that.

Well done! Wish I could upvote this more than once. Gets them wanting to do better–and yes “IT guys are a PITA” is all-too-common a trope when we’re simply endeavoring to protect them from themselves.

I liken our role to airport security; everyone is annoyed that we’re even here. That is, until *just after* they needed us–then for a brief instant EVERYone is fully supportive of having IT and all its cumbersom rules and procedures and safety precautions.

I also recall seeing someone (here maybe?) say they periodically bruteforced their own users’ AD credentials and posted any passwords they successfully compromised. Turning positive reinforcement on its head of course, but “whew, I wasn’t on the list again” can still be a motivator. Haven’t tried that one yet myself.

making employees hyper-vigilant could create paranoia leading to a, “constant state of distrust,” and would interfere with, “how people actually do their jobs.”

Plaintext emailing a URL, username, and password together is still far too common. I don’t mean account creation; this interaction was forwarded to me two days ago, when my boss and a client needed a hand:

As if this blatant disregard of security wasn’t complete; the (five-character) username matches the client’s email address: first initial, last name. When I replied back to say I’d fixed it, I scrolled down to change both to “XXXXX,” just to (barely) limit further risk.

Maybe interfering with “how people actually do their jobs” wouldn’t be a completely bad idea.