"The grumpy IT guy," is the title my fellow employees have bestowed upon me. I don't think I'm grumpy. When all is said and done though, what I think does not matter; the perception is real in their eyes. How can it not be? An overwhelming number of my conversations, at work, center on some digital thing that is not working right, and it needs to be fixed right away, and no one has any idea what happened.

I know I'm not the only IT person in this illustrious club. When grabbing coffee with colleagues, the conversation inevitably gets to "us versus them" situations allowing for a shared commiseration over latte.

I think I may have a possible explanation of the phenomenon. I met Professor Rose McDermott, through her ACM paper, "Emotion and Security." During a conversation, I asked Rose, a well-regarded scholar in psychology and behavioral studies, how to convince users of the importance of IT security, and avoid IT-user discord. She did not mince words:

People will listen to a conversation that is vivid, salient, concrete, and emotionally engaging. Abstract, pallid, statistical arguments tend to make people's eyes glaze over.

Now the "not-mincing words":

IT-professionals, please don't take this the wrong way, but the skills that make one good at IT (meticulous attention to detail, ability to process large amounts of data simultaneously, detachment) can make it hard to communicate effectively with the technologically illiterate.

It's hard to deny what Rose said. Yet, I have faith in my fellow IT professionals that once we understand what Rose proposes in her paper, we'll be ready for the next IT-user encounter. Maybe I'll even be able to get rid of my "Grumpy IT guy" moniker.

Kassner: Thank you Rose, I'm not sure I would have figured much of this out without your help.

As I understand, the main idea presented in the paper is that emotions temper our perception of reality — something IT professionals need to understand if they want to truly convince users doing something is in their best interest. Do you have an example?

McDermott: We can feel secure when we are not, just as we can feel insecure even when we are protected. For example, TSA invests an enormous amount of time, money, energy, and resources making us take our shoes and jackets off to go through a screening that is largely performative. It makes people feel secure so they will fly, not actually making anyone more secure.

On the other side, we can feel at risk when we are largely safe; the risk of dying in a shark attack is low, but people feel insecure all summer once they learn about a local shark attack case on the evening news. Psychologically, we are affected more by perception than probability.

Kassner: What you mention reminds me of the term Bruce Schneier coined: "Security Theater." Beyond that, we must be aware of how users perceive what we are telling them, as it will inform us whether we are making any sense or not.

A bit further in the paper, you drop another bomb — one I'm not sure I comprehend. You say it doesn't matter whether we raise the flag saying a threat is likely, or whether we lower the flag saying the threat is no longer likely; people are going to be negatively impacted either way. Why is that?

McDermott: We tend to forget about things over time, or push them to the back of our minds. But when the threat is publicly addressed, even as a reduced possibility; it becomes salient again, reminding us there is still something out there we should be afraid of, and this makes people more anxious.
Kassner: Wow, this explains why users are nowhere near as relieved as I am when a security issue is resolved.

You talk about two theories: Probability theory and Prospect theory. Probability theory suggests the bigger the threat, the stronger the response. But that's not the case. It seems we react more in line with tenets proposed in the Prospect Theory. Could you explain the theory, and what IT professionals should be paying attention to?

McDermott: Sure, Prospect Theory describes how people respond to risk. Basically, people are more accepting of risk when confronting loss than when they are facing gains. Also, people do not respond to probability in a linear fashion.

We give more weight and attention to events that have a low-probability of occurring. Think of all the money we have spent preventing terrorism, when fewer people have died from terrorism in the past ten years than those who have died from domestic gun violence in any one year of the past ten.

Oppositely, we give less weight and attention to moderate and high probability events. For example, few of us eat healthy, and engage in daily exercise in order to reduce the risk of heart disease and cancer.

Kassner: Finally, I have a reason why users are ambivalent about keeping software up to date on their computers.

The paper mentions the following as a good strategy for communicating information about potential threats. A threat should:

Be communicated by an expert and trustworthy source.

Be focused on a specific anticipated attack.

Motivate respondents to act.

Provide specific concrete actions individuals should take to counter the threat.

Could IT managers apply these principles when trying to convince employees how important a pressing security issue is?

McDermott: You may not be able to do all those things with every risk, but the more of them you can do, the more likely it is people will pay attention and respond appropriately. An IT manager would presumably be an expert and trustworthy source who is able to communicate simply, clearly, and repeatedly the potential source and type of attack, along with the cost and risk of such an attack. That expert also needs to tell the employee, what he or she can do to reduce the risk.

Also, tying the vulnerability to things employees hold dear will be more likely to gain their attention. For example, only saying shared data-storage sites are not secure may evince a yawn. Telling someone web-based email accounts (one form of online data-storage) are easy for hackers to access — maybe read a few unsavory ones, and heaven forbid, forward the emails to a spouse — will likely spark an immediate response.

Obviously, my example is extreme, but one need look only as far as former CIA director David Petraeus to see how common such errors are; even among those who should know better because he is, I don't know, head of US intelligence.

Final thoughts

What am I taking away from Rose's paper? We all respond to situations based on emotions, logic, and what our five senses are telling us. So, I must remember that not following advice isn't a user purposely trying to make me grumble.

I have a question I'd like to ask. After reading this article, do you think those responsible for and participating in the domestic surveillance of United States citizens have a different emotional perspective of security than the citizens who are part and parcel of the surveillance?

A tip of my hat to the ACM for publishing Professor McDermott's paper and allowing me to excise quotes from it.