Search form

Consider, for a moment, your last interaction with cybersecurity measures on one of your many smart devices. Perhaps, when you recently transitioned all of your banking to a mobile application, you were offered the opportunity to set your security as a two-factor login to better protect your data.

That was the recommended approach – or at least that’s what the programed instructions told you – but you still decided against it. After all, you have a hard enough time just remembering one of your long list of passwords, and any extra step that might lock out unwanted invasions into your information very well could lock you out, too.

And yet, with each passing day, larger and larger pieces of our lives are being stored on computers, on phones, and in the digital cloud that allows us access, but which poses significant challenges to security.

The stakes are high and only getting higher, and new School of Interactive Computing Assistant Professor Sauvik Das says it is time to change the way people think about, adopt, and adapt to cybersecurity measures.

“Our money, our work, our media is already protected by cybersecurity and if there’s a breach, all of that stuff is compromised,” Das said. “But think about where we’re going in the future. We’re going to have these embedded smart things that are controlling access to our front doors, controlling our cars, and, in some cases, there are going to be medically implantable devices that could potentially control our bodies.

“It’s more important than ever that we get security to where it’s not a thing people begrudgingly tolerate, but actively want to do. Right now, we almost don’t even consider the social aspects in cybersecurity, and my work is trying to change that.”

Cybersecurity: It’s a Social Game

Das, who completed his undergraduate degree (B.S. Computer Science) at Georgia Tech in 2011, recently completed his Ph.D. at Carnegie Mellon University, where he pursued research at the intersection of human-computer interaction, data science, and cybersecurity to create novel, socially-compatible security systems that make people actually want to engage with their security.

Das said he wants to understand in what ways and why people adopt or reject cybersecurity. Then, ultimately, he wants to create a suite of social cybersecurity systems that are more observable, cooperative, and stewarded.

One way was through a partnership with Facebook. He analyzed how one’s friends use of security tools impacts their own use of those tools. The intuition, he said, is that human behavior is largely explainable by the social influences around them.

He looked at three security tools offered by Facebook: One, login notifications, which send you notifications whenever there’s a suspicious login to your account; another is login approvals, which is Facebook’s version of two-factor authentication; and a third is trusted contacts, which allows a user to specify three to five friends who can vouch for them if they lose access to their account.

In examining how social influences affected the adoption of the three security features, he found that trusted contacts had the expected effect – if you have more friends who use a particular feature, like trusted contacts, you are more likely to use it yourself.

For the other two, however, social influence had a negative effect until a very high level of exposure.

Why?

“That means if you have some friends who use login approvals and login notifications, you’re actually less likely than if you did not have many friends using those features,” Das said. “Searching through literature and speaking to users, one thing became clear: There is a stigma effect. The person who would use these more stringent security measures is probably perceived as being particularly invested in security, maybe a little paranoid or nutty.

“Early adopters tend to be people others perceive as the ‘tinfoil hat crowd.’ Because early adopters tend to be this type of person, others might perceive it as not the type of thing they want to do.”

So, as Das points out, even if cybersecurity systems are made more usable, the fact that social influence can have a negative impact on adoption still poses a challenge.

Knock, Knock. Who’s There?

Another project Das worked on at Carnegie Mellon developed an authentication system for a small local group called “Thumprint.” It was designed for groups that had access to shared refrigerators or things like gaming systems.

The idea – a shared secret knock – goes back to prohibition-era United States, when there were secret speakeasies in which people in the know could enter to procure illegally-sold alcohol in just such a manner.

“They would identify if this patron was part of this club through a knock, and if the incorrect knock was used they wouldn’t let you in or they may tear down the operation because it could be the police,” Das said.

He wanted to see if this would work in the digital world. The knock acts as a shared password, but the system is also able to differentiate between different people who have unique cadences or rhythms to their knock.

“It’s a system that uses machine learning in order to learn groups’ shared knock but still differentiate between the mechanical expression of each individual’s knock,” Das said. “That way, I don’t have to keep a secret but the system still knows which one of us is accessing it.”

Making Security Engaging

These are just a couple of the approaches Das has used in the past, and now that he’s back in the College of Computing – a decision he felt was natural, considering the proximity of his family, his history at Georgia Tech, and the relationship he already had with other stellar faculty members – he’d like to consider how to make security more engaging to its users.

He described it in three terms: Observable, cooperative, and stewarded.

“I want to make systems that make it easy to observe and emulate good behavior so that it propagates through the population,” Das said. “I want to make security systems that are additive. Our strength aggregates in the real world. In the virtual world, it’s the opposite. If I have good security behaviors, but interact with someone with bad behaviors, the easiest way for an attacker to get our information is through that person. So, I want systems that can create strength in numbers. And, I want to examine how we can create systems that allow people to be stewards, to act in benefit of somebody else. That’s the social piece.”

In the cat-and-mouse game of cybersecurity, Das is hoping his research can increase the speed with which society can adjust to the constant bombardment of ever-adaptive attacks.

“The thing about the work I do is that humans don’t change that much,” he said. “Attacks change. Systems change. But, by and large, we can still use fundamental principles of what engages humans. If we can apply that to cybersecurity, that’s going to help in the long run.”