Musings about grad school and personal fulfillment.

NSA’s Best Scientific Cybersecurity Paper Competition, Part 2

One of my papers was recently recognized as an honorable mention for the NSA’s Best Scientific Cybersecurity Paper Award. As an honorable mention author, I was given the opportunity to present my work to an audience of NSA security researchers. In a previous post, I briefly described the award and my paper that got the honorable mention. In this follow-up, I want to talk about the actual recognition ceremony — i.e., my day at the NSA, speaking with actual NSA security researchers and other award winners.

I mentioned this already in the previous post, but it may be worth mentioning again: this post is not meant to be an opinion piece about the NSA. It is simply a descriptive account of my one — rather restricted — day around the NSA.

Learning the news

I first learned that my paper had been selected for an honorable mention when I was in Zurich, just as I was wrapping my internship as a privacy researcher at Google. It was an interesting experience for a couple of reasons:

First, the NSA research directorate director, Dr. Deborah Frincke, personally calls the winning and honorable mention authors to inform them about the victory. But, as the nomination process is anonymous and must be done by someone outside of the author list, it is definitely not a call that one might expect. (In my case, I knew that the paper had been nominated but I was not expecting a personal phone call). I was still using my U.S. number in Europe [1], so I wasn’t too keen on picking up a call from an unknown number.

Second, I was not expecting the award because the news arrived later than it was supposed to arrive. First, while I knew the paper had been nominated, the initial timeline for the award suggested that authors would be informed around early July if they had won. I got the phone call late August.

Later, I got an e-mail from my adviser (and co-author) informing the rest of us that our paper had received an honorable mention. Dr. Frincke later left me an enthused voicemail. She’s very nice, by the way. Even after I e-mailed her back thanking her for the news, she insisted that we eventually have a phone call to talk about the work.

Encryption What?

The NSA offered to foot the bill of getting me and one of my co-authors out to DC for the recognition ceremony. This process, of course, required some back and forth so that they could get us clearance to enter the building and purchase plane tickets. Given that the NSA is a government organization — let alone a government organization in charge of intelligence — it’s not too surprising that the tech used is a bit behind the times [2]. For example, I had to fax in some information in order for them to purchase plane tickets. Yes, I had to use an actual fax machine.

Some of the other tech issues were less…expected and more humorous. For example, I was not able to send my point-of-contact at the NSA an encrypted e-mail because of some issue on their end. We had to settle for a phone call for me to relay some of the more sensitive information they requested. Can’t help but find that a bit ironic.

The Ceremony

The day of the ceremony was actually fun. My adviser and I were put up in a hotel, and we were told to meet the NSA reps somewhere in the lobby. I was half expecting people in black suits and shades to come up to me and just escort me out of the hotel, but the actual process was more ad-hoc. It was a couple of NSA reps standing confusedly in the lobby looking around for people who kinda looked like the pictures we sent them. My adviser and I spotted them right away after breakfast, but we could see the hesitation in their eyes. It was a good start. We were definitely not dealing with any boogeymen.

National Cryptologic Museum

The first agenda item was a visit to National Cryptologic Museum. That’s where we ended up meeting the other winners and honorable mention authors. It was a motley crew, all told. The winning paper had a fairly large author list consisting of people from Brazil, France, Australia and the US. The other honorable mention paper was from MIT Lincoln Lab. There was also a high school science fair winner and her father (because she couldn’t drive by herself, adorably). As you can imagine, the age range was quite large. Apart from me and the high school science fair winner, everyone seemed to be pretty established. We were the only two without Ph.Ds there.

Anyway, we spent around 2 hours at the museum and were taken on an extremely in-depth tour about the history of cryptology and its role in World War I and II. I got to see and play with actual Enigma machines. I also got to see, but not play with, an actual Bombe device (the device, originally designed by Turing, used to decrypt messages produced with an Enigma machine) and replicas of the Japanese “Red” and “Purple” encryption machines. Knowing about these machines intellectually is one thing. Seeing them in person is quite another. I was blown away by the magnitude and intricacy of the engineering effort that went into creating these early encryption and decryption machines. They were truly masterful feats.

Scattered throughout the museum were a set of crypto-puzzles for kids. Amusingly, the NSA has developed an entire cast of anthropomorphized crypto-loving animal characters, dubbed the CryptoKids, to increase children’s interest in crypto. Think Teenage Mutant Ninja Turtles, only every one is Donatello. I saw quite a few kids who were really into it. Kids these days are too smart.

Lunch & “Wicked” Discussions

Following the museum tour, we got a free lunch. Apparently, the arrangement of that lunch was itself a masterful feat because of all of the bureaucracy hurdles. So, you know, whoever was able to get that done: thanks!

The one snafu was that they scheduled around 2 hours for us to eat. That’s crazy. We were done in 30 minutes. So there was around an hour and a half for us to kill before the presentations and panel discussion began.

If you put a bunch of security and privacy researchers who don’t know each other in a room together, you’re not going to get enough small talk to fill up a 90 minute void. Conversations were lulling, and the organizers could tell. But one particularly smart organizer figured out a way to get things riled up again. How? Actually, it’s pretty easy.

If you ever want to get a bunch of academics to go crazy, ask them a “wicked” question. A wicked question is basically a question that has no definite answer but is interesting to contemplate. For example: “Is time travel possible?”, “Are we on track to creating self-aware AI?”, “Can flying cars ever be practical?”, and “What exactly is a p-value?” Academics will take the bait every time. Even if none of us has a strong opinion about the topic before hearing the question, throughout the discussion, every single one of us will develop a strong opinion and argue ad infinitum — nay, nauseam.

This guy asked us two wicked questions. The first: “How can we facilitate tech transfer?” (i.e., how can we get more academic research insights into real products) and the second: “What are some interesting, open research questions in privacy?”

The tech transfer question took up the brunt of the time and it seemed like few people in the room had ever actually worked in an industry position. Accordingly, it was really much more an intellectual exercise than a practical one. Many people were generally under the impression that industry is too focused on short-term goals, whereas most good research is generally too broad to be useful in the short term. Others called for better research communication, I think [3]. There were also often heated off-topic exchanges between smaller groups of us. For example, one of the other authors asked why we’re talking about tech transfer when we should be talking about science transfer, and this lead the conversation down a random path about whether people can understand if a product is scientifically tested or not. I don’t even know what that means, so I have no idea if “people” do.

The privacy question basically lead people to inquire if there could ever actually be a “science” of privacy. Someone brought up the idea that with security we can have formal guarantees, but with privacy we may never be able to offer formal guarantees because there is no universally agreed upon definition of privacy and thus no way to formally measure whether something “is” or “is not” private. A lot of the discussion was about trying to formalize privacy, though. I personally do not believe that’s a productive use of time. I think we should recognize that privacy is inherently very subjectively understood and that we should not try to fight that flexible understanding by creating a one-size-fits-all definition.

As a corollary to that observation, I think that the role of technology in privacy protection is secondary. I have serious doubts that technology, alone, will ever be able to assure “privacy”. Instead, I think that affording people adequate privacy protections is a policy problem. In his book, “Secrets & Lies”, Bruce Schneier argues that until we create strong enough (dis)incentives for decision makers (i.e., executives in industry and government) to value people’s privacy at the level they value other things (e.g., creating new features for their products), privacy will never be a priority for the people who have any power to assure it.

Anyway, that was lunch. As you can see, if you give a group of academics a solid wicked problem, killing 90 minutes is no problem at all. We could have debated these two questions for 900 minutes. Eventually, however, we had to stop. It was time to be “recognized”.

Presentations

Next up: presentations. One author from each of the winning and honorable mention papers had to give a 15-minute presentation about the recognized paper. Just before that, we were each given a framed certificate along with an NSA challenge coin [4].

I’m not going to talk too much about the actual content of my presentation. If you want to learn more about my paper, please either read the paper or my previous post. Instead, I’m going to talk about my experience giving the presentation.

Basically, giving the presentation was more nerve-racking than I initially expected. Note that I’m not typically very nervous then giving presentations (anymore). As a graduate student, you have to give these sorts of presentations all the time: at lunch meetings, at conferences, at fellowship and award recognition ceremonies like this one. And, I had already basically given some form of this presentation several times before. So, the weeks leading up to the presentation I felt totally calm and unworried. In fact, I didn’t even practice the talk once since the last time I gave it (nearly a year prior). But the nerves hit in the hours leading up to the talk.

Maybe I finally realized just what I was doing: speaking to an audience of some of the brightest minds in cybersecurity research about work that was being recognized as some of the best in the past year. That’s flattering, of course, but also opens up the work to all kinds of scrutiny. I had to do my best and represent the work well, lest they realize they made a terrible mistake in selecting my paper. I couldn’t treat this opportunity so casually.

Luckily, the presentation went really well [5]. The audience seemed to really be receptive to the broader idea of applying social psychological insights to improve security. One clear indicator of their engagement is that almost every question in the joint Q&A panel at the end of the three presentations was directed at me about our work. And the questions weren’t the scary “is any of your work even valid?” kind, but the “whoah, that’s really cool, have you considered X as well?” kind. Those are always great to answer. Those questions feel less like an interrogation and more like an intellectual exercise.

The high school science fair winner also gave a quick presentation. She talked about using stylometry to de-anonymize tweets with the application of detecting cyberbullies. It was a neat presentation. Once again, kids these days are too smart.

Reception

There was a general reception at the end of the presentation session. This was supposed to be our chance to intermingle with a broader set of NSA researchers.

The brief chats I had during this reception were extensions of the sorts of questions I got asked during the Q&A panel. Lots of people seemed to be excited about applying the insights from our paper to increase the widespread adoption of security tools in their own sub-organizations.

Sadly, I couldn’t stick around very long. My adviser and I had a flight to catch in a couple of hours, and we had to drive back to the airport. Before I left, though, I did get to eat some NSA cake:

mmmm…NSA cake

Concluding Remarks

Overall, I am deeply honored and happy to have been recognized as one of the honorable mention authors for this award. My experience at the NSA was short and very restricted, but from what I could tell, the security researchers there were not too different from the ones I had met in industry and academia. They were concerned about the current state of security technology and were generally curious and open-minded about our ideas.

If you have any questions, feel free to leave a comment or tweet at me. I’ll answer what I can.

One final note: If you liked this post and would like to show your support, here are two things you can do:
– Follow me on Twitter @scyrusk; and,
– Consider signing up for my mailing list below. I won’t spam, you can unsubscribe at any time, and you’ll get some content that I don’t post on the blog.

Email Address:

If you do either or both of those things, you’d make me happy. Thanks!

Footnotes

[1] I have a T-mobile plan that gives me unlimited (though slowish) data throughout most of Europe, as well as unlimited SMS messaging. Phone calls are $0.20 a minute.

[2] If this seems weird to you, it’s because we do not yet have a way to formally guarantee that the implementation of something is “secure”. The conventional wisdom is that you make a thing that is supposed to be “secure”, and then let the broader community vet the thing by poking holes through its implementation. The longer the thing is around, the more holes will be found and patched up. Accordingly, newer technology is almost always less secure than older technology because fewer holes have been found and patched.

[3] Clearly this person did not articulate the point for better communication very well.

[4] The certificate was cool, but I later realized that the challenge coin was even cooler. Shortly after receiving it, I heard the 99% invisible podcast about challenge coins and was immediately really excited.

[5] In my experience, presentations always go more smoothly during the real thing than they do when you practice them and how you imagine them going.