Search form

Main menu

Lessons from Lockpicking: Perfect security is a myth; nonfiction is the new security | MIT Center for Civic Media

Becky Hurwitz

Codesign Facilitator and Community Organizer

Becky is the Codesign Facilitator and Community Organizer at the Center. She spends her time with changemakers of many kinds codesigning tools and methods to leverage media and technology for equitable social change. Prior to joining the Center, she led the SaferMobile project at MobileActive, a program to educate and train activists, journalists, and human rights defenders in mobile phone security. Becky has lived domestically and internationally working at the intersections of social justice, technology design and development, and media making. She is particularly dedicated to the demystification of technology and the democratization of technology creation and use. Becky holds a B.S. in Comparative Media Studies from MIT and an M.S. in Information Management and Systems from the UC Berkeley iSchool.

Lessons from Lockpicking: Perfect security is a myth; nonfiction is the new security

"You never pick a lock you don't own or that you haven't been given explicit permission from the owner to pick. Number 2, don't pick locks in use because locks can break when you pick them."
- Schuyler Towne, competitive lock picker and professional security researcherSchuyler's introduction to lockpicking video

What is the myth of perfect security?
Schuyler Towne spoke to us in April (live blog post about Schuyler's civic lunch talk: http://civic.mit.edu/blog/mstem/locks-as-social-contract) about the idea of perfect security that existed until the early 1700s while locks were unbreakable. Early tests for the strength of a lock included leaving a lockpicker with a lock for 30 days and allowing that person any tool or tactic to try to break the lock. Only locks that survived these kinds of tests went on to use and because of these tests, locks were trusted.

But now, all locks, any lock, can be broken – Schuyler contends that most can be broken within 15 minutes. So, locks are a social convention moreso than an actual mechanical barrier to physical spaces. We lock spaces that we would like to keep private. We choose to respect each other's desires for privacy of locked spaces by not breaking one another's locks as part of a social agreement to respect each other's privacy not because we are unable to technically break one another's locks.

Myths in electronic security
Schuyler talks about physical space, mechanical locks, but we have similar understanding of privacy and information digitally. Our model of information security and privacy may be founded on the desire for perfect security, to have our communication and data private to ourselves and other people we allow to access this communication or information, but was it ever this way?

Encryption, like PGP is hard to break by guessing keys. It's the equivalent of a strong lock. With our speediest computers, it would take 100 million years, according to PGP. That's a while.

But, it's only as strong as the social practices around it and as strong as someone's ability not to give up their secret key. Stealing a key or forcing someone to give a key to you may be a less artful way to break a lock, but is indeed another way. Attempts to use legal rights to maintain secret keys in the US have been contested (http://en.wikipedia.org/wiki/United_States_v._Boucher).

And, most of us don't use like PGP, we use passwords when we're required by services, and periodically for our devices and programs when we are not required. Similar to mechanical locks, most electronic lock systems are easy to pick. Our passwords are commonly easy to guess because they are similar (http://splashdata.com/splashid/worst-passwords/index.htm) or are made of real words and easy to guess facts about ourselves. There are freely downloadable programs to crack passwords and if yours is common or based on real words, it would be easy to crack it. So, that we don't all run around breaking into each other's digital information is partially based on our interest in upholding the social contract of a lock.

But, here the metaphor diverges. Unlike property or space that we own exclusive rights and access to, we are not in control of much of our electronic spaces. An inbox on a service like gmail or yahoo is more like having a lockbox in a lockbox facility.

Whether it's private or not has more to do with your relationship with the owner of that space and the design of the system than the strength of your key – do you trust the owners of your lockbox not to have a cheat to break into your box? Do you allow them to have a duplicate key? Do you trust that no one will force them to open your box? When you share information using a service, do you trust the carrier? Do you know that your messages will not be intercepted or overheard in some way?

The social contract is less compelling in the case of breaking and entering into electronic spaces because it's less noticeable. The social contract that we have about not breaking each others locks and entering private spaces uninvited is partially reinforced by how obvious it would be if we did this – if you break the lock on a front door, you have to spend some amount of time at the front door, breaking the lock. If you are accessing electronic information in someone's inbox, no one may ever see you.

Rules around who owns digital information and rights to access it are contested. Depending on which country you are in, which country your data is in, which country the host of your data is in, your country's spying laws, are all factors in calculating the probability of “your information” being regarded as your “private” information by a government.

Nonfiction is the new security
So, security experts and advocates work to develop and teach about the strengths and limitations of:

And in asking – how would we live if we had perfect security? What is it what we would feel safe and secure to do? What would we say? How would we act? Because, a lock is only strong until someone learns how to pick it or until a machine is fast enough to break it and our social contracts are only as strong as our ability to detect that someone is breaking them. It then becomes crucial to advocate for the legality of what we want to be able to do legally. If you break a social contract and invade someone's privacy only to find that they have behaved legally, all you have done is broken the social contract.

[there's an endless list of people and groups that I admire working on this]