Breadcrumbs

You are here:

Grassroots digital rights organizing has many faces, including that of hands-on hardware hacking in an Ivy League institution. Yale Privacy Lab is a member of the Electronic Frontier Alliance, a network of community and student groups advocating for digital rights in local communities. For Yale Privacy Lab, activism means taking the academic principles behind Internet security and privacy out of the classroom and into the real world, one hacking tutorial or digital self-defense workshop at a time.

Yale Privacy Lab is an initiative of Yale Law School’s Information Society Project—which concerns itself with digital freedom, policy, and regulation—and serves as the project’s practical implementation arm. We interviewed founding member Sean O’Brien and Cyber Fellow and researcher Laurin Weissinger about their work empowering the next generation of digital rights defenders, and offer advice for those wishing to emulate their example.

Yale Privacy Lab has been going strong since 2017. Tell us a bit about your origin story.

Sean O’Brien: Privacy Lab grew out of workshops that we were already doing for the law school. I had been doing them in New Haven for a while for some activist groups and then got involved in the law school after some people put me in touch with the Information Society Project. They were very enthusiastic. We did some things like Software Freedom Day with the Free Software Foundation, and we hooked up with a local makerspace here, MakeHaven, which gives us a nice, fun, technical setting to be doing those kinds of events.

As Yale Privacy Lab continues to evolve and expand, what are some of your latest endeavors?

SO: We do digital self-defense workshops, and we also do some fun things like taking photos of surveillance devices and mapping those around New Haven. Our role internally is to help give advice and be a resource for law students, faculty, scholars, and anyone who is involved in the legal clinics. Our legal clinics at Yale Law School do a lot of high-profile work, so there’s substantial interest in private and secure communications and a real need for them.

Our digital self-defense workshops are similar to the kind of information EFF has in the Surveillance Self-Defense guide, but a little different. Our take on it is very much shaped by the individuals who come in. We try not to do the standard cryptoparty thing, which is to make sure you cover a certain five tools. We’ll cover whatever needs to be covered based on who’s actually at the workshop. [Editor’s note: This approach fits the guidance included in EFF’s Security Education Companion guide.]

Laurin Weissinger: We’re doing this cyber security class here at Yale Law School for JD candidates and LL.M. candidates.

In the class, everyone gets a micro computer, and we run a hacking-friendly Kali Linux. It’s all about students actually understanding how computers work, and how software works—for example, how privileges and rights are being used in a Linux or Unix system. We did things like show them how network traffic can be intercepted easily and how unencrypted traffic can be read by third parties. All of this comes down to empowering students.

What’s good about a lot of these technologies—FreedomBox, for example—is that they are free or relatively cheap and also enable privacy-enabled learning. If something breaks, it’s not the end of the world. At the same time, it is state-of-the-art privacy-enabling technology.

Another example is the running of Tor nodes, which is very interesting for instruction, because we can show, for example, that all Tor traffic is encrypted. We can sniff it here in the network to demonstrate how it works; and at the same time, we are running relatively cheap hardware the students are familiar with.

How much institutional support does the group receive to run these projects?

SO: We are currently a volunteer-driven initiative, and have been since the start. That means we don’t get any direct funding from the school or any grants, at least at the moment. We do get support for infrastructure—things like printing and event hosting—and those things are obviously not cheap. All of that is coming from the Information Society Project, inside the law school.

We also have had the benefit of the connections through the Electronic Frontier Alliance. That’s allowed us to reach out to other Internet freedom, anti-censorship, privacy groups here on the east coast and elsewhere to get some ideas and collaborate on thinking about these sorts of things. The free and open-source software movement has been huge in that direction as well. The Software Freedom Law Center has sent folks down to do presentations for us; and they have a close connection to the FreedomBox Foundation, which is where we get the real life support for the devices we’re setting up.

Beyond that, we have the great help of the librarians at Yale and Yale Law School, specifically. Early on, we did a bunch of presentations for the law librarians and they were very concerned, as librarians tend to be, about the privacy of their patrons. So they set up a Tor browser on every single computer they have there, and they encourage patrons to use it. They came up with a training for that and all the documentation they would need for their use cases.

How did you get Yale to agree to the Tor nodes? Do you have advice for other groups on that process?

SO: In my perspective, the first thing that needs to be done is getting people to use the Tor browser. That removes some of the stigma behind Tor use in general.

From a technical standpoint, the reason we latched onto FreedomBox is because it’s the easiest way we’ve found to graphically set up a Tor relay that is also a bridge which has a Tor hidden service where they set up an onion service for you. It’s basically a five minute installation once you get the hang of it. We’ve done workshops just recently where we’ll have a bunch of people in the room install this on virtual machines. So if you’re going to have a problem at an institution because they don’t want you to set up physical devices on their network, you can get people at a workshop to do this.

LW: In the cybersecurity class I did speak about the criminal aspects: why would criminals move to the Tor network, and so on. We underline the fact that this is just a rational move by criminals. If you want to, for example, host a forum where illegal stuff can be bought and sold, you would use the most privacy-enhancing technology available, which is Tor. At the same time, there are also a ton of illegitimate websites on what I’ll call the clear web. We know that any technology that exists will be used for illegitimate, criminal, morally problematic reasons. If criminals are using things like the Tor network, email encryption, secure messaging, etc, it means that these technologies appear to be offering some level of protection.

Do you have any resources that could help others who are interested in leading these kinds of projects?

SO: The main resource that we’ve been using for our workshops is called Citizen FOSS (free and open-source software). It’s a play on Citizen 4 which was the handle Edward Snowden used when he was in his operational phase. What we try to do is take the actual software—but sometimes it‘s the operational concepts Ed Snowden used—and apply them as often as possible. The guide is very long and we remix it for workshops on an ad-hoc basis based on what the actual participants are interested in.

What do you see as Yale Privacy Lab’s role in the wider community?

SO: From the start, we’ve been very adamant about engaging with the New Haven community. Obviously if you want to be serious about this kind of anti-surveillance, anti-censorship work you have to engage in the world physically around you. So we always try to make sure the workshops are available to the community and that the resources are all Creative Commons licensed and available to as many people as possible.

I think reaching out to locals in the area has been a big part of the success. It gives us a grounding outside of what can sometimes be a stuffy Ivy League setting. It’s great that the law school allows us to use their facilities, but we want to have environments as well that are welcoming and creative and have more of a community feel. Philosophically and politically, we aren’t trying to do things that concern Yale only, and we also understand that Yale itself has a role in things like local surveillance.

What advice do you have for groups who want help their institutions migrate to new software?

SO: We have a very strong dedication to free and open-source software, and also we have never suggested use of software that has a monetary cost associated with it. As everyone knows, the thing about FOSS is that you’re not tying yourself to a technology that has some big licensing cost. In our case at Yale—as an institution that pays for a lot of software—you’re also not tying yourself to the institutional procurement process behind that.

The thing about free software is not just that it gives you all the freedom to remix and modify code, but also it’s better from a privacy and security standpoint because it can be audited. We let people know that we care about this thing called licensing, but we care from a privacy and security standpoint. It’s a basic truism that the availability of the source code—the ability for security experts to read it—means it’s hard to hide malware in there. I would say, if you’re interested in growing quickly, getting your stuff out there, being able to do it without a ton of people over your shoulder, use FOSS tools.

Focusing on communication needs is really important. In our case, the law school clinics are always talking to at-risk users, so convincing people at the law school and at these clinics that they need this stuff is not very hard. In some other areas, it might be more basic. It might be, “Do you want to get away from your stalker ex-boyfriend? Do you want to not get all your banking information stolen when you’re working at a cafe?” Those might be bigger selling points.

The other thing is just trying to have diplomacy. It’s important to understand that others’ concerns are based in the institutional norms that those workers are used to, rather than trying to just reflexively tell them to go screw themselves.

LW: My first tip comes from my cybersecurity perspective which is: do not run just anything. Run what is the industry standard, particularly when it comes to crypto. In most cases, that will also be open-source. Rely on projects that you know are open-source that are being audited, that are being used by the industry; which means that errors and bugs will be found with a greater likelihood than they would be in random stuff. It is not just about privacy, it’s also about your security.

Finally, get some support to run your infrastructure in a way that does not break the network of the institution you work with. Make sure you have as little impact on them as possible and you will be far more likely to get good institutional buy-in.

The most effective digital rights advocacy is tailored to the skill sets of the organizers and responsive to the needs of the communities they serve. Yale Privacy Lab demonstrates how to build a campus community around digital rights activism, and offers an excellent example to emulate elsewhere.