Digital technologies have ushered in an era of unprecedented mass and individualized surveillance. The inherent tensions between unrestricted data retention and personal privacy have prompted some governments to take action. In the EU, where the ‘right to be forgotten’ has been enshrined in law, and a new General Data Protection Regulation places limits on how organizations can use individuals’ data. In recent years, revelations like the Edward Snowden disclosures trained a spotlight on the ways in which governments (including in the EU) surveil their citizens and human rights defenders without warrant or cause.

Inequality is a central feature of surveillance and privacy in the digital age. As privacy becomes a privilege, protected in the global north, data extraction from the global south is likely to accelerate.

There are political and socio-economic dimensions of digital surveillance. Human rights defenders and political activists are frequently targeted for surveillance. Research by the Tactical Technology Collective revealed that states often compromise the online safety and privacy of human rights defenders both in daily forms of harassment, and in extraordinary forms of intervention. Thus, the space for alternative political visions and projects (including the democratization of the internet) is confined.

In addition, women, people of non-heteronormative gender identities, and people of color are often at significant risk to online harassment and bullying. The United Nations Report on Cyber Violence against Women and Girls found that 73% of women have been exposed to, or experienced, forms of online violence. These marginalized communities face discrimination offline, but they also often find the internet a hostile environment for sharing their views. We need to be aware of the unique privacy concerns of different communities in order to build an internet where all are welcome and where privacy entails not only the right to non-interference but also the right to safety and security for those at risk. The privacy policies of internet corporations often have negative effects on diversity and participation online, such as Facebook’s ‘Real Names’ policy, which adversely affected LGBTQI and Native American communities.

The Questions We Care About

How is privacy connected to privilege in terms of geography, class, race, and gender?

Whose privacy does a right to privacy defend?

Whose security do we value?

How do we realize privacy as more than simply non-intervention – but as the positive duty to protect?

How do we bridge the knowledge gap between internet users and data policy makers?

How do we build solidarity networks between internet users whose privacy is at stake, and tech projects committed to the defense of our privacy?