Share this article

This article was written as a guest post for Digital Freedom Fund's blog. It's a reflection on one of the sessions of their brainstorm “Future-Proofing our Digital Rights” in September.

Trapped in a reactive modus

When fighting for the protection of civil rights in the digital domain, it is easy to get trapped in a reactive modus operandi. Digital rights organisations are underfunded and overburdened. At any given time, there is a surplus of legislative proposals delivering yet another attack on freedom of speech or interfering with the right to a private life. And while online information and communication technologies/services increasingly function as intermediaries for all that we do in life, human rights are rarely at the heart of their policies.

Only through speaking with, instead of about, at-risk communities we will be able to really understand the full impact technology

Breaking free from a toxic trap

Being forced into a reactive mode comes with consequences: it is rare that civil society is able to set its own agenda in advance. As a result, when legislative proposals lack vision and are fragmentary, so is civil society’s response. To break free from this toxic trap, brainstorming sessions such as Digital Freedom FundThe Digital Freedom Fund supports partners in Europe to advance digital rights through strategic litigation.’s workshop “Future-Proofing our Digital Rights” are of vital strategic importance. It allows digital rights activists and experts to step away from their daily realities and start envisioning the future they want and what it will take to get there.

Insufficient attention for the marginalised

Having a reactive approach also means that many of the organisations that are focused on general privacy or free speech concerns (such as Bits of Freedom) are unable to pay sufficient attention to issues from the perspective of vulnerable or marginalised groups in society. Their voices are suppressed in an environment that is not interested in their needs. When digital rights organisations are caught up in fighting for the privacy or free speech rights of all internet users, they can often miss or overlook the perspectives of these particular groups. This becomes even more painful when you consider that these citizens are the first to experience the negative consequences of legislation that does not respect digital rights.

What design allows for all groups of people, regardless of their position in society, to be included?

Imagining a more diverse and inclusive future

One of the sessions at DFF’s workshop asked participants to imagine a more diverse and inclusive future and how technology plays a role in this future. A complex topic with a multitude of nuances, all participants agreed. What does such a future look like? How do we prevent the creation of an environment where people feel free to harass and abuse others, causing them to stay away? How does that digital environment differ from the public space in the real world? What design allows for all groups of people, regardless of their position in society, to be included? Participants not only found it difficult to envision such a future in light of current trends, but also found it hard to identify a working strategy. Should we focus on mitigating the risk of people being targeted, should we help them empower themselves, or both? And how do we educate those who are unaware about the issues faced by minorities and vulnerable groups?

Speaking with, instead of speaking about

A necessary and obvious, yet easy to ignore, approach to the envisioned future is to collaborate with activist groups working on topics such as sexism and racism, as well as those that work on feminist causes. Only through speaking with, instead of about, at-risk communities will digital rights activists be able to really understand how the architecture of our digital environment, company policies and legislative proposals affect the rights of those that are impacted most frequently and most severely. And only through these collaborations will we be able to adequately demonstrate the urgency of the issues that need to be addressed.

Digital rights organisations are underfunded and overburdened

We need to redistribute power

Another outcome of the brainstorm: we need to rethink how, as a society, we deal with hateful speech online. On the one hand, participants felt we need to fight the so-called filter bubbles, in which users are presented only with information that aligns with their existing views and prevents them from being confronted with contrasting ones. On the other hand, some recognized the need to empower users to make their own decisions. Right now, important decisions on what one is allowed to publish and what one will see are in the hands of corporations whose incentives hardly line up with those of their users – and especially not those in marginalized and vulnerable groups.