Trust is a vital aspect of every friendship, every family, every society. When you and another person trust each other, you’ve worked out that your interests are suitably aligned. You both believe the other will behave in ways that ‘look out’ for the two of you, that serve you both well.

Trust supports our interactions as social animals. We’ve evolved to look for clues that tell us how trustworthy another might be, and to explore ways to test and build that trust without really thinking about it. We end up with:

You trust someone else to do X.

What does trust mean in technological terms?

You trust Technology to do X.

And actually, this is a lot more complicated.

You must trust the organization that makes the technology in the context of it doing X. Yet the organization is not another person with the social characteristics you can interpret for trustworthiness. You never get to meet the organization, or more precisely any person meaningfully representing it.

There is no mutual agreement. You must accept the organization’s terms and conditions else not get to use the technology. And the organization is not compelled to tell you about its motivations or values, or indeed live up to those it does communicate.

And your problems don’t stop there.

As information technologies intermingle and inter-operate, it’s not always obvious how you might spot any behaviours that betray your trust, or relate any you do identify to any given technology and therefore any given organization.

You end up needing an intermediary expert to indicate the technology’s trustworthiness and the organization’s trustworthiness in developing and sometimes maintaining and operating it. In other words, you now need to trust a third-party expert. The Digital Life Collective will be the expert as soon as we co-operate to make this happen.

As an example, we know that our trust in technology exposed to others’ surveillance and manipulation is eroded and we’re left not knowing what can be trusted. You may have heard of instances of this playing out in recent times, going by the names of fake news, filter bubbles, and automated profiling.

We know that if we cannot trust our tech it’s increasingly difficult to trust ourselves. And if we cannot trust ourselves, we cannot trust each other.

...

The Digital Life Collective researches, develops, funds and supports Tech We Trust: technologies that prioritize our autonomy, privacy and dignity. Our tech, not their tech.

The norm today is Tech We Don’t Trust. Every time we engage with digital technologies, we should ask – what’s actually going on here? – and the answer is too typically illusive.

We cannot wait for the invisible hand of the market to help solve this. If anything, it seems to guide many companies in the opposite direction for now. We cannot wait for governments to solve this either. In short, we need to co-operate. And that’s why we exist, why we are incorporated as a co-operative, and why we need you to join us. Find out more at www.diglife.com.