When the idea for postcards was first put forward in Germany in the 19th century, the then kaiser rejected it because of concerns about privacy.

It seemed quite wrong to suggest that people might write about their personal lives on an unsealed piece of card and send it through the public postal system, where every postal worker who handled it would have access to the sender's views about the weather and the intended recipient.

New technologies often have the effect of surprising society. The postcard was no exception. To the astonishment of sceptics, it turned out that the public were quite willing to disclose highly personal information on a simple piece of card, and trust that the postman would not be interested in their affairs.

Compare this scenario to the recent arrival of two technological innovations: gmail and care.data. Gmail is Google's free email service. When it was launched, the company took the highly original step of getting computers to scan the text of the messages, identify key words and phrases, infer the sender's likely interests and post relevant advertisements on their browser. The howls of public protest at this infringement of privacy were so loud that Google backed down and withdrew the scanning system.

Care.data is part of the NHS's plan to bring together data from GP practices, hospitals and other healthcare providers. The aim is to get an overview of how the service is affecting people's health, but will this scheme result in improved performance in the sector?

Care.data ran into a volley of privacy accusations as experts pointed out that, even when data has been anonymised, a malicious hacker might be able to piece together evidence to identify an individual and thereby discover information about them from their health record.

So will care.data be forced to rethink? Comparing the success of the postcard with the U-turn by Google can elucidate some of the factors likely to affect the outcome.

The first and most obvious issue is the level of benefit compared to the level of risk. Postcards are very convenient – far less hassle than a sealed letter. They are worth the risk. In contrast, gmail offered little benefit over competing email systems that did not read your messages.

Care.data scores well on the balance of benefits and risks. The benefits of the postcard pale into insignificance compared with he benefits of using health data to discover better treatments and improve our health. Many of the commentators defending care.data stuck to the argument that "the benefits are huge, the risks are small, it's the right thing to do".

But there is another issue here. The degree of trust and control the user has over the system. There is little mystery about how postcards work. The user has a clear grasp of the risks and has complete control over how much information they put on the card.

In theory, the idea of a computer reading your correspondence ought to be less troubling than the idea of your postman reading your correspondence. In reality, the computer is much more disturbing.

Most of us probably don't understand why it might go wrong, or what the risks are – but that is precisely the problem. Because we don't understand the dangers, we cannot get comfortable with them. How much people trust and understand the system is crucial.

On this measure, care.data has some vital work to do. It is important because the technology is an essential part of being able to provide the standards of healthcare that we aspire to. The speed and accuracy with which we can identify what cures people and what is causing them distress is transformed by access to large volumes of machine readable health records. The prize that care.data offers is far too great to forfeit because of fears about computers and anonymisation.

An essential part of ensuring that people are comfortable with such systems is giving them more control. Allowing people to opt out of the scheme is important. But everyone who opts out, weakens the ability of the NHS to manage care effectively. This is the least desirable outcome.

Equally important is giving people greater control over the underlying data. Allowing them to see the information, correct it and use it themselves if they wish. That is essential if we are to succeed in encouraging people to allow their data to be used to create knowledge that will benefit all.