What if they don't care? Not because it's "designed" or "rigged" but because privacy is not something they value?

Humans lived in small groups, then villages pretty much until yesterday, easily for most of our species' history (99%+ of ~200k years). People knew everything about each other. And then gossiped to make sure nothing went unnoticed.

Contrary to the article, we probably still reveal less than we used to. People would bathe in semi-public places. That's not common outside of vacation spots any more.

Yes, advertisers bank on our nature. Gossip blogs bank on it. But they didn't make us that way.

OK, so the nature of information collection changed but we don't feel it. Some people can rationally appreciate it but not casual Internet users. And most nightmare scenarios are still hypothetical.

Imagine being subject of public hate because you expressed a unpopular view when you were young.

Many people, even here, are fine with that. On Twitter it's practically a part of regular programming.

I think it's a flaw to bring in village life in pre-modern time in the context of the meaning of privacy. For one, in that context what information one shares is apparent and the impact of the information is more readily understood. This is not the case when large volumes of data are shared with unknowable powerful parties, used in unmonitored ways with effects that one cannot predict or control.

Secondly, we do not live in a small community, we have an incredibly interconnected world. Democracy does not function well when people lose the capacity to moderate access of information about ourselves, our thoughts and communications to some degree, when other parties may use that information against the individual. It stunts individual expression and that is pretty fundamental to living a modern society.

It's incredibly naive to undervalue the need and importance of privacy. Nightmare scenarios are not hypothetical to a lot of people, they are acting out right now, each and every day.

Some one in a village back then could be pretty sure that someone in another village 500 miles away would not get to hear something they objected to and come 500 miles to punish them for something they might not have any problem with. If a local villager objects, there is a chance of face to face conversation and resolution according to local customs.

Im not sure but I think people want privacy more now because they are more concerned about the consequences of faceless people ruining their lives for reasons they don't know or understand, that actual local privacy.

I live in UK village, and I want privacy because I am not an American and I dont expect to be subject to US law which from my POV is a very disturbing and frightening thing. OK, we Brits need to deal with our UK/US relationship at a political level, but even so, I am happy for most of my life to be known to those around me, but I sure as hell dont want some Yank spook having anything on me whatsoever. I assume Americans dont want British spooks having info on them either.

Over all though, I do think its new uncontrolled consequences people are scared of.

Beyond the fact that humans have lived in small groups for almost the entirety of our existence, even in the brief flicker of time where that hasn't been the case, people who can afford to do so have hired domestic workers. In a world with a 1.25 billion Facebook users it's hard to pretend that being unknown is a huge human priority in and of itself.

Generally, the core problem that the more thoughtful people who are concerned about privacy point to is how certain actors (governments, corporations, doxing mobs) are being empowered at the expense of individuals. If that is one's concern it might make sense to get to the root of the problem and work towards limit the power of these actors directly.

Power imbalance is something that social animals designed to live in small groups would seem to care about deeply.

The point is not weather people are fine with it or not, the point was if they aren't they should have that choice to control the things they share. you dont always know what will come and bite you in the ass after a while. In my country, there has been an active targeted killing going on of a particular sect of the mainstream religion, We don't know who is involved, but in conditions like this, even information such trivial can be vital for your security.

While it may be true that the villagers knew everything about each other, the king had no way of knowing their utmost secrets. Heck, he probably didn't know their names, or how many of them there were exactly. The situation is different now, the boundary of locality has been dissolved.

I believe Windows Vista tried to do exactly that: annoy user with security related screens all the time. This didn't go well because people were just clicking "yes" without reading and in the same time everyone was absolutely frustrated with these crappy alerts.

Yes, good point, That would be traumatising to handle, all i am saying is, its a debate we need to have, and this cannot be left in the dark corners of modern web. This is just a proposal which imho covers all the bases, what we choose to do with it may change according to local geographical conditions and mindsets.

There is no need to have debate - the current internet infrastructure does not allow privacy. Even Tor and over tools don't let you to stay under the radar. And it is not really clear if there are any better ways. At least I haven't seen any research or proposals that would provide users with better privacy online.

Things do get better. Being gay used to be a major crime (and thus had to be hidden). Now gay marriage is becoming legal in much of the western world. I'm optimistic that we can continue building a more open and tolerant society.

Imagine all the lost opportunities, all the worry, all the steps some people take to conceal the truth. It can't go forever, and it will become increasingly more expensive to keep the privacy you had in the past, simply because technology makes transparency cheap and ubiquitous.

Designing a society that relies on the secrecy of certain information is a recipe for disaster. Passwords, credit cards, etc. It won't be long before we simply can't keep any of these secrets, and we will have to switch to a better identification system.

I've had enough to worry about what people might think if they encounter the truth. I don't want to lie anymore. I don't want to keep and remember secrets anymore. I don't want to watch each of my steps and hide behind 7 proxies when I surf the web.

We're due for a paradigm change toward transparency, and the earlier the better. Privacy and secrecy only leads to deception and inefficiencies.

just to reiterate, i am saying to change the mindset we handle the privacy, They can have you in a hundred different ways without you noticing. all i am saying is we (as users) make them care enough to be on our side and not collect info without telling us in the best ux possible.

Many of these discussions happened in the 1970s as people first became aware that large databases (then often called "databanks") were being built to store lots of personal information, and that information from one database could be combined with information from another via a database join. That raised the specter that information originally collected for one purpose could come to be used for a very different purpose.

An important result of those discussions at the time was the Fair Information Practices

which came out of two U.S. government studies on privacy during the 1970s.

These principles include things that are quite similar to what this article proposes, including notice (of what's being collected), choice (about whether it should be collected), and access (to know what others know about you).

The Fair Information Practices formed the basis for European data protection legislation, which has now been implemented in some form everywhere in Europe as a result of the EU Data Protection Directive and other legal instruments. (Of course the Europeans reformulated it and did not directly enact the original U.S. Fair Information Practices into law.) An interesting consequence of that is that most Europeans, at least in theory, have quite extensive rights against information collection that violates these rules (at least by the private sector).

Many Europeans have been able to exercise these rights in practice to challenge data collection and retention by private companies, to see what the companies know about them, or to demand that companies delete information about them. Some of those examples have been mentioned here on Hacker News; the one that I found the most interesting was when Malte Spitz got his cell phone location records from Deutsche Telekom by exercising his right of access under German data protection law.

Anyway, I think these rights are quite similar to what this article is proposing, so I wanted to point out that there is a long history of similar proposals, and that the idea that technology was taking away people's practical right to control over data about them is something that's been a concern for some decades.

By the way, the United States never enacted a comprehensive data protection law, despite being where the Fair Information Practices were first cooked up. They were never given the force of law in a general way, as they were in Europe; here in the U.S. companies can, in general, collect and use data in ways that would be considered "unfair" elsewhere. The main consideration in the U.S. is that the companies can't lie in their privacy policies, but there are few substantive restrictions on the private use and disclosure of data, outside of particular regulated sectors (like credit cards with FCRA, health care with HIPPA, and education with FERPA). There is extremely strong industry opposition to a generally-applicable data protection law here.

Some sore points about data protection where it did get implemented into law:

① European data protection law is leading to some weird and counterintuitive results, recently including the Google v. AEPD/González case where Google was ordered to remove links to old disparaging (but accurate) information about individuals when users search for their names, based on the idea that Google was "processing" personal data about those individuals in an inappropriate way.

② Data protection often has major loopholes for government collection of information. (Government agencies, including police and spy agencies, very often are subject to privacy and data protection laws, but the application of those laws often means just that those agencies are supposed to deliberate about whether they think what they are doing is OK; if so, they can carry on.)

③ As this article and this discussion seem to suggest, notice and consent have become more difficult where companies expect to use large amounts of personal data routinely. The amount of consenting that users would be asked to do and the frequency with which they are asked to do it could become quite annoying and also decrease the likelihood that users will take the time to understand what they are being asked to consent to. (We can see this to some extent with the cookie notices on European web sites, asking users to consent to being tracked by cookies. Contrary to the mainstream view of web developers, I think cookie tracking is a serious privacy risk that users should still worry about in 2014 and that addressing this risk is pretty important. But we can see that the warnings haven't necessarily made most users better-informed or more cautious about cookie tracking, and many users are probably kind of annoyed that every site they use is warning them about cookies.)

As a result of the last point, I heard a Microsoft executive in a speech say that he thought notice and consent were now obsolete and ought to be rethought. (This statement isn't super-shocking to Americans, who might not even have heard about Fair Information Practices in the first place, but it could have been something of a scandal if he had said it in Europe.)

The executive gave the example of the number of different entities that are receiving user information when a user interacts with a major web site, and the number of different privacy policies that would be applicable to these interactions. He suggested that few users would even read the policy of the site that they're trying to visit, let alone the policies of third parties (that might receive user data as a result of embeds or as a result of business partnerships).

I thought that preventing and discouraging some of those data flows was actually a goal of privacy protection. In fact, a lot of privacy software, including software recently developed by my colleagues, is actively trying to stop them, based on the idea that users don't know about them and that they aren't in the user's interest.