Ashlin Lee at Australia's Lifehacker site argues that issues like data retention and government surveillance are just the tip of the iceberg of worries the public should have over Big Data-- and nicely creates a linked bibliography of resources to further explore the problems he highlights:

In reality, people are leaving behind a whole "emerging ecosystem of digital traces, fragments and identifiers that are created as a part of digitally-mediated social interactions," what sociologists Mike Savage and Roger Burrows describe in their The Coming Crisis of Empirical Sociology as a growing array of digital traces that flow from “transactional data”, as they are born from the routine transactions and interactions of a modern society. Savage and Burrows also note that the whole methodology of sociology focused on surveys and interviews is likely to be challenged as authoratatively representing the social with big data resources available.

While many promise great social advances from big data, others are highlighting how the private control of these databases are birthing whole new issues of inequality:

Mark Andrejevic inBig Data, Big Questions| The Big Data Divide​highlights "the asymmetric relationship between those who collect, store, and mine large quantities of data, and those whom data collection targets." This increases power imbalances, especially with most of the public unable to anticipate how such data will be later used to target and sort people for private corporate goals.

Frank Pasquale in his Black Box Society emphasizes how much social control we are delegating to data driven systems of algorithms and the negative social impacts of that delegation.

Our ignorance of these algorithms leaves us vulneral to a whole range of discrimination and surveillance that we are usually not aware of nor can fully anticipate.

Mark Burdon and Paul Harper lay out how workplace discrimination law is being upended by big data options that "challenge the very basis of our anti-discrimination and privacy laws, since "it is often impossible to connect discrimination to the inequalities that flow from data analytics...Establishing a link between a protected attribute and a big data discriminatory practice is likely to be evidentially insurmountable."

José van Dijck in Datafication, dataism and dataveillance highlights that we are entering a dangerous new world of allowing such discrimination by algorithms and machines often acting without human oversight.

As people collect more and more data on themselves and share it, it raises the question of how people are handling that data and what protections they need.

Kate Crawford in When Fitbit Is the Expert Witness details how self-tracking may increasingly play a role in litigation and people may find their personal devices "used against you in court...wearables data could just as easily be used by insurers to deny disability claims, or by prosecutors seeking a rich source of self-incriminating evidence."

Parmy Olson in Wearable Tech Is Plugging Into Health Insurance says such wearables will "play a bigger role in how individual-and-group health insurance costs are decided." The result could be a two-tier system where those with the best health tracking devices get access to lower premiums, but with the "risk that data could leak, and be used by marketers peddling diabetes medication or as extra fodder for insurers seeking to deny coverage."

Laying out all these issues, Lee worries that public focus on the most immediate fears aroud data, such as the metadata retention issue, could distact the public and policymakers from dealing with these longer-term social problems raised by all these writers and thinkers.