The first half of Carr's response engages an earlier piece by Tarnoff and another by Evgeny Morozov that take for granted the data mining metaphor and deploy it in an argument for public ownership of data.

Carr is chiefly concerned with the mining metaphor and how it shapes our understanding of the problem. If Facebook, Google, etc. are mining our data, that in turn suggests something about our role in the process. It conceives of the human being as raw material. Carr suggests we consider another metaphor, not very felicitous either as he notes, that of the factory. We are not raw material, we are producers: we produce data by our actions. Here's the difference:

"The factory metaphor makes clear what the mining metaphor obscures: We work for the Facebooks and Googles of the world, and the work we do is increasingly indistinguishable from the lives we lead. The questions we need to grapple with are political and economic, to be sure. But they are also personal, ethical, and philosophical."

This then leads Carr into a discussion of the Weigel/Tarnoff piece, which is itself a brief against the work of the new tech humanists.

Carr's whole discussion is worth reading, but here are two selections that were especially well put. First:

"But Tarnoff and Weigel’s suggestion is the opposite of the truth when it comes to the broader humanist tradition in technology theory and criticism. It is the thinkers in that tradition — Mumford, Arendt, Ellul, McLuhan, Postman, Turkle, and many others — who have taught us how deeply and subtly technology is entwined with human history, human society, and human behavior, and how our entanglement with technology can produce effects, often unforeseen and sometimes hidden, that may run counter to our interests, however we choose to define those interests.

Though any cultural criticism will entail the expression of values — that’s what gives it bite — the thrust of the humanist critique of technology is not to impose a particular way of life on us but rather to give us the perspective, understanding, and know-how necessary to make our own informed choices about the tools and technologies we use and the way we design and employ them. By helping us to see the force of technology clearly and resist it when necessary, the humanist tradition expands our personal and social agency rather than constricting it."

And:

"Nationalizing collective stores of personal data is an idea worthy of consideration and debate. But it raises a host of hard questions. In shifting ownership and control of exhaustive behavioral data to the government, what kind of abuses do we risk? It seems at least a little disconcerting to see the idea raised at a time when authoritarian movements and regimes are on the rise. If we end up trading a surveillance economy for a surveillance state, we’ve done ourselves no favors.

But let’s assume that our vast data collective is secure, well managed, and put to purely democratic ends. The shift of data ownership from the private to the public sector may well succeed in reducing the economic power of Silicon Valley, but what it would also do is reinforce and indeed institutionalize Silicon Valley’s computationalist ideology, with its foundational, Taylorist belief that, at a personal and collective level, humanity can and should be optimized through better programming. The ethos and incentives of constant surveillance would become even more deeply embedded in our lives, as we take on the roles of both the watched and the watcher. Consumer, track thyself! And, even with such a shift in ownership, we’d still confront the fraught issues of design, manipulation, and agency."

The discontents of humanism (variously understood), the emergence of technopoly (as Neil Postman characterized the present techno-social configuration), and the modern political order are deeply intertwined. Humanism, of course, is a complex and controversial term. It can be understood in countless ways. There is more affinity than is usually acknowledged between anti-Humanism understood as an opposition to a narrow and totalizing understanding of the human and anti-humanism as exemplified by the misanthropic visions of the transhumanists and their Silicon Valley acolytes.

If we are incapable of even a humble affirmation of our humanness then we leave ourselves open to the worst depredations of the technological order and those who stand to profit most from it.