Acquiring and deploying it to change the world through technological innovations can inspire great confidence and self-certainty in the person who possesses the knowledge. And yet, the confidence and self-certainty is nearly always misplaced — a product of the knower presuming that his expert knowledge of one aspect of reality applies equally to others. That's one powerful reason why myths about the place of knowledge in human life so often teach lessons about hubris and its dire social, cultural, and political consequences.

Franklin Foer's important new book, World Without Mind: The Existential Threat of Big Tech, is best seen as a modern-day journalistic retelling of one of those old cautionary tales about human folly. Though he doesn't describe his aim in quite this way, Foer sets out to expose the foolishness and arrogance that permeates the culture of Silicon Valley and that through its wondrous technological innovations threatens unintentionally to wreck civilizational havoc on us all.

It's undeniable that Silicon Valley's greatest innovators know an awful lot. Google is an incredibly powerful tool for organizing information — one to which no previous generation of human beings could have imagined having easy and free access, let alone devising from scratch, as Larry Page and Sergey Brin managed to do. The same goes for Facebook, which Mark Zuckerberg famously created in his Harvard dorm room and has become a global powerhouse in a little more than a decade, turning him into one of the world's richest men and revolutionizing the way some two billion people around the world consume information and interact with each other.

That's power. That's knowledge.

But knowledge of what?

Mostly of how to program computers and deploy algorithms to sort through, organize, cluster, rank, and order vast quantities of data. In the case of Facebook, Zuckerberg obviously also understood something simple but important about how human beings might enjoy interacting online. That's not nothing. Actually, it's a lot. An enormous amount. But it's not everything — or anything remotely close to what Silicon Valley's greatest innovators think it is.

When it comes to human beings — what motivates them, how they interact socially, to what end they organize politically — figures like Page and Zuckerberg know very little. Almost nothing, in fact. And that ignorance has enormous consequences for us all.

You can see the terrible problems of this hubris in the enormously sweeping ambitions of the titans of technology. Page, for instance, seeks to achieve immortality.

Foer explains how Page absorbed ideas from countercultural guru Stewart Brand, futurist Ray Kurzweil, and others to devise a quasi-eschatological vision for Google as a laboratory for artificial intelligence that might one day make it possible for humanity to transcend human limitations altogether, eliminating scarcity, merging with machines, and finally triumphing over mortality itself. Foer traces the roots of this utopianism back to Descartes' model of human subjectivity, which pictures a spiritual mind encased within and controlling an (in principle, separable) mechanical body. If this is an accurate representation of the mind's relation to its bodily host, then why not seek to develop technology that would make it possible to deposit this mind, like so much software, into a much more durable and infinitely repairable and improvable computer? In the process, these devices would be transformed into what Kurzweil has dubbed "spiritual machines" that could, in principle, enable individuals to live on and preserve their identities forever.

The problem with such utopian visions and extravagant hopes is not that they will outstrip our technological prowess. For all I know, the company that almost instantly gathers and ranks information from billions of websites for roughly 40,000 searches every second will some day, perhaps soon, develop the technical capacity to transfer the content of a human mind into a computer network.

The problem with such a goal is that in succeeding it will inevitably fail. As anyone who reflects on the issue with any care, depth, and rigor comes to understand, the Cartesian vision of the mind is a fiction, a fairy tale. Our experience of being alive, of being-in-the-world, is thoroughly permeated and shaped by the sensations, needs, desires, and fears that come to us by the way of our bodies, just as our opinions of right and wrong, better and worse, noble and base, and just and unjust are formed by rudimentary reflections on our own good, which is always wrapped up with our perception of the good of our physical bodies.

Even if it were possible to transfer our minds — our memories, the content of our thoughts — into a machine, the indelible texture of conscious human experience would be flattened beyond recognition. Without a body and its needs, desires, vulnerabilities, and fear of injury and death, we would no longer experience a world of meaning, gravity, concern, and care — for ourselves or others. Which also means that Page's own relentless drive to innovate technologically — which may well be the single attribute that most distinguishes him as an individual — would vanish without a trace the moment he realized his goal of using technological innovations to achieve immortality.

An immortal Larry Page would no longer be Larry Page.

Zuckerberg's very different effort to overcome human limits displays a similar obliviousness to the character of human experience, in this case political life — and it ends with a similar paradox.

Rather than simply providing Facebook's users with a platform for socializing and sharing photos, Zuckerberg's company has developed intricate algorithms for distributing information in each user's "news feed," turning it into a "personalized newspaper," with the content (including advertisements) precisely calibrated to his or her particular interests, tastes, opinions, and commitments. The idea was to build community and bring people together through the sharing and dissemination of information. The result has been close to the opposite.

As Facebook's algorithms have become more sophisticated, they have gotten better and better at giving users information that resembles information they have previously liked or shared with their friends. That has produced an astonishing degree of reinforcement of pre-existing habits and opinions. If you're a liberal, you're now likely only to see liberal opinions on Facebook. If you're conservative, you'll only see conservative opinions. And if you're inclined to give credence to conspiracy theories, you'll see plenty of those.

And maybe not just if you favor conspiracy theories. As we've learned since the 2016 election, it's possible for outside actors (like foreign intelligence services, for example) to game the system by promoting or sponsoring fake or inflammatory stories that get disseminated and promoted among like-minded or sympathetic segments of the electorate.

Facebook may be the most effective echo chamber ever devised, precisely because there's potentially a personalized chamber for every single person on the planet.

What began with a hope of bringing the country and the world together has in a little over a decade become one of the most potent sources of division in a deeply divided time.

And on it goes, with each company and technology platform producing its own graveyards full of unintended consequences. Facebook disseminates journalism widely but ends up promoting vacuous and sometimes politically pernicious clickbait. Google works to make information (including the content of books) freely available to all but in the process dismantles the infrastructure that was constructed to make it possible for people to write for a living. Twitter gives a megaphone to everyone who opens an account but ends up amplifying the voice of a demagogue-charlatan above everyone else, helping to propel him all the way to the White House.

Foer ends his book on an optimistic note, offering practical suggestions for pushing back against the ideological and technological influence of Silicon Valley on our lives. Most of them are worthwhile. But the lesson I took from the book is that the challenge we face may defy any simple solution. It's a product, after all, of the age-old human temptation toward arrogance or pride — only now inflated by the magnitude of our undeniable technological achievements. How difficult it must be for our techno-visionaries to accept that they know far less than they'd like to believe.