Robert Epstein is Senior Research Psychologist at the American Institute for Behavioral Research and Technology (http://AIBRT.org) and the former editor-in-chief of Psychology Today magazine. A Ph.D. of Harvard University, Dr. Epstein has published 15 books and more than 200 articles on artificial intelligence and other topics. The views expressed here are solely his own.

When I first started studying psychology as an undergraduate in the early 1970s, a doddering professor of mine told me emphatically that most important thing a psychologist could study was perception. At the time, I dismissed her as a batty old lady. Over the years, though, I’ve come to believe she wasn’t so batty.

How we see things, and how we interpret what we see, is important – critically so, in fact. Take fear, for example. There is only an approximate relationship between the fear we feel and the actual threats in our environment. Most people would be terrified by the sight of an approaching lion, for example, but an experienced lion tamer might stay perfectly calm; he or she would see the lion from a different perspective because of special skills and knowledge he or she has. Similarly, most people would feel calm at the sight of an approaching Pomeranian (think “Boo” on YouTube), but someone who had been bitten as a child might be frozen with fear.

Perception is everything, and Google is a case in point. About a billion people use Google’s search engine each month to find everything from plastic hangers to plastic surgeons, and, as far as the consumer is concerned, Google is an information company, pure and simple.

But from Google’s perspective – and I don’t mean Google’s PR department, I mean Google’s management – Google is an advertising company. Ninety-seven percent of Google’s revenues, after all, come from advertising.

Because Google tracks every search a consumer makes, Google’s search engine is really just a highly efficient tool for collecting information about consumer behavior – the most efficient and profitable tool for collecting such information ever invented. Over time, Google has gotten even more inventive in devising new ways to collect information about what people buy, believe, like, and dislike: by introducing its own browser and its own mobile operating system, for example, and even, for several years, by having teams of employees drive up and down streets in more than 30 countries extracting personal information from unencrypted Wi-Fi networks while snapping photos for its Street View service.

Early in 2012, Google alarmed privacy advocates and a few members of Congress with its announcement that it was combining consumer data across 60 of its various digital platforms to create dossiers on millions of people so detailed they would have made J. Edgar Hoover himself green with envy. Google’s spin on this – remember, perception is everything – was that is was merely trying to make the user experience “simpler and more understandable.”

There is genius in Google’s methodology, and it all has to do with perception. People who use Google’s products interpret their experience at one level only: Mr. Smith types in “diet pills” and “depression” and “homosexuality,” and Google helps him – free of charge, no less – to find information about those topics. “Thank you, Google!” says Mr. Smith.

That’s all the consumer sees.

But what Google sees is: “Jordan Smith – IP address x, computer ID y, dossier z in our database – is overweight, depressed, and gay and is likely over the next seven days to purchase products and services that will help him lose weight, lift his depression, and make him feel comfortable about his sexual orientation.”

Within hours, many of the generic ads Mr. Smith sees on multiple platforms are now replaced with ads offering him a variety of services related to his most recent searches, but because of the delay between his search activities and the ads, he sees little or no connection between them. Even if he senses the connection, he engages in his next search with no conscious awareness that he is being observed. Instead, his attention is fully occupied by the search activity itself: by the process of formulating search terms, clicking on links, gathering information, and, ultimately, solving the new problem at hand. It’s a cognitively demanding process, requiring considerable attention and producing rapid feedback that further grabs his attention.

Because of the delay and because the search task is so demanding, it completely obscures the stealthy monitoring that’s paying Google’s bills. For all intents and purposes, that monitoring is invisible.

And just as Mr. Smith is only dimly aware that his activities are being monitored, he is also most likely unaware that by setting up any sort of Google account – or even just by using Google’s search engine – he is agreeing to the terms of a 1,682-word contract which in turn incorporates Google’s Privacy Policy. Together, these documents allow Google to store and analyze material he uploads through its services as well as to send him “tailored content” – that is, advertising related to his online activities.

From a business perspective, this method of collecting valuable information is brilliant. It’s a sleight of hand routine with a new twist: consumers are grateful for the experience they’re having while their pockets are being picked. Thank you, Google! One could argue, of course – as Google officials have in fact done – that Google isn’t actually cheating anyone; rather, it’s performing a legitimate service by efficiently matching up vendors who are willing to pay with consumers who need the products and services those vendors offer – precisely at the moment those products and services are needed, no less.

Thank you, Google!

But two aspects of this transaction are troubling, one immediate and relatively harmless and the other hypothetical and potentially catastrophic.

The immediate problem is that the transaction is inherently deceitful. The consumer perceives the transaction at one level, Google at another. Google never openly asks for permission to record aspects of the transaction and never openly informs the consumer that aspects of the transaction are being recorded. Compare this to the sensibly regulated world of telephone conversations, where consent to record, stated or implied, is mandatory and where consumers regularly hear messages such as “This conversation is being recorded for quality assurance.”

If Google were required to display such messages every time someone conducted a search (think: “The search you are about to conduct is being monitored and recorded by Google and might be used in the future for advertising or other purposes”), most people would hesitate before conducting searches that might reveal sensitive personal information. Over time, the privacy-preserving proxy industry would probably mushroom in size, which would pose a serious threat to Google’s business model.

The misleading nature of Google’s relationship with consumers is a small issue, however, compared to what could go wrong. As both the New York Times and The Atlantic have reported recently, Google is already providing information about its users to government agencies around the world on a regular basis. What is that information being used for? Not for advertising, presumably.

Hoover knew the potential danger that his dossiers contained, because he himself had used them to coerce. That’s why he made arrangements to have his entire inventory of files destroyed upon his death. Google’s dossiers are in many respects far richer and more detailed than Hoover’s, and, unlike Hoover, Google is constrained only by the demands of the marketplace.

What would happen if Google’s corporate focus became more nefarious than it already is? And what would happen if Google were somehow hacked to its core, or if a disgruntled Google employee sold out to a Chinese conglomerate, or if a very large hard drive were stolen from Google’s headquarters? The larger concern is not about how Google uses its data now but about how millions of detailed dossiers could, in theory at least, be misused in the future to humiliate, manipulate, or coerce. Identity theft would be the least of our worries.

So is Google a benign and helpful information company, or is it a massive advertising agency that spies on consumers and puts our privacy and civil liberties at risk? It’s all a matter of perception.

Robert Epstein is Senior Research Psychologist at the American Institute for Behavioral Research and Technology (http://AIBRT.org) and the former editor-in-chief of Psychology Today magazine. A Ph.D. of Harvard University, Dr. Epstein has published 15 books and more than 200 articles on artificial intelligence and other topics. The views expressed here are solely his own.