Category Archives: The Glass Cage

We’ve been getting a little lesson in what human-factors boffins call “automation complacency” over the last couple of days. Google apparently made some change to the autosuggest algorithm in Gmail over the weekend, and the program started inserting unusual email addresses into the “To” field of messages. As Business Insiderexplained, “Instead of auto-completing to the most-used contact when people start typing a name into the ‘To’ field, it seems to be prioritizing contacts that they communicate with less frequently.”

Google quickly acknowledged the problem:

We’re aware of an issue with Gmail and auto-complete and are currently investigating. Apologies for any inconvenience.

The glitch led to a flood of misdirected messages, as people pressed Send without bothering to check the computer’s work. “I got a bunch of emails yesterday that were clearly not meant for me,” blogged venture capitalist Fred Wilson on Monday. Gmail users flocked to Twitter to confess to shooting messages to the wrong people. “My mum just got my VP biz dev’s expense report,” tweeted Pingup CEO Mark Slater. “She was not happy.” Wrote CloudFlare founder Matthew Prince, “It’s become pathological.”

The bug may lie in the machine, but the pathology actually lies in the user. Automation complacency happens all the time when computers take over tasks from people. System operators place so much trust in the software that they start to zone out. They assume that the computer will perform flawlessly in all circumstances. When the computer fails or makes a mistake, the error goes unnoticed and uncorrected — until too late.

Researchers Raja Parasuraman and Dietrich Manzey described the phenomenon in a 2010 article in Human Factors:

Automation complacency — operationally defined as poorer detection of system malfunctions under automation compared with under manual control — is typically found under conditions of multiple-task load, when manual tasks compete with the automated task for the operator’s attention. … Experience and practice do not appear to mitigate automation complacency: Skilled pilots and controllers exhibit the effect, and additional task practice in naive operators does not eliminate complacency. It is possible that specific experience in automation failures may reduce the extent of the effect. Automation complacency can be understood in terms of an attention allocation strategy whereby the operator’s manual tasks are attended to at the expense of the automated task, a strategy that may be driven by initial high trust in the automation.

In the worst cases, automation complacency can result in planes crashing on runways, school buses smashing into overpasses, or cruise ships running aground on sandbars. Sending an email to your mom instead of a colleague seems pretty trivial by comparison. But it’s a symptom of the same ailment, an ailment that we’ll be seeing a lot more of as we rush to hand ever more jobs and chores over to computers.

Promulgate:

Last week, Wired‘s Cade Metz gave us a peek into the Facebook Behavior Modification Laboratory, which is more popularly known as the Facebook Artificial Intelligence Research (FAIR) Laboratory. Run by Yann LeCun, an NYU data scientist, the lab is developing a digital assistant that will act as your artificial conscience and censor. Perched on your shoulder like one of those cartoon angels, it will whisper tsk tsk into your ear when your online behavior threatens to step beyond the bounds of propriety.

[LeCun] wants to build a kind of Facebook digital assistant that will, say, recognize when you’re uploading an embarrassingly candid photo of your late-night antics. In a virtual way, he explains, this assistant would tap you on the shoulder and say: “Uh, this is being posted publicly. Are you sure you want your boss and your mother to see this?”

The secret to the technology is an AI technique known as machine learning, a statistical modeling tool through which a computer gains a kind of experiential knowledge of the world. In this case, Facebook would, by monitoring your uploaded words and photos, be able to read your moods and intentions. The company would, for instance, be able to “distinguish between your drunken self and your sober self.” That would enable Facebook to “guide you in directions you may not go on your own.” Says LeCun: “Imagine that you had an intelligent digital assistant which would mediate your interaction with your friends.”

Yes, imagine.

“Look Dave, I can see you’re really upset about this. I honestly think you ought to sit down calmly, take a stress pill, and think things over.”

If and when Facebook perfects its behavior modification algorithms, it would be a fairly trivial exercise to expand their application beyond the realm of shitfaced snapshots. That photo you’re about to post of the protest rally you just marched in? That angry comment about the president? That wild thought that just popped into your mind? You know, maybe those wouldn’t go down so well with the boss.

“And as our senses have gone outside us,” Marshall McLuhan wrote in 1962, while contemplating the ramifications of what he termed a universal, digital nervous system, “Big Brother goes inside.”

This post is an installment in Rough Type’s ongoing series “The Realtime Chronicles,” which began here. A full listing of posts can be found here. Also see: Automating the feels.

Promulgate:

From Norbert Wiener’s The Human Use of Human Beings, published in 1950:

Let us remember that the automatic machine, whatever we think of any feelings it may have or may not have, is the precise economic equivalent of slave labor. Any labor which competes with slave labor must ac­cept the economic conditions of slave labor. It is per­fectly clear that this will produce an unemployment situation, in comparison with which the present reces­sion and even the depression of the thirties will seem a pleasant joke. This depression will ruin many indus­tries — possibly even the industries which have taken advantage of the new potentialities. However, there is nothing in the industrial tradition which forbids an in­dustrialist to make a sure and quick profit, and to get out before the crash touches him personally.

From Robert H. Macmillan’s Automation: Friend or Foe?, published in 1956:

Once upon a time, a Hindu sage was granted by Heaven the ability to create clay men. When he took earth and water and fashioned little men, they lived and served him. But they grew very quickly, and when they were as large as himself, the sage wrote on their foreheads the word DEAD, and they fell to dust. One day he forgot to write the lethal word on the forehead of a full-grown servant, and when he realized his mistake the servant was too tall: his hand could no longer reach the slave’s forehead. This time it was the clay man that killed the sage.

Is there a warning for us today in this ancient fable?

From Kurt Vonnegut’s Player Piano, published in 1952:

The limousine came to a halt by the end of the bridge, where a large work crew was filling a small chuckhole. The crew had opened a lane for an old Plymouth with a broken headlight, which was coming through from the north side of the river. The limousine waited for the Plymouth to get through, and then proceeded.

The Shah turned to stare at the group through the back window, and then spoke at length.

Doctor Halyard smiled and nodded appreciatively, and awaited a translation.

“The Shah,” said Khashdrahr, “he would like, please, to know who owns these slaves we see all the way up from New York City.”

“Not slaves,” said Halyard, chuckling patronizingly. “Citizens, employed by government. They have same rights as other citizens – free speech, freedom of worship, the right to vote. Before the war, they worked in the Ilium Works, controlling machines, but now machines control themselves much better.”

“And any man who cannot support himself by doing a job better than a machine is employed by the government, either in the Army or the Reconstruction and Reclamation Corps.”

“Aha! Khabu bonanza-pak?”

“Eh?”

“He says, ‘Where does the money come from to pay them?’ ” said Khashdrahr.

“Oh. From taxes on the machines, and taxes on personal incomes. Then the Army and the Reconstruction and Reclamation Corps people put their money back into the system for more products for better living.”

“No Kuppo!” said Halyard vehemently. “The government does not own the machines. They simply tax that part of industry’s income that once went into labor, and redistribute it. Industry is privately owned and managed, and co-ordinated — to prevent the waste of competition — by a committee of leaders from private industry, not politicians. By eliminating human error through machinery, and needless competition through organization, we’ve raised the standard of living of the average man immensely.”

Khashdrahr stopped translating and frowned perplexedly. “Please, this average man, there is no equivalent in our language, I’m afraid.”

We have had a hard time thinking clearly about companies like Google and Facebook because we have never before had to deal with companies like Google and Facebook. They are something new in the world, and they don’t fit neatly into our existing legal and cultural templates. Because they operate at such unimaginable magnitude, carrying out millions of informational transactions every second, we’ve tended to think of them as vast, faceless, dispassionate computers — as information-processing machines that exist outside the realm of human intention and control. That’s a misperception, and a dangerous one.

Modern computers and computer networks enable human judgment to be automated, to be exercised on a vast scale and at a breathtaking pace. But it’s still human judgment. Algorithms are constructed by people, and they reflect the interests, biases, and flaws of their makers. As Google’s founders themselves pointed out many years ago, an information aggregator operated for commercial gain will inevitably be compromised and should always be treated with suspicion. That is certainly true of a search engine that mediates our intellectual explorations; it is even more true of a social network that mediates our personal associations and conversations.

Because algorithms impose on us the interests and biases of others, we have not only a right but an obligation to carefully examine and, when appropriate, judiciously regulate those algorithms. We have a right and an obligation to understand how we, and our information, are being manipulated. To ignore that responsibility, or to shirk it because it raises hard problems, is to grant a small group of people — the kind of people who carried out the Facebook and OKCupid experiments — the power to play with us at their whim.

What algorithms want is what the people who write algorithms want. Appreciating that, and grappling with the implications, strikes me as one of the great challenges now lying before us.

Promulgate:

I like it when bands name their tours, like Dylan’s Why Do You Look At Me So Strangely Tour in 1992, or They Might Be Giants’ Don’t Tread on the Cut-up Snake World Tour, also in 1992, or Guided by Voices’ Insects of Rock Tour in 1994.* So I’ve decided to give a name to my upcoming book tour. It’s going to be called The Uncaged Tour. (Actually, the full, official title is The Uncaged Tour of the Americas 2014.)

Promulgate:

“There will always be change,” wrote Thomas Friedman in his 2012 column “Average Is Over.” “But the one thing we know for sure is that with each advance in globalization and the I.T. revolution, the best jobs will require workers to have more and better education to make themselves above average.”

Economics professor and blogger Tyler Cowen borrowed Friedman’s title for his most recent book, Average Is Over: Powering America Beyond the Age of the Great Stagnation, but his emphasis, in surveying the opportunities opening up in today’s labor scene, is not exactly on more and better education. “I see marketing as the seminal sector for our future economy,” Cowen writes:

We can expect a lot of job growth in personal services, even if those jobs do not rely very directly on computer power. The more that the high earners pull in, the more people will compete to serve them, sometimes for high wages and sometimes for low wages. This will mean maids, chauffeurs, and gardeners for the high earners, but a lot of the service jobs won’t fall under the service category as traditionally construed. They can be thought of as “creating the customer experience.” Have you ever walked into a restaurant and been greeted by a friendly hostess, and noticed she was very attractive? Have you ever had an assistant bring you coffee before a meeting, touching you on the shoulder before leaving the cup? Have you gone to negotiate a major business deal and been greated by a mass of smiles and offers of future friendship and collaboration? All of those people are working to make you feel better. They are working at marketing.

I would just like to interject here that I am feeling better.

It sounds a little silly, but making high earners feel better in just about every part of their lives will be a major source of job growth in the future. At some point it is hard to sell more physical stuff to high earners, yet there is usually just a bit more room to make them feel better. Better about the world. Better about themselves. Better about what they have achieved.

Welcome to the mendicancy economy.

Cowen uses a happy metaphor to sketch out the contours of interpersonal competition in this new world:

The more that earnings rise at the upper end of the distribution, the more competition there will be for the attention of the high earners and thus the greater the importance of marketing. If you imagine two wealthy billionaire peers sitting down for lunch, their demands for the attention of the other tend to be roughly equal. After all, each always has a billion dollars (or more) to spend and they don’t need to court each other for favors so much. There is a (rough) parity of attention offered and received. Of course, some billionaires are more important than others, or one billionaire may court another for the purpose of becoming a mega-billionaire, but let’s set that aside.

Compare it to one of those same billionaires riding in a limousine, with open windows, through the streets of Calcutta. A lot of beggars will be competing for the attention of that billionaire, and yet probably the billionaire won’t much need the attention of the beggars. The billionaire may feel overwhelmed by all of these demands, and yet each of these beggars will be trying to find some way to break through and capture but a moment of the billionaire’s attention. This in short is what the contemporary world is like, except the billionaire is the broader class of high earners and the beggars are wealthier than in India.

That’s an awesome analogy, really felicitous, but it has one big flaw. What billionaire is going to drive through Calcutta in a limo with the windows open? I’m sorry, but that’s just nuts.

UPDATE (9/6): Cowen offers an even sunnier speculation today: “It is an interesting question how much that will prove to be the equilibrium more generally, namely the genetic superiority of slaves because they can reap more external investment. After all, capital is more productive today than in times past, so evolution might now produce more slaves.”