A new paper reviews how psychology, biology, and neurology are ganging up on economics to prove that, when it comes to making decisions, people are anything but rational

Reuters

Daniel McFadden is an economist. But his new paper, "The New Science of Pleasure," shows the many ways economics fails to explain how we make decisions -- and what it can learn from psychology, anthropology, biology, and neurology.

The old economic theory of consumers says that "people should relish choice." And we do. Shopping can be fun, democracy is better than its alternatives, and a diverse and fully stocked grocery store ice cream freezer is quite nearly the closest thing to heaven on earth. But other fields of science tell a more complicated story. First, making a choice is physically exhausting, literally, so that somebody forced to make a number of decisions in a row is likely to get lazy and dumb. (That's one reason why stores place candy near the check-out aisle: They suspect your brain is too zonked to resist.) Second, having too many choices can make us less likely to come to a conclusion. In a famous study of the so-called "paradox of choice", psychologists Mark Lepper and Sheena Iyengar found that customers presented with six jam varieties were more likely to buy one than customers offered a choice of 24.

If you've read the work of Dan Ariely or Daniel Kahneman, you know exactly how far from perfectly rational we are when faced with a decision. Many of our mistakes stem from a central "availability bias." Our brains are computers, and we like to access recently opened files, even though many decisions require a deep body of information that might require some searching. Cheap example: We remember the first, last, and peak moments of certain experiences. So when we make a choice about how to spend a certain amount of time -- say, by going to Six Flags -- we forget that most of the time at an amusement park is spent waiting around doing nothing. Instead, we remember the thrill of the roller coaster. (This has been previously used to explain why people sometimes go back to disappointing old romantic partners, but that might be for another article.)

The third check against the theory of the rational consumer is the fact that we're social animals. We let our friends and family and tribes do our thinking for us. In a fascinating example, McFadden presents a study that shows Korean peasant women within the same village tend to use the same contraception -- even though there is "substantial, persistent diversity across villages." This pattern could not be explained by income, education, or price. Word-of-mouth explained practically all the difference.

In another corner of the ivory tower (or, more likely, across campus in a glassy lab), neurologists are finding that many of the biases behavioral economists perceive in decision-making start in our brains. "Brain studies indicate that organisms seem to be on a hedonic treadmill, quickly habituating to homeostasis," McFadden writes. In other words, perhaps our preference for the status quo isn't just figuratively our heads, but also literally sculpted by the hand of evolution inside of our brains.

A final example to show how other fields of science are ganging up on classical economics: The popular psychological theory of "hyperbolic discounting" says people don't properly evaluate rewards over time. The theory seeks to explain why many groups -- nappers, procrastinators, Congress -- take rewards now and pain later, over and over again. But neurology suggests that it hardly makes sense to speak of "the brain," in the singular, because it's two very different parts of the brain that process choices for now and later. The choice to delay gratification is mostly processed in the frontal system. But studies show that the choice to do something immediately gratifying is processed in a different system, the limbic system, which is more viscerally connected to our behavior, our "reward pathways," and our feelings of pain and pleasure.

And there's much more. To explain it, here's Daniel McFadden himself. The following transcript of our email conversation has been very lightly edited for clarity.

Let me try to sum up your paper for readers, because it covers a lot of ground. Classical economists used to posit that, since consumers are rational, we make decisions to maximize our pleasure, end of story. But your paper reviews all the ways we know that consumers aren't in fact rational but prone to all sorts of biases and habits that pull us from any strictly rational view of the consumer. Is that alright?

This is a good summary, but I think the final message is that neither the physiology of pleasure nor the methods we use to make choices are as simple or as single-minded as the classical economists thought. A lot of behavior is consistent with pursuit of self-interest, but in novel or ambiguous decision-making environments there is a good chance that our habits will fail us and inconsistencies in the way we process information will undo us.

Choices are good. Trade is good. That's the view of neoclassical consumer theory. But it turns out that people don't really like making decisions. We have habits, we like thinking automatically. So sometimes we avoid making choices altogether because it stresses us out. Why is that? And how might, say, a company use that superior understanding of consumer theory to make consumers behave a certain way?

Trade is a contest, with a chance of coming out on the short end. Animals in "fight or flee" situations often find it safer to flee. Similarly, people in situations where trade is possible, or even promising, may find it safer to turn away. It takes trust to trade. McDonald's is successful because it has created a brand people trust - they know what to expect. A "30-day free trial" or "satisfaction or your money back" or "bring us a better price and we will refund the difference" are offers by merchants intended to promote the idea that they can be trusted, and that the risk of an unsatisfactory trade is low.

Real estate agents take advantage of people's discomfort with decision-making. Since buying a house is highly consequential and difficult to reverse, rational people should look at a great many options and think them through very carefully. A good agent will show you a few houses that are expensive and not very nice, and then one at almost the same price and far nicer. Many buyers will respond by stopping their search and jumping on this bargain. Our susceptibility to "bargains" is one of the cognitive devices we use to simplify choice situations, and one that companies are conscious of when they position their products.

One of the observations that most struck me was "economic choices can make us uncomfortable." That seems like a very powerful idea. How might I see it in my life?

If two "rational" people meet and disagree on the probability of an event (e.g., the AFC team wins the super bowl, the price of Google stock goes up), then both can gain by wagering on the event. In the real world, however, wagering is the exception, not the rule. On the one hand, you could say that getting someone to bet on an event, pay attention to the outcome, and finally make the payoff, is too much work. But actually, if you ask people why they don't bet often with their friends, they will simply say that it would make them uncomfortable to do so.

I'm a big procrastinator. Why does that fall under the category of "hyperbolic discounting" rather than "rational way to spend a Sunday afternoon"?

Procrastination is a way of avoiding uncomfortable choices. Hyperbolic discounting seems to be related to our subjective perception of time, and to the way the brain parses current and future pleasure-seeking - waiting for an hour right now is more painful than our perception of waiting for an hour in the future.

Here is an example of how hyperbolic discounting works: You go to your car dealer seeking a model that has a sound system you want. He says it will take 3 days to get that exact model, but you can drive away right now with one that has a better sound system and costs $300 more. Most buyers will choose to pay a little more and take their new car now. However, if the dealer said that no car is available right now, and he can get the model you want in 33 days, but a model costing $300 more with a better sound system in 30 days, most buyers will choose to wait the 33 days and get the exact model they want. This is hyperbolic discounting at work. Rational consumers with consistent intertemporal evaluation should treat the trade "$300 for an attractive but unneeded accessory versus 3 days" the same whether it is executed right now or executed in 30 days.

I felt like this sentence at the end of your paper was really
important -- "Specialized brain circuitry processes experience in ways
that are not necessarily consistent with relentless maximization of
hedonic experience" - but I didn't really understand what it meant.

Our brains seem to operate like committees, assigning some tasks to
the limbic system, others to the frontal system. The "switchboard" does
not seem to achieve complete, consistent communication between
different parts of the brain. Pleasure and pain are experienced in the
limbic system, but not on one fixed "utility" or "self-interest" scale.
Pleasure and pain have distinct neural pathways, and these pathways
adapt quickly to homeostasis, with sensation coming from changes rather
than levels. Overall, presumably as a product of evolution, our brains
are organized well enough to keep us alive, fed, reproducing, and
responsive to but not overwhelmed by sensation, but they are not
hedinometers.

You make the case that humans are social animals more than economic machines, which sounds right to me. So do social network like Facebook and Twitter help us make better choices?

This is complicated. Social networks are sources of information, on what products are available, what their features are, and how your friends like them. If the information is accurate, this should help you make better choices. On the other hand, it also makes it easier for you to follow the crowd rather than engaging in the due diligence of collecting and evaluating your own information and playing it against your own preferences. In net, the information provided by social networks probably improves choices. The down side is that it may make you a lazy decision-maker. There is also a problem that if social networks encourage herd behavior, then they increase the risk of panics and stampedes that lead to market bubbles and instability.

Most Popular

Writing used to be a solitary profession. How did it become so interminably social?

Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.

Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.

Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.

But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.

Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.

MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.

Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.

There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!

And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.

He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.

During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.

During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.

Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.

The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.

What is the Islamic State?

Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.

The new version of Apple’s signature media software is a mess. What are people with large MP3 libraries to do?

When the developer Erik Kemp designed the first metadata system for MP3s in 1996, he provided only three options for attaching text to the music. Every audio file could be labeled with only an artist, song name, and album title.

Kemp’s system has since been augmented and improved upon, but never replaced. Which makes sense: Like the web itself, his schema was shipped, good enough,and an improvement on the vacuum which preceded it. Those three big tags, as they’re called, work well with pop and rock written between 1960 and 1995. This didn’t prevent rampant mislabeling in the early days of the web, though, as anyone who remembers Napster can tell you. His system stumbles even more, though, when it needs to capture hip hop’s tradition of guest MCs or jazz’s vibrant culture of studio musicianship.

Some say the so-called sharing economy has gotten away from its central premise—sharing.

This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.

The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”

A leading neuroscientist who has spent decades studying creativity shares her research on where genius comes from, whether it is dependent on high IQ—and why it is so often accompanied by mental illness.

As a psychiatrist and neuroscientist who studies creativity, I’ve had the pleasure of working with many gifted and high-profile subjects over the years, but Kurt Vonnegut—dear, funny, eccentric, lovable, tormented Kurt Vonnegut—will always be one of my favorites. Kurt was a faculty member at the Iowa Writers’ Workshop in the 1960s, and participated in the first big study I did as a member of the university’s psychiatry department. I was examining the anecdotal link between creativity and mental illness, and Kurt was an excellent case study.

He was intermittently depressed, but that was only the beginning. His mother had suffered from depression and committed suicide on Mother’s Day, when Kurt was 21 and home on military leave during World War II. His son, Mark, was originally diagnosed with schizophrenia but may actually have bipolar disorder. (Mark, who is a practicing physician, recounts his experiences in two books, The Eden Express and Just Like Someone Without Mental Illness Only More So, in which he reveals that many family members struggled with psychiatric problems. “My mother, my cousins, and my sisters weren’t doing so great,” he writes. “We had eating disorders, co-dependency, outstanding warrants, drug and alcohol problems, dating and employment problems, and other ‘issues.’ ”)

Jim Gilmore joins the race, and the Republican field jockeys for spots in the August 6 debate in Cleveland.

After decades as the butt of countless jokes, it’s Cleveland’s turn to laugh: Seldom have so many powerful people been so desperate to get to the Forest City. There’s one week until the Republican Party’s first primary debate of the cycle on August 6, and now there’s a mad dash to get into the top 10 and qualify for the main event.

With former Virginia Governor Jim Gilmore filing papers to run for president on July 29, there are now 17 “major” candidates vying for the GOP nomination, though that’s an awfully imprecise descriptor. It takes in candidates with lengthy experience and a good chance at the White House, like Scott Walker and Jeb Bush; at least one person who is polling well but is manifestly unserious, namely Donald Trump; and people with long experience but no chance at the White House, like Gilmore. Yet it also excludes other people with long experience but no chance at the White House, such as former IRS Commissioner Mark Everson.