Walk into the buff-brick building in New York’s Greenwich Village, ride an elevator to the seventh floor and you’ll enter a neuroscience lab, where scientists study brain processes linked to biases.

Past a waiting area are two rooms: one filled with caps studded with electrodes and black faux leather chairs facing computer monitors, the other with devices to monitor physiological responses.

Inside these two rooms, David Amodio has spent the past eight years at New York University using electroencephalograms, or EEGs, functional magnetic resonance imaging, or fMRI, and other tests tracking brain functions and searching for clues to curb discrimination.

Research by Amodio and a cadre of other scientists during the past decade points to a troubling fact: Eliminating bias is far more difficult than once thought.

“Bias is a real problem,” said David Rock, director of the NeuroLeadership Institute, an international research association not affiliated with NYU. “It’s much bigger than we realized, and educating people doesn’t seem to do very much.”

‘We’re all fairly equally exposed to these pervasive messages about who is most fit to do science. Those stereotypes are very robust. When we imagine a scientist, we imagine a man.’

Modern bias isn’t the domain of a few bad apples driven by animosity or hate, experts say. It plagues everyone and infects behavior without people consciously knowing it. As science has shed light on automatic biases, employers have introduced training methods to raise awareness of unconscious attitudes and ways to mitigate them.

“In terms of trying to get under the surface — kind of like popping open the hood of your car to see how it actually works — neuroscience has been able to shed light on more specific mechanisms,” Amodio said. “While many people are interested in developing interventions to reduce bias, in order to do that, it’s critical that people understand the underlying processes that give way to bias.”

Unconscious Bias

Since the 1980s, numerous studies have shown that people can act in biased ways despite explicitly believing that prejudice and discrimination are wrong. This dichotomy comes from what social psychologists call implicit — unconscious — biases, the result of pervasive messages that perpetuate stereotypes.

Implicit bias likely is a reason why progress has been slow across sectors and institutions, said Corinne Moss-Racusin, an assistant professor of psychology at Skidmore College in Saratoga Springs, New York.

For example, corporate boards in America remain overwhelmingly the bastions of men, and fewer than 6 percent of executives are people of color.

“When we look at the numbers of women and people of color who are represented in all sorts of institutions — from our academic institutions to the political forum to the corporate workplace — diversity has increased,” Moss-Racusin said. “But it has increased very slowly. And there are still some stark disparities in areas where we have not been able to make the sort of improvements we might have thought would be possible.”

What implicit bias suggests: Racism, sexism and homophobia come from people who are not racist, sexist or homophobic.

A 2012 study explored the hiring decisions of biology, chemistry and physics faculty members — professions that pride themselves on objectivity. They were given applications identical in every way except for the applicant’s sex. The professors favored the male job applicant “John” over the female job applicant “Jennifer.” They rated him as more competent, offered him more mentoring and selected a higher starting salary for John.

Young and old, male and female faculty members reached similar conclusions.

“We’re all fairly equally exposed to these pervasive messages about who is most fit to do science,” says Moss-Racusin, who led the research. “Those stereotypes are very robust. When we imagine a scientist, we imagine a man.”

The concept of implicit bias hit the mainstream in 1998 when an unconscious-bias assessment went online. Since then, more than 6 million people have taken the Implicit Association Test, the result of a collaboration among psychologists at Harvard University, the University of Virginia and the University of Washington.

Its goal: to create awareness about unconscious biases in self-professed egalitarians.

The test gauges unconscious prejudice by measuring the speed of making associations. For example, the test can measure how quickly someone pairs a white face with a positive term and then compare it with how quickly that person pairs a black face with a positive term.

Now neuroscience proves that people weren’t falsely claiming to believe in equality. Instead, neuroimaging shows that decision-making automatically triggers specific regions of the brain responsible for unconscious processing, including those measured by the Implicit Association Test.

Insights about which regions are activated could help scientists understand how biased associations are learned and how to counter them.

The amygdala, an almond-shaped set of neurons deep in the temporal lobe, has emerged as a key region in bias research. It is the part of the brain that reacts to fear and threat. Scientists have found a correlation between amygdala activity and implicit racial bias.

The amygdala isn’t the only part of the brain involved in unconscious bias.

Amodio and his colleagues also have found implicit stereotyping associated with the left temporal and frontal lobes. The left temporal lobe is important for storing general information about people and objects, and Amodio said this seems to be an important place for social stereotypes. The medial frontal cortex is important for forming impressions of others, empathy and various forms of reasoning.

And a 2012 study by Columbia University psychologists G. Elliott Wimmer and Daphna Shohamy found that the hippocampus, which forms links between memories such as dates and facts, also subconsciously steers people toward choosing one option over another.

The implications of widespread automatic biases have begun to change diversity programs within corporate America.

At Chubb Corp., it began with a few phone calls. The property and casualty insurer, which has earned praise for its inclusive culture, began receiving calls from managers. They had completed training about microinequities, the small messages that discount talented employees who are perceived as different.

They said, “ ‘You haven’t dealt with the elephant in the room,’ ” recalled Sabrina McCoy, assistant vice president and diversity manager.

The “elephant” was bias. They wondered if unconscious bias could explain why some people weren’t as far along in their careers as expected, McCoy said.

They turned to Mahzarin Banaji, a social psychologist at Harvard University who uses fMRIs to study social attitudes and beliefs and is one of the creators of the Implicit Association Test. Banaji spoke to Chubb’s senior leaders, and explained unconscious bias from a scientific point of view.

“It takes the judgment off the table,” McCoy said.

The executives found it so valuable that they had all U.S. managers learn core concepts and go through scenarios such as avoiding unconscious bias when reviewing résumés, said Trevor Gandy, Chubb’s senior vice president and chief diversity officer. The company hopes to cascade training to all employees in the coming year.

For now, Chubb considers the metrics to be how many people have been introduced to the idea of unconscious bias with the expectation that they apply their awareness.

“We know that providing the training and awareness is certainly the first strong step for us,” Gandy said.

Banaji said her goal is to educate people about basic forms of bias that may hinder their judgment.

“I regard awareness to be a singularly important experience because the problem lies in the lack of awareness,” Banaji said in an email. “When good people discover their blind spots, they are inherently motivated to wish to change. I try to make use of that motive to do good and take it one step further — to ask about the extent to which people are willing to doubt their own intuitions.”

PricewaterhouseCoopers also turned to Harvard’s Banaji. The accounting and professional-services firm hopes to accelerate the diversity of its leadership, said Jennifer Allyn, the firm’s managing director of diversity.

Banaji spoke to the top 100 partners of the firm and challenged them to think about some unconscious assumptions made when they say “so and so is not ready yet,” Allyn said.

“It’s something common that people say,” Allyn said. “It’s also a real phenomenon. An individual might not be ready for a particular role.”

‘Mind Bugs’

Before There Was Bird Brain, There Was Lizard Brain

The term “lizard brain” has become pop-culture shorthand for unconscious habits and other instincts that once served humans well, but today may cause problems in the workplace.

“It’s a pop term,” said David Rock, director of the NeuroLeadership Institute.

And it’s popping up in blogs, books and TED talk videos.

The “lizard brain” can cause problems in the workplace, said Terence Burnham, a business professor at Chapman University and author of the book “Mean Markets and Lizard Brains.”

Take email. Humans have good instincts for working face-to-face, Burnham said. But much of business relies on email. There is no reason to believe that a person will know instinctively how to deal with email. People need to create systems to succeed.

The lizard brain also may let people get fooled by others who they see only a few times, Burnham said. In ancestral settings, people knew everything about everyone else. Mistaken assumptions based on initial interactions became corrected over time. But in the business world, mistaken assumptions can lead to long-lasting commitments. The lizard brain wasn’t designed for slick salesmen rewarded for posturing and burnishing the right image, he said.

Despite books devoted to taming the lizard brain, the phrase itself rests on a fallacy.

Dr. Paul D. MacLean, a researcher at Yale Medical School in the 1950s, developed a theory that the brain has three layers: the limbic, the neocortex and the reptilian complexes.The reptilian complex includes the basal ganglia, which plays a key role in forming habits.

MacLean and other scientists of his era thought that the forebrains of reptiles consisted mostly of basal ganglia, a notion later found to be false. In fact, the reptile complex predates reptiles — with some parts traced back to jawless fish.

Some contemporary authors even use “lizard brain” to refer to fear controlled by the amygdala, which isn’t even part of the reptilian complex. It’s in the limbic.

But the term lives on, and it represents the misguided instincts of humans.

“Your lizard brain is here to stay,” wrote marketing guru Seth Godin on his blog, “and your job is to figure out how to quiet it and ignore it.”

—Todd Henneman

They considered “mind bugs,” as Banaji calls them, such as the prototype bias, occupation-specific stereotypes for particular job functions.

After Banaji’s three-hour session, one senior leader told Allyn that he had prided himself on being able to sum up someone in five minutes.

“That’s my talent as a leader,” the partner told Allyn. “After this session, I’m not going to say that anymore because the things that I’m basing that immediate impression on are all subjective and all open to bias.”

Another partner began printing the list of her entire team before selecting someone for a special assignment, as a way to safeguard against picking someone who may be top of mind simply because that person is similar to her or works nearby.

Tony Greenwald, however, is skeptical if training can eradicate implicit biases.

Greenwald, co-author with Banaji of the 2013 book “Blindspot: Hidden Biases of Good People,” said training can be useful in making executives aware of ways in which they or their organizations might be discriminating, even with no intent to discriminate.

“Blinding” strategies — hiding information such as a candidate’s gender — work well, as do evidence-based guidelines for making judgments, he said.

But he cautions that Project Implicit, the collaboration behind the Implicit Association Test and of which he is part, can’t claim yet to have “satisfactory validated evidence of efficacy” in producing change.

Andrés Tapia, a senior partner in Korn Ferry’s leadership and talent consulting division, said unconscious bias lacks good answers for how to reverse it.

Tapia suggests a three-step approach: Learn about unconscious bias, build skills around cultural dexterity, and then use these insights to create a “third culture,” which draws upon the strengths of people’s cultural differences.

“Awareness and the ‘a-ha!’ moment can happen in 60 seconds,” Tapia said. “The work of building the skills to mitigate that is a lot more heavy lifting, but that’s where the payoff is.”

Neuroscience continues to provide clues.

The NeuroLeadership Institute has spent the past 12 months looking at how to mitigate bias by categorizing them neurologically, said Rock, the institute’s CEO.

Some biases result from cognitive laziness: accepting the easiest answer, he said. Picking someone for a team because you frequently work with that person would be an example of cognitive laziness. Rock said he believes that rewarding people for identifying errors in their thinking will motivate them to avoid these biases.

Rock has developed a three-part strategy: accept that individuals and systems are intrinsically biased, label the type of bias likely to happen, and develop a plan to mitigate it. He still is identifying the categories of biases, but he hopes to test the three-part strategy in companies during 2014.

It may be tough to unlearn unconscious biases, said Amodio, an associate professor of psychology and neuroscience. His research into the amygdala suggests that part of implicit bias involves classical fear conditioning, a process in which something neutral elicits fear because we have learned to associate it with something bad.

“That suggests that implicit prejudices are learned quickly and they may be indelible,” Amodio said. “They may be impossible to completely unlearn.”

It may be more effective to find ways to help people override their implicit prejudices rather than try to undo those automatic biases, he said.

Amodio’s research has found that the brain is well-equipped for controlling unwanted biases — if the person detects their presence.

The anterior cingulate cortex, which plays an important role in cognitive control, can detect the activation of implicit attitudes. This region appears to detect conflicts between a person’s overarching goal — such as being egalitarian — and automatic behaviors that conflict with it — such as prejudiced thoughts or intentions.

The anterior cingulate then signals the dorsolateral frontal cortex, which is involved in making moral decisions, creating the possibility of overriding implicit biases.

“So although biases are sometimes difficult to detect and override,” Amodio said, “they are by no means inevitable or uncontrollable.”