Share this:

Like this:

First things first. Happy Birthday to Why We Reason! Believe it or, it’s one year old. I want to thank everyone for the visits, tweets, Facebook shares, comments, emails, etc. Seriously: It’s been great.

Second, the wonderful website 3 Quarks Daily, which curates written content from around the web, is currently holding a best-of-the-web science writing contest. I have a piece nominated! Long time readers of Why We Reason might remember it from October 2011. It’s titled “Does Pinker’s ‘Better Angels’ Undermine Religious Morality?” In brief, I look at Steven Pinker’s latest book in context of religious morality.

Finally, as you may know, I’m now blogging full time at BigThink.com. My blog over there is called “Moments of Genius” and it’s about the psychology of creativity. Feel free to visit. Now that I’m blogging full time for BT I’ve essentially stopped writing for Why We Reason, instead using it as a medium to promote my stuff. As a result of this shift, I’ll soon be launching SamMcNerney.com, a website that will curate all of my writing, including old WWR stuff, BT pieces, ScientificAmerican.com articles, CreativityPost.com articles and current material. The website should be up and running soon.

Until then, I’ll see you elsewhere on the web.

Sam

Share this:

Like this:

By the time French police arrested him in 1969, Frank Abagnale had posed as a lawyer, doctor, U.S. Bureau of Prisons agent, teaching assistant at Brigham Young University, pilot and, “passed $2.5 million worth of meticulously forged checks across 26 countries.” By any standard, Abagnale – who committed most of his crimes as a teenager – was a criminal. In fact, when he was finally captured, 12 countries wanted him on accounts of fraud.

At that point everybody had questions for Abagnale. How did he get away with all of it for so long? Why didn’t anyone notice his youthful appearance? And how did he manage to escape authorities, even after they captured him?

Abagnale’s unmatched intelligence, willingness to take a chance and charm created a strange cognitive brew that gave rise to his unusual accomplishments – if that’s the appropriate word. But he was also highly creative. He didn’t just steal and forge; he stole and forged in entirely novel ways. In this light – a somewhat pessimistic light – he was a creative genius.

Was Abagnale a cheater because of his crafty creativity? Consider a study published last year recently brought to the popular audience with the publication of Dan Ariely’s latest book The (Honest) Truth About Dishonesty.Ariely conducted five experiments for the study. In one, after measuring how creative each participant was, Ariely and his research partner Francesca Gino facilitated a multiple-choice test with cash rewards that depended on performance – the better people did the more money they made.

Here’s where things got tricky. Ariely and Gino gave the participants bubble sheets (think the SAT) with instructions to transfer their answers onto it. However, because of a “copyright error,” the bubble sheets already had the correct answers marked in. The error, of course, was a ruse. Ariely and Gino implemented this small ripple to give the participants a chance to cheat without getting caught. The question is: Would they?

The researchers found two things. The first is that many people cheated but only a little bit. This finding is consistent with Ariely’s thesis, which describes how most “honest” people are willing to cheat by “fudging” their results in order to give themselves small gains. Ariely demonstrates this with numerous studies and anecdotes throughout his book. By the end he concludes that cheating is a widespread phenomenon not just limited to a few bad apples.

The second finding confirmed Ariely’s and Gino’s hunch: creative people cheated more. “Those who cheated more on each of the… tasks had on average higher creativity scores compared to noncheaters, but their intelligence scores were not very different.” Why? Ariely thinks it has to do with storytelling. That is, creative types told themselves more convincing and justifying stories:

[T]he difference between creative and less creative individuals comes into play mostly when there is ambiguity in the situation at hand and, with it, more room for justification… Put simply, the link between creativity and dishonesty seems related to the ability to tell ourselves stories about how we are doing the right thing, even when we are not. The more creative we are, the more we are able to come up with good stories that help us justify our selfish interests.

What’s also interesting is the relationship between pathological liars and gray and white matter. Gray matter is a term that describes the neurons that power our thinking. White matter, in contrast, is the wiring that connects our brain. A study led by Yaling Yang found that – and this is the interesting part – pathological liars had 14 percent less gray matter in their prefrontal cortices, a part of the brain that helps us distinguish right form wrong. One interpretation of this finding is that pathological liars have a difficult time when it comes to moral dilemmas because of their lack of gray matter.

However, Yang and her team also found that the pathological liars had 22 to 26 percent more white matter in their prefrontal areas compared to a control group. In Ariely’s words, this means that “pathological liars are likely able to make more connections between different memories and ideas, and this increased connectivity and access to the world of associations stored in the their gray matter might be the secret ingredient that makes them natural liars.”

Ariely speculates the implications:

If we extrapolate these findings to the general population, we might say that higher brain connectivity could make it easier for any of us to lie and at the same time think of ourselves as honorable creatures. After all, more connected brains have more avenues to explore when it comes to interpreting and explaining dubious events – and perhaps this is a crucial element in the rationalization of our dishonest acts.

This doesn’t mean that the more creative you are the more of a cheater you are – correlation doesn’t equal causation. But Ariely does rightly point out that cheating requires a creative mindset. Such was the case with Abagnale. He didn’t cheat because of his creativity, but his novel brand of thievery couldn’t have been possible without his wildly creative mind. After all, “facts are for people who lack imagination to create their own truth.” Abagnale would agree.

Share this:

Like this:

In 1981, Arthur Leonard Schawlow won the Nobel prize in physics for his contributions to laser spectroscopy. When asked what made the difference between highly creative and less creative scientists he responded: “The labor of love aspect is important. The most successful scientists often are not the most talented. But they are the ones who are impelled by curiosity. They’ve got to know what the answer is.”

Schawlow was describing what the psychologist Teresa Amabile calls the “Intrinsic Motivation Principle of Creativity,” or the propensity for human creativity to flourish when people are motivated by the personal enjoyment of the work itself. Athletes call it a “love of the game;” artists refer to it as an unrelenting need to express. For academics like Schawlow, it’s the pure joy of discovering something new.

Extrinsic motivation, in contrast, is the daily pressure we feel from outside incentives – grades, salaries, and promotions – put in place to encourage output. Here’s the question: Is creative output the product of intrinsic or extrinsic motivation? Do we need a reason to work? Or is passion enough?

In the 1970s, Mark Lepper, David Greene and Richard Nisbett conducted a classic study involving a group of preschoolers who liked to draw. The researchers separated the kids into three groups. The first was told that if they continued to draw they would receive a big blue ribbon with their name on it (reward condition). The second wasn’t told about the reward but given a blue ribbon after they finished drawing (unexpected reward condition). The third group wasn’t given a blue ribbon ribbon (no award condition).

They ran the experiment for two weeks and found that the kids in the “no award” and “unexpected reward” conditions continued to enthusiastically draw just like they did initially. However, their peers in the award condition showed a drastic reduction in interest. Sadly, they no longer found pleasure in drawing – their intrinsic motivation was destroyed by an extrinsic reward, the blue ribbon.

Creativity can take an artist further than any other trait, though it can be undermined due to simple, common errors. Grammar and punctuation errors can give people the impression that the person’s creativity is not genuine, so services like spell check and line editing are necessary

A study conducted more recently by Teresa Amabile of the Harvard Business School demonstrated similar results. Amabile’s subjects, unlike Lepper’s et al, were college women. In one experiment Amabile asked the women to make paper collages. She told half of them graduate art students would judge their collages; the other half was told that researchers “were studying their mood and had no interest in the collages themselves.” A panel of artists evaluated the collages and Amabile found that those who expected to be judged were significantly less creative. Drawing on this study and other research, Amabile concluded that, “The intrinsically motivated state is conducive to creativity, whereas the extrinsically motivated state is detrimental.”

Are all extrinsic motivators creativity killers? Not exactly. This research doesn’t rule out the important role extrinsic motivation plays in the creative process. Consider the following examples, brought to life by Geoff Colvin in his book Talent is Overrated:

Intrinsic motivation may dominate the big picture, but everyone, even the greatest achievers, has responded to extrinsic forces at critical moments. When Waton and Crick were struggling to find the structure of DNA, they worked almost nonstop because they knew they were in a race with other research teams. Alexander Graham Bell worked similarly on the telephone, knowing he was in competition with Elisha Gray, whom he beat to the patent office by just hours. Such people are driven by much more than fascination or joy.

Colvin concludes that extrinsic motivators are good as long as they are directed at delivering constructive feedback. Here’s what he means:

While the mere expectation of being judged [tends] to reduce creativity, personal feedback could actually enhance creativity if it was the right kind… That is, feedback that [helps] a person do what he or she [feels] compelled to do [is] effective. Even the prospect of direct rewards, normally suffocating to creativity, could be helpful if they were the right kinds of rewards… [As such] intrinsic motivation is still best, and extrinsic motivation that’s controlling is still detrimental to creativity, but extrinsic motivators that reinforce intrinsic drives can be highly effective.

Given the connection between motivation and achievement and creativity, it’s worth asking if the United States education system is doing a good job of balancing intrinsic and extrinsic motivators. The short answer is not really. A new report from the Center on Education Policy outlines new strategies schools are creating to boost student motivation, suggests that many schools still do a poor job of understanding student motivation and describes the inherent problem of motivating a student who seems steadfast in his or her unwillingness to engage the material. Here’s a recent Atlantic article on the study:

Schools nationwide are experimenting with initiatives aimed at boosting student motivation, incorporating new programs aimed at piquing their interest or helping them feel more connected to the material they are being taught. In some instances, and not without controversy, schools have resorted to outright bribery, offering students cash and other rewards in exchange for greater effort and achievement….

[But] even the best school, program, and teacher can’t make a dent in improving academic achievement when a student isn’t motivated to learn. There are several elements to motivating students successfully, and as more of these triggers are activated, an initiative becomes more likely to work… if students see a direct connection between what they are learning and their own interests and goals, they are likely to be more motivated. Additionally, how schools are organized, and how teachers teach, are all factors in student motivation.

The article and CEP study suggests that schools in the United States should make pedagogical adjustments that consider what we know and don’t know about motivation. I think there’s no doubt that this is true.

For one, too much weight is put on extrinsic motivators – grades, tests, final examples, etc. They’re important – no need to throw the baby out with the bathwater – but the ultimate goal of any education system should be to give people the opportunity to find and bring to life that which motivates them intrinsically. Ideally, all students will find, at some point in their educational careers, a domain where their labor is love, just as Schawlow did.

Share this:

Like this:

Like many college students, I took a semester abroad. I spent the first half of my junior year in London taking classes at UCL, exploring the museums, and learning the difference between two pints, two pounds and two pence. After a few lovely months on the “other” side of the pond I returned home feeling cultured. Of course, the difference between London and New York (where I went to school) was small. But the UK nonetheless influenced me to see the world a bit differently.

Such are the benefits of travel. A few weeks or months in a foreign country won’t necessarily transform our lives, but wandering the streets of Helsinki, Harare or Hong Kong leaves a residue on our minds. Returning home, this cultural footprint is hard to ignore and difficult to identify. Something’s different, but what?

Given the importance of traveling abroad, it’s no surprise that psychologists study how these experiences affect our cognition. Do they make us smarter or more open-minded? Does learning a foreign language boost IQ? Is it a good idea to live outside of your native country for a while? Consider a study conducted by Lile Jia and his colleagues at Indiana University.

In one experiment the team of psychologists asked participants to list as many different modes of transportation as possible. They explained that the task was created by either Indiana University students studying in Greece (distant condition) or by Indiana University students studying in Indiana (near condition). This small ripple turned out to have large effects: participants in the distant condition generated more modes of transportation and were more original with their ideas.

The second experiment demonstrated similar results. The team asked participants to solve three insight problems. Here’s an example of one:

A prisoner was attempting to escape from a tower. He found a rope in his cell that was half as long enough to permit him to reach the ground safely. He divided the rope in half, tied the two parts together, and escaped. How could he have done this?

Like the first experiment, Jia and his team told participants that the questions came from either a research institute “around 2,000 miles away” or in Indiana “2 miles away.” (In a control condition they did not reference a location). Again, the researchers found that participants in the distant condition generated more solutions than participants in the other two conditions.

A ScientificAmerican.com article on Jia’s study summarizes the results this way:

This pair of studies suggests that even minimal cues of psychological distance can make us more creative. Although the geographical origin of the various tasks was completely irrelevant – it shouldn’t have mattered where the questions came from – simply telling subjects that they came from somewhere far away led to more creative thoughts.

In Imagine, Jonah Lehrer parallels this research with a 2009 study out of the Kellogg School of Management and INSEAD. The researchers “reported that students who lived abroad for an extended period were significantly more likely to solve a difficult creativity problem than students who had never lived outside of their birth country.” Lehrer concludes that, “the experience of another culture endows the traveler with a valuable open-mindedness, making it easier for him or her to realize that a single thing can have multiple meanings.”

It’s unclear if this finding is causal or correlative – students who go abroad might be endowed with an open and creative mindset in the first place – but the point remains: diverse experiences are good for creativity because they influence us to look at problems from multiple points of view.

This brings me to a brand new study out of Tel Aviv University’s School of Psychological Sciences conducted by professor Nira Liberman and a team of her students. They wanted to see if “expansive thinking” improves the creative output of 6 to 9 year olds.

Their experiment was straightforward. The researchers gave the kids a series of photographs displaying nearby objects (a pencil on a desk) and distant objects (a picture of the Milky Way galaxy). Here’s the important part: half of the kids started with the nearby objects and progressed to more distant ones (expansive mindset); the other half saw the photos in reverse order (contractive mindset).

Next, the kids tackled several creativity tests in which they were given an object and asked to name as many different uses for it. The tasks were designed to test “outside of the box” thinking. For example, if the object was a paper clip, an unimaginative response would be to hold paper. More creative answers, on the other hand, would be “a bookmark,” or “Christmas tree decorations.”

Liberman found that kids in the expansive mindset scored significantly better on all measures of creativity. They came up a greater number of uses and more creative uses for the objects. Why? According to Liberman, “spatial distance, as opposed to spatial proximity, was clearly shown to enhance creative performance…. [and] psychological distance can help to foster creativity because it encourages us to think abstractly.”

Two important findings come out of Liberman’s research. The first is that creativity can be taught. David Kelley makes this point precisely in a recent TED talk. Drawing upon personal experience and years of research, Kelley puts it this way:

Don’t let people divide the world into the creative and the non-creative like it’s some God given thing…. People [should] realize they are naturally creative and… these people should let their ideas fly. They should achieve… self-efficacy, [meaning they] should do what they set out to do… And reach a place of creative confidence.

The second point brings me back to London. One way to kill creativity and abstract thinking – two cognitive attributes vital in the 21st century economy – is to maintain a “here and now” perspective. London steered me away from this mindset; it influenced me to adopt a more open-minded perspective.

To be sure, my leisurely strolls through the British Museum didn’t make me smarter, and by no means was I “culturally transformed” upon hearing that ‘soccer’ was actually ‘football’. But it’s remarkable what you can learn by sitting in an English pub for a few hours. For starters, pints are two pounds, not two pence.

Share this:

Like this:

Here’s a forthcoming article for the Huffington Post religious blog I’ve written with Rabbi Geoff Mitelman, a friend and fellow cognitive science enthusiast. We discuss atheism and the psychology of belief. Check out his blog Sinai and Synapses

Rabbi Geoffrey Mitelman: It’s inherently challenging for believers and atheists to have productive conversations. Discussing topics such as belief and nonbelief, the potential irrationality of religion, or the limits of scientific knowledge is difficult since each side often ends up more firmly entrenched in their own worldview.

But one bright person interested in broadening the conversation is Sam McNerney, a science writer who focuses on cognitive science and an atheist interested in religion from a psychological point of view.

I found Sam through his writing on ScientificAmerican.com, and started reading his blog Why We Reason and his posts on BigThink.com. We discovered that even though we approached religion from different perspectives, we had great respect for each other.

So as two people with different religious outlooks we wondered: what can we learn from each other?

Sam McNerney: There are many things we can learn. Let’s take one: the role of authority.

A recent New York Times article points out that secular liberal atheists tend to conflate authority, loyalty and sanctity with racism, sexism and homophobia. It’s not difficult to see why. Societies suffer when authority figures, being motivated by sacred values and religious beliefs, forbid their citizens from challenging the status quo. But a respect for authority and the principles they uphold to some degree is necessary if societies seek to maintain order and justice and function properly. The primatologist Frans de Waal explains it this way: “Without agreement on rank and a certain respect for authority there can be no great sensitivity to social rules, as anyone who has tried to teach simple house rules to a cat will agree.” (Haidt, 106)

Ironically, atheists’ steadfast allegiance to rationality, secular thinking and the importance of open-mindedness blinds them to important religious values including respect for authority. As a result, atheists tend to confuse authority with exploitation and evil and undervalue the vital role authority plays in a healthy society.

Geoff: You accurately bring up one aspect of why organized religion can be so complicated: it is intertwined with power. And I’m glad you note that authority and power are not inherently bad when it comes to religion. In fact, as you also say, a certain degree of authority is necessary.

To me, the real problem arises when religion adds another element into the mix: certainty. It’s a toxic combination to have religious authorities with the power to influence others claiming to “know” with 100% certainty that they’re right and everyone else is wrong.

One thing I learned from several atheists is the importance of skepticism and doubt. Indeed, while certainty leads to arrogance, uncertainty leads to humility. We open up the conversation and value diverse experiences when we approach the world with a perspective of “I’m not sure” or “I could be wrong.”

Recently, astrophysicist Adam Frank wrote a beautiful piece on NPR’s blog 13.7 about how valuable uncertainty can be:

Dig around in most of the world’s great religious traditions and you find people finding their sense of grace by embracing uncertainty rather than trying to bury it in codified dogmas…

Though I am an atheist, some of the wisest people I have met are those whose spiritual lives (some explicitly religious, some not) have forced them to continually confront uncertainty. This daily act has made them patient and forgiving, generous and inclusive. Likewise, the atheists I have met who most embody the ideals of free inquiry seem to best understand the limitations of every perspective, including their own. They encounter the ever shifting ground of their lives with humor, good will and compassion.

Certainty can be seductive, but it hurts our ability to engage with others in constructive ways. Thus when religious people talk about God, belief or faith, we have to approach the conversation with a little humility and recognize that we don’t have a monopoly on the truth. In the words of Rabbi Brad Hirschfield, we need to realize that another person doesn’t have to be wrong for us to be right.

This doesn’t mean believers and atheists will agree on the role of religion in society, the validity of a particular belief system, or even the very existence of God. In fact, believers and atheists will almost certainly continue to vehementlydisagree about these questions. But we have to remember that not all disagreements are bad. Some arguments are quite beneficial because they help us gain a deeper understanding of reality, encourage clearer thinking, and broaden people’s perspectives.

The Rabbis even draw a distinction between two different kinds of arguments. Arguments they call “for the sake of Heaven” will always be valuable, while arguments that are only for self-aggrandizement will never be productive (Avot5:20). So I’m not interested in arguments that devolve into mocking, ridicule, name-calling or one-upmanship. But I’d gladly participate in any discussion if we are arguing about how we make ourselves and this world better, and would actively strive to involve whoever wants to be part of that endeavor, regardless of what they may or may not believe.

Sam: You are right to point out that both atheists and believers under the illusion of certainty smother potentially productive dialogue with disrespectful rhetoric. What’s alarming is that atheism in the United States is now more than non-belief. It’s an intense and widely shared sentiment where a belief in God is not only false, but also ridiculous. Pointing out how irrational religion can be is entertaining for too many.

There’s no doubt that religious beliefs influence negative behavioral consequences, so atheists are right to criticize religion on many epistemological claims. But I’ve learned from believers and my background in cognitive psychology that faith-based beliefs are not necessarily irrational.

Consider a clever study recently conducted by Kevin Rounding of Queen’s University in Ontario that demonstrates how religion helps increase self-control. In two experiments participants (many of whom identified as atheists) were primed with a religious mindset – they unscrambled short sentences containing words such as “God,” “divine” and “Bible.” Compared to a control group, they were able to drink more sour juice and were more willing to accept $6 in a week instead of $5 immediately. Similar lines of research show that religious people are less likely to develop unhealthy habits like drinking, taking drugs, smoking and engaging in risky sex.

Studies also suggest that religious and spiritual people, especially those living in the developing world, are happier and live longer, on average, than non-believers. Religious people also tend to feel more connected to something beyond themselves; a sentiment that contributes to well-being significantly.

It’s unclear if these findings are correlative or causal – it’s likely that many of the benefits that come from believing in God arise not from beliefs per se but from strong social ties that religious communities do such a good job of fostering. Whatever the case, this research should make atheists pause before they dismiss all religious beliefs as irrational or ridiculous.

Geoff: It’s interesting — that actually leads to another area where atheists have pushed believers in important ways, namely, to focus less on the beliefs themselves, and more on how those beliefs manifest themselves in actions. And to paraphrase Steven Pinker, the actions that religious people need to focus on are less about “saving souls,” and more about “improving lives.”

For much of human history the goal of religion was to get people to believe a certain ideology or join a certain community. “Being religious” was a value in and of itself, and was often simply a given, but today, we live in a world where people are free to choose what they believe in. So now, the goal of religion should be to help people find more fulfillment in their own lives and to help people make a positive impact on others’ lives.

It’s important to note that people certainly do not need religion to act morally or find fulfillment. But as Jonathan Haidt writes in his new book The Righteous Mind, religion can certainly make it easier.

Haidt argues that our mind is like a rider who sits atop an elephant to suggest that our moral deliberations (the rider) are post-hoc rationalizations of our moral intuitions (the elephant). The key to his metaphor is that intuitions comes first (and are much more powerful) and strategic reason comes afterwards.

We need our rider because it allows us to think critically. But our elephant is also important because it motivates us to connect with others who share a moral vision. Ultimately, if we are striving to build communities and strengthen our morals, we cannot rely exclusively on either the rider or the elephant; we need both. As Haidt explains:

If you live in a religious community, you are enmeshed in a set of norms, institutions and relationships that work primarily on the elephant to influence your behavior. But if you are an atheist living in a looser community with a less binding moral matrix, you might have to rely somewhat more on an internal moral compass, read by the rider. That might sound appealing to rationalists, but it is also a recipe for…a society that no longer has a shared moral order. [And w]e evolved to live, trade and trust within shared moral matrices. (Haidt, 269)

Since religion is a human construct, with its “norms, institutions and relationships,” it can be used in a variety of different ways. It can obviously be used to shut down critical thinking and oppress others. But as you mention, religion has positive effects on well-being, and religious beliefs correlate with a sense of fulfillment. Perhaps the job of religion, then, should be giving us a common language, rituals, and communities that reinforce and strengthen our ability to become better human beings and find joy and meaning in our lives.

Ultimately, we don’t have to agree with someone in order to learn from them. As Ben Zoma, a 2nd century Jewish sage, reminds us: “Who is wise? The person who learns from all people.” (Avot 4:1) When we are willing to open ourselves up to others, we open ourselves up to new ideas and different perspectives.

Indeed, I have come to believe that our purpose as human beings – whether we identify as a believer, an atheist, or anything in between – is to better ourselves and our world. And any source of knowledge that leads us to that goal is worth pursuing.

Share this:

Like this:

Several years ago University of California at Davis professor Dean Simonton conducted a study that examined more than three hundred creative geniuses born between 1450 and 1850. The list included thinkers Liebniz and Descartes, scientists Newton and Copernicus and artists Vinci and Rembrandt. He compared the relationship between their education and eminence, a metric he determined by an array of criteria. He plotted the graph and found an inverted U sparking the following conclusion: “The most eminent creators were those who had received a moderate amount of education, equal to about the middle of college.”

Simonton’s research highlights a commonly held notion: too much familiarity can be detrimental to creativity. The problem, Simonton hypothesizes, is that creativity benefits from an outsider’s mindset. “Too much experience…” on the other hand, “may restrict creativity because you know so well how things should be done that you are unable to escape to come up with new ideas.” It seems reasonable, then, to suggest that, “if you want a creative solution to a problem, you’d better find someone who knows a little about the situation but not too much.”

Consider the clever website InnoCentive.com. The premise is simple: ‘Seekers’ go to post problems for ‘Solvers.’ The problems range from “Recovery of Bacillus Spore from Swabs,” to “Blueprints for a Medical Transportation Device for Combat Rescue.” They are usually posted by large corporations, so the rewards can be lucrative – sometimes millions of dollars.

There are two things remarkable about InnoCentive, each brought to light by astudy conducted by researchers at Harvard Business School. The first is that it works; about 33 percent of the problems are solved on time. The second is that solvers tend to solve problems that are at the fringe of their expertise. If a biochemistry problem only attracted biochemists it tended to remain unsolved. But if the same problem was tackled by, say, a molecular biologist or an organic chemist the chances were greater that the problem would be solved. Outside thinking was vital.

Think about the failures of expertise, as the author of Talent is Overrated Geoff Colvin does: “Why didn’t Western Union invent the telephone? Why didn’t U.S. Steel invent the minimill. Why didn’t IBM invent the personal computer? Over and over, the organizations that knew all there was to know about a technology or an industry failed to make the creative breakthrough that would transform the business.”

Is too much expertise killing creativity?

Well, not exactly. Colvin goes on to remind readers that the greatest innovators of any field share a few characteristics in common: years of intensive preparation and technical competence. Great innovations, he says, are roses that bloom after long and careful cultivation.

He considers James Watson’s and Francis Crick’s discovery of the structure of DNA. Colvin cites the research of Robert Weisberg, who showed that several other distinguished scientists were trying to solve the same problem at the same time. Colvin argues that, “if we presume that too much familiarity with a problem is a disadvantage, then we would expect to find that Watson and Crick came at this one unburdened by the excessive data that clouded the thinking of the other researchers. But in reality, the story was just the opposite.”

The larger point is that creative breakthroughs require about 10,000 hours or ten years of deliberate practice within a given field:

The most eminent creators are consistently those who have immersed themselves utterly in their chosen field, have devoted their lives to it, amassed tremendous knowledge of it, and continually pushed themselves to the front of it. Zero evidence supports the conclusion that too much knowledge might be a hindrance in creative achievement.

And what about the success of InnoCentive? What’s important is not to be an outsider, but to have an outsider’s mindset. People at the fringe of their expertise solved problems on InnoCentive, but they were still solving problems within their general field of expertise. Indeed, innovation occurs at the boundary of disciplines, but you’ll never hear about a novelist winning a Nobel Prize in physics.

As for Simonton’s study, it’s important to remember that during the period that his subjects existed – 1450-1850 – many fundamental principles of the scientific method were still unknown. It was still possible – especially in the first half of that 400 year stretch – for someone to be an expert in multiple disciplines. Moreover, a high-level degree in, say, 1650, didn’t confer much.

Today’s landscape is much different – all the low hanging fruit is good. A breakthrough in any field requires exclusive preparation in that field; even experts don’t know everything about their field. So it’s important to maintain a skeptical point of view and think like an outsider. But when it comes to creative breakthroughs, the more familiarity the better.

Share this:

Like this:

For most of human history creativity was something that came from the muses; it was about flashes of insight from another world. Today we know that creativity is something that happens in the brain; many psychologists and neuroscientists are working to identify cognitive mechanisms and processes active during the creative process. However, the public still believes that creativity is a “gift” applicable across many fields even though research shows that creativity is improvable, contingent on upbringing and societal circumstances and domain-specific. Particularly interesting are studies of the last few years that suggest that subtle cues in our physical environment significantly influence creative output.

Consider a study published in Science by Juliet Zhu and a team of researchers from the University of British Columbia. The psychologists recruited six hundred subjects and tasked them with several basic cognitive tests that required either an analytic approach or a more creative mindset. The key part of the experiment was that the tests were conducted on computer screens with red, blue, or neutral backgrounds. Did the color of the screen matter?

The differences were noticeable. Computer screens with a red background boosted performance on analytical tasks including memory retrieval and proofreading. Blue computer screens, on the other hand, improved performance on creative tasks such as coming up with uses for a brick and brainstorming. Why? Zhu argues that red unconsciously motivates us to think more deliberately and analytically because it’s associated with things such as stop signs, emergency vehicles and danger. In contrast, blue is associated with the sky, ocean and peace and tranquility – things that influence a more free-flowing and exploratory mindset.

This brings me to a brand new study by Zhu and her colleagues Ravi Mehta and Amar Cheema released in the Journal of Consumer Research. The psychologists were interested in looking at how various levels of sound affect creativity. In one experiment they assigned 65 undergrads Remote Associate Tasks (RAT). In a RAT participants are given stimuli words (shelf, read, end) and asked to determine a related target word (book).

There were four conditions: low-noise (50 dB), moderate-noise (70 dB), high-noise (85 dB) and no-noise (the sounds of the room the participants completed the tasks). For the three noise conditions the researchers blended sounds from a cafeteria, roadside traffic and distant construction noise to create an ambient sound typical of consumption contexts such as a shopping mall or grocery store. The participants listened to the noise, which came from speakers in the room, while they solved the RAT. The psychologists found, “a significant main effect of noise level on RAT performance such that respondents in the moderate-noise condition generated more correct answers than those in the low-noise high-noise or control conditions.”

The study involved four other experiments. As predicted by Zhu and her team, each demonstrated similar results. Here’s what the authors conclude:

We find that increasing levels of noise induce distraction, leading to a higher construal level. That is, both moderate and high noise levels lead to more abstract processing as compared to a low noise level. This higher construal level then induces greater creativity in the moderate-noise condition; however, the very high level of distraction induced by the high-noise condition, although it prompts a higher construal level, also causes reduced information processing, thus impairing creativity. In other words, while a moderate level of noise produces just enough distraction to induce disfluency, leading to higher creativity, a very high level of noise induces too much dis- traction so as to actually reduce the amount of processing, leading to lower creativity.

Zhu et al. also conclude that further research will be needed to determine what exactly their research means. They expressed interest in how different types of noise might affect creativity.

The larger point of this research is that there are simple things we can do to boost our creativity: blue rooms and a moderate amount of ambient noise for instance. It also suggests that creativity doesn’t arrive to us from the muses; it’s a skill carried out by certain parts of the brain that are influenced by certain aspects of the physical environment. Moreover, creativity is improvable; it’s not reserved for certain people.

Perhaps one advantage creative people have, then, is an ability to find environments that maximize their output. Maybe. At any rate, it’s worth keeping these studies in mind the next time you need to supplement your creative output.

Share this:

Like this:

Shakespeare was a ruthless thief. Some of his first plays – the three parts of Henry VI – were so similar to Christopher Marlowe’s Tamburlaine the Great thatmany eighteenth-century scholars believed Marlowe wrote them. By today’s standards, the plays of Henry VI were a copyright lawsuit waiting to happen. But the paraphrasing and borrowing of characters and plot lines was common practice in Elizabethan England; everybody stole from everybody. Shakespeare, as it were, was the best thief; he was constantly mining the work of his contemporaries.

Ironically, the more Shakespeare copied and imitated the more he started thinking on his own terms. The aesthetically rich atmosphere of Elizabethan England would always influence him – no man is an island – but by the time Shakespeare penned Hamlet (some fifteen years after the Henry VI plays) he had found his original voice.

Share this:

Like this:

For the British social psychologist Liam Hudson, IQ as a measurement for achievement is a lot like height in the NBA, past a certain threshold, it doesn’t matter. In the NBA, that threshold is about 6 feet. Baring a few exceptions, nearly every player in the NBA Hall of Fame is at least 6 feet tall. For IQ, the threshold is around 120. Geniuses like Isaac Newton and Blaise Pascal sported IQs well over 150. But Richard Feynman, winner of the 1965 Nobel Prize in Physics and widely considered one of the greatest physicists of all time, did just fine at 120. A high IQ correlates with things like good health, high salary and high academic achievement, but it guarantees little in terms of personal success. Indeed, scores of people with an IQ above 120 go on to accomplish next to nothing.

Malcolm Gladwell brings this research to light in his 2008 book Outliers. In addition, he examines the relationship between Nobel Prize winners and the colleges they graduated from. We tend to think that a Nobel Prize requires a diploma from an elite university, but a degree from elite institution does not matter as much as one might think. Gladwell looked at winners of the Nobel Prize in Medicine and Chemistry and found that they represented a wide range of colleges including Hope, Hunter, Holy Cross and Antioch. He concludes that a good education, not an elite education, is sufficient.

Yet, would knowing this change anything? Would anybody turn down a few free IQ points? Would someone turn down Harvard for the University of Florida? No. Despite what the data suggests, we’re still obsessed with being the smartest and getting into the best school.

This is why education in the United States is thought of as an ascent: we progress upwards from 1st grade; a 4.0 is better than a 3.0 which is better than a 2.0; the valedictorian is at the top of his class; we move up in class rankings but not down.

The drive to be at the apex of society is a good attribute in the competitive world. But creativity isn’t linear; it’s about thinking laterally. It’s about reaching across domains to bring two seemingly unrelated ideas together to create one original idea. It’s about what exists at the periphery. It’s about, as Steve Jobs said, “connecting things.”

Just think about the traits creative people possess. As New York Times columnist David Brooks explains:

[They] don’t follow the crowds; they seek out the blank spots on the map. Creative people wander through faraway and forgotten traditions and then integrate marginal perspectives back to the mainstream. Instead of being fastest around the tracks everybody knows, creative people move adaptively through wildernesses nobody knows.

Now think about the competitive environment that confronts the most fortunate people today and how it undermines those mind-sets. First, students have to jump through ever-more demanding, preassigned academic hoops. Instead of developing a passion for one subject, they’re rewarded for becoming professional students, getting great grades across all subjects, regardless of their intrinsic interests. Instead of wandering across strange domains, they have to prudentially apportion their time, making productive use of each hour.

Is competition trumping creativity?

Not exactly. Creative output is the product of its social and intellectual environment. There’s a reason ancient Athens was home to the best philosophers, Elizabethan England had the best playwrights, Silicon Valley harbors the world’s top tech entrepreneurs and North Korea doesn’t produce the same amount of Nobel Laureates as the United States. Certain circumstances favor creative expression.

Capitalism is one of those. When people are given the chance to pursue self-interest they usually do. They also tend to cluster: poets hang out with poets; fashion designers hang out with fashion designers; cognitive science professors hang out with cognitive science professors. The byproduct of our natural tendency to seek out likeminded people is improvement; being surrounded by the best makes us better. This is why, in a larger sense, competition is good for creativity.

Brooks would agree. However, his worry (and mine) is that on the path to improvement we’re ignoring the side roads. That is, competition should be thought of as a means to a niche – a place people go to create a creative monopoly. In today’s hyper competitive world, however, competition is a means to an end; we’re competitive just for the sake of it, and what’s lost is a willingness to be creative just for the sake of it.

This brings me back to Outliers. The lesson from the research that Gladwell highlights is that after a certain point in one’s academic career it’s important to focus less on better grades, test scores, acceptance letters and more on thinking laterally and seeking out original ideas. Indeed, understanding society only as a competitive myopia is a real creativity killer.