Posts tagged ‘Psychology’

By the time French police arrested him in 1969, Frank Abagnale had posed as a lawyer, doctor, U.S. Bureau of Prisons agent, teaching assistant at Brigham Young University, pilot and, “passed $2.5 million worth of meticulously forged checks across 26 countries.” By any standard, Abagnale – who committed most of his crimes as a teenager – was a criminal. In fact, when he was finally captured, 12 countries wanted him on accounts of fraud.

At that point everybody had questions for Abagnale. How did he get away with all of it for so long? Why didn’t anyone notice his youthful appearance? And how did he manage to escape authorities, even after they captured him?

Abagnale’s unmatched intelligence, willingness to take a chance and charm created a strange cognitive brew that gave rise to his unusual accomplishments – if that’s the appropriate word. But he was also highly creative. He didn’t just steal and forge; he stole and forged in entirely novel ways. In this light – a somewhat pessimistic light – he was a creative genius.

Was Abagnale a cheater because of his crafty creativity? Consider a study published last year recently brought to the popular audience with the publication of Dan Ariely’s latest book The (Honest) Truth About Dishonesty.Ariely conducted five experiments for the study. In one, after measuring how creative each participant was, Ariely and his research partner Francesca Gino facilitated a multiple-choice test with cash rewards that depended on performance – the better people did the more money they made.

Here’s where things got tricky. Ariely and Gino gave the participants bubble sheets (think the SAT) with instructions to transfer their answers onto it. However, because of a “copyright error,” the bubble sheets already had the correct answers marked in. The error, of course, was a ruse. Ariely and Gino implemented this small ripple to give the participants a chance to cheat without getting caught. The question is: Would they?

The researchers found two things. The first is that many people cheated but only a little bit. This finding is consistent with Ariely’s thesis, which describes how most “honest” people are willing to cheat by “fudging” their results in order to give themselves small gains. Ariely demonstrates this with numerous studies and anecdotes throughout his book. By the end he concludes that cheating is a widespread phenomenon not just limited to a few bad apples.

The second finding confirmed Ariely’s and Gino’s hunch: creative people cheated more. “Those who cheated more on each of the… tasks had on average higher creativity scores compared to noncheaters, but their intelligence scores were not very different.” Why? Ariely thinks it has to do with storytelling. That is, creative types told themselves more convincing and justifying stories:

[T]he difference between creative and less creative individuals comes into play mostly when there is ambiguity in the situation at hand and, with it, more room for justification… Put simply, the link between creativity and dishonesty seems related to the ability to tell ourselves stories about how we are doing the right thing, even when we are not. The more creative we are, the more we are able to come up with good stories that help us justify our selfish interests.

What’s also interesting is the relationship between pathological liars and gray and white matter. Gray matter is a term that describes the neurons that power our thinking. White matter, in contrast, is the wiring that connects our brain. A study led by Yaling Yang found that – and this is the interesting part – pathological liars had 14 percent less gray matter in their prefrontal cortices, a part of the brain that helps us distinguish right form wrong. One interpretation of this finding is that pathological liars have a difficult time when it comes to moral dilemmas because of their lack of gray matter.

However, Yang and her team also found that the pathological liars had 22 to 26 percent more white matter in their prefrontal areas compared to a control group. In Ariely’s words, this means that “pathological liars are likely able to make more connections between different memories and ideas, and this increased connectivity and access to the world of associations stored in the their gray matter might be the secret ingredient that makes them natural liars.”

Ariely speculates the implications:

If we extrapolate these findings to the general population, we might say that higher brain connectivity could make it easier for any of us to lie and at the same time think of ourselves as honorable creatures. After all, more connected brains have more avenues to explore when it comes to interpreting and explaining dubious events – and perhaps this is a crucial element in the rationalization of our dishonest acts.

This doesn’t mean that the more creative you are the more of a cheater you are – correlation doesn’t equal causation. But Ariely does rightly point out that cheating requires a creative mindset. Such was the case with Abagnale. He didn’t cheat because of his creativity, but his novel brand of thievery couldn’t have been possible without his wildly creative mind. After all, “facts are for people who lack imagination to create their own truth.” Abagnale would agree.

Share this:

Like this:

Several years ago University of California at Davis professor Dean Simonton conducted a study that examined more than three hundred creative geniuses born between 1450 and 1850. The list included thinkers Liebniz and Descartes, scientists Newton and Copernicus and artists Vinci and Rembrandt. He compared the relationship between their education and eminence, a metric he determined by an array of criteria. He plotted the graph and found an inverted U sparking the following conclusion: “The most eminent creators were those who had received a moderate amount of education, equal to about the middle of college.”

Simonton’s research highlights a commonly held notion: too much familiarity can be detrimental to creativity. The problem, Simonton hypothesizes, is that creativity benefits from an outsider’s mindset. “Too much experience…” on the other hand, “may restrict creativity because you know so well how things should be done that you are unable to escape to come up with new ideas.” It seems reasonable, then, to suggest that, “if you want a creative solution to a problem, you’d better find someone who knows a little about the situation but not too much.”

Consider the clever website InnoCentive.com. The premise is simple: ‘Seekers’ go to post problems for ‘Solvers.’ The problems range from “Recovery of Bacillus Spore from Swabs,” to “Blueprints for a Medical Transportation Device for Combat Rescue.” They are usually posted by large corporations, so the rewards can be lucrative – sometimes millions of dollars.

There are two things remarkable about InnoCentive, each brought to light by astudy conducted by researchers at Harvard Business School. The first is that it works; about 33 percent of the problems are solved on time. The second is that solvers tend to solve problems that are at the fringe of their expertise. If a biochemistry problem only attracted biochemists it tended to remain unsolved. But if the same problem was tackled by, say, a molecular biologist or an organic chemist the chances were greater that the problem would be solved. Outside thinking was vital.

Think about the failures of expertise, as the author of Talent is Overrated Geoff Colvin does: “Why didn’t Western Union invent the telephone? Why didn’t U.S. Steel invent the minimill. Why didn’t IBM invent the personal computer? Over and over, the organizations that knew all there was to know about a technology or an industry failed to make the creative breakthrough that would transform the business.”

Is too much expertise killing creativity?

Well, not exactly. Colvin goes on to remind readers that the greatest innovators of any field share a few characteristics in common: years of intensive preparation and technical competence. Great innovations, he says, are roses that bloom after long and careful cultivation.

He considers James Watson’s and Francis Crick’s discovery of the structure of DNA. Colvin cites the research of Robert Weisberg, who showed that several other distinguished scientists were trying to solve the same problem at the same time. Colvin argues that, “if we presume that too much familiarity with a problem is a disadvantage, then we would expect to find that Watson and Crick came at this one unburdened by the excessive data that clouded the thinking of the other researchers. But in reality, the story was just the opposite.”

The larger point is that creative breakthroughs require about 10,000 hours or ten years of deliberate practice within a given field:

The most eminent creators are consistently those who have immersed themselves utterly in their chosen field, have devoted their lives to it, amassed tremendous knowledge of it, and continually pushed themselves to the front of it. Zero evidence supports the conclusion that too much knowledge might be a hindrance in creative achievement.

And what about the success of InnoCentive? What’s important is not to be an outsider, but to have an outsider’s mindset. People at the fringe of their expertise solved problems on InnoCentive, but they were still solving problems within their general field of expertise. Indeed, innovation occurs at the boundary of disciplines, but you’ll never hear about a novelist winning a Nobel Prize in physics.

As for Simonton’s study, it’s important to remember that during the period that his subjects existed – 1450-1850 – many fundamental principles of the scientific method were still unknown. It was still possible – especially in the first half of that 400 year stretch – for someone to be an expert in multiple disciplines. Moreover, a high-level degree in, say, 1650, didn’t confer much.

Today’s landscape is much different – all the low hanging fruit is good. A breakthrough in any field requires exclusive preparation in that field; even experts don’t know everything about their field. So it’s important to maintain a skeptical point of view and think like an outsider. But when it comes to creative breakthroughs, the more familiarity the better.

Share this:

Like this:

For most of human history creativity was something that came from the muses; it was about flashes of insight from another world. Today we know that creativity is something that happens in the brain; many psychologists and neuroscientists are working to identify cognitive mechanisms and processes active during the creative process. However, the public still believes that creativity is a “gift” applicable across many fields even though research shows that creativity is improvable, contingent on upbringing and societal circumstances and domain-specific. Particularly interesting are studies of the last few years that suggest that subtle cues in our physical environment significantly influence creative output.

Consider a study published in Science by Juliet Zhu and a team of researchers from the University of British Columbia. The psychologists recruited six hundred subjects and tasked them with several basic cognitive tests that required either an analytic approach or a more creative mindset. The key part of the experiment was that the tests were conducted on computer screens with red, blue, or neutral backgrounds. Did the color of the screen matter?

The differences were noticeable. Computer screens with a red background boosted performance on analytical tasks including memory retrieval and proofreading. Blue computer screens, on the other hand, improved performance on creative tasks such as coming up with uses for a brick and brainstorming. Why? Zhu argues that red unconsciously motivates us to think more deliberately and analytically because it’s associated with things such as stop signs, emergency vehicles and danger. In contrast, blue is associated with the sky, ocean and peace and tranquility – things that influence a more free-flowing and exploratory mindset.

This brings me to a brand new study by Zhu and her colleagues Ravi Mehta and Amar Cheema released in the Journal of Consumer Research. The psychologists were interested in looking at how various levels of sound affect creativity. In one experiment they assigned 65 undergrads Remote Associate Tasks (RAT). In a RAT participants are given stimuli words (shelf, read, end) and asked to determine a related target word (book).

There were four conditions: low-noise (50 dB), moderate-noise (70 dB), high-noise (85 dB) and no-noise (the sounds of the room the participants completed the tasks). For the three noise conditions the researchers blended sounds from a cafeteria, roadside traffic and distant construction noise to create an ambient sound typical of consumption contexts such as a shopping mall or grocery store. The participants listened to the noise, which came from speakers in the room, while they solved the RAT. The psychologists found, “a significant main effect of noise level on RAT performance such that respondents in the moderate-noise condition generated more correct answers than those in the low-noise high-noise or control conditions.”

The study involved four other experiments. As predicted by Zhu and her team, each demonstrated similar results. Here’s what the authors conclude:

We find that increasing levels of noise induce distraction, leading to a higher construal level. That is, both moderate and high noise levels lead to more abstract processing as compared to a low noise level. This higher construal level then induces greater creativity in the moderate-noise condition; however, the very high level of distraction induced by the high-noise condition, although it prompts a higher construal level, also causes reduced information processing, thus impairing creativity. In other words, while a moderate level of noise produces just enough distraction to induce disfluency, leading to higher creativity, a very high level of noise induces too much dis- traction so as to actually reduce the amount of processing, leading to lower creativity.

Zhu et al. also conclude that further research will be needed to determine what exactly their research means. They expressed interest in how different types of noise might affect creativity.

The larger point of this research is that there are simple things we can do to boost our creativity: blue rooms and a moderate amount of ambient noise for instance. It also suggests that creativity doesn’t arrive to us from the muses; it’s a skill carried out by certain parts of the brain that are influenced by certain aspects of the physical environment. Moreover, creativity is improvable; it’s not reserved for certain people.

Perhaps one advantage creative people have, then, is an ability to find environments that maximize their output. Maybe. At any rate, it’s worth keeping these studies in mind the next time you need to supplement your creative output.

Share this:

Like this:

Sigmund Freud postulated that dreaming is a reflection of the unleashed id; it represents one’s deep sexual fantasies and frustrations implanted during childhood. But what happens when we fall asleep is usually much less dramatic; we dream about the problems of everyday life. Now scientists understand dreaming as an integral part of the creative process – it’s not just about the problems of everyday life, it’s about solving them.

In 2004, the neuroscientists Ullrich Wagner and Jan Born published a paper in Nature that examined the relationship between sleep and problem solving. In one experiment, they tasked participants with transforming a long list of number strings. The task required participants to apply a set of algorithms that would scare off most save a handful of math geeks. However, the researchers integrated an elegant shortcut that made the task easier. How many people, Wagner and Born asked, would catch it?

Share this:

Like this:

Starting today I will be a blogger for BigThink.com. For those of you who are not familiar, Big Think is a wonderful website with great content. Here’s what they’re all about:

In our digital age, we’re drowning in information. The web offers us infinite data points—news stories, tweets, wikis, status updates, etc—but very little to connect the dots or illuminate the larger patterns linking them together. Here at Big Think, we believe that success in the future is about knowing the ideas that allow you to manage and master this universe of information. Therefore, we aim to help you move above and beyond random information, toward real knowledge, offering big ideas from fields outside your own that you can apply toward the questions and challenges in your own life.

My blog is called Moments of Genius. Here’s a quick summary of what it will be about.

Everybody has their own pet theory about how to generate ideas and be productive: some chug caffeine, others relax; some work in groups, others work alone; some work at night, others in the morning. This blog draws from recent findings in cognitive science to inform and answer these questions and others like it. It’s for the creative professional, the businessperson or the artist who seeks to create new ideas and work efficiently. It’s about translating findings in psychology and neuroscience so we can be more productive, make better decisions, be more creative, collaborate efficiently and solve problems effectively.

My first post went up today. It’s an expansion of a previous Why We Reason post on childhood and creativity. Here’s the gist:

The Monster Engine is one of the best ideas I’ve come across. It’s a book, demonstration, lecture and gallery exhibition created by Dave Devries. The premise is simple: children draw pictures of monsters and Devries paints them realistically. According to the website, the idea was born in 1998 when Devries took an interest in his niece’s doodles. As a comic addict, Devires wondered if he could use color, texture and shading to bring his niece’s drawings to life.

But Devries had a larger goal: he wanted to always see things as a child. Why? In many ways, children flourish where adults fail. Children are more creative and are natural inventors. Their worldview is incomplete and demands discovery. They prosper because they embrace their ignorance instead of ignoring it. And they are willing to explore, investigate and put their ideas to the test because they are willing to fail. Unlike adults, they don’t care how other people perceive or evaluate their ideas, and they’re unconcerned with the impossible or what doesn’t work.

So what does this mean for Why We Reason? In short, Why We Reason will remain for the time being. I still have a few WWR posts in the works and they need to see the light of day. However, some changes will be made in the near future. In the mean time, I encourage my readers to bookmark, tweet, share, etc., my posts on Big Think.

Share this:

Like this:

By many accounts, Bjorn Borg is one of the greatest tennis players of all time. The former world no. 1 won 11 Grand Slam titles between 1974 and 1981. Most remarkably, he won 82 percent of all the professional matches he played. He had skills.

But that’s not all he had. Like many athletes, he had superstitions. To prepare for Wimbledon, Bjorn grew a beard and wore the same Fila shirt during the matches. It worked too. He holds a career record of 51-4 at Wimbledon along with five consecutive singles titles he recorded in the second half of the 1970s. Bjorn’s “lucky beard,” as the Swedes termed it, has become a staple in other sports. Today, NFL, NBA and NHL players sport the “playoff beard” in search of a competitive edge.

Superstitions are, by many accounts, irrational and scientifically backwards. However, empirical evidence suggests that this might not be entirely true. A few years ago social psychologist Lysann Damisch teamed up with Barbara Stoberock and Thomas Mussweiler to measure what effects, if any, superstitions had in sports.

In one experiment, the social scientists tested the “lucky ball” myth by having two groups of participants attempt ten golf putts from a distance of 100cm. Like good psychologists they told one group of participants that the ball they were about to use “turned out to be lucky” (superstition-activated condition). In contrast, they told the second group they were using a ball that everyone used (control condition). The researchers found that participants in the superstition-activated condition drained 35 percent more putts than participants in the control condition.

In a related study conducted last year, undergraduate Charles Lee of the University of Virginia joined with Sally Linkenauger to see if superstitious beliefs about equipment affected performance. They recruited 41 undergraduates with backgrounds in golf for their study. Similar to Damisch’s team, Lee and Linkenauger told half of the students they were using a really nice putter and the other half that British Open champion and PGA tour player Ben Curtis, who was known to be an expert putter, previously owned the putter they were able to use. (Importantly, all of the undergraduates knew who Curtis was.) Their findings were telling: students who putted with “Curtis’” putter sank, on average, one and a half more balls.

What accounts for these findings? The basis for superstitious beliefs is sheer fantasy, but their effects can be real and consequential. For example, a 2010 paper by Travis Ng of Hong Kong University found that superstitions surrounding ‘8’ and ‘4’ in Cantonese – 8 is considered lucky because it rhymes with prosper and prosperity whereas 4 is unlucky because it rhymes with die or death – affected the economics of license plates. Here’s the BPS Research Digest:

Controlling for visual factors that affect price (for example, plates with fewer digits are more sought-after) Ng’s team found that an ordinary 4-digit plate with one extra lucky ‘8’ was sold 63.5 per cent higher on average. An extra unlucky ‘4’ by contrast diminished the average 4-digit plate value by 11 per cent. These effects aren’t trivial. Replacing the ‘7’ in a standard 4-digit plate with an ‘8’ would boost its value by roughly $400.

So why do we believe in superstitions in the first place? Some cases are clearer than others. In terms of athletic performance, evidence suggests that a superstitious belief in certain objects (Curtis’ putter) and habits (Bjorn’s beard) gives us confidence, which moreover improves performance. In terms of the study involving Ben Curtis’ putter, it’s the clubs history that’s relevant. For the same reason people would like to wear a sweater knitted by Mother Teresa or use Einstein’s pencil, we believe that the equipment a legend used would give us an advantage on the playing field. In the case of Bjorn’s beard, the habit provides structure and security to an otherwise disorganized or nervous pre-Wimbledon routine.

It’s also hypothesized that superstitions arise from our natural tendency to seek evidence of intentionality in the world. We want reasons for things, and we want those reasons to have an author (e.g., God, destiny, karma, the force). We hate randomness. Many religious beliefs come about from teleological reasoning along these lines. And like superstitions in sports, there are real consequences. This is what research from anthropologist Richard Sosis suggests. As a recent NYTimes reports:

[Sosis] found that in Israel during the second intifada in the early 2000s, 36 percent of secular women in the town of Tzfat recited psalms in response to the violence. Compared with those who did not recite psalms, he found, those women benefited from reduced anxiety: they felt more comfortable entering crowds, going shopping and riding buses — a result, he concluded, of their increased sense of control.

All of this research encourages the idea that superstitious beliefs might not be entirely irrational. Although there is no empirical data to suggest that superstitions are real in it of themselves, their behavioral consequences illustrate a different trend.

There are downsides, of course, to fantastical thinking – athletes often become overly obsessed with pregame rituals and many religious beliefs led to less than ideal scenarios. But superstitions are essential. For better or for worse, they are a natural component of our cognition.

Share this:

Like this:

For Dylan, “Like a Rolling Stone” began as a long piece of vomit, at least that’s what he told two reporters back in 1965. As the story goes, Dylan, who was at the tail end of a grueling tour that took his pre-electric act across the United States and into Europe, decided to quit music and move to a small cabin in upstate New York to rethink his creative direction. He was sick of answering the same questions over and over again. He was sick of singing the same song over and over again. He wanted to liberate his mind.

This is why “Like a Rolling Stone” began as a twenty-page ramble. It was, as Dylan described it, a regurgitation of dissatisfactions and curiosities. What came next was Dylan’s true talent. Like a wood sculpture, he whittled at his rough draft. He cherry picked the good parts and threw away the bad parts. He began to dissect his words to try to understand what his message was. Eventually, Dylan headed to the studio with a clearer vision, and today, “Like a Rolling Stone” stands as one of the very best.

What’s interesting is how Dylan approached the writing process. The song started as a splattering of ideas. Dylan wasn’t even trying to write a song; initially, he didn’t care about verses or choruses. He compared the writing process to vomiting because he was trying to bring an idea that infected his thinking from the inside to the outside of his body.

His strategy isn’t unique. In fact, it resembles the approach of many other artists throughout history. For example, in the Fall 1975 issue of The Paris Review, the Pulitzer Prize winner and Nobel laureate John Steinbeck gave this piece of advice about writing: “Write freely and as rapidly as possible and throw the whole thing on paper. Never correct or rewrite until the whole thing is down. Rewrite in process is usually found to be an excuse for not going on. It also interferes with flow and rhythm which can only come from a kind of unconscious association with the material.” As the saying goes, perfection is achieved not when there is nothing left to add, but when there is nothing left to take away.

This principle doesn’t just show itself in art. Economies, too, succeed and fail by continuous innovation and wealth followed by unvaried ideas and bankruptcies. The Austrian economist Joseph Schumpeter popularized the term creative destruction to describe the simultaneous accumulation and annihilation of wealth under capitalism. As Schumpeter saw it, for every successful entrepreneur dozens of failures followed. But this was a good thing; capitalism was to be understood as an evolutionary process where good ideas prevailed over bad ones.

With these thoughts in mind, consider a study released this month conducted by Simone Ritter of the Radboud University in the Netherlands with help from Rick B. van Baaren and Ap Dijksterhuis. For the first experiment, the scientists recruited 112 university students and gave them two minutes to come up with creative ideas to solve relatively harmless problems (e.g., improving the experience of waiting in line at a supermarket). Next, the subjects were divided into two groups: the first went straight to work, while the second performed an unrelated task for two minutes to distract their conscious mind.

The first thing the psychologists found wasn’t too eye opening. Both groups – conscious and distracted – created the same amount of ideas. But the second finding was slightly more intriguing. Here’s Jonah Lehrer describing the results:

After writing down as many ideas as they could think of, both groups were asked to choose which of their ideas were the most creative. Although there was no difference in idea generation, giving the unconscious a few minutes now proved to be a big advantage, as those who had been distracted were much better at identifying their best ideas. (An independent panel of experts scored all of the ideas.) While those in the conscious condition only picked their most innovative concepts about 20 percent of the time — they confused their genius with their mediocrity — those who had been distracted located their best ideas about 55 percent of the time. In other words, they were twice as good at figuring out which concepts deserved more attention.

When it comes to writing an essay for college, pitching a business plan or creating a work of art we are hard wired to believe that our output is above average. As a result, we are blind to what needs improvement. It’s not just that we can’t see any holes and errors; we don’t think they exist. What’s interesting about Ritter’s findings is that they give us a strategy to overcome our overconfidence. The lesson from her research is that in order to recognize our imperfections we must step back and be dilettantes. In other words, get distracted and don’t marry the first draft.

And this brings me back to Dylan’s vomit and Steinbeck’s advice. The reason we should “never correct or rewrite until the whole thing is down” is because we initially don’t know which of our ideas are worthwhile. It’s only after we get everything down that we are able to recognize what works from what doesn’t. This is the lesson from Ritter’s research: we need to give the unconscious mind time to mull it over so it can convince the conscious mind to make adjustments. Or, as Nietzsche said in All Too Human: “The imagination of the good artist or thinker produces continuously good, mediocre or bad things, but his judgment, trained and sharpened to a fine point, rejects, selects, connects…. All great artists and thinkers are great workers, indefatigable not only in inventing, but also in rejecting, sifting, transforming, ordering.”

Share this:

Like this:

Do opposites attract? Pop culture thinks so. Movies like Pretty Woman and The Notebook suggest that couples with virtually nothing in common are destined for each other. Psychological studies paint a different picture. When people have a choice, they seek people who are just like them. Psychologists call this the similarity-attraction effect (SAE) and it shows itself across many cultures.

The SAE is especially pronounced between romantic couples. For example, in the early 1990s the Chicago Sex Survey collected data to find out where and how Americans met their partners. It found that “people search for – or, in any case, find – partners they resemble and partners who are of comparable ‘quality’… the great majority of marriages exhibit homogamy on virtually all measured traits, ranging from age to education to ethnicity.”

The same is true of our friends. This is what a recent paper by Angela Bahns, Kate Pickett and Christian Crandall at Wellesley College and the University of Kansas demonstrates. The researchers were interested in how the social diversity of a college influenced social relationships: Did more socially diverse schools lead to more diverse relationships?

To find out they compared the relationships of students at a large state university (University of Kansas) with four small colleges in Kansas. They accomplished this by asking students about their demographic information, behaviors and beliefs (opinions on birth control and under age drinking for instance). They found that the “greater human diversity within an environmental leads to less personal diversity.” The students at the University of Kansas, in other words, tended to create more homogeneous social groups compared to their peers at smaller schools. This means, ironically, that the more opportunities there are to pursue diverse relationships the more we tend to gravitate towards likeminded people.

This can be a problem. Several studies conducted over the last decade illustrate the importance of intellectual diversity. An analysis of Stanford Business School graduates found that “entrepreneurs with more ‘entropic’ and ‘diverse’ social networks scored three times higher on a metric of innovation, suggesting that the ability to access ‘non-redundant information from peers’ is a crucial source of new ideas.” Similarly, Brian Uzzi and Jarrett Spiro found that the most successful Broadway musicals combined new blood with industry veterans; too much familiarity or novelty within the staff was a killer of quality content.

In the context of marriage the SAE is a good thing. Marriages usually succeed when two likeminded people are involved; the similarity of personality traits is a good predictor of marital stability and happiness. In fact, it’s especially unlikely for people with dissimilar personalities to be attracted to each other. It’s not merely that opposites don’t attract: They often repel.

If opposites don’t attract romantically, why do we have such a propensity to believe that they do? For one thing, we humans love romantic stories. From Romeo and Juliet to EVE and Wall-E to Katniss and Peeta, we can’t help but fantasize about pairs of star-crossed lovers. Unfortunately, because stories sacrifice reality for more passionate and heart wrenching plots our perception of romantic relationships is heavily distorted. Not everything has a happy ending.

In brief, then, romantic relationships thrive on similarity. The opposite is true for your social and professional circles: when it comes to generating ideas, being creative or entrepreneurial, intellectually diverse social circles are key.

Share this:

Like this:

It’s difficult to make objective predictions about our future self. No matter how hard we try, we’re always influenced by the present. In one study, for example, researchers phoned people around the country and asked them how satisfied they were with their lives. They found that “when people who lived in cities that happened to be having nice weather that day imagined their lives, they reported that their lives were relatively happy; but when people who lived in cities that happened to be having bad weather that day imagined their lives, they reported that their lives were relatively unhappy.”

Similarly, a few years ago researchers went to a local gym and asked people who had just finished working out if food or water would be more important if they were lost in the woods. Like good social scientists, they asked the same question to people who were just about to work out. They found that 92 percent of the folks who just finished working out said that water would be more important; only 61 percent of people who were about to work out made the same prediction.

Physical states are difficult to transcend, and they often cause us to project our feelings onto everyone else. If I’m cold, you must be too. If I like the food, you should too. We are excellent self-projectors (or maybe that’s just me). Sometimes there are more consequential downsides to this uniquely human ability. And this brings me to a new study led by Ed O’Brien out of the University of Michigan recently published in Psychological Science. (via Maia Szalavitz at Time.com)

The researchers braved the cold for the first experiment. They approached subjects at a bus stop in January (sometimes the temperature was as low as -14 degrees F) and asked them to read a short story about a hiker who was taking a break from campaigning when he got lost in the woods without adequate food, water and clothing. For half of the subjects the lost hiker was a left leaning and pro-gay rights Democrat; the other half read about a right-wing Republican. Next, the researchers asked the subjects their political views and which feeling was most unpleasant for the stranded hiker – being thirsty, hungry or cold. (For female participants, the hiker was described as female; for men, the hiker was male.) While these chilly interviews were being conducted O’Brien and his team ran the same study in a cozy library. Did the two groups show different answers?

The first thing O’Brien found was consistent with the gym study: 94 percent of the people waiting for the bus said the cold was the most unpleasant feeling for the hiker compared to only 57 percent of the library dwellers. Here’s were things got interesting: “If participants disagreed with the hiker’s politics… their own personal physical state had no bearing on their response: people chose the cold in equal numbers, regardless of where they were interviewed.” In other words, we don’t show as much empathy towards people who don’t share our political beliefs.

Their findings are disheartening given the current political climate in the United States. If we cannot empathize with someone who doesn’t share our political views, how are we supposed to engage in rational discourse with them? In order to work out our differences, it seems like we need to first recognize that we are the same deep down.

The larger problem is that compassion, empathy and moral sentiments towards other people binds and blinds. As one author says, “we all get sucked into tribal moral communities, circling around something sacred and then sharing post-hoc arguments about why we are so right and they are so wrong. We think the other side is blind to truth, reason, science, and common sense, but in fact everyone goes blind when talking about their sacred objects.”

How do we break out of our political matrices? Here’s one idea: let’s take the red pill and realize that we all can’t be right while remembering that we all have something to contribute. This is what the Asian religions nailed on the head. Ying and Yang aren’t enemies because like night and day they are necessary for the functioning of the world. Vishnu the preserver (who stands for conservative principles) and Shiva the destroyer (who stands for liberal principles), the two of the high Gods in Hinduism, cooperate to preserve the universe. It’s a cliché worth repeating: let’s work together to get along.

Share this:

Like this:

Growing up has its benefits. As we age, our intellect sharpens and willpower strengthens. We come to control out thoughts and desires; we identify goals and hone our skills.

However, growing up comes at a cost: we lose our natural desire to discover and invent; we become more self-conscious and less willing to fail. A study conducted between 1959 and 1964 involving 350 children found that around 4th grade our tendency to daydream and wonder declines sharply. In other words, Picasso was right: “Every child is an artist. The problem is how to remain an artist once we grow up.”

Age doesn’t necessarily squander our creative juices – creative geniuses like Steve Jobs and Steven Spielberg somehow managed to maintain a sense of wonderment through their adult years – but when we make the leap from elementary school to middle school our worldview becomes more realistic and cynical. The question is: what did Jobs and Spielberg do differently? How do we maintain our naiveté?

A study conducted several years ago by Darya Zabelina and Michael Robinson of North Dakota State University gives us a simple remedy. The psychologists divided a large group of undergraduates into two groups. The first group was giving the following prompt:

You are 7 years ago. School is canceled, and you have the entire day to yourself. What would you do? Where would you go? Who would you see?

The second group was given the same prompt minus the first sentence. This means they didn’t imagine themselves as seven years olds – they remained in their adult mindset.

Next, the psychologists asked their subjects to take ten minutes to write a response. Afterwards the subjects were given various tests of creativity, such as inventing alternatives uses for an old tire, or completing incomplete sketches. (As well as other tasks from the Torrance test of creativity.) Zabelina and Robinson found that, “individuals [in] the mindset condition involving childlike thinking… exhibited higher levels of creative originally than did those in the control condition.” This effect was especially pronounced with subjects who identify themselves as “introverts.”

What happens to our innate creativity when we age? Zabelina and Robinson discuss a few reasons. The first is that regions of the frontal cortex – a part of the brain responsible for rule-based behavior – are not fully developed until our teenage years. This means that when we are young our thoughts are free-flowing and without inhibitions; curiosity, not logic and reason, guides our intellectual musings. The second is that current educational practices discourage creativity. As famed Ted speaker Ken Robinson said: “the whole system of public education around the world is a protracted process of university entrance. And the consequence is that many highly talented, brilliant, creative people think they’re not, because the thing they were good at at school wasn’t valued, or was actually stigmatized.”

No matter the reasons, the authors stress, adults can still tap into their more imaginative younger selves. The useful cognitive tools that come with adulthood tempt us to inhibit our imagination from wondering about the impossible, but as so many intellectuals and inventors have remarked throughout history, challenging what’s possible is a necessary starting point. As Jobs said, “the people who are crazy enough to think they can change the world, are the ones who do.”

To be sure, it’s often beneficial to approach life with an adult mindset – you probably don’t want to get too creative with your taxes – but when it comes to using your imagination, thinking of oneself as a child facilitates more original thinking.