WHAT SCIENCE AND SUPER-ACHIEVERS TEACH US ABOUT HUMAN POTENTIAL

The book

The author

David Shenk is the national bestselling author of five previous books, including The Forgetting ("remarkable" - Los Angeles Times), Data Smog ("indispensable" - New York Times), and The Immortal Game ("superb" - Wall Street Journal). He is a correspondent for TheAtlantic.com, and has contributed to National Geographic, Slate, The New York Times, Gourmet, Harper's, The New Yorker, NPR, and PBS.

December 04, 2013

I was invited today onto public radio's The Takeaway with the great John Hockenberry to discuss a new Nature Neuroscience study from Emory University demonstrating that memories can be biologically inherited, via epigenetics, from parent to child and even to grandchild. For those of you already familiar with fourteen years of epigenetic studies, this is mostly just further proof that epigenetic inheritance is very real. For those who have not yet heard of epigenetics, or genetic expression, I promise that this will blow your mind. The 20th century was about genetics. The 21st is going to be about epigenetics.

August 05, 2009

I appreciate the intense reactions to this blog so far, and respect the lingering skepticism. (Some of the nastiness I could do without, but it wouldn't be the Internet without some tasty pot-shots). I certainly didn't expect to win over the entire crowd with a handful of short overview pieces containing little evidence and no depth. I get that smart Atlantic readers are going to scrutinize this stuff.

After three years of discussing it among friends, I also understand that this is an issue that often provokes a visceral response. We all have strong opinions about how we became who we are. We need to have these opinions--it's a part of forming an identity. After a century of genetic dogma, terms like "innate" and "gifted" are baked right into our language and thinking. I don't mean to suggest that we all believe that genes control everything. Instead, most of us believe in "nature" followed by "nurture": genes dispense various design instructions as our body is formed in utero, priming us with a certain level of intellectual, creative, and athletic potential; following this, environmental influences develop that potential to some extent or another. This is what we know to be true, and it makes perfect sense.

Well, it turns out not to work that way. But no one familiar with the new science of development expects these old beliefs to wash simply away in a few weeks or months just because a few smart-ass writers come along and say they know better.

As we try to present this stuff, there are a hundred sand traps of understandable confusion. When I argue that "innate" doesn't really exist, it may seem like I'm making the blank slate argument -- which I'm not. When I argue that talent is a process, it may seem like I'm arguing that anyone can do anything, which I'm not. When I argue that we can't really see our individual potential until we've applied extraordinary resources over many years, it may seem like I'm arguing that genetic differences don't matter -- which I'm not. When I criticize The Bell Curve, it may look like I'm an agent of the left pushing a liberal egalitarian agenda, which I'm not.*

What I am pushing is the consideration of a whole new paradigm. In doing so, I am of course just a conduit.

While their evidence is quite complex, their renewed argument is simple: "nature vs. nurture" doesn't adequately explain how we become who we are. That notion needs to be replaced.

Lead author John Spencer:

The nature-nurture debate has a pervasive influence on our lives, affecting the framework of research in child development, biology, neuroscience, personality and dozens of other fields. People have tried for centuries to shift the debate one way or the other, and it's just been a pendulum swinging back and forth. We're taking the radical position that the smarter thing is to just say 'neither' -- to throw out the debate as it has been historically framed and embrace the alternative perspective provided by developmental systems theory.**

"Developmental systems theory" is a vague mouthful, and the scientists behind these observations readily admit that they haven't yet found the most compelling new language to present their ideas to the public. But the basic idea, as I've written in previous posts, is that genes are not static; they are dynamic. Genes interact with the environment to form traits. The more closely scientists look at claims of so-called "hard-wired" behavior and abilities, the more they turn up evidence that actions and talents are formed in conjunction with the culture around them.

John Spencer again:

Researchers sometimes claim we're hard-wired for things, but when you peel through the layers of the experiments, the details matter and suddenly the evidence doesn't seem so compelling...When people say there's an innate constraint, they're making suppositions about what came before the behavior in question. Instead of acknowledging that at 12 months a lot of development has already happened and we don't exactly know what came before this particular behavior, researchers take the easy way out and conclude that there must be inborn constraints. That's the predicament scientists have gotten themselves into.

Imprinting is one of many examples reviewed by the Iowa researchers. In 1935, Viennese zoologist Konrad Lorenz famously discovered that newborn chicks whose eggs were incubated in isolation will still correctly pick the call of their mother over another animal. It seemed the perfect little proof of innate ability.

But in 1997, Gilbert Gottlieb discovered the flaw in that assumption. It turned out that when fetal chicks were deprived of the ability to make vocal sounds inside their own eggs -- that is, the ability to teach themselves what their species sounded like -- they were unable to pick the correct maternal sound from various animals.

Another famously innate quality is "dead reckoning," the ability of fish, birds, and mammals (including humans) to establish one's current location based on past locations and movement history. How could young geese know how to fly home from 100 meters without trial and error? Being a mystery with no apparent answer, the word "innate" was again used as a catch-all explanation. Then it became clear that mother geese train their gosslings' navigational skills through daily walks.

How could baby chicks find their way back to a mother without clear sight of her? It turned out that they simply reversed the directions they had taken when getting lost.

One by one, the Iowa researchers show, scientists have declared basic abilities to be explainable only by hard-wiring only to later have a slow learning process revealed under closer inspection and better tools. The consistent refrain: abilities form in conjunction with development, community, and context. Genes matter, but actual results require genetic expression in conjunction with the environment.

(One big problem with this new paradigm, explains John Spencer, "is that it's much more complicated to explain why the evidence is on shaky ground, and often the one-liner wins out over the 10-minute explanation.")

The Iowa paper also delves deeply into claims of human language innateness, including what is known as "shape-bias." "Shape bias," the authors write, "simpliﬁes the word learning situation and thereby aids vocabulary development, but it is not innate. Rather, it is the emergent product of a step-by-step cascade."

What does all of this have to do with Einstein's genius or your piano playing? Developmental systems theory tells us that, while genetic differences do matter, they cannot, on their own, determine what we become. From there, the whole idea of innate talent falls apart.

As this blog continues, you'll meet more of the scientists who are documenting and shaping these ideas. One of the things I'd like to do is bring them together as a community and give their umbrella notion a more accessible name. "Developmental genetics" is one possibility. "Environmental genetics" is another.

Suggestions are welcome.

_____________

[Thanks to Mark Blumberg, one of the University of Iowa authors and editor-in-chief of Behavioral Neuroscience.]

_____________

Notes

* I am guilty of being a liberal on most issues, and there are elements of this new paradigm that gel nicely with a liberal sensibility; but there are also some very uncomfortable moral implications to come to terms with. Every writer has biases to be sure, but self-respecting journalists don't ignore or cherry-pick information because they like its political ramifications. I didn't write Data Smog because I wanted to bring down the Internet; I didn't offer some sanguine views on new surveillance technologies because I desire a police state, and I haven't been picking and choosing genetics and intelligence studies to prop up the Obama administration.

** These John Spencer quotes are taken from an University of Iowa press release about the journal article.

July 28, 2009

"[Some] assert than an individual's intelligence is a fixed quantity which cannot be increased. We must protest and react against this brutal pessimism."

- Alfred Binet, inventor of the original IQ test, 1909

Last week, I argued that our 21st century understanding of genetics invalidates the idea of fixed, innate abilities. Genes influence everything but determine almost nothing on their own.

What, then, is IQ? Conventional wisdom says that IQ scores reveal our native intelligence. According to this view, IQ tests are different from school grades, different from SAT scores, different from any other test you will ever take, because they somehow reveal the core, innate abilities of each person's brain: your clock speed, your RAM, your absolute limit.

That's what Stanford psychologist Lewis Terman wanted us to believe when he introduced the American version of the IQ test in 1916. (This was quite the opposite intention of the test's original co-inventor, Alfred Binet. But that's a history lesson we'll return to another time.)

What Terman had actually come up was a deceptively simple system for ranking academic progress. His Stanford-Binet tests measured many different skills, and then scored the results so that the median was always 100. If you had an IQ score of 100, it simply meant that half of the test-takers your age had done better and half had done worse.

These tests were impressively stable, which meant that, over time, most people ended up in roughly the same place in the pack. If you had tested in the 60th percentile at age 10, chances were pretty good that that you'd test close to the 60th percentile at age 12 and age 14.

But did this stability prove that the tests revealed innate intelligence?

Far from it. The reality is that students performing at the top of the class in 4th grade tend to be the same students performing at the top of the class in 12th grade, due to many factors that tend to remain stable in students' lives: family, lifestyle, resources, etc.

Being branded with a low IQ at a young age, in other words, is like being born poor. Due to family circumstances and the mechanisms of society, most people born poor will remain poor throughout their lives. But that doesn't mean anyone is *innately* poor or destined to be poor; there is always potential for any poor person to become rich.

The happy reality is that IQ scores:

A) measure developed skills, not native intelligence.

B) can change dramatically.

C) don't say anything about a person's intellectual limits.

More details below.

Coming next in this blog: Should kids know their own IQs?

____________________

AN IQ FAQ

What is IQ?

IQ (short for "intelligence quotient") is a score derived from a collection of
tests which rank academic achievement within a particular age group.

What do IQ tests
measure?

IQ tests measure current academic abilities -- not any sort of fixed, innate intelligence. More specifically, the best-known IQ battery,
"Stanford-Binet 5," measures Fluid Reasoning, Knowledge, Quantitative
Reasoning, Visual-Spatial Processing, and Working Memory. Collectively, these
skills are known as "symbolic logic." Among other things, IQ tests do
not measure creativity;[i] they
do not measure "practical intelligence" (otherwise known as
"street smarts");[ii]
and they do not measure what some psychologists call "emotional
intelligence."

Harvard's Howard
Gardner:

"The tasks
featured in the IQ test are decidedly microscopic, are often unrelated to one
another, and . . . are remote, in many cases, from everyday life. They rely
heavily upon language and upon a person's skill in defining words, in knowing
facts about the world, in finding connections (and differences) among verbal
concepts . . . . Moreover, the intelligence test reveals little about an
indivdual's potential for further growth."[iii]

Tufts'
Robert Sternberg:

IQ
problems tend to be "clearly defined, come with all the information needed
to solve them, have only a single right answer, which can be reached by only a
single method, [and are] disembodied from ordinary experience . . . . Practical
problems, in contrast, tend to
require problem recognition and formulation . . . require information seeking,
have various acceptable solutions, be embedded in and require prior everyday
experience, and require motivation and personal involvement."[iv]

How are IQ scores
determined?

Raw individual test
scores are converted so that they correlate perfectly to a bell curve
representing the entire population of same-age students. The average score is
always 100.

-
An IQ score of 100 means that 50% of the people in your age group scored
better, and 50% scored worse.

- An
IQ score of 85 means that 84.13% of the people in your age group scored better,
and 15.87% scored worse.

-
An IQ score of 130 means that 2.28% of the people in your age group scored
better, and 97.72% scored worse.

If IQ scores can change over time, why do most people's IQ scores
stay reasonable stable?

What any individual can
achieve with the right combination
of assets and gumption is entirely different from what most people actually do
achieve. Most people settle into a particular academic standing early in life
and do not substantially deviate from that standing. That's the inertia of life
and human circumstance.

So IQ scores don't
imply any sort of fixed or innate intelligence?

Quite the contrary. We
know that the abilities IQ measures are skills, and we know that people can
earn these skills. "Intelligence," Robert Sternberg has declared,
"represents a set of competencies in development." There is plenty of
evidence, for example, that schooling raises overall academic intelligence.[vii]
There is also evidence that most human beings are not reaching their cognitive
or academic potential.[viii]
Better schools and higher standards can raise the level of learning for nearly
all students.

Don't genes limit our
intelligence? Isn't intelligence "heritable?"

No, and no. Very sloppy science and journalism has led us to believe that what scientists
call "heritability" (derived from twin studies) is the same thing as
what weordinary folk call
"heredity." In fact, they are not even remotely the same thing. Genes
certainly do have an impact on
intelligence, and everyone has their own theoretical limits, but every
indication is that most of us don't come close to our true intellectual
potential. More on this here.

FOOTNOTES

[i]IQ scores do not identify the most successful
and creative artists or scientists:

July 03, 2009

A large number of websites and even quite a few books will tell you that Wolfgang Amadeus Mozart's IQ was 165. They'll also reveal that Benjamin Franklin's IQ was 160, Charles Dickens' was 180, Isaac Newton's was 190, and Blaise Pascal's was 195.

There's
only one small problem with this data: The IQ test was invented in the early
20th Century -- long after all of these people were dead and buried.

Here's
how this lunacy came about: The IQ test was first invented in France by Alfred
Binet in the late 19th Century as a way to measure academic skills and pick out
the students who were not learning as fast they could and should. It was not
designed to separate innately-smart people from less innately-smart people.
Binet, in fact, did not believe intelligence was innate. He saw intelligence
not as a thing, but as a process of acquiring certain thinking skills. (He
turned out to be quite correct.)

Then
along came Lewis Terman, a Stanford psychologist in the early 20th century who
preferred Francis Galton's idea of intelligence: a certain innate quality that
each person is born with. Terman reinvented the IQ test and sold it to American
intellectuals and policymakers as a way to separate the intellectual wheat from
the idiotic chaff. Terman also began an epic study on geniuses entitled, Genetic
Studies of Genius.

Mind
you, he had no proof that intelligence was gene-based. (We still don't have any
such proof, contrary to what you might read elsewhere. See my post on heritability.)

Terman
was well-funded and well-staffed. In 1926, he assigned one of his protégés,
Catharine Cox, to somehow adapt their new IQ test to estimate the IQs of 301
well-known historical figures.

Here's
the rub: Even in Terman's context, this made no sense. It was pure intellectual
foolishness, even if you completely accepted his argument that IQ detected
innate intelligence. That's not just because none of these people actually took
an IQ test, but also because IQ tests only measure people's academic skills
against other people their same age. The actual score is not an actual score of
right vs. wrong answers, but a weighted score to compare every test-taker's
performance with every other same-age test-taker's performance in that
particular year. 100 is always the median. A score of 100 means that 50% of the
same-age students scored higher than you, and 50% scored lower.

So
how could anyone possibly hope to go back in time, look at the work of dead
people, and deduce their IQ score? It was impossible.

In
her report[i], Cox
acknowledged: "The correction attempted in the
present report is a crude approximation . . ."

Cox
and Terman assigned a score of 200 to their hero Francis Galton. That would
make him one of the great geniuses of all time.

Thanks partly to this study, IQ has become one of the great myths of our time. It's going to take us another century to replace it with a more sensible understanding of intelligence.

__________________

[i]
"The Early Mental Traits of Three Hundred Geniuses," by Catharine M.
Cox, from Genetic Studies of Geniu, edited by Lewis M. Terman. Stanford University Press,
1926.

June 17, 2009

Very nice story in the NYTimes today by Benedict Carey about the vexing complexity of gene-environment interaction. The story opens with this powerful news:

"One of the most celebrated findings in modern psychiatry — that a single gene helps determine one’s risk of depression in response to a divorce, a lost job or another serious reversal — has not held up to scientific scrutiny, researchers reported Tuesday."

The original 2003 study had revealed what many were calling a "gene for depression." People with a particular variant of a gene involved in the regulation of neurochemicals seemed much more susceptible to depression when thrust into certain depressing life situations.

But it's not that simple. In the vast majority of cases, genes don't dictate a trait or a specific response. They interact constantly with other genes and every other facet of a person's ongoing life. In school, we were taught that genes contain instructions on what each of us will be like. That was view 100 years ago. Now we understand that life is the consequence of constant gene-environment interaction.

More from the Times story:

"The authors reanalyzed the data and found 'no evidence of an association between the serotonin gene and the risk of depression,' no matter what people’s life experience was, Dr. Merikangas said.

"By contrast, she said, a major stressful event, like divorce, in itself raised the risk of depression by 40 percent."

As a general rule, don't listen to anyone telling you that there's a "gene for" this or that. Even if there's an Ph.D. or M.D. at the end of the name, it's an old and misleading way of discussing genetics.

Thankfully, it's not just the science that's improving. Reporting on genetics has also been getting demonstrably better. Today's piece is a nice example, as is this extraordinary piece by Carl Zimmer from last November.

May 19, 2009

Over the next few months, I want to pay individual tribute to some of the great scientists who are helping us to understand genetics, talent and intelligence in a whole new way.

In our Data Smog world of endless info and hyper-complexity, we all rely on trusted sources more than ever. It's a chain of trust: journalists rely on experts they trust most; citizens rely on journalists they trust most.

For the book I just finished, I'd estimate that I spent about 1/3 of my research time trying to identify trusted scientists. It's an exhaustive process. One doesn't want to simply latch on to those with a shared ideology. Rather, you carefully wade through a sea of scientific research, develop a sense of the different approaches, and slowly intuit which few minds seem to have the best handle on what's out there.

Something very funny happened to me about two years ago as I was developing my short list of favorites. One of my them turned out to be a neighbor.

Meet Professor Massimo Pigliucci.

His book bio said he was at the University of Tennessee, but by the time we got in touch, he happened to live a few hundred feet away from me in Brooklyn. This bit of luck ended up making my book stronger in a number of ways.

Massimo has for many years been a working (and teaching) biologist but has lately taken on the burdens of a public philosopher. He's helped me sort through the intricacies of gene expression and heritability, and helps others think through all sorts of other interesting matters in his blog "Rationally Speaking."

March 07, 2007

The prodigious savant Daniel Tammet was just profiled on 60 Minutes, sparking a provocative email from my brainy and combative step-uncle Stan; he wants to know how savant syndrome fits into, or conflicts with, my developing understanding of talent.

Tammet is as rare as it gets: there are only 50 or so prodigious (truly exceptional) savants out there. But there are thousands more savants who are highly-impressive in one way or another, and as a group they can offer us enormous insight into the workings of the brain and the nature of intelligence.

The lessons are surprising. At first blush, one might assume that savants are proof that biology trumps effort: everyone's brain has a slightly different circuitry and will perform accordingly; savants are at one extreme end of the spectrum, with very strange wiring that confers amazing ability.

The truth is a lot more interesting. Here's a Savant FAQ, informed by the work of Darold Treffert, one of the world's leading savant authorities.

What is savant syndrome?Savant syndrome is the presence of unusual intellectual and/or artistic abilities in otherwise impaired individuals. It is seen in an estimated 1 in 10 persons with autism, AND in roughly 1 in 1000 persons with other mental impairments, including developmental disability, mental retardation, and other central nervous system injuries or diseases. Savantism occurs in many more males than females -- a 6:1 ratio.

Is it always present from birth?No, and that turns out to have very important implications. Says Treffert: "Savant syndrome can be congenital, or it can be acquired following brain injury or disease later in infancy, childhood, or adult life. Recent reports of savant-type abilities emerging in previously healthy elderly persons with fronto-temporal dementia are particularly intriguing."

What are the specific abilities displayed by savants?As a rule, they are right-hemisphere skills: music, art, math, spatial dexterity and calendar calculation -- what Treffert calls an "intriguingly narrow range of special abilities" made possible by a spectacular deployment of mechanical, or concrete (also called "implicit") memory.

What's the underlying cause?No current theory can account for all the cases of savant syndrome, but the most prominent theory that plausibly covers most cases is an injury to the left part of the brain (in the womb, infancy, childhood or adulthood) which sparks a dramatic compensation by the right brain.

--- Treffert elaborates: "Some savants, because of prenatal, perinatal or postnatal central nervous system damage, from a variety of genetic, injury or disease processes have substituted right brain capacity in a compensatory manner for left brain dysfunction and limitation. Simultaneously, because of those same injurious factors, these savants have come to rely on more primitive cortico-striatal (procedural or habit) memory rather than higher level cortico-limbic (semantic or declarative) memory. This combination of right brain skills coupled with procedural memory produces the constellation of abilities and traits that is savant syndrome."

But how can a brain injury give someone exceptional abilities?We know from centuries of medical history, including the emergence of various medical oddities over the years, that certain components in every brain are equipped with incredible technical capabilities -- capabilities normally suppressed by other components so that the brain can do its main job, which is to balance out function and help a person lead a normal life. For example, in my book The Forgetting, I discuss the famous Russian patient S. who literally remembered every detail he came across in his entire life. He could recite verbatim conversations or random number lists decades after the fact. Sounds cool,bBut this was actually a huge liability -- remembering every detail makes it impossible to form intelligent summaries of details, which is the basis of all intelligent thought and communication. The ability to forget -- get rid of sensory detail -- turns out to be just as important in the brain as the ability to form new memories.

Similarly, savants become unhinged from the usual cerebral checks and balances. Treffert explains: "'Weak central coherence' theory (WCC) [is the ability/disability of] focusing on details rather than the whole....Not being distracted by more global patterns, the savant can focus on a single item or skill and perfect it." (He cites Frith & Happe, 1994).

Like a car spinning around and around because its steering wheel is stuck in the right-turn position, savants' severe brain injuries push them to focus all their time and energy away from the wide burden of social function and into one or more very narrow skills.

What are the lessons for normal functioning brains?1. Savants don't have amazing abilities -- they acquire them.Savant brain injuries, whether in the womb or much later on, don't instantly bestow people with amazing powers -- rather, they set loose normally restricted brain mechanisms which allow that person to hyper-focus on a certain skill set in a way that normal functioning minds cannot. Through their disability, they are able to develop amazing skill. As Daniel Coyle recently wrote in the NYT: "Savants' true expertise, the research suggests, is in their ability to practice obsessively, even when it doesn't look as if they're practicing."

2. We can acquire them too.Although it's a far more cumbersome process, anyone with a normal functional brain can also develop advanced -- and even extraordinary -- skills. Ericsson, Dweck et al have shown some paths to get there, and Treffert argues that we may be able to develop further training methods based on what we're learning from savant brains. "Does some Rain Man ability — savant-like skill and capacity — exist in each of us?" posits Treffert. "Probably so. [The] more primitive memory circuitry, and right brain capacity, both still exist in each of us. However because of their inherent, utilitarian usefulness we have generally come to rely more heavily on left (dominant) hemisphere functions such as language, logical & sequential thinking, for example, than on right (non-dominant) hemisphere skills. Likewise in our day to day functioning we have come to generally use and depend upon semantic or declarative memory much more than using our more primitive, and less facile, procedural or habit memory capabilities. The question becomes then, is it possible to tap and use those still existent, but less frequently used, capacities and circuits, with some of their savant-like characteristics, in those of us more wedded to left brain capacity and higher level memory? ... ...I am convinced there is."

March 05, 2007

This has been a terrific couple of weeks for anyone wanting to better understand talent -- several smart magazine and newspaper pieces have zeroed in on new, critical data. Daniel Coyle has a solid piece in yesterday's NYTimes Sports Magazine that nicely combines Anders Ericsson's work on "deliberate practice" with some very recent findings about myelin, the fatty insulation around nerve fibers that makes electrical nerve signals more efficient (Ishibashi et al, 2006; Fields, 2006).

Here's the connection:

It is now very well established that persons of great skill in any field have spent many years carefully honing their technique (this includes savants, who, by nature of their disability, are able to focus obsessively and persistently on math or music or art, effectively tuning out distractions). Why does high-level skill take so much time and steady effort to develop? It turns out that this slow, patient persistence is exactly what myelin needs to become a thicker and more efficient insulator. You can't rush that process. "In neurology, myelin is being seen as an epiphany," NIH's Douglas Fields told Coyle. "This is a new dimension that
may help us understand a great deal about how the brain works,
especially about how we gain skills."

Coyle also looks at the current epicenters of great sports training -- the Spartak tennis center in Russia, golfers in South Korea, baseball payers in the Dominican Republic and Venezuela. The common thread, he observes, is an obsessive focus on technique. Each of these places are incubators for deliberate practice. Harnessing the competitive drive comes later (at Spartak, they don't allow students to compete in tournaments for at least three years).

***

Are some people born with more efficient myelin-boosters than others? Maybe so. Maybe, on top of the years and years of persistent development of technique, Anna Kournikova and Tiger Woods and Nicolo Paganini also got lucky in the genetic lottery. But to anyone following the last few years of research, genetic differences seem less and less relevant. Here's why:

1. No one has actually found these much-vaunted genetic differences relating to skill and talent. Maybe they're connected to intelligence, maybe persistence -- but we haven't actually found them yet. Meanwhile, Ericsson, Fields, Dweck, et al have exhaustively documented various external influences.

2. Regardless of what differences we're born with, evidence suggests that:-- most people do not come remotely close to achieving their genetic potential (Ericsson, Ceci)-- high-level achievement is simply impossible without hard work and persistence (Ericsson et al)

3. We know from Carol Dweck's definitive research that no one benefits from a mindset that relyies on their "natural" abilities. Students encouraged to rely on their natural gifts stagnate, as do poor-performing students told that they are limited by some disability. Conversely, students of every caliber perform better when they are encouraged to equate hard work with results.

February 28, 2007

Maia Szalavitz has an interesting piece in yesterday's Washington Post about the national mania for diagnosing kids:

"Increasing numbers of children are given increasingly specific labels,
ranging from psychiatric and neurological diagnoses such as Asperger's
and attention-deficit disorder to educational descriptors including
"gifted" and "learning disabled."

-- The main problem being that these labels tend to overwhelm parent, child and teacher with a fixed and false set of expectations. She cites Stanford's Carol Dweck, author Alissa Quart and psychiatrist Bruce Perry all insisting that abilities are not fixed.

"Recent research in
neuroscience bolsters the idea that people can and do change. Says
Perry: 'The brain is like a muscle: The areas that are used grow and
improve while those which aren't, don't.'"

Kids diagnosed with a disability need to understand that there are no fixed limits on what they can achieve. "It's incumbent on parents," says Dweck, "to explain that 'Well, you may be
wired a little differently; this might make it more difficult for you;
you might have to work harder and use different strategies,' as opposed
to 'This means you can't learn.' "

And at the other end of the spectrum, kids labeled as "gifted" need to understand that success will only come with effort and a willingness to take risks. "Children who believe
in permanent traits like fixed intelligence," Dweck explains, "are actually vulnerable
because when something goes wrong they think they don't deserve the
label anymore."

February 22, 2007

Another important facet of Po Bronson's recent article: it touches on the nature and nurture of persistence. Is an individual's level of persistence hard-wired and immutable or can it be increased/decreased?

This is critical because, as we know anecdotally and as has been demonstrated by researchers (Renzulli, 1978, and many other studies), persistence is an essential component of greatness. Exceptional skill may look effortless -- the spectacular putt or pirouette -- but getting there takes relentless dedication, years of practice and humility. "It's not that I'm so smart," Einstein once said. "It's just that I stay with problems longer."

Where does persistence come from, and can it be acquired?

Psychologist Ellen Winner argues that persistence --
what she calls the "rage to master" -- "must have an inborn, biological
component” (Von Károlyi & Winner, p. 379,),
and that exceptional performers are “intrinsically motivated to acquire skill" in the
areas in which they are innately gifted because they find it easier to learn those skills. (Winner (1996, p. 274).

Anders Ericsson argues against this second notion. Having spent years studying what he calls "deliberate practice" -- the slow, methodical process of getting better -- he points out that there's nothing easy or fun about it. It is, he says, "associated with frequent failures and frustrations and is not the
most inherently enjoyable or 'fun’ activity available." His research shows that "aspiring individuals typically prefer [the harder, slower work] to playful interactions
with friends."

So what makes some people spend so much energy on the the harder, slower practice instead of spending less energy on easier, more thrilling, but less skill-building play activities? Are such people simply born with that work-hard impulse?

Maybe some are -- Matt Ridley's Nature via Nurture reviews some evidence of how genes help play into personality. But there's also some emerging evidence for persistence being something we can develop. Bronson's piece cites Robert Cloninger, at Washington University in St. Louis, who not only zeroed in on the persistence circuitry in the brain (Gusnard, Cloninger et al, 1993), but also trained mice and rats to develop persistence. “The key is intermittent reinforcement,” explains Cloninger. “A person who grows up getting too frequent rewards will not have persistence, because they’ll quit when the rewards disappear.” In other words, yes, according to Cloninger, the animal mind can actually be trained to reward itself for slow and steady progress rather than the more thrilling instant gratification.

If we can marry this neurobiology with some psychology and real-world understanding -- such as Carol Dweck's work in motivating students to work harder, we may actually get closer to a real recipe for greatness that could be useful to any parent, teacher or coach.