Correcting Creativity: The Struggle for Eminence

By the time he put the finishing touches on the Rite of Spring in November of 1912 in the Châtelard Hotel in Clarens, Switzerland, Stravinsky had spent three years studying Russian pagan rituals, Lithuanian folk songs and crafting the dissonant sacre chord, in which an F-flat major combines with an E-flat major with added minor seventh. The rehearsal process wasn’t easy either. Stravinsky fired the German pianist and the orchestra and performers only had a few opportunities to practice at the Théâtre des Champs-Élysées in Paris, where the Rite debuted in May 1913. But the Russian born composer pulled it off, and his composition now stands as a 20th century masterpiece.

Stravinsky is one of seven eminent creators of the 20st century profiled by Harvard professor Howard Gardner in his 1993 book Creating Minds. The others are Pablo Picasso, Sigmund Freud, T.S. Eliot, Martha Graham, Mahatma Gandhi and Albert Einstein. One can debate the list but Gardner’s foremost conclusion is uncontroversial: creative breakthroughs in any domain require strenuous work and a willingness to challenge the establishment.

The psychology of creativity–both empirical research and popular literature for the lay audience–misses this. It reduces creativity to warm showers and blue rooms, forgetting that the life of the eminent creator is not soothing; it is a struggle–a grossly uneven wrestling match with the muses.

For Gardner, eminent creators are locked into a Faustian bargain, in which fulfilling their vision comes at the cost of an otherwise fulfilling personal life:

...the creators were so caught up in the pursuit of their work mission that they sacrificed all, especially the possibility of a rounded personal existence… unless this bargain has been compulsively adhered to, the talent may be compromised or even irretrievably lost. And, indeed, at times when the bargain is relaxed, there may well be negative consequences for the individual’s creative output.

And so it was for Nietzsche, who believed that creativity is not about solving puzzles, divergent thinking or making remote associations but destroying old systems of thought and breaking from the status quo. Doing so doesn’t require a relaxed state of mind or a few shots of alcohol but enough courage to break out of the herd mentality. As Nietzsche states in various ways throughout his writings,few possess the strength to accomplish this, and those who do are usually rejected at first only to resurface later in time as truly original thinkers.

This is the story of Gardner’s subjects. From Stravinsky’s dissonance to Eliot’s obscure prose to Einstein’s treatment of time and space, each new idea succeeded by standing in contrast to the standards that dictated each domain. The process was not pleasant. As Gardner makes clear, his subjects triumphed only after experiencing prolonged isolation from their fields. They wandered into uncharted waters working mostly on their own while everyone else–the herd–remained either unable or unwilling to see norms as the problem.

In contrast, creativity in the 21st century has become a buzzword for pseudo-intellectuals, pseudo-entrepreneurs, and pseudo-artists who like to label themselves as “creative types” even though their level of commitment to their craft and their willingness to break from the herd is flimsy. TED and similar knowledge-hungry websites might be making it worse by attracting these individuals who, if it weren’t for the easy-to-digest science, wouldn’t care about their elusive creative genius. Or perhaps it is the Internet in general, where someone publishes an article about the “secrets to creativity” every hour.

What’s important is that for decades researchers like Gardner and Dean Keith Simonton have distinguished between “little c” and “big C” creativity, in which the latter more closely aligns with what Nietzsche had in mind. We should return to the uppercased version of the word. Generating a good idea isn’t reading a “top-ten ways to boost your creativity” article. Nor is it cherry picking from the latest cognitive psychology research. As Nietzsche described in The Gay Science, it is like delivering and nurturing a child. “We must constantly give birth to our thoughts out of our pain and maternally endow them with all we have of blood, heart, fire, pleasure, passion, agony, conscience, fate, and disaster.”

Of course, it is difficult to describe “big c” creativity without sounding like a clichéd commencement speaker. For one, it is nearly impossible to convey the “embrace failure” message without coming across banal. Indeed, failure is inevitable and important. But pointing out that “mistakes are simply the portals of discovery” is hindsight babel that loses touch with the reality that failure is horrible–even nauseating–and that most creative projects never see the light of day.

Second, 10,000 hours of deliberate practice is not something you “put in.” Think, for a moment, what such a grueling practice regiment entails. It’s not just practicing for six hours every day for nearly five years; it’s a state of bondage where creators are at the mercy of their domains–moments of total exasperation in which they teeter at the edge of defeat while fearing disapproval far outweigh moments of insight and joyful productivity.

This is the paradox of researching creativity and writing about it. On one hand we should strive to accurately capture the life of eminent creators and their breakthroughs as well as cognitive strategies that contribute to “little c” creativity. At the same time, it is difficult to accomplish this without drawing trite conclusions. For instance, though writing about the Rite provides good insights, I fear it paints an unrealistic picture of creativity in the same way Nora Ephron created an unrealistic account of love. As the late philosopher and art critic Denis Dutton once wrote of Stravinsky’s masterpiece, it’s the creative-type’s enduring tale: what once was so outrageous, so unintelligible, that it could cause a riot, came eventually, through knowledge and familiarity, to be accepted as a masterpiece. In other words, we cannot help but squeeze stories of eminent creation into crisp commoditized narratives with heart-warming proverbs.

So my concern is twofold. First, the cognitive science of creativity and the public’s obsession with it promotes a fringe version of creativity and second, we are reducing eminent creations and their creators into cliché stories. Both distort “big c” creativity, which is central when it comes to genuine innovation and original thinking.

Moving forward, let’s remember that creativity is a struggle and that even though studying creative geniuses helps us understand what Nietzsche held as the greatest expression of the human spirit, being creative is about paving your own path - one that leads you away from the herd so you can draw your own conclusions.

• Image of Stravinsky by Picasso, in public domain, via Wikipedia Commons, by Bibliothèque nationale de France.

The views expressed are those of the author(s) and are not necessarily those of Scientific American.

ABOUT THE AUTHOR(S)

Samuel McNerney

Sam McNerney graduated from the greatest school on Earth, Hamilton College, where he earned a bachelors in Philosophy. After reading too much Descartes and Nietzsche, he realized that his true passion is reading and writing about cognitive science. Now, he is working as a science journalist writing about philosophy, psychology, and neuroscience. He has a column at CreativityPost.com and a blog at BigThink.com called "Moments of Genius". He spends his free time listening to Lady Gaga, dreaming about writing bestsellers, and tweeting @SamMcNerney.

Scientific American is part of Springer Nature, which owns or has commercial relations with thousands of scientific publications (many of them can be found at www.springernature.com/us). Scientific American maintains a strict policy of editorial independence in reporting developments in science to our readers.