Scientific Discovery, Innovation, Creativity and How We’re Killing It

The black hole visualization above might be more than just a depiction of something in the outer regions of space, it could be an apt metaphor for what is taking place in our research institutes and universities as far as young scientists are concerned. As a young scientist and innovator(?) (* I’ll come back to that term later) this is a deeply personal issue so keep this in mind as you read on. I am also an educator and responsible for training a new group of scientists, health practitioners, and social innovators in public health — our the future discovery agents — and it is in this latter role that I am most upset and passionate about the issue that is befalling scientific research: a systematic strangling of opportunities for young people.

Jonah Lehrer recently explored this topic in his column in the Wall Street Journal and on his science blog (“The Frontal Cortex“) and I’m very glad he did. Lehrer points to the widening gap between those who have funding and those that do not (the rich getting richer) and how this trend is hurting researchers at the very beginning of their career — the time they are most likely to make breakthrough discoveries.

In 1980, the largest share of grants from the National Institutes of Health (NIH) went to scientists in their late 30s. By 2006 the curve had been shifted sharply to the right, with the highest proportion of grants going to scientists in their late 40s. This shift came largely at the expense of America’s youngest scientists. In 1980, researchers between the ages of 31 and 33 received nearly 10% of all grants; by 2006 they accounted for approximately 1%. And the trend shows no signs of abating: In 2007, the most recent year available, there were more grants to 70-year-old researchers than there were to researchers under the age of 30.

My personal experience in Canada is that this pattern is not much different. As the available grant opportunities decrease or stagnate, the pie continues to remain relatively stable in size while the number of people wanting to eat from it grows. And to compound the problem, more researchers are staying longer in the field. As Lehrer notes: if you’re 70 or older you’re more likely to get an NIH grant than someone under the age of 30.What does that say to young scholars?

In order to be more competitive in grant competitions and for jobs, students are considering post-docs and spending a longer time training, paying money into an education and deferring potential employment-related income, with the hope that it will pay off. Recent National Science Foundation data shows this trend:

New doctorate recipients are increasingly likely to take postdocs, and that is evident in the 2006 SDR data: among all SEH doctorate recipients, 38% had held a postdoc at some point in their careers (table 1). More recent cohorts were more likely than earlier ones to have held a postdoc: 45% of those earning the doctorate within the last 5 years compared with 31% of those who earned the doctorate more than 25 years ago.

If that payoff is a stable research position and the ability to start up your own research group then they are likely to be disappointed, as Lehrer notes:

The age distribution of NIH grants has significant implications for American science. It has become much harder for young scientists to establish their own labs. According to the latest survey from the National Science Foundation, only 26% of scientists hold a tenure-track academic position within six years of receiving their Ph.D.

Jason Hoyt discusses this further and illustrates the trends in grant funding and asks whether we have too many PhD’s in the first place given this trend? Good question. But perhaps a better question is whether we’re killing off innovation, discovery and creativity by stifling the ability for people in their most creative years to do the work in the first place? And are we discouraging the next generation of young scientific leaders by making life for early career researchers so difficult that talented, creative people self-select out of the applicant pool for graduate school and faculty posts?

The issue of tenure mentioned above is not a moot point. To those outside the academy, the concept of tenure surely seems something anachronistic in uncertain economic times and I certainly appreciate that there will be little sympathy for non-tenured scientists and professors among the public. However, it is worth pointing out that the science and innovation done in research has a time horizon that is not the same as other jobs. A grant takes months to prepare, months to adjudicate (a recent grant I applied for had a deadline of October 15th and the decision will not be rendered for the competition until April), and then requires anything between a year to five to do the research (if you’re so lucky to have funding last more than a year or two), and then at least a year to publish your findings. This rests on the assumption that you get funded the first time and that your manuscripts get accepted the first time around. Neither of these are reasonable assumptions. Yet, grants and publications are the #1 things used to assess success. And if people think researchers get paid too much, consider what 14 years of post-secondary education and training gets you according to Hoyt:

With a PhD, a postdoc can expect to start, at most, US $42K a year in academia and $52K in industry.

I made $36.5K as a post doc and as an assistant professor at a major research university, I make less on an hourly basis than all but my part-time data entry clerk (this includes my graduate student research assistants) considering the number of hours I have to work each week to get everything accomplished. My reasons for doing this work isn’t about money, but at some point for most young researchers who have a mountain of student loan debt from a decade and a half of accumulated education and opportunity costs, it has to be.

The tenure gap also points to another shadow in the system and that is the idea that scientists raise their own salary. Thus, scientists and professors are spending an ever-increasing amount of time writing grants to pay themselves so that there is someone to do the research. In the United States, there exists mechanisms to put salaries into grants, however in Canada this is a relatively rare occurrance. The assumption is that universities and research institutes pay salaries and funders pay for research, which simply doesn’t hold true. Imagine having most of your creative talent spending 1/3 of their time applying for funding to support them in…writing more grants to support them in writing more grants. When does the innovation happen? And when are scientists supposed to be doing all of that knowledge translation stuff that they’re increasingly expected to do?

Another problem for young researchers (and tied to the tenure or long-term contract issue) is that the value of an innovation is only really seen in hindsight, therefore any profession based on innovation requires methods of promotion, retention and acknowledgment that somewhat fits this horizon (even if that is imperfect, particularly in basic sciences where the innovation isn’t always obvious for many years). How can you reasonably judge someone’s contribution based on a short period of time?

Here we have a system that espouses language of innovation without any mechanism to support it or worse, an entrenched pattern of behaviour designed to prohibit it. How much sense does that make?