Opinion: Researching the Researchers

The biomedical research community is due for some self-reflection.

By David Rubenson and Paul Salvaterra | November 25, 2013

SXC.HU, JOHKABiomedical researchers are grumbling a lot these days. The worries span funding levels at the National Institutes of Health (NIH), the peer review process, academic promotion policies, the effectiveness of conferences, waste caused by scientific error, regulatory burdens, and so on. However the grumbling won’t amount to much unless there is a systematic way to formulate, analyze, implement, and monitor reforms to the systems and institutions that make conducting research possible. To do this, the community should develop a new academic tradition of analyzing the biomedical research enterprise. A 21st century ability to apply research data to medical advances will require a 21st century understanding of how to organize biomedical research.

The core impediment to the adoption of this approach is that biomedical research is rarely treated as a product of organizational structure, culture, and incentives. Many scientists see “curiosity” or other lofty ideals as the primary drivers of the research process. They view administration as simply the cost of doing business, failing to recognize that it actually influences (for good and bad) the goals and directions of research. The result is the absence of a tradition for measuring and analyzing organizational performance.

The NIH system for grant funding is a prime example of a process that has come in for strong criticism. Many argue that scientists tailor proposals to win grants rather than to describe the most innovative, boldest, or best approaches for solving society’s medical problems. Some argue that review committees are tainted by conflicts of interest, group-think, and inadequate preparation, among other things. But coming up with something better isn’t so simple.

It’s easy to grumble and equally easy to throw up one’s hands in frustration. Whether it’s the NIH granting process, the effectiveness of scientific communications, or academic promotion policies, the biomedical research enterprise is too large, complex, and tradition-bound to easily identify and implement superior metrics and incentives; particularly ones that can also keep pace with rapidly changing technology and scientific opportunities.

Still, there are constructive ideas. The San Francisco Declaration on Research Assessment is an example of a well-publicized and thoughtful document, which has gathered widespread support. But how do we actually implement that document’s call to “encourage a shift toward assessment based on the scientific content of an article rather than publication metrics,” or to “consider the value and impact of all research outputs (including datasets and software) in addition to research publications?” Not only do these recommendations require serious analysis before implementation, they will require continuous monitoring, revaluation, and adjustment. Without sustained scholarship, thoughtful ideas have little chance to be woven into the fabric of our biomedical research system.

There have also been lost opportunities. The brainstorming sessions that identified the Huntington’s disease-associated gene in the 1980s, the different ways institutions structure internal requests for research funds, Stand Up 2 Cancer Dream Teams, the unique incentives in the Howard Hughes Medical Institute system, or recent efforts in data sharing all constitute organizational “experiments.” These should have been evaluated for effectiveness and the potential for broader application.

Change requires the development of a research culture for studying the research process itself. Only then can thoughtful suggestions be moved into well-defined and effective policy recommendations. This is hardly radical. There is a well-established academic tradition for doing this in health care delivery and education. The nation’s business schools concern themselves with financial, organizational, and cultural incentives in corporate and non-profit organizations. Government agencies employ numerous think tanks to evaluate long-term policies. There is, however, virtually no scholarly tradition for analyzing the biomedical research process. “One off” studies by the Institute of Medicine or the occasional ad hoc committees cannot substitute for a sustained program of research and analysis.

To encourage the establishment of this tradition, we propose the creation of a national biomedical research policy institute. It should be loosely connected to the NIH because it would need access to the agency’s data and policy makers. But it must be governed by an independent board and be free to conduct independent analysis, unencumbered by existing NIH practices and policies. Such an institute must have a broad multi-disciplinary expertise, combining biomedical researchers with specialists in economics, management sciences, social sciences, health care and even patient advocates. It should strive to develop an ethic of unbiased and transparent research and seek to become a central meeting place for discussing the future of biomedical research. Such an institute could develop the data to improve our understanding of how the system is performing and use that data to develop effective arguments about the value of biomedical research to society.

The stakes are enormous. Biomedical research represents a significant financial investment, and it carries the hopes and fears of the afflicted. Long-range health care costs will certainly be influenced by the policy and strategic choices made in biomedical research. The enterprise is too important, large, and complex to be governed casually and with little awareness of the factors shaping it.

David Rubenson is the associate director for administration and strategic planning at the Stanford Cancer Institute. Paul Salvaterra is a professor of neuroscience at the Beckman Research Institute of the City of Hope.

Add a Comment

Comments

The self-reflection is very important, but the examination needs to start at a very deep core, the academic institution, and go all the way to the top at the NIH. This is one gigantic onion of supply and demand, both for people and money, grant and translational outcome. It is so incredibly uncoordinated with so many independent vested interests, often at tremendous odds with one another, that it's a wonder that it even works at the mediocre level it does.

Hopefully, such an institute would also be able to consider the influence of larger economic systems (capitialism, socialism, etc) and particular governments on policy, NIH, researcher mindset, and citizen expectation. This would be a deeper discussion than the USA constant pitting of republicans versus democrats!

But, for example- how are the fruits of the research of biomedical researchers ever going to be accessible to all people in a country without universal healthcare? As a researcher, do you like the idea that the majority of your fellow citizens might never get a chance to enjoy the benefit of what you may discover?

In many cases, it doesn't seem like such research is even based on the consumer, or "fellow citizens." Rather, it's projected that notoriety, even through inapplicable progress, is often the motivation behind such "breakthroughs."

It seems relatively easy to understand where the disconnect is in the equation. Although, as an outsider, it's easy to gloss over the small day-to-day factors which in turn lead to a tradition so engrained, or abide by it.

Everything from career choices by individuals, to abiding by the status quo (which as this article states, is a very large thing) to inadequacies in infrastructure leave us with a whole slew of problems that are only being addressed on the face level.

I'm afraid all we can do right now is swallow it. And as someone in the field, that's a scary thought.

Largely in agreement with NRA - and sadly these things are harder to recognize once you have a vested interested in it. Likewise catching sight of these things doesn't necessarily lead to a change.

This completely works into the weird relationship between researchers and the journalists covering their work. There's a huge gap between what people are reading and what's actually going on. Brings up some interesting points I read by a researcher/CEO named Kevin Xu in his column:

RE:Creating a National Biomedical Research Policy Institute (USBRPI)!?

As one of the US trained and experienced (now retired) biomedical cancer research pioneers (1970s-80s), I would fully appreciate and concur with the above proposal of creating a USBRPI which I thought whose time has finally come in the 2nd decade of the 21st century today and beyond!?

On further analysis, I thought we should make it a USBRPAI -- or National Biomedical Research Policy & Accountability Institute!?

Interesting article. Talking about gaps between reality and the ideal: came across this blog post "We don't need no education" (http://njoyruminates.blogspot.in/) --disturbing commentary on emerging trends in research and publishing and the deplorable lack of vigilance (or complicitness?) on the part of policymakers.