MIRA supports investigators’ overall research programs through a single, unified grant rather than individual project grants. The goals include increasing investigators’ funding stability, ability to take on ambitious challenges and approach problems creatively, and flexibility to follow important new research directions as opportunities arise.

Awards will provide all of the support from NIGMS for research related to its mission in an investigator’s laboratory. [Editor’s note: Awards will be for 5 years, similar to the current average length of an NIGMS R01 award to new investigators.]

As stated in earlier posts, this first MIRA competition is an experiment and is intentionally limited to a small group of eligible applicants. If this pilot is successful, we plan to issue future funding opportunity announcements covering additional groups of investigators.

The goal of this FOA is to test the MIRA concept under well-controlled conditions with a small group of investigators. We’re initially targeting established investigators who have received two or more R01-equivalent awards or a single award of $400,000 or more in direct costs from NIGMS in Fiscal Year 2013 or 2014, and who have at least one grant expected to end in Fiscal Year 2016 or 2017. We think that this approach will help these investigators transition smoothly from their current grants to MIRA support. In the future, we plan to issue MIRA FOAs for additional groups of investigators, and if the pilot is successful we will open the program to any investigator working on research questions related to the mission of NIGMS.

If you’re eligible for this FOA and on the fence about applying, consider that MIRA awards:

Will be for 5 years instead of the current NIGMS average of 4 years,

Will continue support for other research currently funded by NIGMS without requiring a separate renewal application,

Will provide flexibility to pursue new ideas and opportunities as they arise,

The doubling of the NIH budget between 1998 and 2003 affected nearly every part of the biomedical research enterprise. The strategies we use to support research, the manner in which scientists conduct research, the ways in which researchers are evaluated and rewarded, and the organization of research institutions were all influenced by the large, sustained increases in funding during the doubling period.

Despite the fact that the budget doubling ended more than a decade ago, the biomedical research enterprise has not re-equilibrated to function optimally under the current circumstances. As has been pointed out by others (e.g., Ioannidis, 2011; Vale, 2012; Bourne, 2013; Alberts et al., 2014), the old models for supporting, evaluating, rewarding and organizing research are not well suited to today’s realities. Talented and productive investigators at all levels are struggling to keep their labs open (see Figure 1 below, Figure 3 in my previous post on factors affecting success rates and Figure 3 in Sally Rockey’s 2012 post on application numbers). Trainees are apprehensive about pursuing careers in research (Polka and Krukenberg, 2014). Study sections are discouraged by the fact that most of the excellent applications they review won’t be funded and by the difficulty of trying to prioritize among them. And the nation’s academic institutions and funding agencies struggle to find new financial models to continue to support research and graduate education. If we do not retool the system to become more efficient and sustainable, we will be doing a disservice to the country by depriving it of scientific advances that would have led to improvements in health and prosperity.

Re-optimizing the biomedical research enterprise will require significant changes in every part of the system. For example, despite prescient, early warnings from Bruce Alberts (1985) about the dangers of confusing the number of grants and the size of one’s research group with success, large labs and big budgets have come to be viewed by many researchers and institutions as key indicators of scientific achievement. However, when basic research labs get too big it creates a number of inefficiencies. Much of the problem is one of bandwidth: One person can effectively supervise, mentor and train a limited number of people. Furthermore, the larger a lab gets, the more time the principal investigator must devote to writing grants and performing administrative tasks, further reducing the time available for actually doing science.

Although certain kinds of research projects—particularly those with an applied outcome, such as clinical trials—can require large teams, a 2010 analysis by NIGMS and a number of subsequent studies of other funding systems (Fortin and Currie, 2013; Gallo et al., 2014) have shown that, on average, large budgets do not give us the best returns on our investments in basic science. In addition, because it is impossible to know in advance where the next breakthroughs will arise, having a broad and diverse research portfolio should maximize the number of important discoveries that emerge from the science we support (Lauer, 2014).

These and other lines of evidence indicate that funding smaller, more efficient research groups will increase the net impact of fundamental biomedical research: valuable scientific output per taxpayer dollar invested. But to achieve this increase, we must all be willing to share the responsibility and focus on efficiency as much as we have always focused on efficacy. In the current zero-sum funding environment, the tradeoffs are stark: If one investigator gets a third R01, it means that another productive scientist loses his only grant or a promising new investigator can’t get her lab off the ground. Which outcome should we choose?

My main motivation for writing this post is to ask the biomedical research community to think carefully about these issues. Researchers should ask: Can I do my work more efficiently? What size does my lab need to be? How much funding do I really need? How do I define success? What can I do to help the research enterprise thrive?

Academic institutions should ask: How should we evaluate, reward and support researchers? What changes can we make to enhance the efficiency and sustainability of the research enterprise?

And journals, professional societies and private funding organizations should examine the roles they can play in helping to rewire the unproductive incentive systems that encourage researchers to focus on getting more funding than they actually need.

We at NIGMS are working hard to find ways to address the challenges currently facing fundamental biomedical research. As just one example, our MIRA program aims to create a more efficient, stable, flexible and productive research funding mechanism. If it is successful, the program could become the Institute’s primary means of funding individual investigators and could help transform how we support fundamental biomedical research. But reshaping the system will require everyone involved to share the responsibility. We owe it to the next generation of researchers and to the American public.

Figure 1. The number of NIGMS principal investigators (PIs) without NIH R01 funding has increased over time. All NIGMS PIs are shown by the purple Xs (left axis). NIGMS PIs who were funded in each fiscal year are represented by the orange circles (left axis). PIs who had no NIH funding in a given fiscal year but had funding from NIGMS within the previous 8 years and were still actively applying for funding within the previous 4 years are shown by the green triangles (left axis); these unfunded PIs have made up an increasingly large percentage of all NIGMS PIs over the past decade (blue squares; right axis). Definitions: “PI” includes both contact PIs and PIs on multi-PI awards. This analysis includes only R01, R37 and R29 (“R01 equivalent”) grants and PIs. Other kinds of NIH grant support are not counted. An “NIGMS PI” is defined as a current or former NIGMS R01 PI who was either funded by NIGMS in the fiscal year shown or who was not NIH-funded in the fiscal year shown but was funded by NIGMS within the previous 8 years and applied for NIGMS funding within the previous 4 years. The latter criterion indicates that these PIs were still seeking funding for a substantial period of time after termination of their last NIH grant. Note that PIs who had lost NIGMS support but had active R01 support from another NIH institute or center are not counted as “NIGMS PIs” because they were still funded in that fiscal year. Also not counted as “NIGMS PIs” are inactive PIs, defined as PIs who were funded by NIGMS in the previous 8 years but who did not apply for NIGMS funding in the previous 4 years. Data analysis was performed by Lisa Dunbar and Jim Deatherage.

As described in the blog post announcing the RFI, the Maximizing Investigators’ Research Award (MIRA) program would provide a single award in support of all of the projects in an investigator’s lab that are relevant to the NIGMS mission. A MIRA would be longer and larger than the current average NIGMS R01 award.

We received more than 290 responses through the official RFI comment site. We heard from individual investigators as well as several scientific organizations. Most of the responses were positive, and both established and early stage investigators indicated that they were very likely to apply.

The respondents identified the most valuable aspects of the proposed program as:

Increased flexibility to follow new research directions as opportunities and ideas arise,

Savings of time and effort currently spent on writing and reviewing applications, and

Enhanced stability of research support.

However, some responses expressed concerns, which we are taking into consideration. Despite the intention of the program to optimize the distribution of NIGMS resources, some respondents thought that it could lead to funds becoming concentrated in fewer labs at the most elite institutions. This was in part a reflection of the phased implementation plan, which would focus initially on investigators with more than one NIGMS grant. Respondents urged NIGMS to broaden the eligibility criteria as quickly as possible following the initial pilot phase. Other concerns that were raised related to peer review and program evaluation.

For more about the RFI results, including a breakdown of responses by question, watch my presentation, which begins at 2:18 on the archived videocast.

The Advisory Council discussed the MIRA proposal and then approved plans to proceed with developing the program. We plan to issue a funding opportunity announcement in early 2015, with the first awards being made in Fiscal Year 2016. We intend to evaluate the MIRA program and if it is successful, will broaden it.