What I Learned from a Year Spent Studying How to Get Policymakers to Use Evidence

The past year I was a senior research analyst at Northwestern University's Global Poverty Research Lab on a study of evidence-based policy. Specifically, our goal was to work on a question often on researchers' minds: how can I get my ideas acted upon?To do this, I dug through a number of bodies of evidence on how science influences policy. One area I looked at is what is called "implementation science" in medicine, which looks at how to get doctors, nurses, and hospital administrators to adopt evidence-based practice. Another was a series of papers by social scientist Carol Weiss and her students on how policymakers in government agencies claim to use evidence. There is also a small literature on how to implement evidence-based policy in public schools, and a little work on policymaker numeracy. I've included a bibliography below that should be helpful for anyone interested in this topic.Most of my year was spent on delving into attempts to scale up specific policies, so this literature review is not as extensive as it could be. Still, while there are no knock-down bold conclusions, the research on evidence-based policy does offer a few lessons that I think anyone trying to get others to act based on evidence could use. I think this has broad applicability to people working to help others effectively, such as:—Anyone working to promote evidence-based policy—Researchers and those working at research organizations who are trying to get others to listen to them or trying to figure out what research to do—Mangers in nonprofits looking to promote the use of evidence by employees—Advocates promoting more rational behavior (e.g. giving to effective charities or considering others' interests)Here is what I learned:1) Happily, evidence does seem to affect policy, but in a diffuse and indirect way. The aforementioned researcher Carol Weiss finds that large majorities (65%-89%) of policymakers report being influenced by research in their work, and roughly half of them strongly (Weiss 1980; Weiss 1977). It's rare that policymakers pick up a study and implement an intervention directly. Instead, officials gradually work evidence into their worldviews as part of a gradual process of what Weiss calls "enlightenment" (Weiss 1995). Evidence also influences policy in more political but potentially still benign ways by justifying existing policies, warning of problems, suggesting new policies or making policymakers appear self-critical (Weiss 1995; Weiss 1979; Weiss 1977).2) There are a few methods that seem to successfully promote evidence-based policy in health care, education, and government settings where they have been tested. The top interventions are:2a) Education—Workshops, courses, mentorship, and review processes change decision makers' behavior with regard to science in a few studies (Coburn et al. 2009; Matias 2017; Forman-Hoffman et al. 2017; Chinman et al. 2017; Hodder et al. 2017).2b) Organizational structural changes—If an organization has evidence built into its structure, such as having a research division and hotline, encouraging and reviewing employees based on their engagement with research, and providing funding based on explicit evidence, this seems to improve the use of evidence in the organization (Coburn and Turner 2011; Coburn 2003; Coburn et al. 2009; Weiss 1980; Weiss 1995; Wilson et al. 2017; Salbach et al. 2017; Forman-Hoffman et al. 2017; Chinman et al. 2017; Hodder et al. 2017).A few other methods for promoting research-backed policies seem promising based on a bit less evidence:

A cheeky example of an awareness campaign.
Source: https://coyotegulch.blog/2017/04/04/march-for-science-
to-defend-evidence-based-policy-csucollegian/

3) Interestingly, a few methods to promote evidence-based practices that policymakers and researchers often promote do not have much support in the literature. The first is building collaboration between policymakers and researchers, and the second is creating more research in line with policymakers' needs One of the highest-quality write-ups on evidence-based policy, Langer et al. 2016 finds that collaboration only works if it is deliberately structured to build policymakers' and researchers' skills. When it comes to making research more practical for policymakers, it seems that when policymakers and researchers work together to come up with research that is more relevant to policy, it has little impact. This may be because, as noted in point (1), research seems to influence policy in important but indirect ways, so making it more direct may not help much.

4) There is surprisingly and disappointingly little research on policymakers' cognition and judgment in general. The best research is familiar to the effective altruism community from Philip Tetlock (1985; 1994; 2005; 2010; 2014; 2016) and Barbara Mellers (2015), and it gives little information on how decision-makers respond to scientific evidence, but suggests that they are not very accurate at making predictions in general. Other research indicates that extremists are particularly prone to overconfidence and oversimplification, and conservatives somewhat more prone to these errors than liberals (Ortoleva and Snowberg 2015; Blomberg and Harrington 2000; Kahan 2017; Tetlock 1984; Tetlock 2000). Otherwise, a little research suggests that policymakers in general are susceptible to the same cognitive biases that affect everyone, particularly loss aversion, which may make policymakers irrationally unwilling to end ineffective programs or start proven but novel ones (Levy 2003; McDermott 2004). On the whole, little psychological research studies how policymakers react to new information.Overall, this literature offers some broad classes of strategies that have worked in some contexts and that can be refined and selected based on intuition and experience. At the very least, I think those promoting reason and evidence can take heart that research and science do seem to matter, even if it's hard to see.

Chinman, Matthew, et al. "Using Getting To Outcomes to facilitate the use of an evidence-based practice in VA homeless programs: a cluster-randomized trial of an implementation support strategy." Implementation Science 12.1 (2017): 34.

Choi, Bernard CK, et al. "Bridging the gap between science and policy: an international survey of scientists and policy makers in China and Canada." Implementation Science 11.1 (2016): 16.

Coburn, Cynthia E. "The role of nonsystem actors in the relationship between policy and practice: The case of reading instruction in California." Educational Evaluation and Policy Analysis 27.1 (2005): 23-52.

Coburn, Cynthia E., William R. Penuel, and Kimberly E. Geil. "Practice Partnerships: A Strategy for Leveraging Research for Educational Improvement in School Districts." William T. Grant Foundation (2013).

Coburn, Cynthia E., Judith Touré, and Mika Yamashita. "Evidence, interpretation, and persuasion: Instructional decision making at the district central office." Teachers College Record 111.4 (2009): 1115-1161.

Fishman, Barry J., et al. "Design-based implementation research: An emerging model for transforming the relationship of research and practice." National Society for the Study of Education 112.2 (2013): 136-156.

Langer L, Tripney J, Gough D (2016). The Science of Using Science: Researching the Use of Research Evidence in Decision-Making. London: EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London.

Weiss, Carol H. "The haphazard connection: social science and public policy." International Journal of Educational Research 23.2 (1995): 137-150.

Weiss, Carol H. "The many meanings of research utilization." Public administration review 39.5 (1979): 426-431.

Wilson, Paul M., et al. "Does access to a demand-led evidence briefing service improve uptake and use of research evidence by health service commissioners? A controlled before and after study." Implementation science 12.1 (2017): 20.

Comments

Interesting. Would you mind giving some concrete examples of what it looks like for well-done education interventions to affect policy? For education I don't have a great sense of whether the proposed mechanism is "directly inform decision-makers", "inform their advisors", "make something common knowledge within a field", or something else. Also, for structural changes, do you have any sense of how likely Goodhart-type effects are, or what can be done to avoid them?

The mechanism for education is more direct training of policymakers, e.g. through adult ed courses, seminars, etc. Making something common knowledge within a field is something I would have classified as raising awareness rather than education. For Goodhart-type effects, my sense is we are so far from any focus on evidence-based policy that if people overoptimized for it, that would probably be a better equilibrium.

Post a Comment

Popular posts from this blog

NOTE: I would like to clarify that the post below and the published paper show that a result from 1995 does not hold, but they do NOT make the case for the 1995 model being correct. There are many reasons the models in both papers are likely to be deeply flawed: path dependency, dynamic ecosystems, philosophical problems with the definition of suffering and enjoyment, and so on. The primary point here is to treat the 1995 result and other work on wild animal suffering with caution.

In 1995, Yew-Kwang Ng wrote a groundbreaking paper, "Towards welfare biology: Evolutionary economics of animal consciousness and suffering" that explored the novel question of the wellbeing of wild animals as distinct from the conservation of species. As perceptive as it was innovative, the paper proposed a number of axioms about evolution and consciousness to study which animals are sentient, what their experiences are, and what might be done about it.

Effective altruism is now spending a great deal of time on improving prospects for the future. This is chiefly by avoiding extinction risks, but there are other strategies as well, e.g. moral circle expansion. In any case changing institutions looks like a promising strategy, either to spread moral consideration for animals and future people. What are the longest-lasting institutions in the world? Certainly high among them is religion. For this reason, it seems to me that influencing religion, particularly old religions with a tendency to grow, is a highly-neglected strategy for improving the world. I've seen posts in effective altruism (e.g. this one) about outreach to religious groups, but I always saw them as a sort of diversity and inclusivity message: to grow a movement, you need to welcome all sorts of people. It's important to welcome and include people, of course, but this seems to be dramatically underselling the importance of religion. The Catholic Church is around 20…

I am a PhD student in economics at Stanford University and a National Science Foundation Graduate Research Fellow. I am interested in global priorities research—research on the most effective ways to do good with limited resources—and a Global Priorities Fellow with the Forethought Foundation. I am an advocate and a follower of the effective altruism movement (www.effective-altruism.com). I was previously a Senior Research Analyst at the Global Poverty Research Lab at Northwestern University's Buffett Institute, where I studied the implementation of evidence-based policies in education and criminal justice. I am also the chair of the Animal Advocacy Research Fund Oversight Committee, which distributes roughly $300,000 annually to fund research on effective advocacy for animals.
Follow me on Twitter: https://twitter.com/zdgroff.