You are here

Leveraging Data and Evidence to Drive Decision Making at USAID

A Haitian Health Care Worker makes Ten to 12 Home Visits Each Day Using a Tablet to Track Critical Data. [USAID]

Over the past five years, USAID has made significant progress in using evaluations and open data to effectively drive budget, policy, and management decisions.

Last month, our progress in the areas of evaluation and open data were recognized by both non-governmental and governmental stakeholders: The Federal Invest in What Works Index, released by the bipartisan nonprofit organization Results for America, and the Office of Management and Budget’s Project Open Data Dashboard, both rated USAID highly.

Using Evaluations for Evidence-Based Decision Making

In Results for America’s annual Federal Invest in What Works Index -- which describes how key U.S. government agencies and departments use data and evidence to drive budget, policy, and management decisions -- USAID ranked second among rated U.S. government agencies. The agency performed particularly well in criteria related to use of data, resources dedicated to evaluation, and innovation.

Since the release of USAID’s Evaluation Policy in 2011, the agency has made an ambitious commitment to invest in evaluation practices that value independent judgement, high-quality methods and evidence-based findings to determine what is and is not working in USAID programs. These evaluations help explain why a program is succeeding or failing, and can provide evidence and recommendations for how best to improve performance.

As a result of this new policy, the number of independent evaluations has increased from an annual average of about 130 to an annual average of about 230. In addition, independent studies have found that the quality and use of USAID evaluations have also improved.

One independent study on use of evaluations at USAID found that 93 percent of evaluations have been used in some capacity, most frequently in project design and implementation as well as strategy and policy formulation.

At the country level, 59 percent of approved Country Development Cooperation Strategies referenced findings from USAID evaluations, and 71 percent of respondents reported that evaluations had been used to design and/or modify a USAID project or activity.

These evaluations have proven invaluable. For example, after USAID shared its evaluation findings with the Government of Ethiopia, the government made HIV testing for highly vulnerable children a priority and revised its National Guidelines for Comprehensive HIV Prevention in 2014.

In Mozambique, findings from a USAID impact evaluation of an education program led to the Government of Mozambique’s request to expand the program from 180 schools to an additional 538 schools -- improving literacy for 109,021 more students.

We are working hard to meet these requirements to contribute to the goal of ensuring that government is transparent, accountable, participatory and collaborative; we believe that it is important to make information resources accessible, discoverable and usable by the public.

USAID’s open data policy provides a framework for systematically collecting agency-funded data in a central repository, documenting the data to make it easy to locate and use, and making the data available to the general public, while ensuring rigorous protections for privacy and security.

Under this policy last year, the agency began regularly releasing data about USAID-funded programs to the public for the first time in the organization’s history. Thanks to these efforts, OMB recognized USAID’s open data policy as a “model of best practices.”

The CIO Council, the principal interagency forum on federal agency practices for IT management, has also featured USAID’s forward-leaning open data practices in a case study on innovation. It cites USAID’s progress in including “data submission requirements into its awards,” which has created a steady stream of data to USAID, “which in turn is released to the public.”

We recognize that open data is a powerful tool for collaboration. For example, through a partnership with the National Geospatial Intelligence Agency, USAID’s GeoCenter takes advantage of open data in the form of high-resolution satellite imagery to improve the effectiveness and efficiency of USAID’s development programs.

The satellite imagery serves as a basis for creating new geospatial data in unmapped parts of the world, enhancing USAID’s ability to improve health outcomes, strengthen food security programming and monitor land use change.

Moving Forward

The agency has made significant progress in establishing a strong framework for open data and evaluation practices, but our work is not yet done.

USAID will continue to build on its evaluation practices, strengthening the capacity of staff and our partners to better integrate evaluative thinking throughout the planning and managing of development programs, as well as expanding tools and partnerships for evaluation.

Additionally, we will continue to refine our transparency policies and practices, emphasizing a responsible approach to data management that balances our commitment to openness with our commitment to mitigating risk in the vulnerable communities we serve.

We will continue to invest in a more robust Development Data Library, designed to provide the general public with timely access to high-quality, USAID-funded data. We remain committed to ensuring that our transparency efforts also include feedback loops, such as the input we are currently seeking on obtaining informed consentin an era of increased data transparency.

Ultimately, we will continue to promote an organizational culture at USAID that emphasizes learning and adapting and that encourages staff to seek out evidence and data to inform decision-making.

About the authors: Negar Akhavi is Acting Director of the Office of Learning, Evaluation and Research in USAID’s Bureau for Policy, Planning and Learning. Brandon Pustejovsky is USAID’s Chief Data Officer.

For more information:

To find more examples of how evaluations have contributed to better development outcomes, read the Evaluation Policy’s five-year report.