In This Episode

Pew president and CEO Rebecca Rimel shares her thoughts on the value of non-partisanship and civil dialogue. Plus, two veteran lawmakers reach across the aisle and discuss what is needed to find middle ground today.

How We Work

Whether it is changing policy and practices or tracking trends, we seek tangible results by pursuing specific, measurable goals. That means maintaining a commitment to a rigorous, analytical and evidence-based approach. Take a look at an example of Pew's work on sentencing and corrections reform.

After the Fact

“After the Fact” is a podcast from The Pew Charitable Trusts that brings you data and analysis on the issues that matter to you—from our environment and the sciences, to larger economic trends and public health.

Increased investment in research has helped the Department of Social and Health Services in Washington state better understand the links between untreated substance use disorders and other human services outcomes, such as housing stability, homelessness, and avoidable costs of emergency room visits. Starting in the 1990s, the department has built its capacity to gather data across a range of services to better analyze the impact of specific programs.

The department’s Research and Data Analysis (RDA) division now measures a range of outcomes by gathering and matching administrative data, such as information collected by government agencies to determine eligibility for programs, manage cases, and make payments. In the case of untreated substance use disorders, analysis of the matched data demonstrated that lack of access to appropriate services boosted both avoidable public expenditures and negative social outcomes.

Recognizing the value of linking multiple sources of existing data to perform these analyses, RDA developed an integrated system that imports and matches client information from over 20 different data systems. This effort provides a comprehensive, cross-agency view of client experiences and service information, such as the type, length, and costs of services received, residential and employment history, involvement in the criminal justice system, and demographic information. RDA then uses the data to regularly analyze the state’s social and health services, including evaluations of program impact.

For instance, RDA used the integrated system to help assess the impact on Medicaid costs of the state’s “Roads to Community Living” program, which helps those with long-term care needs transition from institutional to community-based care. The system allowed the division to create a comparison group that closely mirrored the program’s treatment group so analysts could better isolate the impact of this service, a process that would have been difficult without the extensive client data. The study found state and federal Medicaid savings of $21.5 million for the treatment group compared with nonparticipants over a two-year follow-up period.

To maintain the integrated system, RDA works with each agency that agrees to share data to extract and match client records. For most systems, this matching and linking occurs on a monthly basis to keep the database up to date. Data-sharing agreements ensure that each office maintains data ownership and establishes requirements for data security, privacy, and protection of personal information. The agreements also clarify how the information can be used, such as for evaluations. RDA staff provide ongoing quality control to identify duplicate records and data anomalies, and to ensure information is accurately linked and organized.

This system has helped to ensure:

Timely and cost-efficient evaluations. Access to a linked repository of administrative data saves RDA time and money, compared with gathering similar information from scratch for each new study. Such access also reduces the burden on program managers and participants who would have to provide data previously shared in other contexts. “We can complete a quasi-experimental evaluation, assuming there’s no new data to collect, in a relatively short time and at a fraction of the cost of [contracting out the] evaluation,” said David Mancuso, director of RDA.

Stronger comparison groups. A key challenge to performing comparison group studies is selection bias—the risk that program participants are systematically different from the comparison group in ways that affect results. Linked administrative data can provide extensive information about individuals that can help mitigate those differences. That enables researchers to match the participant and comparison groups more closely and results in a more rigorous evaluation.

The ability to measure multiple relevant outcomes. Linked client data allows researchers to assess numerous outcomes and their relationship to one another. The department’s interest in the effects of untreated substance use disorder on other human services outcomes could be measured only by linking administrative data from multiple departments: employment information, statewide justice data, Medicaid information, and other social service data.

RDA leaders attribute their success in developing and using these integrated data systems to:

Building teams of information technology professionals and analysts with complementary skill sets. IT staff perform complex data integration and transformation functions, while analysts with strong quantitative social science backgrounds, such as economists, sociologists, psychologists, and demographers, typically guide the analyses.

Being a good steward of partner agency trust. Agencies such as RDA must maintain a commitment to analytic integrity, be willing to work collaboratively with partner agencies to achieve delivery system improvements, and engage in timely review of sensitive results before public release.

Building relationships between analysts and agency fiscal, clinical, policy, and IT system subject-matter experts. Maintaining connections between research and IT staff helps to more effectively address changes to individual systems and build an understanding of what the data mean, especially since definitions vary across IT systems. Budget and policy changes also can alter what services are delivered and how they are reflected in IT systems, making it critical for all staff to understand these changes.

RDA’s integrated system provides policymakers with a more complete picture of how residents use Washington’s public services and helps researchers more easily assess whether these programs improve human services outcomes.

In 2014, the Pew-MacArthur Results First Initiative identified five key components of evidence-based policymaking: program assessment, budget development, implementation oversight, outcome monitoring, and targeted evaluation. Taking into consideration and implementing one or more of these components can help states and counties use the Results First evidence-based policymaking framework in ways that yield meaningful changes for their communities.