Institute of Education Sciences

Are you celebrating National STEM Day this November 8th by learning more about how to improve student achievement in Science, Technology, Engineering, and Mathematics (STEM)? If so, the Institute of Education Sciences’ (IES’s) What Works Clearinghouse has great resources for educators who want information about the latest evidence-based practices in supporting learners of all ages.

Don’t worry, we won’t leave science out! Encouraging Girls in Math and Science includes five evidence-based recommendations that both classroom teachers and other school personnel can use to encourage girls to choose career paths in math- and science-related fields. A handy 20-point checklist provides suggestions for how those recommendations can be incorporated into daily practice, such as “[teaching] students that working hard to learn new knowledge leads to improved performance” and “[connecting] mathematics and science activities to careers in ways that do not reinforce existing gender stereotypes of those careers.”

Looking for specific curricula or programs for encouraging success in STEM? If so, check out the What Works Clearinghouse’s Intervention Reports in Math and Science. Intervention reports are summaries of findings from high-quality research on a given educational program, practice, or policy. There are currently more than 200 intervention reports that include at least one math or science related outcome. (And nearly 600 in total!)

Maybe you just want to see the research we’ve reviewed? You can! The What Works Clearinghouse’s Reviews of Individual Studies Database includes nearly 11,000 citations across a wide range of topics, including STEM. Type in your preferred search term and you’re off—from algebra to zoology, we’ve got you covered!

We hope you’ll visit us on November 8th and learn more about evidence-based practices in STEM education. And with practice guides, intervention reports, and individual studies spanning topics from Early Childhood to Postsecondary education and everything in-between, we hope you’ll come back whenever you are looking for high-quality research to answer the question “what works in education!”

by Herbert Turner, Ph.D., President and Principal Scientist, ANALYTICA, Inc.

The WWC Evidence Standards (hereafter, the Standards) provide a detailed description of the criteria used by the WWC to review studies. The standards were first developed in 2002 by leading methodological researchers using initial concepts from the Study Design and Implementation Assessment Device (DIAD), an instrument for assessing the correspondence between the methodological characteristics and implementation of social science research and using this research to draw inferences about causal relationships (Boruch, 1997; Valentine and Cooper, 2008). During the past 16 years, the Standards have gone through four iterations of improvement, to keep pace with advances in methodological practice, and have been through rigorous peer review. The most recent of these is now codified in the WWC Standards Handbook 4.0 (hereafter, the Handbook).

Across the different versions of the Handbook, the methodological characteristics of an internally valid study, designed to causally infer the effect of an intervention on an outcome, have stood the test of time. These characteristics can be summarized as follows: A strong design starts with how the study groups are formed. It continues with use of reliable and valid measures of outcomes, has low attrition if a randomized controlled trial (RCT), shows baseline equivalence (in the analysis sample) if a quasi-experimental design (QED), and has no confounds.

These elements are the critical components of any strong research design – and are the cornerstones of all versions of the WWC’s standards. That fact, along with the transparent description of their logical underpinning, is what motivated me to use Standards 4.0 (for Group Designs) as the organizing framework for understanding study validity in a graduate-level Program Evaluation II course I taught at Boston College’s Lynch School of Education.

In spring 2017, nine Master and four Doctoral students participated in this semester-long course. The primary goal was to teach students how to organize their thinking and logically derive internal validity criteria using Standards 4.0—augmented with additional readings from the methodological literature. Students used the Standards (along with the supplemental readings) to design, implement, analyze, and report impact evaluations to determine what interventions work, harm, or have no discernible effect (Mosteller and Boruch, 2002). The Standards Handbook 4.0 along with online course modules were excellent resources to augment the lectures and provide Lynch School students with hands on learning.

At the end of the course, students were offered the choice to complete the WWC Certification Exam for Group Design or take the instructor’s developed final exam. All thirteen students chose to complete the WWC Certification Exam. Approximately half of the students became certified. Many emailed me personally to express their appreciation for the (1) opportunity to learn a systematic approach to organizing their thinking about assessing the validity of causal inference using data generated by RCTs and QEDs, and (2) developing design skills that can be used in other graduate courses and beyond. The WWC Evidence Standards and related online resources are a valuable, accessible, and free resource that have been rigorously vetted for close to two decades. The Standards have few equals as a resource to help students think systematically, logically, and clearly about designing (and evaluating) a valid research study to make causal inferences about what interventions work in education and related fields.

In December 2016, the What Works Clearinghouse made a version of its online training publicly available through the WWC Website. This enabled everyone to be able to access the Version 3.0 Group Design Standards reviewer training to learn about the standards and methods that the WWC uses. While this was a great step to increase access to WWC resources, users still had to go through the 1 ½ day, in-person training to become a WWC certified reviewer.

To continue our efforts to promote access and transparency and make our resources available to everyone, the WWC has now moved all of its group design training to be online. Now everyone will have access to the same training and certification tests. This certification is available free of charge and is open to all users. It is our hope that this effort will increase the number of certified reviewers and help increase general awareness about the WWC.

Why did the WWC make these resources publicly available? As part of IES’s effort to increase access to high quality education research, we wanted to make it easier for researchers to use our standards. This meant opening up training opportunities and offering training online was a way to achieve this goal while using limited taxpayer resources most efficiently.

The online training consists of 9 modules. These videos feature an experienced WWC instructor and use the same materials that we used in our in-person courses, but adapted to Version 4.0 of the Group Design Standards. After completing the modules, users will have the opportunity to download a certificate of completion, take the online certification test, or go through the full certification exam.

Becoming a fully certified reviewer will require users to take a multiple choice online certification test and then use the new Online SRG application to conduct a full review using the same tools that the WWC team uses. The WWC team will then grade your exam to make sure you fully understand how to apply the Standards before certifying you to review for the Clearinghouse.

Not interested in becoming a certified reviewer? Online training still has several benefits. Educators can embed our videos in their course websites and use our training materials in their curricula. Researchers can use our Online SRG tool with their publications to determine a preliminary rating and understand what factors could cause their study to get the highest rating. They could also use the tool to use when conducting a systematic evidence review.

Have ideas for new resources we could make available? Email your ideas and suggestions to Contact.WWC@ed.gov!

For the What Works Clearinghouse (WWC), standards and procedures are at the foundation of the WWC’s work to provide scientific evidence for what works in education. They guide how studies are selected for review, what elements of an effectiveness study are examined, and how systematic reviews are conducted. The WWC’s standards and procedures are designed to be rigorous and reflective of best practices in research and statistics, while also being aspirational to help point the field of education effectiveness research toward an ever-higher quality of study design and analysis.

To keep pace with new advances in methodological research and provide necessary clarifications for both education researchers and decision makers, the WWC regularly updates its procedures and standards and shares them with the field. We recently released Version 4.0 of the Proceduresand Standards Handbooks, which describes the five steps of the WWC’s systematic review process.

For this newest version, we have divided information into two separate documents (see graphic below). The Procedures Handbook describes how the WWC decides which studies to review and how it reports on study findings. The Standards Handbook describes how the WWC rates the evidence from studies.

The new Standards Handbook includes several improvements, including updated and overhauled standards for cluster-level assignment of students; a new approach for reviewing studies that have some missing baseline or outcome data; and revised standards for regression discontinuity designs. The new Procedures Handbook includes a revised discussion of how the WWC defines a study. All of the changes are summarized on the WWC website (PDF).

Making the Revisions

These updates were developed in a careful, collaborative manner that included experts in the field, external peer review, and input from the public.

Staff from the Institute of Education Sciences oversaw the process with the WWC’s Statistical, Technical, and Analysis Team (STAT), a panel of highly experienced researchers who revise and develop the WWC standards. In addition, the WWC sought and received input from experts on specific research topics, including regression discontinuity designs, cluster-level assignment, missing data, and complier average causal effects. Based on this information, drafts of the standards and procedures handbooks were developed.

Version 4.0 of the Handbooks was released on October 26. This update focused on a few key areas of the standards, and updated and clarified some procedures. However, the WWC strives for continuous improvement and as the field of education research continues to evolve and improve, we expect that there will be new techniques and new tools incorporated into future versions the Handbooks.

Your thoughts, ideas, and suggestions are welcome and can be submitted through the WWC help desk.

Almost a decade ago, the What Works Clearinghouse (WWC) released Dropout Prevention, one of its first practice guides, which offered six recommendations for keeping students in school, based on the findings from high-quality research and best practices in the field. Since its release in August 2008, the practice guide has been downloaded thousands of times by practitioners across the country. In fact, the guide was downloaded more than 1,500 times in 2016, alone.

However, over the past decade, the research and knowledge base has grown in the area of dropout prevention, which is why the WWC decided to update this guide to reflect the latest evidence and information about keeping students in school and on track toward graduation.

First, it reflects improvements in practices related to monitoring at-risk students, including advances in using early warning indicators to identify students at risk for dropping out. Secondly, it covers an additional nine years of research that were not a part of the previous guide. In fact, 15 of the 25 studies used to support the recommendations in this updated guide were published after the first guide was published. In addition, studies from the previous guide were reviewed again against current, more rigorous WWC evidence standards.

Preventing Dropout in Secondary Schools offers four evidence-based recommendations that can be used by schools and districts:

Monitor the progress of all students, and proactively intervene when students show early signs of attendance, behavior, or academic problems;

Provide intensive, individualized support to students who have fallen off track and face significant challenges to success;

Engage students by offering curricula and programs that connect schoolwork with college and career success and that improve students’ capacity to manage challenges in and out of school; and

For schools with many at-risk students, create small, personalized communities to facilitate monitoring and support.

Each of these recommendations includes specific strategies for implementation and examples of how this work is being done around the country (see one such example in the image to the right).

Like all of our practice guides, the recommendations were developed with a panel of educators, academics, and experts who brought a wealth of knowledge and experience to the process. Two of the panelists on this updated guide were also involved in the development of the first dropout prevention guide—Russell Rumberger, from University of California, Santa Barbara, and Mark Dynarksi, of Pemberton Research LLC. The other panelists for the new guide were Howard (Sandy) Addis, of the National Dropout Prevention Center and Network; Elaine Allensworth, from the University of Chicago; Robert Balfanz, from The Johns Hopkins University; and Debra Duardo, superintendent of the Los Angeles County Office of Education.