Two Roads to Reform: Comparing the Research on Vouchers And Class-Size Reduction

Findings on Cleveland and Milwaukee Voucher Plans

Over the past few years, other research and analyses of voucher programs have failed to buttress the case being made by voucher supporters. Last fall, the U.S. General Accounting Office reviewed state evaluations and found little or no difference between the academic achievement of voucher students and public school students in Cleveland and Milwaukee-the two major urban school systems with publicly funded voucher programs.10

Indiana University researcher Kim Metcalf, who has spent several years studying the Cleveland program, released a report last year comparing groups of voucher students and public school students from the time they entered first grade through the end of second grade. While voucher students had higher total test scores entering first grade, this advantage quickly began to erode. Over this two-year period, the report revealed that the public school students demonstrated average learning gains that were greater in language, reading and math than the voucher students.11

Voucher supporters have cited isolated data from last year's Indiana University report, claiming that Metcalf's research proves that vouchers boost academic performance. Yet Metcalf himself wrote that the analysis of student test results from voucher schools and public schools "presented no clear or consistent pattern tha[t] can be attributable to [voucher] program participation." Echoing this view, the Ohio Department of Education summed up the study in distinctly lukewarm terms, noting that voucher students "perform at a similar academic level as public school students."12

The Milwaukee voucher program has received only one comprehensive state evaluation, conducted in 1995 by a University of Wisconsin-Madison team led by professor John Witte.13 Reviewing the voucher program's first five years, Witte found no appreciable academic gains in reading and math from vouchers.14 He also observed that the attrition rates for voucher students were high, especially in the first two years.15 Using Witte's data, a research team led by Peterson employed different assumptions and statistical techniques, claiming that there was a statistically significant gain for voucher students in the third and fourth years of the program.16 But this finding was disputed by many in the research community, who argued that by the third year the control and experimental groups were not comparable. The annual attrition rate (about 30 percent)-consisting primarily of students doing poorly in the voucher program-ensured that those students who remained were an academically superior subset, not a random sample.17 Other aspects of the methodology used by the Peterson team to re-analyze the Milwaukee data have been criticized, including the Peterson team's reliance, in some cases, on tiny samples-in one instance, a sample of 26 students.18 The Peterson team's re-analysis was described by Witte as a "confusing, tortured effort," and even the pro-voucher Wall Street Journal wrote that Peterson was "loose with his claims."19

Since the 1995 state evaluation, voucher supporters have shown no enthusiasm for new efforts to examine the program's impact on student achievement. In fact, after the lackluster results of this evaluation were released, Wisconsin legislators eliminated provisions calling for future academic evaluations of the program.20 Since then, the Legislature has provided only for a single audit by the state's Legislative Audit Bureau in the year 2000. This audit observed: "Some hopes for the program-most notably, that it would increase participating students' academic achievement-cannot be documented, largely because uniform testing is not required in participating schools."21

Some voucher supporters have cited research by Princeton University's Cecilia Rouse that reported math gains for Milwaukee voucher students.22 Yet, the findings Rouse cited were only for the subgroup of students who were in the voucher program over a four-year period. As noted earlier, student attrition rates come into play because Witte found that "voucher students who left the [Milwaukee] program for various reasons had lower test scores than those who continued to participate [emphasis in original]."23 Clearly, a full and accurate assessment of voucher schools considers not simply those students who use a voucher and remain in the voucher school, but, rather, all students who entered the voucher program. In simple terms, students who do well in voucher schools are more likely to stay-those doing poorly are more likely to leave or drop out. Additionally, Rouse found that "the [voucher] effects on the reading scores are as often negative as positive and are nearly always statistically indistinguishable from zero."24