Editorial: Making the MCAS work

Friday

Sep 26, 2008 at 12:01 AMSep 26, 2008 at 7:31 AM

A decade or more ago, the annual release of the MCAS test results prompted hand-wringing and outrage. There was no such outpouring of attention when the latest round of district MCAS scores were released this week, and that's a good thing.

A decade or more ago, the annual release of the MCAS test results prompted hand-wringing and outrage. Newspapers rushed to rank the results, as if the blizzard of numbers could, by some simple formula, be converted into a single list, with the "best" schools on top and the "worst" on the bottom. Homeowners and real estate brokers fretted about the test scores' impact on home values. Politicians, parents and educators complained about the tyranny of standardized testing.

That was then. There was no such outpouring of attention when the latest round of district MCAS scores were released this week, and that's a good thing. Tests aren't supposed to determine real estate values. They are supposed to give teachers, administrators, students and parents a tool to determine whether they are learning the minimal skills they need to advance in school and succeed once they graduate.

It's the individual scores that matter most, and even then, MCAS scores mostly flag a student who needs more attention. Properly used, aggregate scores can tell something about the quality of a school or even the performance of a teacher. But the tests only identify problems, they don't fix them.

The MCAS tests are a low standard by some measures. The high-stakes tests - meaning you have to pass them to graduate - are given in the 10th grade and measure 10th grade knowledge and skills. State college professors have noticed no improvement in their freshmen's readiness for college-level work since the MCAS tests were initiated.

But they are a high standard when compared to the tests other states use to keep score under the federal No Child Left Behind Act. The state's NCLB goal is for all Massachusetts children to score proficient on the MCAS tests by the year 2014, which is a little like the fictional Lake Woebegone, where all the children are above average.

NCLB requires all schools, and all subsets of students, be measured to determine "Adequate Yearly Progress" toward that goal. By that reckoning, half the state's public schools are now coming up short. That doesn't mean the schools are failures, and it certainly doesn't reflect on every student or teacher. It can mean, however, that some small part of the curriculum isn't being taught effectively to some group of students. Educators need to look at the details, not the overall scores, to fix the problem.

Most schools, teachers and students learned to live with the MCAS years ago, but a small number of politicians, academics and activists are still caught up in a stale debate over the value of standards and testing. The MCAS tests are no more a perfect measure of a student's worth than any other test, whether it's the SAT or the spelling test given every Friday in the third grade. Schools and teachers have always used "alternative assessment" tools to measure the whole child. Isn't that what a report card is?

The MCAS tests shouldn't be the center of education policy, and we don't believe they are. Educators, teachers, students and parents should take seriously what the numbers say about where they can improve. Those who care about education policy should stop arguing about the MCAS and concentrate on where the schools go from here.