Wednesday, June 20, 2007

The Massachusetts Board of Higher Education recently released preliminary findings from a new study linking K-12 and post-secondary statistical information. Peter Schworm of the Boston Globe has highlighted the apparent correlation between scores on the Massachusetts Comprehensive Assessment System (MCAS) and college success indicators, specifically credits earned and GPA: Students who were rated as Advanced on the MCAS in high school received more credits with a higher GPA than those rated Proficient, who in turn performed better in college than those rated Needs Improvement.

Some said the findings were predictable and showed only that brighter, harder-working students were more likely to succeed at college, not that the tests were improving education.

While these findings should by no means be surprising, they are, however, not necessarily to be expected: If the MCAS were not a good measure of students' academic capabilities, there would be little correlation between MCAS score and college performance. Since we expect the MCAS to be a good measure of student achievement, we aren't surprised by the correlation; nevertheless, the finding (assuming its statistical significance, which wasn't presented in the preliminary report) is a testament to the success, in some degree, of the MCAS as an achievement metric.

Unfortunately, the second half of the quotation above demonstrates a fundamental misunderstanding of the function of assessments as metrics. A test, as a measurement--a snapshot of ability in time--simply cannot improve education. Rather, improving education depends wholly on what one does with test results.

Consider an analogy from medicine. When a new patient enters the hospital, the admitting doctor might ask for a blood sample to run a few tests. Very few people would think the doctor were doing anything useful if he simply continued to perform blood tests, hoping the results would improve. Instead, the patient would expect the doctor to interpret the test results and prescribe some form of treatment. Then, after the treatment had been applied, it would be reasonable to run a follow-up test to verify the treatment had the desired effect.

Just as a doctor would interpret blood test results to assess a patient's health before prescribing a treatment, the perceived power for academic assessments to improve education lies in schools and teachers' using test results to inform instructional decisions, to target interventions where students need them most. A test of arithmetic won't help Johnny learn mathematics, but it can reveal to his teacher that Johnny is solid in addition, but weak in multiplication. His teacher then knows that Johnny needs more practice with multiplication, and can skip the extra addition lessons. Due to tests' potential to provide more effective instruction, States ought to increase the timeliness and detail of the reports produced from their standardized assessments in order to maximize the benefit of the assessment to students and their teachers.

Since high-quality assessments measure how well students have mastered the material set forth for them in the established curriculum, it seems perfectly reasonable to expect students to pass such an assessment before granting them a certificate stating they have met the curricular requirements. Oddly, not everyone agrees:

"I don't think anyone is surprised that students who do better on the MCAS exam do better in college," said Representative Carl M. Sciortino Jr. , a Medford Democrat who has filed legislation to stop denying students diplomas based solely on MCAS scores. "That means nothing in terms of producing better-prepared graduates overall."

To return to the medical analogy, no decent doctor would give a patient a clean bill of health until blood tests revealed acceptable results. Similarly, why would our schools grant a diploma to students at the end of twelfth grade who have not demonstrated an ability to perform, for example, mathematics at an eighth-grade level? Doing so would only devalue the high school diploma and perform a disservice to students by not truly preparing them for post-graduate life.

Now, some might argue that the MCAS does not function as an accurate metric of students' mastery of the knowledge and skills of the state curricular frameworks. However, if that is indeed the case, the appropriate response is not to remove the MCAS, but to reform it. If we value the quality of our educational system, we cannot fail to measure its effectiveness in producing desired learning outcomes. Similarly, if we value the worth of our high school diplomas, students must be required to demonstrate their academic achievement with a standardized, valid, and reliable indicator of their learning.

No comments:

About Me

I'm a grad student in Educational Statistics at the University of Illinois, Urbana-Champaign. My overall career aim is to help policy makers unlock information about educational programs to improve quality and access of education for the poor.