Jersey Jazzman: The Real Story of New York's Schools Can't Be Told In Test Scores

Test score releases always generate plenty of heat now matter where you go. Politicians and advocates and stakeholders all furiously send out their press releases right after the scores are posted, confident the "latest data proves" their particular agenda is "getting results," even though "we have a lot of work to do," and [INSERT CLICHE HERE].

But New York, in particular, seems to love debating the meaning of their scores. Maybe it's because there is a big education research community in place in and around the city. Maybe it's because parent advocates are particularly well organized, teachers unions are particularly vocal, and wealthy interests are particularly engaged in education issues (golly,I wonder why...). Maybe it's because the debates over mayoral control and charter school expansion are actually battles in larger political wars, especially the interminable power struggles between Albany and NYC.

Whatever the reason, this last score release by NYSED is once again being presented with little to no context, and little to no understanding of how tests work. Granted, I'm the first to admit I'm barely a few feet above sea level on the climb to the right of Mount Stupid:

But I know enough about standardized tests to know that it's pointless to compare proficiency rates from year-to-year on completely different tests. But don't take my world for it; ask the guy who literally wrote the book on testing:

Meanwhile, the researchers we spoke with said any comparisons between this year’s test and last year’s are inherently flawed. The state acknowledged Friday that the tests are not directly comparable because this year’s tests were shorter, for instance, and untimed.

“The state said in their release that the tests are not comparable,” said Dan Koretz, a professor of education at Harvard. “That’s sort of the end of the story. They’re not comparable.” [emphasis mine]