Post navigation

New teacher evaluation system is all flaws: A veteran educator on the trouble with value-added data

On Thursday, the New York State Department of Education and the state teachers union came to an agreement on revising the teacher evaluation process to include students’ standardized test scores.

Earlier in the week, on Tuesday, New York’s highest court ruled that New York City’s Department of Education could publish the ratings of 12,000 teachers, which are also based on standardized test scores.

Both developments are part of a push to use gains made by students on tests (so-called “value-added data”) to determine which teachers to promote, fire or simply keep on the job. This is in contrast to the more traditional, principal-led evaluations that critics have long charged are too subjective.

Those critics have won, but I doubt our schools will benefit. I doubt, also, that this new system will last.

As a veteran teacher, I cringe when I hear politicians say that current evaluations do not include student learning as a factor. When my principal observes my class, he will witness student learning in many forms. For example, he might see me call on a student who has been struggling.

Perhaps that student will not be able to answer the question perfectly, but he will be able to do it far better than he would have at the beginning of the period. My principal is a veteran and knows learning when he sees it. I trust his judgment.

The new evaluations hinge on a flawed notion of student progress. This will lead to their downfall.

One reasonable way to determine if, say, a fifth-grader has progressed would be to, at the beginning of the year, administer a pretest that is identical to the test she will take at the end of the year and then compare the two results.

But this is not what happens. Instead, students take the fourth-grade test at the end of fourth grade and the fifth-grade test at the end of fifth grade. Since the fourth-grade test is easier, scores often go down on the fifth-grade test. To see if this decrease on the harder test still qualifies as “progress,” evaluators compare the fifth-grade scores of all the students in the state who got the same score on the fourth-grade test. From this, they attempt to calculate student gains.

If it seems confusing, well, it is. And this is an oversimplified explanation of how value-added data works.

Even some of the experts who created value-added measures admit they are not very reliable. Error rates of over 30% mean that even an effective teacher could be deemed ineffective, and vice versa. That is why, as a safeguard, according to New York State law, value-added data cannot be used to count for more than 40% of a teacher’s entire evaluations — the other 60% is still based on principal observations.

Forty percent is still way too high, but things just got worse. According to the state Education Department, under the new system, “Teachers rated ineffective on student performance based on objective assessments must be rated ineffective overall.”

In other words, if a teacher gets a low enough student performance score, he will get an ineffective evaluation regardless of how well he does on the other 60% of the evaluation.

In essence, then, the student performance score is not weighted as a maximum of 40%, as required by state law, but 100% for certain teachers. This loophole makes New York, for all practical purposes, the only state in the country that ignores the myriad experts who say that evaluation is to be based on multiple measures, not solely on test scores.

This is also why the last seven New York State Teachers of the Year have written a letter opposing these evaluations. These are the teachers who would, if the measures were accurate, have the most to gain (at least financially) from such a system. But they oppose it because no one should be judged by such flawed measures.

As disappointed as I am that inaccurate ratings are going to be published and that similar ratings will be used to unfairly fire or reward teachers, there is one fact that makes me optimistic: New Yorkers are a demanding bunch.

When value-added gets the opportunity to take center stage, all eyes will be on it, much as they are on every new pitching prospect touted by the Yankees.

I predict that soon enough, under the sort of intense scrutiny that comes with the implementation of a major data system like this, New Yorkers will have definitive proof of how flimsy of a statistical tool this really is — and how much of a disservice the new rating system is to the men and women who educate our children.

Rubinstein is a two-time recipient of the Math For America master teacher fellowship.

Like this:

LikeLoading...

Related

About Ryan Copeland

Ryan grew up in Maine and studied at the University of Massachusetts in Amherst before moving to Seattle in 2009. He has worked with school-age students in various settings for the past eight years, including two great years as a literacy specialist at Greenwood Elementary. He currently studies Elementary Education at Penn GSE.