Altering teacher evaluations

Many districts look to tweak new system that emphasizes testing to measure student progress

By Kristen V. Brown

Published 7:29 pm, Tuesday, December 24, 2013

Photo: SKIP DICKSTEIN

Image 1of/1

Caption

Close

Image 1 of 1

TIMES UNION STAFF PHOTO BY SKIP DICKSTEIN - Lonnie Palmer, interim school superintendent of the Troy Schools spoke to the Times Union this morning in his offices in Troy, New York June 27, 2007 about his plans for retirement. less

TIMES UNION STAFF PHOTO BY SKIP DICKSTEIN - Lonnie Palmer, interim school superintendent of the Troy Schools spoke to the Times Union this morning in his offices in Troy, New York June 27, 2007 about his plans ... more

Photo: SKIP DICKSTEIN

Altering teacher evaluations

1 / 1

Back to Gallery

In hindsight, Berne-Knox-Westerlo Superintendent Lonnie Palmer will tell you, the way his district rolled out its new teacher evaluation system last year didn't make that much sense.

In most classes, students received tests at the beginning of the year and then at the end to measure academic growth. But in some cases, the chosen tests were not the right ones to accurately indicate a student's progress. In others, using a test at all seemed counterintuitive.

"At the end of the year, you end up with a fair number of teachers labeled ineffective or developing," he said. "Some of those teachers can legitimately say that's not correct."

Palmer's district is one of many across the state now rethinking their teacher evaluation plans since implementing them last year, including nixing some of the testing associated with the plans.

In the past year, more than 20 percent of the state's 700 school districts have received state approval for changes to their plan, according to the State Education Department. This school year alone, 42 districts have gotten the green light for changes, and over 100 more are now making changes ahead of a March 1 deadline.

More Information

The state's new teacher evaluation protocols, dubbed the Annual Professional Performance Review, require that 40 percent of a teacher's annual ratings be based on student performance, with the other 60 percent rooted in more subjective measures such as principal observations.

Half the portion based on student growth depends on measures set at the local level — and that 20 percent is what school districts like Palmer's are interested in.

In Berne-Knox-Westerlo, last year the district either purchased a third party test or developed its own test for each subject and grade level, testing students at the beginning and end of the year. But in retrospect, Palmer said, it didn't make sense to pre-test students in subjects such as, say, algebra, that students had never taken before.

"Am I really going to want to know what the kids don't know?" he said. "That's going to make you look bad."

Why not instead look at how they performed in other math subjects at the end of the previous year to compare growth?

As part of the state's rules, districts may use historical data — for example, how well a student did on their previous year's final math exam — to set student benchmarks at the beginning of the year in place of pre-assessments.

Districts may also use state test scores to establish benchmarks as long as the scores are analyzed in a different way than the state's measurements, such as using the schoolwide performance of students to evaluate certain teachers.

State tests may be used to replace end-of-the-year testing as well.

The new evaluations were put in place as part of the state's bid to win a $700 million grant as part of the federal government's Race to the Top program. In a rush to implement the new evaluations system in time to comply with grant requirements, many districts turned to additional testing rather that alternative measures of evaluation.

"I think it was compelling in terms of time," admitted Green Island Union Free School District Superintendent Michael Mugits. His district bought third-party tests for several grade levels and subjects, though in some cases, such as eighth-grade science, the school is already using state exams in calculating teacher evaluation scores.

"As part of the rapid rollout, we wanted to meet targets for Race to the Top. That meant rolling out things sooner than we were ready," Palmer said.

He said that with the benefit of hindsight, his district is also changing the targets it selected to judge how well students are doing.

"Our APPR targets were not well selected," he said. "Some teachers had kids that passed the state test and did poor on the APPR. Others had the reverse."

Many of the changes to evaluation plans thus far approved by the state include reducing or eliminating pre-assessments and relying more heavily on schoolwide measures of assessment rather than double testing in grades and subjects where there are state exams.

"The fact that school districts and local teachers unions are going back and tweaking their plans is not a surprise," said Carl Korn, a spokesman for New York State United Teachers, the state's largest teachers union. "It's evidence that the last school year should have been a pilot to begin with."

NYSUT argued in court for some local control over student progress in the evaluations, leading to a compromise with Gov. Andrew Cuomo, who wanted state control over how student growth was measured.

Carrie Remis, director of the education advocacy group the Parent Power Project, said that not enough emphasis has been placed on control districts have over how much testing students must take.

"What we need to do to educate the public about where state mandates end and where local decisions begin," she said.

Back in Berne-Knox-Westerlo, Palmer is still deep in conversation with his teachers about how to best measure progress in each individual class. He says they've already made some good decisions: In physics, for one, they'll use a student's previous year's chemistry and math performance to establish a benchmark for their academic performance in physics.

In the past, the evaluations have been a "fear factor" for his teachers. Next year he hopes it will instead be a valuable diagnostic tool.

"It turned out we picked badly last year," he said. "Now we're having some great conversations about what are our local goals."