"A child's learning is the function more of the characteristics of his classmates than those of the teacher." James Coleman, 1972

Thursday, November 14, 2013

The Potential Value of Teacher Transfer Data

Maybe I’m naïve, but I found it hopeful when economist Dan Goldhaber told the conservative American Enterprise Institute that value-added models work at the elementary level, at least in comparison with other ways of evaluating teachers. But, he cited evidence that value-added might not work quite so well at the high school level. He concluded that less emphasis would have been placed on the value-added of individual teachers if research had focused on high schools rather than elementary schools. I am far from convinced that value-added evaluations make sense even in the early years, but I have to believe that most reformers will see the folly of test-driven evaluations in middle and high school, and that they will back off from the single silliest policy idea during this age of reform.

It is in this mindset, which is probably pollyanna-ish, that I look into the details of Mathematica’s “Transfer Incentives for High-Performing Teachers,” by Steven Glazerman et.al. It asks whether a large financial incentive (up to $20,000 paid out over two years) known as the Teacher Transfer Initiative (TTI) could encourage high-performing teachers to transfer to low-performing schools, whether they would be successful in raising the achievement of their new students, and whether the top teachers would remain after the payments end.

The study included the obligatory cost benefit analysis of the TTI in comparison to reducing class size. It could save $13,000 or so per grade per year. As explained before, the TTI had to search for 29 teachers with the highest value added before finding one who would take their gamble. Given the difficulty of finding 81 top teachers who would transfer, the potential for scaling up the TTI is so miniscule that it would take years (or decades) to recoup the $11.6 million the Mathematica study cost.

The study includes some illustrative details, however. Surveys showed transfers encountered more misbehavior in the high-poverty schools. The sample’s middle schools faced significantly bigger challenges in terms of poverty and special education populations. And, apparently, the teachers who agreed to transfer to the more challenging schools were relatively less motivated by money, raising the possibility that they were more relatively committed to turning around poor schools.

Moreover, all ten districts had forty or more elementary schools. Six were county-wide districts that included urban and non-urban schools. How could the archetypical metropolitan districts where suburban flight caused extreme concentrations of generational poverty and trauma hope to find enough teachers who would leave their more affluent districts and place their fate in the hands of under-the-gun urban districts? (Since eight of the ten districts were located in Right to Work states, which means the union is weaker, transferring teachers would be likely relearn that “no good deed goes unpunished.)

Above all, there was a great variety of outcomes among the districts and among grades. Apparently, the study further confirms the conventional wisdom that changes in elementary school instruction can have far greater effects. I read the data as confirming the common sense conclusion that teacher effects are more likely to be washed out by the mayhem of high-poverty middle schools.

Even the TNTP, I would hope, would see the Teacher Transfer Initiative as a wake-up call. They should stop seeking the cheap and easy quick fix of deputizing teachers as the agents to overcome the legacies poverty. We should invest in the socio-emotional supports necessary for creating school cultures so that good teachers in high-poverty schools can raise student performance the way that comparable teachers in low-poverty schools do.

It is also time to reconsider the value-added evaluations that are likely to produce an exodus of talent from poor schools, especially challenging secondary schools.

The real potential of TTI data could be realized if a new study compares value-added outcomes. Did the value-added of high-performing teachers who remained in their old high-poverty schools go up after incentives were offered? How much?

More importantly, what was the value-added of TTI teachers in their old high-performing schools and in their new ones? How many of those teachers, who previously had high-value added but who did not receive bonus, would also be in danger of termination because the conditions in their new school made it impossible to meet value-added targets?

Given the big differences in outcomes among districts and grades, what does the value-added database say about the disparities between teachers who may not be damaged much by value-added evaluations, as opposed to those who are more likely – through no fault of their own – have their careers destroyed because they chose to commit to the tough schools. I’m not a mathematician, but I’ve been to a county fair or two. It looks to me like the data implies that some, but not all schools, need to set the expectation that the last good teacher being driven out of the schools where it is harder to raise test scores should agree to turn the lights out when they leave.