Examining NYC DOE’s Only Egg Basket

by Gary Rubinstein

When the leaders of the largest school district in the country decide to put all of their proverbial eggs in one basket, that basket had better be strong. In the case of New York City, this basket is the June 2010 ‘research’ report by MDRC entitled ‘Transforming the High School Experience: How New York City’s New Small Schools Are Boosting Student Achievement and Graduation Rates’ and the January 2012 follow up report ‘Sustained Positive Effects on Graduation Rates Produced by New York City’s Small Public High Schools of Choice.’

This is the paper that is always cited by Bloomberg, Walcott, and Suransky when they proclaim that the ends justify the means when it comes to shutting down schools. In his recent New York Daily News editorial, ‘Close bad schools, save their students’ Walcott wrote

A study by the independent education research group MDRC confirmed how well our new schools are working. Among other things, the study found that they “markedly improved graduation rates for a large population of low-income, disadvantaged students of color.”

When these reports were released, many of the flaws in the conclusions were analyzed. But since the DOE had no other evidence that their reforms were working, they continued to cite this study every chance they could. Since the other excellent efforts to demonstrate the flaws in these reports have not caught on enough, I’ve decided to reinforce by analyzing the papers myself using the most recent data about the schools involved in the study.

The study attempts to measure the effect of shutting down 23 large high schools and opening 216 small high schools. As the big high schools were closed, their students were scattered around the district. The new small schools got new crops of ninth graders so it seems difficult to compare the new schools to the old schools since they have different kids. For this study, they came up with an interesting approach. Since students apply to the different schools, and many of the schools require a lottery to see who gets in, they set out to compare the achievement and graduation rates of the students who entered the lottery and were admitted to 105 of the small schools to the students who entered the lottery but ‘lost’ and went to regular schools.

The big finding was that the graduation rate for the students who won the lottery was 6.8% higher that for students who had entered the lottery but didn’t get into one of those 105 schools.

As an isolated statistic, this sounds moderately successful. Certainly for the 6.8% who graduated and might not have otherwise, it is significant. As these small schools serve about 400 students each, they have a combined enrollment of 40,000, so 6.8% is over 2,700 students. But as I examined these two papers carefully I’ve determined that there are many additional factors that call the success of this program into question. In this post, I’ll highlight the most significant ones.

1. Who wrote the paper?

The paper was written by MDRC, which is “a nonprofit, nonpartisan research organization” according to the Jan 2012 update. It was funded by Gates, which is interesting, but not surprising. Gates funds a lot of research and much of it contradicts the corporate reform theories.

What I learned, though, and what I haven’t seen reported elsewhere is that while three authors wrote the June 2010 paper, Howard S. Bloom, Saskia Levy Thompson, and Rebecca Unterman, only two of them wrote the Jan 2012 update. Missing from the update is Saskia Levy Thompson. Why the omission? Well, she couldn’t work on that because two months after the first paper, in August 2010, she was hired as a top DOE executive making $174,410. And, no, I’m not implying that she created skewed research to land a job, but it is still something ‘interesting’ considering that this paper remains the only piece of ‘proof’ that closing down schools is an effective reform strategy.

2. Do the 105 schools serve the ‘same kids’ as the schools they replaced?

The schools that had been shut down, according to the report, had graduation rates around 45%. The new schools had a 68.7% graduation rate. If it were the ‘same kids’ this would be quite an accomplishment. Well, they don’t try to claim that it is the same kids which is why they don’t boast a 20% increase, but just a 6.8% one. This is because the ‘control group’ which are the students who lost the lottery and had to go to a regular school had a graduation rate of 61.9%. In other words, the kids who entered the lottery were ‘better’ than kids who didn’t.

This is revealed completely in table 2.3 on page 31 of the June 2010 paper.

The most dramatic line is the one about special education. While 14% of NYC 9th graders are Special Education students, the percent of students who entered the lottery was actually 15.5%. But somehow only 6.7% went to the 105 small schools. How can this be? Statistically, it is nearly impossible that only 6.7% would win the lottery if the pool had 15.5%. Well, the reason was not that only that number happened to win the lottery. But most of of the Special Education students who ‘won’ the lottery were not able to attend those schools since those schools, being small schools, could not offer the accommodations that they were entitled to. This statistic, alone, should invalidate any conclusions made in the study. Of course when you have much fewer Special Education students, you also have fewer sever behavior issues which tend to take a lot of time and energy to address.

In several other categories we see that the 8th grade lottery winners were ‘better’ than the lottery losers.

The net result of all this manipulation of the entering students could surely account for the 6.8% increase in graduation rate.

3. What is the ‘expected’ graduation rate increase based on peer effects?

The fact that the graduation rate of the students who went to these 105 schools was 6.8% higher than the graduation rate of the students who lost the lottery and went to regular schools can easily be explained by peer effects. If you separate the more motivated students, they will do a little better than if you mix those students in with less motivated students. This is something that everyone already knows. It is not, though, something that can be the basis of a policy change. If they really wanted to take this experiment to an extreme, they would separate all the more motivated students out and have schools just for them. The other schools would get much worse since they would not have enough motivated students to set examples for the others. I would have expected the graduation rate to increase by more than 6.8% so I don’t see this experiment as much of a success.

4. What about some of the other results, not often quoted?

On page 53 of the June 2010 report, they have this table comparing different levels of graduation rate and also achievement results on certain tests.

Notice that the control group (students who lost the lottery) actually had a higher percent of Advanced Regents diplomas and also had a higher percent of students getting over a 75 on the Math A Regents. Looking at this table, it is safe to say that the results of this experiment are, at best, mixed.

5. What level of achievement have these small schools actually accomplished?

Looking over the 2010-2011 Comprehensive Information Reports (CIR) for these schools, I was struck by how poor their achievement was. In a school that is producing many ‘college ready’ students, we should see a good number of students taking some of the more advanced Regents. These would include Chemistry, Physics, and higher math. There are three different math regents, Integrated Algebra, Geometry, and Algebra 2 / Trigonometry. Advanced 8th graders often take the Integrated Algebra Regents, even in a low performing middle school. ‘Average’ 9th graders and 10th graders who are behind could take that test too. I took a random school from the list, Validus Preparatory Academy, to see what sorts of Regents they took. Only 42% of 190 students passed Integrated Algebra. This is a test, I know from grading it, that only requires getting about 30% correct to get scaled to a passing 65%. Only 45 students took Geometry, of which only 16% passed. Only 5 students took Algebra 2 / Trigonometry, of which just 1 student passed. One student took Chemistry, though there was not a score for that student, and zero students took Physics. The combined SAT average for this school was 1062 out of 2400. This is only a little better than you get for writing your name on the paper.

As Validus Preparatory Academy was just chosen since it was the last school on the list alphabetically, I decided to look at the ‘best’ school, according to the 2010-2011 city progress report. The ‘It Takes A Village’ school scored at the 99.4th percentile on the city progress report. In a school that high, you should expect a lot of kids taking Algebra 2 since that is really an 11th grade course while 12th graders would be taking precalculus, or even calculus. We should also see many students passing Chemistry and Physics.

It Takes A Village had 23 students take Algebra 2 (19 passed), 31 for Chemistry (19 passed), and 40 for Physics (25 passed)

The fourth ‘best’ school, The Urban Assembly School for Media Studies had 9 students take Algebra 2 (5 passed), no Chemistry, and no Physics.

Some other schools had higher percentages than these, but, in general the academic achievement and rigor at these schools was very thin. For their five best schools, the average SAT scores were 1135, which was below the city average of 1222. For AP exams, they had an average of 17% passing vs. 30% for the city. And since the demographics of these schools gave them a ‘peer index’ of 2.27 which is above the city average of 2.25, they can’t even use their demographics as an ‘excuse.’

6. Have any of these 105 schools been, since, shut down?

As of the most recent school closure announcements, seven of the 105 schools have been closed. This 7% is not that different than the percent of schools throughout the 1,100 schools that have been shut down.

The seven schools are International Arts Business School, Gateway School for Environmental Research and Tech, Manhattan Theater Lab High School, Global Enterprise High School, Performance Conservatory High School, Urban Assembly Academy for History and Citizenship, and The School for Community Research and Learning.

Conclusion

Shutting down schools and reopening new ones is likely to create an illusory bump in some statistics — in this case the 6.8% improvement in graduation rate. A brand new school with all first time freshmen will be free of the distraction of the repeater freshmen. In that way, I’m not surprised that there were minor increases. But these increases are just a result of this dynamic and not from getting a crop of better teachers. In time these schools will likely begin to suffer the same problems that brought down the schools they replaced.

This is not a scalable solution and it seems to be doing much more harm than it is good.

As always, I really encourage professional journalists to dig deeper into this than I am able to do in my limited spare time. There is certainly a Pulitzer Prize in journalism waiting for the reporter who takes down the corrupt corporate reform movement.

9 Responses

pfh64

Mr. Rubenstein, I wish you the best of luck in your challenge of the various reporters, but I would not hold my breath waiting. For even if there were a reporter doing this, his or her Bloomberg lackeys that are on the editorial board or in the publishers office, will squash faster than a kid runs out the door on the last day of school.

Thank you Gary, very nice report. I see with my own eyes what you write about as I am sent week to week to a different Manhattan high school. Many have the chance to select their students and it really determines what kind of learning environment exists in the school. I also hope journalist really look into how Bloomberg destroys schools and to reveal his real motives. He knows what he is doing is a facade.

Do you happen to have the data for EBC High School for Public Service handy? Its former principal and former Associate Director of the Office of New Schools, Victor Capellan, is running the Central Falls HS and not surprisingly promoting the same profile of in this case declining test scores and rising graduation rates.

Small schools put way more pressure on teachers to pass kids than large schools do. Also, in the Times article that reported on the study, they said the small schools did better on the English Regents, but not the math. As far as I know, you can’t scrub a math exam, so this piece of data is a huge red flag.