Tuesday, January 26, 2016

ADDING: I've mentioned this before: hyperlocals are becoming some of the best sources for local reporting out there. RedBankGreen.com was the source for the quotes below, but I also read some earlier reports to get some context. They do a really nice job.

This is from a dataset I made (with Bruce's guidance) of charter schools and their affiliated charter management organizations (CMOs). We hear a lot these days about KIPP or Uncommon or Success Academies; however, we hear much less about Academica or Charter Schools USA or White Hat. What do we really know about these schools?

Further: what about that "OTHER" category? Who runs these schools? How do they perform? How do they affect their local districts?

One goal I have over the next year at this blog is to spend some more time looking at these lesser-known charter schools -- the ones who, in reality, are the backbone of the charter sector. Let me start here in New Jersey with a story that doesn't involve a high profile charter leader like Eva Moskowitz or a high profile CMO like KIPP; however, it's a story that I believe is quite instructive...

* * *

Red Bank might be best known as the hometown of the great Count Basie. Like many small towns in New Jersey, it runs its own K-8 school district; high school students attend a larger "regional" high school that is fed my two other K-8 districts, Little Silver and Shrewsbury. The three districts are all quite small; when combined, their size is actually smaller than many other K-12 districts or regional high schools in the area.

This map from the National Center for Education Statistics shows the Red Bank Regional High School's total area, and the three smaller K-8 districts within it. You might wonder why the three districts don't consolidate; just the other day, NJ Senate President (and probable gubernatorial candidate) Steve Sweeney argued he'd like to do away with K-8 districts altogether. The estimates as to how much money would be saved are probably too high, but in this case it would still make a lot of sense.

The reality, however, is that these three K-8 districts are actually quite different:

Here are the free-lunch eligible rates for the three K-8 districts, and the regional high school. Red Bank students are far more likely to qualify for free lunch, a measure of economic disadvantage. Last year, Shrewsbury had one student who qualified for free lunch. It's safe to guess most of the high school's FL students came from Red Bank.

But I've also included another school: Red Bank Charter School. It's FL population is higher than Little Silver or Shrewsbury, but only a fraction of the FL population in Red Bank. What's going on?

With the first flakes of an anticipated blizzard falling outside, a hearing on a proposed enrollment expansion by the Red Bank Charter School was predictably one-sided Friday night.

As expected, charter school Principal Meredith Pennotti was a no-show, as were the school’s trustees, but not because of the weather. They issued a statement earlier in the day saying they were staying way because the panel that called the hurry-up session should take more time in order to conduct “an in-depth analysis without outside pressure.”

Less expected was district Superintendent Jared Rumage’s strongly worded attack of charter school data, which he said obscured its role in making Red Bank “the most segregated school system in New Jersey.”

That's a very strong claim. I'm not about to take it on, but I do think it's worth looking more closely at how the charter school's proposed expansion might affect Red Bank's future:

The charter school proposal calls for an enrollment increase to 400 students over three years beginning in September. Supporters of the non-charter borough schools contend the expansion would “devastate” the district, draining it of already-insufficient funding, a claim that charter school officials and their allies disputed at a closed-door meeting Wednesday night.

As a row of chairs reserved for charter school officials sat conspicuously empty, a standing-room crowd gathered in the middle school auditorium heard Rumage revisit familiar themes, claiming that the expansion plan filed with the state Department of Education on December 1 relies on outdated perceptions about the district.

Continuing a battle of statistics that’s been waged for the past eight weeks, Rumage countered assertions made at a closed meeting Wednesday, where charter school parents were told the expansion would have no adverse impact on the district, and would in fact bolster the district coffers.

This is, of course, the standard play by charter schools these days when confronted with the fiscal damage they do to their hosting districts: claim that they are actually helping, not hurting, their hosts. Julia Sass Rubin*, however, did a study of how Red Bank CS funding affects the local schools. What she points out -- and what seems to have been lost on the charter's spokespeople in their own presentation to their parents -- is that the charter gets less funding per pupil largely because it enrolls a different student population than the public district schools.

This is one of the great, untold secrets of NJ charter school funding: the amounts are weighted by the types of students you enroll. If a charter school takes a student who qualifies for free lunch, or is Limited English Proficient, or has a special education need, the charter gets more money than if it took a student who was not in those categories. That's only fair, as we know students who are at-risk or have a particular educational need cost more to educate.

Here are the special education classification rates for all schools in the Red Bank Regional HS area. Red Bank CS has, by far, the lowest classification rate of any district in the region. Of course they are going to get less funding; they don't need it as much as their host district, because their students aren't as expensive to educate. Further, by enrolling fewer special education students, they are concentrating those students in the Red Bank Borough Public Schools. Is this a good thing?

But that's not the only form of segregation that's happening:

Red Bank Borough has few white students in its public district; the charter school has far more. But look at the high school and the other two feeders: they have even more white students proportionally than the charter school. Yes, the charter is creating segregation -- but that's hardly the entire story.

In addition, there's one more very curious thing about this situation. There are, in fact, other areas in New Jersey with K-8 districts that feed into regional high schools, and those K-8 districts, like here, can have very different student populations. StateAidGuy points out a particularly interesting case in Manchester Regional High School: many students who attend K-8 school in North Haledon, a more affluent town than its other neighboring feeders, don't go on to the regional high school. The unstated reason is that parents in that town do not want their children attending high school with children from less-affluent districts; Jeff also notes the racial component to that situation.

But that's not the case for Red Bank Regional High School; in fact, the school attracts more students than those who graduate from its feeders!

These are the sizes of different student cohorts when they are in Grade 8 in the feeders, and then Grade 9 in the high school. The high school actually attracts more students from the area: it has popular vocational academies that can enroll students from other districts, and an extensive International Baccalaureate program.

So the notion that the largely white and more affluent families in Shrewsbury and Little Silver would be scared off by a three-district consolidation with Red Bank doesn't seem to have a lot of evidence to support it. The students already come together in the high school, and that appears to be working out well (at least as far as we can learn from the numbers).

Furthermore, the three towns are within a small geographic area, about 4 miles across. A centrally located school, particularly for the younger children, wouldn't be any further than a couple of miles away for families. It would be quite feasible to implement a "Princeton Plan" for the area; for example, all K-2 students would attend one school, 3-5 another, and 6-8 another.

But the Red Bank Charter School appears to be moving the area away from desegregation. If the expansion goes through, it's likely to make any chance at consolidation go away, because the Red Bank district is likely to become more segregated.

Again, the effects of consolidation on the budgets of the schools would probably be modest -- but the effects on desegregation could be enormous. New Jersey has highly segregated schools; this would be a real chance to undo some of that. But expanding a charter which serves a fundamentally a different student population is almost certain to make segregation in the Red Bank region more calcified.

And for what? In their application for expansion, Red Bank CS boasts about its higher proficiency rates than Red Bank Boro. But it's not hard to boost your test scores when you enroll fewer special need students and fewer students in economic disadvantage. What if you take into account the different student populations?

What I've done below is to create a simple linear regression model that predicts mean scale scores. The sample is all schools in Monmouth County, NJ. The model uses free and reduced-price lunch (FRPL) as an independent variable in Grade 5. I add special education percentages (which weren't statistically significant in the Grade 5 model) for Grade 8.

What I'm basically doing here is looking at all the schools in the county and, based on their scores and their students, creating a model that predicts where we would expect the school's average test score to be given its student population. Some schools will "beat prediction": they'll score higher than we expect. Some will score lower than prediction.

Let me be very clear on this: I would never suggest this is a comprehensive way to judge a school's effectiveness. I'm only saying that if you're going to make a case that a school should be allowed to expand based on its test scores, this is a far more valid approach than simply putting out numbers that are heavily influenced by student population characteristics.

Let's start with Grade 5 English Language Arts (ELA).

That's Red Bank Middle School in the upper right. About 76 percent of the variation in Grade 5 ELA scale scores in Monmouth County can be statistically explained by the percentage of FRPL students enrolled in each school. Red Bank Middle has one of the highest FRPL rates in the county, yet it does exceptionally well in getting test scores above where we'd predict they'd be based on its student population.

What about Grade 8?

For those with sharp eyes: I changed the x-axis to FL instead of FRPL (the model still uses FRPL). The charter does somewhat better than Red Bank Middle school; however, the public district school in Red Bank still beats prediction.

I always say this when I do these: absent any other information, I have no doubt that Red Bank CS is full of hard-working students and dedicated teachers; they should all be proud of their accomplishments. But it's clear that it's very hard to make the case that Red Bank CS is far and away superior to Red Bank Middle.

The Red Bank region has a chance to do something extraordinary: create a fully-integrated school district that serves all children well. I don't for a second believe that will be at all easy; we have plenty of research on the tracking practices, based on race and other factors, of schools that are integrated in name only.

But why turn down the chance to at least attempt something nearly everyone agrees is desirable in the name of "choice"? Especially when the "choice" is going to have a negative effect on the hosting school's finances? And when there's little evidence the "choice" is bringing a lot of extra value to its students to begin with?

Who knows -- maybe there's some way to have Red Bank CS be part of this. Maybe it can provide some form of "choice" to all students in the region. But not like this; all an expansion will do in this case is make it even harder to desegregate the area's schools. This is exactly the opposite of NJDOE Commissioner Hespe's mandate; can he honestly say there are benefits from expanding Red Bank CS that are worth it?

I wish I could say that what's happening in Red Bank is an isolated incident; it's not. Let's stay out in the NJ 'burbs for our next stop...

The next year, that same cohort of students, who were now in Grade 4, showed substantially different results on the same test; the Montclair average was now substantially higher than TEAM/KIPP's.

I don't point this out to suggest either that Montclair's schools are superior, or that TEAM/KIPP's schools are inferior. Without adequately controlling for at least the observed variations in each district's populations (and acknowledging that there are likely many unobserved variations), any comparison between the two systems is utterly pointless.

My point here is that facile, a-contextual, cherry-picked factoids like these are completely meaningless, and that people who bring them up time and again show themselves to be fatuous.

- The latest official figures for TEAM/KIPP's post-secondary (college) enrollment rate is 82 percent. I think this is very good and TEAM/KIPP should be proud of their work; however, once again, it is pointless to say that TEAM/KIPP is getting far superior results than the district schools unless and until you account for the differences, both reported and unreported, in the student populations. Further, simply citing one year's post-enrollment rate, which has not even been confirmed by official sources, is at best incomplete and at worst just plain old lazy.

- Dale Russakoff's book, The Prize, does not make the claim that TEAM/KIPP spends $400 per student on custodians while the Newark Public Schools spends $1,200 per student. As I wrote in my brief on Russakoff's (mis-)use of data, here is the relevant passage from the book:

“Christie had not funded the full formula since taking office, citing the state fiscal crisis, but the allocation was still equivalent to about $20,000 per student. Less than half of this, though, reached district schools to pay teachers, social workers, counselors, classroom aides, secretaries, and administrators – the people who actually delivered education to children. For example, the district calculated that it spent $1,200 a year per student on Avon’s janitorial services; BRICK founder Dominique Lee researched the cost on the private market and found it was close to $400 per student.” (p.135)

First of all, there is nothing in here about TEAM/KIPP. Second, the claim of $1,200 per year at BRICK, an NPS school, is unsourced. My review of NPS data calls into question the veracity of the claim; NPS documents showed spending of about $225 per pupil on custodial salaries (see my brief for the data source). Finally, there is no documentation of how Lee calculated her figure, or what the "private market" means.

I think I've been more than fair to Russakoff, but I also think it's simply unacceptable for "facts" like these to work their way into the mainstream media. She has actually misquoted her own book in interviews. It's important to be clear and rigorous with this stuff; I have found Russakoff's use of data in The Prize to be neither. Sorry to be blunt, but enough's enough.

- The notion that Newark's charters have less bureaucratic bloat than NPS schools is contradicted by state data.

Newark spends more on classroom instruction per pupil than most Newark charters, including TEAM/KIPP.

This is reflected in the large number of these support personnel per student at NPS compared to most charters.

While TEAM/KIPP has equivalent numbers of social workers per student compared to NPS, the district also has many more psychologists, school counselors, and nurses per student.

NPS has lower administration costs per pupil than any Newark charter school.

NPS's administrative salary costs are among the lowest in the city.

Despite having a crumbling infrastructure, NPS plant costs are not inordinately high compared to the charters.

Russakoff has claimed that only half the money spent by NPS makes it "into the classroom." Yet she never explains what that means, she never explains her methodology for arriving at the figure, and she never fully sources the figure. In the face of all this contradicting evidence that comes directly from the state, Russakoff and the people who quote her have an obligation to explain the apparent contradiction here. Is the state data wrong? If so, how do we know?

You can't just fling data around without explaining how it was created, where you got it, and how it should be interpreted in the proper context.

I'm tired of hectoring people who clearly don't give a damn about their own reputations. But I'm not going to stop pointing out when claims are made about schools that have no proper context, are cherry-picked, are poorly sourced, or are just plain wrong. What I have above are the facts. You can check them out yourself. If I'm wrong, I'll correct them.

But if I'm right...

You can't argue with people who repeatedly bury their heads in the sand. All you can do is point out the facts to those who are willing to listen.

ADDING: This is very, very frustrating to me. In an otherwise excellent conversation about Newark and its schools, Owen Davis, who I admire greatly, uses Russakoff's book as a source to make the case the charters have less bureaucratic bloat than NPS:

OD: Of course the district should undergo the “forensic audit” that Russakoff suggests. More money should be going to the children in the classrooms, especially when that means more social workers, counselors, teachers assistants, etc. But it has to be understood w/in the context of a depressed local economy where middle class jobs are scarce.

The charter schools in Newark aren’t weighed down by that economic drag, and Russakoff shows how kids and teachers benefit from leaner bureaucracies and more agile administrators. There’s no question that kids are better off when their schools can provide them with more, faster. But the existence of charter schools doesn’t answer the question of wider economic impacts when the district shrinks. [emphasis mine]

Again: Russakoff's tale is contradicted by official state data. Further, she has absolutely not made the case that her sources are better than the state's own reporting.

This has got to stop. We are telling the wrong story, and it's going to lead us to the wrong conclusions.

Sunday, January 24, 2016

1) Lately, the posts are longer and much more likely to be filled with statistical stuff.

2) I've had it with arguing with reformy hacks.

This is probably something I should have said as part of a New Years post, but what the hell... I had a lot of time to think today while pushing the snow blower around, and it's become increasingly clear that I want to change direction in 2016:

As I said before: the reformy side really has nothing. If the best response you have to charter skepticism -- which is not, by the way, the same as saying there is no place for choice or charter schools in our education system; it's actually saying that the claims of vastly superior results in the charter sector are largely nonsense -- is to make thinly veiled accusations of racism, you're really running on fumes.

If the best response you have to the legitimate concerns of parents who, among other actions, opt their children out of standardized tests is to say that they are merely coddling their kids, you really have nothing to contribute to the conversation about America's schools.

If you spend your days beating up teachers unions while ignoring the serious problem of inadequate and inequitable funding for our schools, you're not someone I want to waste my time on.

So this blog is, I hope, entering a new phase. Or maybe it's more accurate to say I'm going to try to spend more time writing things I myself would like to read: evidence-based, rigorous, serious discussions about American education, using publicly available data and other forms of evidence to fight off the tired, ignorant platitudes that have come to dominate the conversations about this nation's schools.

Reformsters: if you want to "swarm" me while I do this, go ahead. At this point, I really couldn't care less. I am not going to waste my time debating you on your facile meandering. If you want me to engage, step up; otherwise, you're just not worth it.

Let's start by spending the next week or two looking at the New Jersey suburbs, and why almost everything you've heard about school "choice" is probably wrong. Stand by...

Wednesday, January 20, 2016

A little background on what you're about to read: In the spring of last year, nj.com posted a story about the PARCC exam -- the new, semi-national standardized test that has been a large source of controversy -- and how it would affect teacher evaluations in the state.I happened to notice a really great comment from "Rutgers Professor" just below the article. The scholar in question is Gerald A. Goldin. I don't know him personally, but I had certainly heard about him: he is the definition of a scholar, distinguished in both his field and the teaching of his field.It bothered me, frankly, that someone as knowledgable as Goldin, who had written a genuine essay within his comment, wasn't featured more prominently in this post. Since I'm at Rutgers myself, I contacted him to ask if I could publish what he wrote.I didn't hear back from him until later in the fall; he was away, and then I was away, and you know how that goes. Dr, Goldin, however, was very gracious and agreed to let me reprint what he wrote. I only apologize I haven't done so until now.What you're about to read is important; Gerald Goldin's opinion on how PARCC will be used matters. I know the state has dropped the percentage of SGP used in a teacher's total evaluation to 10 percent, but even that's too much for a method that is fundamentally invalid. I'm honored to host this essay on my blog. Thanks, Dr. Goldin, for this contribution.

* * *

An 8ththing to know: Junk statistics

I read with interest the on-line article (March 16, 2015),
“7 things to know about PARCC’s effect on teacher evaluations” at www.nj.com/education/.

As a mathematical scientist with knowledge of modeling, of
statistics, and of mathematics education research, I am persuaded that what we
see here could fairly be termed "junk statistics" -- numbers without
meaning or significance, dressing up the evaluation process with the illusion
of rigor in a way that can only serve to deceive the public.

Most New Jersey parents and other residents do not have the
level of technical mathematical understanding that would enable them to see
through such a pseudoscientific numbers game. It is not especailly reassuring
that only 10% of the evaluation of teachers will be based on such numbers this
year, 20% next year, or that a teacher can only be fired based on two year’ s data. Pseudoscience deserves no weight whatsoever in educational policy. It is
immensely troubling that things have reached this point in New Jersey.

I have not examined the specific plans for using PARCC data
directly, but am basing this note on the information in the article. Some of
the more detailed reasons for my opinion are provided in a separate comment.

In short, I think the 8th thing to know about PARCC’s effect
on teacher evaluation is that the public is being conned by junk statistics.
The adverse effects on our children’s education are immediate. This planned
misuse of test results influences both teachers and children.

Why the reportedly planned use of PARCC test statistics is
“junk science”:

First, of course, we have the “scale error” of measurement
in each of the two tests (PARCC and NJ-ASK). Second, we have random error of
measurement in each of the two tests, including the effects of all the
uncontrollable variables on each student’s performance on any given day,
resulting in inattention, misreading of a question, “careless mistakes,” etc. Third, we have any systematic error
of measurement – possibly understating or overstating student competency – that may be present in the test
instruments, may be different in the two instruments, and may vary across the
test scales.

The magnitude of each of these sources of error is about
doubled when the difference of two independently-obtained scores is taken, as
it is in calcualting the gain score. In addition, since two different test
instruments are being used in the calculation, taking the difference of the
scores requires some derived scale not specified in the article, which can
introduce additional error. These sources of error mean that each student's
individual gain score has a wide "error bar" as a measure of whatever
it is that each test is designed to measure.

Fourth, we have “threshold effects” – some students are
advanced well beyond the content intended to be measured by each test, while
others are far behind in their knowledge of that content. The threshold effects
contribute to contaminating the data with scores that are not applicable at all.
Note that while the
scores of such students may be extremely high or low, their difference from one
year to the next may not to be extreme at all. Thus they can contribute
importantly in calculating a median (see below).

A fifth effect
results from students who did not take one of the two tests. Their gain scores
cannot be calculated, and consequently some fraction of each teacher’s class
will be omitted from the data. This may or may not occur randomly, and in any
case it contributes to the questionability of the results.

Sixth is the fact
that many variables other than the teacher influence test performance –
parents’ level of education, socioeconomic variables, effects of prior
schooling, community of residence, and so forth. Sophisticated statistical
methods sometimes used to “factor out” such effects (so-called “value added
modeling”) introduce so much additional randomness that no teacher’s class comes
close in size to being a statistically significant sample. But without the use
of such methods, one cannot properly attribute “academic growth” or its absence
to the teacher.

According to the
description in the article, the student gain scores are then converted to a
percentile scale ranging from 0 to 100, by comparison with other students
having "similar academic histories." It is not clear to me whether
this means simply comparison with all those having taking both tests at the
same grade level, or also means possibly stratifying with respect to other,
socioeconomic variables (such as district factor groupings) in calculating the
percentiles. Then the median of these percentile scores is found across the
teacher’s class. Finally the median percentile of
gain scores is converted to a scale of 1-4; it not specified whether one merely
divdes by 25, or some other method is used.

However, a seventh objection is that test scores, and
consequently gain scores, are typically distributed according to a bell-shaped
curve (that is, approximately a normal distribution). Percentile scores, on the
other hand, form a level distribution (that is, they are uniformly distributed
form 0 to 99). This artificially magnifies the scale toward the center of the
bell-shaped distribution, and diminishes it at the tails. Small absolute
differences in gain scores near the mean gain score result in important
percentile differences, while large absolute differences in gain scores near
the extremes result in small percentile differences.

There are more complications. The distribution of
performance on one or both tests may be skewed (this called kurtosis), so that
it is not a symmetrical bell-shaped curve. How wide the distribution of scores
is (the “sample standard deviation”) is very important, but does not seem to
have been taken into account explicitly. Sometimes this is done in establishing
the scales for reporting scores, in which case one thereby introduces an
additional source of random error into the derived score, particularly when
distributions are skewed.

Eighth, and perhaps most tellingly, the median score as a
measure of central tendency is entirely insensitve to the distribution of
scores above and below it. A teacher of 25 students with a median “academic
growth” score of 40 might have as many as 12 students with academic growth
scores over 90, or not a single student with an academic growth score above 45.
To use the same statistic in both cases is patently absurd.

These comments do not address the validity of the tests,
which some others have criticized. They pertain to the statistics of
interpreting the results.

The teacher evaluation scores that will be derived from the
PARCC test will tell us nothing whatsoever about teaching quality. But their
use tells us a lot about the quality of the educational policies being pursued
in New Jersey and, more generally, the United States.

Saturday, January 16, 2016

@jerseyjazzman@The74 Few data points? So when NJ releases parcc and Alexander scores right, will look for a retraction.

Um, no. As you will read below, unless NJDOE changes how it does business, we'll never see what Alexander's score on the PARCC was in the official state data, because North Star's results are published in the aggregate: we only get results for all of North Star's schools together, and not the individual schools in their system. I actually wonder how North Star was able to separate Alexander from its other schools; does NJDOE have school-level data for charter chains? If yes, why don't they release those figures publicly? And release the special education and student demographic and suspension rates and all the other breakdowns by school as well?More and better data -- please.

When I want some shallow charter school cheerleading, I go straight to Campbell Brown's The 74 Million. With the possible exception of Education Post, you'll not find a more reliable source of credulous, unconditional charter love; take, for example, this piece by the reliably reformy Richard Whitmire:

But a steady drip of recent data points to a very different story line: Not only did the reforms of traditional Newark Public Schools produce some real benefits, but the relatively small portion of the gift invested in Newark charter schools paid off big. Real big.

Whitmire gives three citations to back up his claim that Newark's schools produced "real gains." Of course, all are from -- surprise! -- The 74: one from a spokesman for KIPP, the largest chain chain in Newark; one from current State Superintendent Chris Cerf, who, when he was the state education commissioner, was former State Superintendent Cami Anderson's biggest supporter; and one from Anderson herself.

All trot out data points to make their case; unfortunately, none account for what quantitative researchers often call secular effects: changes that affect the entire system being studied. In other words, if Newark's test scores rose as part of a statewide rise in scores, that really doesn't make the case that Newark's reforms had any direct, causal effect.

Bruce Baker and I looked at Newark's test scores -- both charters and NPS -- over the period of "reform" in the city's schools. We found no evidence that Newark has seen any positive changes that couldn't be explained by overall, statewide trends (I'll have a similar analysis of graduation rates out soon).

So when Whitmire says NPS reforms produced "some real benefits," understand he is making a claim that is supported only be the weakest of evidence, and is based on data points from directly interested parties. Does anyone think that's enough to make a suggestion like this?

The gains are so striking, in fact, that they raise a key question: Why didn’t the Newark reforms emphasize charters from the beginning? If you look across the Hudson River where former New York City schools chancellor Joel Klein produced striking gains by pulling in the region’s top charters with offers of $1 a year rentals to use existing buildings, it’s reasonable to ask (with the admitted benefit of hindsight): Why didn’t Newark do the same?

First of all, as Leonie Haimson among others has shown, the Klein era was hardly an undeniable success. In addition, it's worth pointing out that there is a good argument to be made that NPS would have been better off not selling off properties like 18th Avenue School to private owners. Not only did NPS arguably not get fair market value for the property; as (again) Bruce Baker and Gary Miron point out, the taxpayers are paying to buy a property they already own and turn it over to private ownership.

But let's get to Whitmire's central claim: that the evidence shows us that Newark would have been better off had it turned more of its buildings over to charters. Do we have enough proof to say that's a reasonable argument? Whitmire hangs his hat on the "success" of one -- yes, one -- charter school conversion:

The latest and most dramatic example comes from test results released to The Seventy Four by the charter network Uncommon Schools for its North Star Academy Alexander Street School: Based on the tough new PARCC tests, in just a single year Uncommon was able to erase years of education malpractice. [emphasis mine]

OK, hold on -- Richard, you're going to base your case on data that is not publicly available, cannot be verified, and comes straight from an interested party?That's your evidence?

I've complained about this before: time and again, charter operators are relying on their own, proprietary data to make claims about their "success" -- and "journalists" like Whitmire just swallow them whole. I suppose it's to be expected that charters would do this; public school districts, to be fair, will also crow about their proficiency rates before all the data is made available to the public. But you would think Whitmire would prefer to wait until he could confirm the data...

Because you can't make the claim that test scores show any school is "succeeding" without accounting for differences in its student populations.

Whitmire thinks it's enough to simply show the state average proficiency rates for "non-economically disadvantaged" students and compare them to North Star/Alexander's. But that's wholly inadequate: what about differences in the populations of special education and Limited English Proficient students? What about the differences in the types of learning disabilities? What about the differences between free lunch-eligible and reduced price lunch-eligible students, which I've shown can significantly affect test scores?

When I testified last year before the NJ Legislature on Newark's "reforms," I showed this graph:

The methodology was (again) developed by Bruce Baker to control for differences in student characteristics, spending, class sizes, and other school variations, and then compare adjusted growth percentiles. Let's be fair: North Star is one of the more "efficient" schools in Newark. But several other schools, including NPS schools, are equally or more efficient. Further, many Newark charters are relatively inefficient. Where is there any proof a mass conversion of NPS schools to charters will substantively change student outcomes for the better?

But we're not done yet, folks. Because Whitmire doesn't even tell us the most important fact we need to know to fully evaluate the conversion of Alexander School into a charter:

Alexander was a school so depressing and so low performing that former Newark Superintendent Cami Anderson told me she used to cry when she visited. Uncommon agreed to assume responsibility for the building and its K-4 students, almost all of whom returned. [emphasis mine]

Yes, Uncommon took the K-4 students; but they weren't the only grades at Alexander. This memo comes straight from NPS:

Alexander Street: North Star Academy will operate grades K-4 beginning fall 2014. Alexander
families with K-4th grade students must submit a One Newark Enrolls application. Families will
have preference to North Star Alexander Elementary School or their other top-choice schools
through One Newark Enrolls. Alexander families with 5th through 8th graders must submit a One
Newark Enrolls application and will have preference to their top-ranked district or charter schools. [emphasis mine]

When North Star took over Alexander, they cut the Grade 5 through Grade 8 students loose. I ask all of you who thunder that we cannot tolerate any delays in charter school conversions: if it was so important for Alexander's students to be enrolled in a charter like North Star, why didn't Cami Anderson force the charter to take all of the students enrolled there?

Dropping the four middle school grades to focus on younger students is a luxury NPS district schools can't afford. And there's good reason to believe that it matters:

Again, from my testimony before the Joint Committee on the Public Schools. Here's how the Class of 2018 shrank during their Grade 5 through Grade 8 years at North Star -- all while their scale scores on the NJASK went up. Did Whitmire ever bother to ask about this?

Unless and until the state-run NPS district releases publicly available and vetted data, we will never know how many students actually stayed when Alexander converted. We will never know if the proportion of FL or special education students changed, because NJDOE only requires charters to report student demographics aggregated across their entire network.

And, perhaps most importantly: unless and until someone actually goes into North Star and carefully vets the school's practices, we will never really know if there are unobserved differences between North Star students and NPS students.

I get the sense sometimes that people think charter schools are a recent phenomenon. But North Star has been around since 1997; it's hardly new. According to many charter advocates, students and families learn from each other how charters differ not only from the public schools, but from other charters. I agree; I think the word most certainly has gone out about what kind of school North Star is.

North Star's certificated staff is one of the least experienced faculties in the city. Yes, I do think many parents know these facts, and I do think they can influence their choices. I think many parents hear that North Star is a "no excuses" charter with fewer highly experienced educators than NPS, and they think: "Is this school going to be good fit for my child?" And then they act accordingly.

I am always amazed at charter advocates who can't follow through this rather simple line of reasoning -- one based on premises they all accept. If parents are going to "choose" their child's school, why would we be at all surprised that the children who enroll in a school like North Star differ from those who enroll in an NPS school -- even in ways that do not show up in state data?

Why would we think the children in "no excuses" charters are just like the children who are in other schools? Again, a "choice" system is predicated on families being informed consumers; of being able to access more information than the poorly-constructed measures of school effectiveness put out by clearly biased governmental entities.

If we believe this, than of course the students at North Star will take to a "no excuses" model of schooling better than the entire student population. Of course many students at North Star will leave after their families discover the school isn't the right "fit." I honestly do not understand why any charter advocate would ever try to argue otherwise.

Again, let me be fair: North Star is a school that generally outperforms on test-based measures given its student population. Good for them; they should be proud of that. But there is no evidence the "successes" of North Star can be significantly scaled up. North Star may work for the students who stay there, but there is no evidence it will work for all students; in my opinion, there's no evidence it will work for many of Newark's students, particularly give Uncommon's rather questionable pedagogical methods.

In addition: we are seeing more and more evidence that Newark's school funding system must change, or charter school proliferation will become an increasingly destructive force for the many students who, by choice, remain enrolled in NPS schools. That cannot be allowed to continue; it's completely unfair and it will rip the city apart. Full funding of all Newark schools is a necessary precondition for the expansion of charters.

These issues, however, seem to be lost on Richard Whitmire. Like any good ideologue, he substitutes spin and hype for facts and reason:

The awkward bottom line is that Newark traditional schools can’t compete with the top charters — a fact Anderson acknowledges — for really simple reasons: The charters can recruit promising talent and then lavish them with extensive training. Plus, the charters can take advantage of their slim headquarters staff to push more resources to the classroom. [emphasis mine]

These excess costs can be difficult to track since education management organizations do not report relevant, detailed, comprehensive expenditures in the same format or with comparable documentation as public districts or the charter schools themselves. One example comes from the IRS 990 form for the Uncommon Schools network, which operates North Star Academy in Newark, NJ. It reported 2012 compensation for its systemwide CEO approaching $270,000, for its CFO at $207,000, for its Senior Director of Real Estate at $130,000, and for the Newark managing director at $213,900; in addition, the network maintains school-level administrative staffs. These EMO salaries, presumably subsidized by management fees, are not accounted for in state professional staffing reports or in the schools’ own expense reports, and they may not be fully accounted for by the management fees listed in financial reports, where they are typically shown as central administrative interacted service expenses. Administrative costs for the academy are also borne by the Newark Public School district, which has a separate districtwide administration. Thus, it is no easy task to determine exactly what the various administrative expenses for the North Start Academy actually total. [emphasis mine]

Newark parents should not have to settle for "voting with their feet"; like suburban parents, they should be able to vote with their vote.

- If the Newark community wants to have schools with differing discipline codes, so be it. But parents should not have to enroll their children in schools that abrogate their rights simply to get them into schools that aren't crumbling and dangerous.

I'll be the first to agree that Kirp's argument was missing some key context. But Whitmire's propaganda here -- yes, let's call it what it is -- is, in my opinion, even worse. A few carefully selected data points that can't be verified and that come from an interested party do not add up to evidence for the radical remaking of Newark's schools that Richard Whitmire desires.

It's fine for charter schools to be proud of their work. But ideological charter cheerleading does not help anyone. Enough, please.

Gimme a "C"!

ADDING: Even the staff of Uncommon Schools agrees that finding the right "fit" is extremely important when choosing a school; just ask one of their college counselors:

Can you address the best way for students to research colleges -- resources, criteria, or do's and don'ts?

Visit in person whenever possible! Be intentional about your visits: use a checklist and take notes on your thoughts and impressions. Don’t fixate on “name brands” or college rankings - keep an open mind to what may be a good fit for you and your family. I like Collegeboard.org, Naviance and College-insight.org as online resources. Make a folder for every college you are researching to stay organized. [emphasis mine]

Well, if that's true for college, isn't it just as true for K-12 schools? Isn't that the entire point of the school "choice" movement: that some schools aren't always a good "fit" for everyone?

And if that's true, why should we believe North Star's methods will work for all children?

As always: Bruce Baker is my advisor in the PhD program at Rutgers GSE.