Today’s much-anticipated release of the Maine Department of Education’s “School Grades” has provoked the similarly-anticipated, and likely intended, negative response from educators who called the grading system “flawed” in public and who-knows-what in private. The Maine Education Association has certainly won this round on Twitter and in the newspaper online comments section, with an overwhelming number of responses criticizing the grading methodology. Stephen Bowen himself seems to be resigned to expressing an awkward optimism about the whole thing:

I commend him on resisting the urge to use an exclamation point on that second sentence. I’m intrigued by his application of the Sheryl Sandberg “lean in” concept to parenting. In general, I don’t hold Stephen Bowen entirely blameless for this mysterious attempt to randomly shame a bunch of schools, but one sees the hand of the Big Man quite clearly in this whole enterprise.

Again, the Maine Education Association has been quite successful so far in steering the conversation to the fact that the DoE school grading system bears a strong, linear relationship to average district poverty as measured by percentage of students qualifying for free or reduced lunch. One presumes that Bowen and Governor LePage were not prepared to pre-emptively address this extremely valid concern because the whole school grading idea is essentially another route to shoehorning more charter schools (and preferably virtual, Floridian charter schools) into the state, as this appears to be pretty much the Governor’s entire raison d’etre with regard to intervention in Maine’s public school system. Otherwise, one can only assume that the always-significant relationship between income and school performance would have been something they chose to address.

I have not yet studied the entire methodology behind the school grading system, although others are raising issues about such problems as cut-off points for the letter grades, the significance of the number of students tested, and the imposition of a bell curve on the grading results. However, for the sake of argument let’s assume that the grades are a straightforward assessment of average standardized test scores and improvements over previous years on those test scores. (Yes, I know – large grain of salt.) What would those results tell us?

As the MEA and Mike Tipping point out, there is a very strong relationship between levels of poverty and average district ME DoE “school grades.” The averages in these areas are quite telling, with each an average increase of around 10% of district students receiving free or reduced school lunch for every step down in average district “school grade.” However, the averages are not identical to each district’s score. I used the DoE’s much improved new data interface to download the percentages of students receiving reduced or free lunch and looked more closely at how these variables related to one another.

As you can see, although the general trend is quite clear, there are a number of schools which do not fit exactly along the line. The black line in the center of the scatterplot describes the predicted relationship between poverty and “school grade.” While those relationships tend to cluster next to the line they are not all right on it. The distance between the dot – the actual relationship – and the line – the predicted relationship – is known as the observation’s residual.

People enjoy looking at residuals because you can generally get lots of interesting information from thinking about what could explain why an observation deviates from its predicted score. In this case, we have a strong bivariate relationship between the ME DoE’s definition of quality and district wealth. The R^2 of the bivariate relationship is .32, which means that about a third of the ME DoE school grade is explained by levels of district students receiving free or reduced school lunch. However, that means that there are some other factors explaining the ME DoE school grades as well, unless it’s all just random “noise,” or inexplicable random variation.

Unfortunately, because poverty is such an important factors in predicting outcome you aren’t going to get anywhere with finding these additional explanations for school quality (per the DoE’s measurement of it, at least) unless you control for district wealth. Here’s where the residuals come in. By looking at the residuals – the distance from the line to the observed score for each district – you can see where districts are overperforming or underperforming relative to where we would expect to see them based on their average poverty levels.

This is an interesting list. If we look at “school grade”/poverty residuals – that is, “school grade” performance controlling for poverty, we find that the following school districts are overperforming relative to how their average level of poverty predicts they should perform:

To read that table, what it means is that the average Edgecomb or Whiting school is performing 2 grades above where it is predicted to perform, based on those districts’ average poverty level. The lower the residual, the closer that district is to having its performance on the “school grades” be predicted almost entirely by district income.

Now, lots of districts also fell below where their income predicted they should score. That’s just the hard truth of OLS regression, folks.

It personally gives me little joy to show you those districts, particularly as my son’s school is among them (although not one of the more extreme cases). It is interesting to note, however, that the bivariate relationship is in fact not perfectly linear, as demonstrated by looking at the moving averages. Districts with fewer than 15% of students receiving free or reduced lunch, on average, outperform their expectations based on the average relationship between poverty and “school grade.” After that number, the average performance more accurately predicted by poverty level.

(Sadly, Excel does not allow me to show a smoothed version of this measure. But you get the idea.)

Another interesting thing about the group of schools getting “school grades” below their predicted score is the number of state academies in the group. (For those not in the know, state academies were Maine’s original “charter school” – we were doing charter schools before charter schools were hip. They are public-private schools which create their own employment rules and receive town money, in addition to attracting full-freight-paying non-local students.) Only Erskine Academy added “school grade” value beyond what was predicted through income level, while six academies did not. It very well could be that there’s something important and systematic about the students who are sent to state academies, but it also could be telling for what we should expect to see out of the charter schools the next time these “school grades” are given.

**ETA: Thanks to Jonathan Pratt for bringing to my attention the fact that the academies had not in previous years been required by the state to participate in standardized testing in the same way. Because they had an exception around testing, they had lower participation – yet under the ME Doe “school grades” scheme they were summarily dropped one letter grade below what their raw score would have given them otherwise. Why the ME DoE decided to treat the academies in this way, yet altogether exclude other schools – like charters, or very small schools – from their grading system is really a total mystery. I will recalculate the regression based on the raw scores later today.**

Unless this “school grade” business was actually a one-off experiment intended to provide justification for adding more charter schools. Which would be simply shocking. Consider me shocked — shocked! — in advance.

If it isn’t entirely a farce, however, I would expect to see Commissioner Bowen speaking reasonably about the significance of poverty and the “school grade” system — pointing out both what poverty explains, as well as trying to see what it doesn’t.

6 Responses to What can we learn from the ME DoE School Grades?

Emily – Are the two graphs created by MEA directly correlations or not? I understand that the more poverty (free/reduced lunch) there is in a school, the more *likely* it was to get a lower grade but did that actually happen? In other words, if School A had 90% F/RL rate did it actually receive a D or was it just more likely to receive a D? Thanks.

The MEA graphs reported average school poverty rate by grade group. It demonstrated a relationship between poverty and “school grades” by arranging those averages on a chart next to one another and thereby showing a clear linear relationship between school grades in general and levels of poverty – the higher the grade the lower, on average, the level of poverty.

However, those were averages and didn’t predict where every single school would be. Like you’re saying, they basically said that if a school had a high poverty rate, it’s very likely to have received a low grade. What’s most interesting, though – at least to me – are the exceptions. These are places which more distant from the average. If a school with a 90% poverty rate is getting a “C” under the ME DoE school grade system, that’s better than we’d expect. What’s going on with those places?

It should be – most of it is public, though I’m not sure about the “improvement of bottom 25%” measures for the elementary and middle schools or the individual school grad rates for high schools. Worth taking a look!

Many thanks for your fine work! I run End 68 Hours of Hunger, and our Vision is to end childhood hunger in America one school at a time! Your statistics show that while the correlation may not be one for one, there is clearly a relationship between poverty and grades, and while individual schools may be doing better, the fact remains that we have to feed these children! If you feel so inclined, it would be awesome to check the ones outperforming the average on a school by school basis to see if there is another mechanism by which these children might be being fed that isn’t accounted for! Keep up the great work!