Students at Hollenbeck Middle School in Boyle Heights. (Courtesy: Hollenbeck MS website)

My wife and I have two boys, ages 3 and 1. We are just a few years away from sending them to kindergarten, and like all parents, we want them to go to a “good school.” We want our children to learn and grow academically and socially so they can live rewarding and secure lives and contribute to society. It’s practically universal: parents seek out “good schools” even at considerable personal cost, because they want the best for their children.

There’s nothing wrong with that. But the way we talk about and measure what constitutes a “good school” is deeply problematic. California standardized test scores were recently released, and the conversation immediately focused on the percentage of students who met or exceeded state standards, or more commonly, percent proficient. School districts like Beverly Hills and San Marino were extolled for their high test scores, and districts like Los Angeles and Pasadena were shown to be comparatively lacking. Families, in turn, act on this information, choosing schools where higher percentages of students are proficient.

The problem with this is that proficiency numbers don’t tell you how much students are learning in their time at school. At some schools, the majority of students enter the school already reading at or above grade level. They are already on-track for college and have many advantages to keep them on track. If their school facilitates even a minimum level of learning, they will continue to score well on standardized tests and boost their school’s ratings.

But at many other schools, the majority of students enter far behind grade level. They may be learning English for the first time or coming from a school that did not prepare them well. For schools where the majority of their students start behind, even when they do a stellar job of teaching, their test scores will likely still reflect the learning gaps they started with and leave them labeled as a “bad school.” This does a tremendous disservice to the educators who are doing heroic work to serve the students who most need our support. More importantly, it does a disservice to the students themselves.

Take for example Hollenbeck Middle School in the Boyle Heights neighborhood of Los Angeles. At Hollenbeck, 30 percent of students met standards in math this past year. This is still below the state average for middle schools (37 percent), and by some accounts, Hollenbeck is a struggling school. Those who look more closely may see that Hollenbeck’s scores are above-average for schools serving mostly low-income students and that scores have improved significantly over the past several years. But few outside the school staff and parents of Hollenbeck students seem to be asking how much students at Hollenbeck have learned over the course of the year. Where do students start and where do they finish?

When we examine the student-level data for Hollenbeck eighth-graders, we can see remarkable learning gains happening over their two years at Hollenbeck. Test scores are divided into four bands, ranging from “exceeded standards” to “not met standards.” Only 14 percent of last year’s eighth-grade Hollenbeck students met or exceeded standards as sixth-graders. Given this baseline, we would expect to see about 14 percent meeting or exceeding standards as eighth-graders. But by eighth grade, 30 percent of Hollenbeck students met this benchmark. Overall, 43 percent of students increased math achievement levels over their two years (52 percent maintained and only 5 percent declined). When looked at this way, we can see that Hollenbeck is doing a remarkable job educating its students and improving their prospects for high school, college, and career. Surely these are better indicators of a “good school,” as they tell us how much students are learning.

So why don’t we use better metrics to track how much schools contribute to learning? One reason is that it can be complicated. It requires analysis of student-level data and matching scores from multiple years across multiple schools. Other states that use student growth measures to assess school quality have sometimes struggled to communicate the process in ways that families and everyday citizens understand.

Another, more cynical explanation might be that we don’t really care about how much students are learning, and that our system is designed to segregate middle-class, educated families from their less privileged peers. It may be that we want our students to go to schools where students are already succeeding, so our children can benefit from the peer effects. I hope this is not true, because if it is, it undermines the whole vision of what public education is meant to be — a collective public effort to educate our children.

There are efforts underway to change how we evaluate schools. The California Department of Education is considering adopting a new growth metric to be included in its dashboard. Greatschools.org, a popular website that shares school quality information, has recently added an academic progress indicator, which attempts to show year-to-year student growth, though it lacks the student-level data to do this well.

But to truly change our focus from absolute proficiency scores to real student growth and learning, we need to change the conversation. When we talk about what is a good school, we need to ask, “How much are students learning?” Where do students start when they begin their education at that school, and where do they end up when they finish? If we do that, we can shift the focus to what really matters and celebrate the schools that are truly providing a good education.