HALL MONITOR — A Signature Euphemism; a Daft Solution

September 1, 2017

By Jay Bullock

I’m so lucky! My school is a “signature school” in the Milwaukee Public Schools this year!

In 20 years of teaching I’ve seen my share of reform efforts and euphemisms. But “signature school,” as a way of indicating schools that failed to meet expectations on the last state report card, may be the craziest.

The designation entitles us to some bonus resources, like a “hotline” for administrators to call in emergencies (not, sadly, a Commissioner Gordon-style red phone) and “resources gathered to counter inequitable patterns,” whatever that means.

During a full week of professional development before students returned, teachers in “signature schools” were presented with a hefty list of “Classroom Set-up Expectations.”

These elicited actual laughs from my colleagues. A classroom library with a carpeted area, so students can sit at our feet? Where does that fit among the 40 desks for my sophomores? Posted weekly lesson plans? Come on, I have to adjust my lessons on the fly almost every day!

Let us not forget the in-class “cool down space,” complete with noise-canceling headphones, lavender-scented pillows, and “a small trampoline.” I am not making this up.

What’s not funny is mandatory posting of achievement, attendance, and discipline data on a “data dashboard,” updated hourly and prominently displayed outside the door of every “signature school” classroom.

This dashboard is clearly designed so central office personnel can see at a glance whether a classroom, and its teacher, are failing because getting to know us and our kids by investing real time among us simply takes too long.

The shame (guilt, stigma — pick your noun) associated with bad data on our dashboards is somehow supposed to motivate teachers and students to do better.

Here’s the thing, we have pretty clear evidence that data walls don’t work.

They originated with University of Chicago’s David Kerbow, who saw data visualization as a way for teachers and administrators to identify problems early. Private data walls in the office or staff lounge provide school adults with big-picture insight and prompt good discussion about what has worked, what hasn’t, and what to try next. They should be a tool for informing next steps, not for judging students or staff.

Importantly, there was never any intention to have “data walls” in view of students or the public. But why should that deter education reformers?

Despite the experiences of places like Holyoke, Mass., that had probably the most famous uproar in 2014, worthless public “data walls” have steadily spread among low-performing schools and districts nationwide.

Yet, we do know what does work. Let’s set aside policing-style classroom set-ups and shaming teachers and students, and instead focus on research-based solutions for “signature schools.”

So what works?

Achievement

Our must-post data comes from the district’s “universal screener” test, STAR. A screening test is not a test of student achievement; it is, as the label suggests, used to identify early students who need remediation and intervention.

STAR covers only math and literacy, and only in some grade levels. It is not aligned to district curriculum and it is given
just three times a year. My sophomores took the STAR test on August 28 and will not test again until January. Of what value is that January score to anyone visiting my class in, say, November? What use is STAR data posted outside of, say, an art class, ever?

No reputable researcher or organization anywhere recommends using screener data this way, including state and
national Response to Intervention (RtI) groups.

Better achievement happens when teachers track and celebrate individual student growth over time on specific
key skills, which can’t be reflected in a single number. Such growth should be monitored constantly, not checked a few times a year.

As noted by the Achievement Network, a national nonprofit that partners with schools to boost academics, “This is not just about looking at the numbers, but looking at student work that illuminates specific needs of students.” No data dashboard can do that.

Attendance

Evidence is overwhelming that attendance improves when schools make personal connections to students and families, including through dedicated mentors. Some MPS high schools benefit from City Year, an Americorps-funded program that places recent college grads in the role of mentor and interventionist for ninth-grade students only.

This is a start, but not enough, especially
as City Year interventions miss the vast majority of MPS students and don’t quite go far enough with those they do reach.

According to a guide for schools from Hanover Research, mentors should do more than make a few calls home and see students at school. They should “meet with parents and occasionally participate in home visits for students with attendance or behavior issues.” Mentors should “monitor student progress and work alongside families and communities to improve attendance.”

Behavior

We must post how long it has been since we wrote a discipline referral, like the signs in factories that read, “This plant has worked x days without an accidental injury.”

There is research to suggest that such workplace signs indeed help minimize
injury, but only after extensive safety training and building a shared sense of community responsibility among workers.

Posting referral data may well work when students have a shared sense of responsibility for each other. Simply posting it won’t do the difficult work of creating such a community.

MPS has made some baby steps with Restorative Practices and trauma-sensitive training. But how do creating tension, competition, and division through these artificial, meaningless “data dashboards” build a caring, connected community?

Real change requires complicated and undoubtedly expensive work. A “data dashboard” is easy and cheap, but utterly useless to anyone except those who want to make snap judgments about students and their teachers.