You are here

Standards-Based Reforms

Nationally and in Ohio, we press for the full suite of standards-based reforms across the academic curriculum and throughout the K–12 system, including (but not limited to) careful implementation of the Common Core standards (CCSS) for English language arts (ELA) and mathematics as well as rigorous, aligned state assessments and forceful accountability mechanisms at every level.

Some say the world will end in fire. Some say in ice. But if you’re pressed for time and want to end all intelligent life quickly, nothing beats a task force.

In New York last week, a task force chosen by Governor Andrew Cuomo issued its report on Common Core. In a model of stunning governmental efficiency, the group managed to “listen” to 2,100 New York students, teachers, parents, and various other stakeholders. They then retreated to their chambers to write, edit, and publish a fifty-one-page report a mere ten weeks after they were impaneled. But clearly that was time enough for these solons to learn and thoughtfully consider what the Empire State needs: to adopt “new, locally driven New York State standards in a transparent and open process.” The report has twenty recommendations on how to bring this about.

It should be noted (speaking of governmental efficiency) that God himself was content with a mere ten modest suggestions to govern all known human activity. Cuomo’s task force has double that number—just for Common Core in a single state. But God acted alone. On a task force, every voice must be heard, every grievance aired. And they were, in all their...

In its 2015 state policy analysis, the National Association of Charter School Authorizers (NACSA) found that fourteen states (including Ohio) saw positive charter policy changes since its inaugural report last year. These wide-ranging improvements demonstrate the value of sizing up a state’s legal framework, diagnosing its structural problems, comparing it to peers, and using that information to press policymakers for change. In other words, rankings like this—and other seemingly wonky law and policy reviews—may actually pave the way for real improvements.

NACSA analyzed and ranked every state with a charter law (forty-three, plus the District of Columbia) against eight policy recommendations meant to ensure a baseline of authorizer quality and charter school accountability: 1) Can schools select from at least two authorizers? 2) Does the state require authorizers to meet endorsed standards (like NACSA’s)? 3) Does the state evaluate its authorizers? 4) Do poor authorizers face sanctions? 5) Do authorizers publish annual performance reports on schools? 6) Is every charter bound by a contract that outlines performance expectations? 7) Are there strong non-renewal standards, and can authorizers effectively close poor performers? 8) Does the state have an automatic closure law on the books?

We’ve seen a lot of hand wringing over math achievement in this country. Our students continue to underperform against their peers in other countries, lighting a fire under educators and politicians to push new STEM (science, technology, engineering, and math) programming in schools. While these panicked efforts have admirable intentions, they are mostly barking up the wrong tree. Kids spend vastly more time outside school than in it—four or five times as many waking hours—and one-on-one attention during that time is a major unpulled lever for generating change. Sadly, the large majority of our population misses out on that opportunity completely.

It begins with parents, who are their children’s first teachers. Kids respond to the in-person presence of their parents more strongly than to anyone else. Patricia Kuhl at the University of Washington's Institute for Learning and Brain Sciences (I-LABS) has shown that when a baby sees someone touch her mother’s hand, the same region of the baby’s brain lights up as when someone touches the baby’shand. But when the baby instead watches a video of the person touching her mom’s hand, those regions don’t light up. Nor do they light up when a stranger’s hand is touched. Moreover, babies respond more strongly...

This study examines whether information supplied about a student’s ability helps inform that student’s decision to enroll in Advanced Placement classes. Specifically, the information “signal” is the “AP Potential” message on the student’s PSAT Results Report, as written by the College Board. Students who score at a certain cut point on the PSAT get a message that says, “Congratulations! Your score shows you have potential for success in an at least one AP course!” or else a message that says, “Be sure to talk to your counselor about how to increase your preparedness.”

Students in Oakland Technical High School who took the PSAT in 2013 made up the sample of roughly five hundred sophomores. The intervention was as follows: Right before and after they received their PSAT results that included one of the AP Potential messages above, they were given a survey that asked them (1) how they perceived their academic abilities and their plans relative to attending college; (2) the number of AP courses they plan to take; (3) whether they would take the SAT; (4) the probability that they’d pass the exit exam; and (5) the probability that they’d graduate high school.

Aided by a highly misleadingNew York Timesarticle, the anti-Common Core crowd is pushing the narrative that Massachusetts’s recent testing decision (to use a blend of PARCC and its own assessment rather than go with PARCC alone) spells the end for the common standards effort. AEI’s Rick Hess and Jenn Hatfield called it a “bruising blow.” Bill Evers and Ze’ev Wurman described a testing system in “disarray.” Cato’s Neal McCluskey tweeted that Common Core is getting “crushed.”

First, let’s deal with Massachusetts, where the state board of education has decided to use a hybrid of PARCC and the Bay State’s own MCAS. In what must surely be a first, Commissioner Mitch Chester and Common Core opponent (and one-time senior associate commissioner) Sandra Stosky concur: This move is no repudiation of PARCC. As Chester wrote in a letter to the Times, “Neither my recommendation to the Massachusetts...

When is a test not a test? Sure, there’s an easy answer—“When it doesn’t send opt-out parents running for their torches and pitchforks”—but that’s not what we’re looking for. Give up? It’s when the test is a “locally driven performance assessment.” An article in Education Week explains the rise of these specially designed student tasks in eight New Hampshire school districts, which have been granted authority by the Education Department to employ them as alternatives to standardized tests. The districts will work with the state and one another to develop Performance Assessment of Competency Education (PACE), a series of individual and group queries that allow students to exhibit mastery over a subject without filling in bubbles. The challenges (which include the design of a forty-five-thousand-cubic-foot water tower to show proficiency in geometry) sound stimulating, and the Granite State’s record in competency-based education is extensive. It’s not hard to see why such an option would be attractive to state and local officials, especially when testing has become roughly as popular there as a leaf-peeping tax. What remains to be seen is whether this approach to assessment captures the same vital data as traditional measures.

A small storm has blown up around the fact that certain math items on the 2015 National Assessment of Educational Progress (NAEP) do not align with what fourth and eighth graders are actually being taught in a few states—mainly places attempting to implement the Common Core State Standards within their schools’ curricula.

NAEP is only administered in grades four, eight, and twelve. So the specific issue is whether the fourth graders who sat for NAEP this spring had a reasonable opportunity to learn the skills, algorithms, techniques—broadly speaking, “the content”—on that test. If their state standards had moved some portion of what used to be fourth-grade math to the fifth or sixth grade, or replaced it with something else entirely, their state’s NAEP scores would likely be lower.

This kind of misalignment is blamed for some of the math declines that NAEP recently reported. Department officials in Maryland, for example, examined the NAEP math sub-scores and determined that many Maryland fourth graders are no longer being taught some of those things before they take the test.

We are left to wonder: Should NAEP frameworks and assessments be updated to reflect what’s in...

Last week, in the wake of President Obama’s pledge to reduce the amount of time students spend taking tests, my colleagues Robert Pondiscio and Michael Petrilli weighed in with dueling stances on the current state of testing and accountability in America’s schools. Both made valid points, but neither got it exactly right, so let me add a few points to the conversation.

Like Robert, I don’t see how we can improve our schools if we don’t know how they’re doing, which means we need the data we get from standardized tests. But I also believe that—because we’re obligated to intervene when kids aren’t getting the education they deserve—some tests must inevitably be “high-stakes.” The only real alternative to this is an unregulated market, which experience suggests is a bad idea.

Must this logic condemn our children to eternal test-preparation purgatory? I hope not, but I confess to some degree of doubt. The challenge is creating an accountability system that doesn't inadvertently encourage gaming or bad teaching. Yet some recent policy shifts seem to have moved us further away from that kind of system.

As Mike noted, the problem of over-testing has been exacerbated in recent years by the...

A new study by the NAEP Validity Studies Panel analyzes the alignment of the assessment’s 2015 Math Items (the actual test questions) for grades four and eight to the math Common Core State Standards (CCSS).

To do so, the panel enlisted as reviewers eighteen mathematicians, teachers, math educators, and supervisors who have familiarity with Common Core. This group classified all 150 items in the 2015 NAEP math pool for each grade as either matching a CCSS standard or not.

The reviewers determined that the Common Core and NAEP were reasonably aligned at both grade levels— not surprising, since CCSS writers had the NAEP frameworks at their disposal. Further, NAEP is by design broader than the CCSS and is supposed to maintain a degree of independence relative to the “current fashions in instruction and curriculum.”

Panelists found that 79 percent of NAEP items were matched to the content that appears in the CCSS at or below grade 4. The overall alignment of NAEP to CCSS standards at or below grade eight is even closer, 87 percent.

There is, however, variation in matches across content areas. In fourth grade, the least aligned content area was data analysis, statistics,...

OK, everyone, back away from the ledge. With the release of NAEP data this week, the predictable deluge of commentary is well underway—mainly of the gnashing-of-teeth, rending-of-garments variety. NAEP may be the nation’s report card, but it is also the nation’s Rorschach test. Perception is in the eye of the beholder, and many see darkness and misery: “A Decade of Academic Progress Halts,” says the Los Angeles Times. “Student Score in Reading and Math Drop,” says U.S. News & World Report.

One of the frequent criticisms of NAEP punditry is “misNAEPery”—the sin of attributing fluctuations to particular policies, for example. One particularly virulent form of this fallacy—failure to account demographic changes in states over time—has become slightly less tenable this week, courtesy of this illuminating analysis by Matthew Chingos of the Urban Institute.

Not every state is the same. States with higher concentrations of black and Hispanic children, low-income families, and English language learners (ELLs) have a harder time rising to the top because they have more students mired at the bottom. But when you adjust for these demographic realities, a different NAEP emerges. There’s Massachusetts, still sitting pretty atop the tables. But Texas and...