KRUGMAN (4/19/13): At the beginning of 2010, two Harvard economists, Carmen Reinhart and Kenneth Rogoff, circulated a paper, ''Growth in a Time of Debt,'' that purported to identify a critical ''threshold,'' a tipping point, for government indebtedness. Once debt exceeds 90 percent of gross domestic product, they claimed, economic growth drops off sharply.

Ms. Reinhart and Mr. Rogoff had credibility thanks to a widely admired earlier book on the history of financial crises, and their timing was impeccable. The paper came out just after Greece went into crisis and played right into the desire of many officials to ''pivot'' from stimulus to austerity. As a result, the paper instantly became famous; it was, and is, surely the most influential economic analysis of recent years.

In fact, Reinhart-Rogoff quickly achieved almost sacred status among self-proclaimed guardians of fiscal responsibility; their tipping-point claim was treated not as a disputed hypothesis but as unquestioned fact. For example, a Washington Post editorial earlier this year warned against any relaxation on the deficit front, because we are ''dangerously near the 90 percent mark that economists regard as a threat to sustainable economic growth.'' Notice the phrasing: ''economists,'' not ''some economists,'' let alone ''some economists, vigorously disputed by other economists with equally good credentials,'' which was the reality.

For the truth is that Reinhart-Rogoff faced substantial criticism from the start, and the controversy grew over time.

That said, there was “substantial criticism from the start.” Krugman explains how this matter finally got resolved:

KRUGMAN: Finally, Ms. Reinhart and Mr. Rogoff allowed researchers at the University of Massachusetts to look at their original spreadsheet—and the mystery of the irreproducible results was solved. First, they omitted some data; second, they used unusual and highly questionable statistical procedures; and finally, yes, they made an Excel coding error. Correct these oddities and errors, and you get what other researchers have found: some correlation between high debt and slow growth, with no indication of which is causing which, but no sign at all of that 90 percent ''threshold.''

In fact, those “researchers” were three UMass graduate students. Three years after the paper appeared, it was “finally” left to three graduate students to fact-check this ungodly mess!

Other economics professors had given their bungling colleagues a pass. The professors had failed us again!

Given this paper's importance, why didn’t other professors fact-check Reinhart and Rogoff’s work? We haven’t seen anyone ask that question; we haven’t seen any explanations. That said, our intellectual elites sat back and allowed this ludicrous gong-show to roll. If it weren’t for those three lowly graduate students, we still wouldn’t know about the famous professors’ bungles.

In a rational world, this would seem like a very strange set of events. But you don’t live in that rational world. As we have persistently told you, you don’t live in a world with functioning intellectual elites.

However strange this story may seem, this is the way our culture typically functions. Consider what happened in 2006 in The Case of the High Passing Pates.

In February 2006, a news report topped the front page of the hard-copy Washington Post.

The report was written by Jay Mathews, a high-ranking education writer whose work, tone and approach we typically like. But Mathews got thoroughly taken this day, along with the usual gang of education professors and “educational experts.”

Mathews got taken by a scam affecting the whole state of Virginia. He hadn’t noticed the scam before he published his upbeat front-page report. Neither had a high-ranking “educational expert” who was up to his ears in the mess.

After we exposed the scam—a scam affecting the whole state of Virginia—the Washington Post didn’t report the fact that the scam had occurred. Education writers and “educational experts” all agreed not to discuss it.

This is the way our culture typically works! Let’s recall the basics of what happened in this case:

Mathews was reporting the fabulous progress shown by Maury Elementary, a low-income public school in Alexandria, Virginia. What made Maury Elementary “a study in pride, progress?”

According to Mathews, the school had achieved fantastic passing rates on the 2005 Virginia state tests. “Perhaps the best news was Maury's jump in [reading] scores among third- and fifth-graders,” Mathews wrote. “The percentage of children passing the test shot up from just over 50 percent to 92 percent.”

Over the previous three decades, we had learned not to accept such pleasing stories at face value. But Mathews and his editors hadn’t learned this particular lesson. Mathews’ report ran across the top of the Post’s front page, accompanied by photographs of smiling students and parents.

Over the previous three decades, we had learned not to accept such stories at face value. When we fact-checked this latest tale, we uncovered a statewide scam—and the head of the Virginia state school board acknowledged to us, on the record, that this statewide scam had occurred.

In a nutshell, this is what had happened:

At that time, Virginia elementary schools administered annual statewide tests in third and fifth grades only. There was no fourth-grade testing. This permitted the scam.

The state of Virginia had adopted a reporting system which was in effect a scam. This is the way it worked:

Imagine an elementary school, Jones Elementary, with 100 kids in each grade. In Year 1, they test their 100 third-graders. Only 50 “pass.”

For that first year, Jones Elementary would report a 50 percent passing rate. The scam would begin in the next year, prodcuing the illusion of progress.

In Year 2, Jones Elementary would test its 100 new third-graders. But the school would also give the third-grade test to the 50 kids who failed it the year before. These kids would now be in fourth grade. But once again, they would take the third grade test.

Uh-oh! Jones Elementary now has 100 third-graders—but it has 150 students taking the third-grade test. Let’s imagine that 60 of the new third-graders passed the test in Year 2—along with 40 of the fourth-graders who were once again given the third-grade test.

You’ll think we’re joking, but what follows is true: According to the state’s official protocols, Jones Elementary would report an enrollment of 100 third-grade students—and it would report that 100 students had passed the third-grade test! On the basis of those two (accurate) numbers, the school would then report a 100 percent passing rate.

Let's repeat: In fact, only 60 percent of the school's third-graders would have passed the test. But under the state’s official procedures, Jones Elementary would have reported a passing rate of 100 percent!

We know—you think that must be wrong. You think that is no earthly way a major state’s education department could have adopted a gong-show procedure like that.

We understand why you would think that—but in thinking that, you would be wrong. The state of Virginia had adopted the absurd protocols we have described. Those protocols were driving up passing rates all over the state, especially at lower-performing schools like Maury Elementary, the school which sat atop the Post’s front page, hailed for its “pride, progress.”

How absurd were the state’s protocols? The state even had a protocol telling schools what to do if its passing rate exceeded 100 percent. (Such a school should report its passing rate as 100 percent.) We know—you think that can’t be true! But it was true. We found that absurd protocol in open public documents.

(Nothing was being hidden. It's just that no one was looking, not even our various “experts.”)

With this in mind, how ludicrous was the Washington Post’s report about Maury Elementary? In fact, only 26 percent of Maury’s third-graders had passed the third-grade reading test. This was the second-lowest passing rate in the whole state of Virginia! And yes, you read that correctly:

The school with the state’s second-lowest passing rate sat atop the Washington Post’s front page, hailed for its “pride, progress.” In fact, the high passing rates hailed by the Post had been ginned up out of whole cloth. Those passing rates derived from the absurd protocols we have described.

Why are able to tell you these things? Because, in that memorable case, we played the role of the UMass graduate students. This is what happened:

All the way back in the 1970s, we had learned that you can’t accept miracle claims about public schools at face value. Because we had learned that decades before, we didn’t automatically accept that upbeat story about Maury Elementary.

Like the UMass graduate students, we decided to check the data. As you will of course understand, no one else in the whole country did.

When we checked the full set of public data for Maury, the contradictions were quickly apparent. (These data were right there on-line. Anyone could have checked them.) We soon found the absurd protocols under which this whole mess had occurred.

To his credit, the chairman of the Virginia state board acknowledged that a mess had occurred. He said he hadn’t understood the problem with those procedures, and we believe him. He was a lawyer providing citizen review, not an education specialist.

We believed the chairman of the board when he said he hadn’t known. But there is no way that education professionals in the state’s department wouldn’t have understood.

In fact, one of our leading “educational experts” sat on the Virginia state board at that time. This gong-show occurred right under his nose. After we revealed this ridiculous gong-show, he never discussed or acknowledged the statewide scam on his highly lauded education blog.

Our system works like that! Careerist hustlers bury the scams. They maintain official stories, proving true to their various guilds.

As far as we know, none of your nation’s education professors ever got around to discussing this remarkable statewide scam. Our “educational experts” were silent. So was the Washington Post.

The state of Virginia is part of the Washington Post’s local beat. But so what? The Post never reported that this remarkable statewide scam had occurred. Virginia readers were never informed about the scam their state has conducted. Parents were enever told that their school's passing rates might be inflated. State officials were never asked to explain how this scam had occurred.

In that case, we ourselves were cast in the role of the UMass graduate students. It was left to us to check the facts because the education professors, the education writers and the educational experts all agreed not to.

But then, what else is new?

In the more recent case involving Reinhart and Rogoff, it was left to three graduate students to check the facts and uncover the scam. Your economics professors all took a pass. The professoriate failed you again.

In fairness to the nation’s professors, many of them have a good excuse for their failure to serve. They couldn’t check Reinhart and Rogoff—they were in the south of France! But all your clammy liberal bloggers have let the obvious question pass, as they so typically do in such cases:

Why was it left to three graduate students to fact-check that very important paper—“the most influential economic analysis of recent years?” Maybe there’s an answer to that. So far, we haven’t seen the question asked.

Why was it left to three graduate students? Here’s our provisional answer: At the top of our intellectual pig-piles, this latest failure is the norm. In large part, our intellectual elites stopped functioning a long time ago!

At the top of our journalistic and academic pig-piles, our society no longer works.

Tomorrow: What Ezra said

Friday: The famous professors clear their throats in our greatest (and dumbest) newspaper

9 comments:

The most obvious excuse for the failure of professors to fact-check Rogoff and Reinhart is that they circumvented the peer-review process and did not make their data (ie., the Excel spreadsheet) public. It's not a very good excuse because such actions should have raised more protest from the academic community than it did, especially given the paper's importance.

That said, some economists (Dean Baker, Paul Krugman, et al.) did criticize R&R's paper but were either ignored or ridiculed in the press.

"It's not a very good excuse because such actions should have raised more protest from the academic community than it did"

Exactly. It strikes me that if R-R didn't have to publish their data, they could have skewed it to say anything they wanted it to. And as long as R-R told establishment elites what they wanted to hear, their paper would have been celebrated. Is this really the way academe' works, whereby theories are supported by secret evidence?

On Reinhart & Rogoff, part of the problem of calling them on their shit was the fact that they refused to make their data/excel spreadsheets available for review. For whatever reason, (maybe comfort that what happened wouldn't happen if turned over to only a lowly grad student), they decided to turn them over to that guy (who ended up savaging them). Also, it was one grad student doing a class project, who, upon finding his "shocking results" immediately found two partners in full blown professors.

Also, both Krugman and Dean Baker were vocally skeptical of the paper's results way before the grad students busting it up, and Dean Baker (who should be getting way more credit right now) was extremely vocal about the fact that Reinhart and Rogoff needed to release their data for review.

I agree that the paper was taken way too seriously and that shows a failing of both the "academy" and of journalism; however, the story you paint isn't exactly true. There were plenty of reasons to be skeptical of that paper way before it was officially declared a scam in the court of public opinion (thanks to the work of that grad student).

>>>>>Also, it was one grad student doing a class project, who, upon finding his "shocking results" immediately found two partners in full blown professors. <<<<<

Right, this post got a little carried away with the "three graduate students" claim. A graduate student was the lead researcher, however:

>>>>>AMHERST, Mass. —Thomas Herndon, the doctoral student whose paper just upended the global austerity debate, should be finishing his economics coursework and focusing on his thesis. Instead, he is wrapping up an interview with the BBC and fielding calls from the Colbert Report.

Last week, Mr. Herndon and two of his professors at the University of Massachusetts in Amherst published research challenging the findings of Harvard University’s Carmen Reinhart and Kenneth Rogoff that nations with a great deal of public debt inevitably face slow growth or economic contraction....

While doing research last fall for an econometrics term paper, Mr. Herndon found a spreadsheet error in the Reinhart-Rogoff work that led to different figures. He and his two professors, Michael Ash and Robert Pollin, honed the findings into the paper...<<<<<

Somerby's work on that Virginia test is among his best. A school where about three quarters of its 3rd graders flunked the test was given a rating of 90-odd percent. And was thus featured in the WaPo as a great success.

The school officials had to have known that the 90-odd percent figure was bogus. They may have taken some pride that a bunch of kids who had flunked one year had progressed to passing the next. But letting the media believe---and trumpet---the 90-odd percent figure was just asking for trouble.

Except that trouble didn't happen. Somerby figured it out. The WaPo didn't print a retraction. Many of the WaPo's daily readers were sending their children to VA public schools, and reading about their local school's scores in the paper. But the WaPo didn't start explaining that the numbers were bogus. There was no outrage.

Reportedly VA did change the way it computes these numbers. Somerby deserves a big thank you.