sorry, you can’t replicate economics either

Is Economics Research Replicable? Sixty Published Papers from Thirteen Journals Say “Usually Not”, by Andrew C. Chang and Phillip Li, Finance and Economics Discussion Series 2015-083. Washington: Board of Governors of the Federal Reserve System: Abstract We attempt to replicate 67 papers published in 13 well-regarded economics journals using author-provided replication files that include both data and code. Some journals in our sample require data and code replication files, and other journals do not require such files. Aside from 6 papers that use confidential data, we obtain data and code replication files for 29 of 35 papers (83%) that are required to provide such files as a condition of publication, compared to 11 of 26 papers (42%) that are not required to provide data and code replication files. We successfully replicate the key qualitative result of 22 of 67 papers (33%) without contacting the authors. Excluding the 6 papers that use confidential data and the 2 papers that use software we do not possess, we replicate 29 of 59 papers (49%) with assistance from the authors. Because we are able to replicate less than half of the papers in our sample even with help from the authors, we assert that economics research is usually not replicable. We conclude with recommendations on improving replication of economics research.

I’m just a layman, but this reminds me of Kenneth Gergen’s Toward Transformation in Social Knowledge. I’m wondering if there’s a more up-to-date version of the history of this phenomenon. Now, I’m sure not all the failures in replication were due to the contingencies of a socially constructed reality (I mean this to survive Christian Smith’s criticisms of social construction as omnicompetent explanatory paradigm), but perhaps some are?

During a decade as head of global cancer research at Amgen, C. Glenn Begley identified 53 “landmark” publications — papers in top journals, from reputable labs — for his team to reproduce. Begley sought to double-check the findings before trying to build on them for drug development.

Result: 47 of the 53 could not be replicated. He described his findings in a commentary piece published on Wednesday in the journal Nature.

My wife is getting her postdoc in biochemistry and biophysics, and one of the observations we both have made is that even that kind of science is getting more brittle; in one case, the particular water in a lab made their experiment succeed, where it failed in other places. Troubleshooting that was fun. So the kinds of contingencies which torpedo human sciences research seem to be popping up in the hard sciences as well.

It should be noted that this a very weak version of replication: get a similar qualitative result using the same data and codes; next one would ask if the results point in the same direction using and expanded or different observational data set; then you’d want to know how robust the results were to alternative but reasonable specifications. This is just basic metrics. Normally, grad students in econ are assigned to replicate a well-known empirical paper. Success is rare. I personally find the results reported here encouraging. They say that requiring submission of data and code files along with a submitted paper is working to improve research quality.