RICHARD ANDERSON: Replication and the Zen of Home Repair

This summer is the first since my retirement from government that I find myself without academic obligations here or abroad. Instead, I am focused on starting to rehab a tattered house that I recently purchased jointly with one of my children.

Surprising parallels exist between repairing a house and pursuing scientific research, at least for persons with an active imagination. First, it is important to understand the basic structure of the problem: removing an incorrect wall might lead to collapse of the house. Pursuit of an uninteresting hypothesis might doom many months of research to becoming a permanent resident in your file cabinet. Both are tragedies.

There also is the issue of “what was done before.” Is it important to discern the architect’s original location for a window or a door? Is it important to discern precisely how the investigator in a previous study specified his regression in Eviews? Surprisingly, the answers are both yes. Approximate guesses are not adequate. Cutting through framing that supports a hidden beam can lead to poor results, as can guessing what exact specification was used by a previous investigator.

Opening the door on an older house and opening a new academic study are quite similar in a challenging way: neither typically comes with adequate documentation. In a house, you open the door to adventure: no document reveals the modifications and flaws, there is candy and danger for you to discover. Your mind’s vision of the completed project is its advertisement. Similarly, as Bruce McCullough phrases it, opening a new published article is but an advertisement for the underlying research. What data, precisely, were used? If a regression was used, how was it specified and what options (or defaults) were used in its estimation? What statistical package was used? If a hypothesis test was used, precisely how was the test statistic calculated?

Danger stalks both restored houses and scientific research. An incorrectly modified house can risk human life (or at least the value of the property). An incorrect scientific study risks a poorly designed public policy, or creating a “bandwagon” that leads others in pursuit of flawed results.

Fortunately, the answer in economic research (both empirical and DSGE-style simulation studies) is easier than in old houses: the profession should expect authors to furnish code and data as part of the output of their research. It is an enduring mystery that professional economists – and the persons who pay their salaries – see no value in such transparency. An old house is unable to reveal clearly its history and current flaws; most are sold “as is” for that reason. How much longer will published economic research similarly be sold “as is” to its consumers?