Research

Current Projects:

Methodology of science and expertise

Papers: Expertise and Institutional Design in Economics (manuscript); The Role of Experts in the Methodology of Economics (submitted); Epistemology of expertise – a view from the trenches (submitted).

Description: Pluralist stances in methodology of science recognize subjective expert judgment as one among the many sources of evidence (e.g. models, statistical information, history, etc.) As a source of evidence, subjective expert judgment requires a dedicated methodology; e.g. for assessing the boundaries of what is relevant expertise, the methods for eliciting and aggregating expert judgments, etc. In my project I focus on the use of subjective expert judgment in economics, with a focus on economic committees. I evaluate a number of “principles of expertise”, and analyze the case study of the Monetary Policy Committee. This project is at the basis of a DFG-funded workshop I recently organized: Experts and Consensus in Economics and the Social Sciences.

Some research questions:

To which extent can (and should) subjective expert judgment be applied to problems in the social sciences (e.g. in risk analysis, forecasting, problem exploration), instead of other methods like analytical modeling, actuarial rules, etc.?

Can we detect the relevant biases in subjective expert judgment and find methods to “de-bias” experts?

Can we identify which parts of a given scientific consensus (e.g. anthropogenic global warming, monetary policies in a given country, etc.) are based on subjective judgment and which are based on other sources of evidence? In other words, can we “meter” the subjective component of a given scientific consensus?

Cognitive biases

“While every one well knows himself to be fallible, few think it necessary to take any precautions against their own fallibility, or admit the supposition that any opinion of which they feel very certain, may be one of the examples of the error to which they acknowledge themselves to be liable.” J. S. Mill

Papers: Local and global confidence reports, an analysis of the overconfidence effect (in preparation).

Description: Biases are a well-known source of inaccuracy in science and ill-guided policy making in society. While plenty of research has been dedicated to detecting biases, a lot has yet to be understood about them. In this project we focus on overconfidence. There is much debate about when and how overconfidence creeps in an expert’s judgment, some results showing strong overconfidence effects, and some showing good calibration instead. In a recent experiment, we detect two very different types of overconfidence, local and global, and test their relation to each other and to accuracy scores.

Some research questions:

Is overconfidence a single phenomenon, or are there different kinds of overconfidence?

Can we detect the effects of different kinds of overconfidence?

Can we devise a strategy to “debias” experts, and reduce the overconfidence effect?

Description: What should rational epistemic agents do in a situation of peer disagreement? The literature on epistemic disagreement has provided several arguments supporting the claim that a conciliatory approach is the answer. In other words, when disagreeing with an epistemic peer a rational person should change her beliefs and move her opinions closer to the other agent’s opinion. Stronger still is the view that it is impossible for two rational epistemic peers to disagree after having exchanged their evidence on the subject of disagreement. Both weak and strong conciliatory stances assume, implicitly or explicitly, that disagreement constitutes evidence on which to update one’s beliefs. The core of this research project is to understand whether updating in the light of disagreement is rational at all, and if so which method for updating can be best defended.

Some research questions:

Are conciliatory positions rational?

Which forms of updating on disagreement are rational, and which are not?

Is disagreement with someone on a given subject evidence for doubting your own beliefs on that subject?

Economic Methodology

Papers: Physics envy, but for the right reasons (draft)

Description: The classic demarcation problem tries to establish what is a science and what is not. Starting from the analysis on the demarcation problem, it is sometimes argued that economics is not a science. I find it more useful to avoid the question of demarcation as posed above, and to try instead to understand sciences comparatively. We can investigate what is characteristic of economic system, in comparison to, for instance, physical systems, or biological ones. We can investigate what is peculiar about the method of economics, for example when it comes to modeling, or measurement of economic quantities. In this project I try to clarify some of those issues in order to justify why we should still do economics regardless of whether economics is a science, according to some demarcation criterion, or not, and how to best approach a methodology of economics in its practical implications.

Some research questions:

What kind of discipline, if not science (or science, if a science) is economics?

How does it relate to other sciences, for example natural science?

Should we still do economics, regardless of whether it is a science or not, and how?