More by Bradley Efron

Abstract

Fisher is the single most important figure in 20th century
statistics. This talk examines his influence on modern statistical thinking,
trying to predict how Fisherian we can expect the 21st century to be.
Fisher's philosophy is characterized as a series of shrewd compromises
between the Bayesian and frequentist viewpoints, augmented by some unique
characteristics that are particularly useful in applied problems. Several
current research topics are examined with an eye toward Fisherian influence, or
the lack of it, and what this portends for future statistical developments.
Based on the 1996 Fisher lecture, the article closely follows the text of that
talk.

BOX, J. F. 1978. The Life of a Scientist. Wiley, New York. This is both a personal and an intellectual biography by Fisher's daughter, a scientist in her own right and also an historian of science, containing some unforgettable vignettes of precocious mathematical genius mixed with a difficulty in ordinary human interaction. The sparrow quote in Section 4 is. put in context on page 130.

FISHER, R. A. 1925. Theory of statistical estimation. Proc. Z Cambridge Philos. Soc. 22 200 225. Reprinted in the Mather collection, and also in the 1950 Wiley Fisher collection Contributions to Mathematical Statistics. This is my choice for the most important single paper in statistical theory. A competitor might be Fisher's 1922 Philosophical Society paper, but as Fisher himself points out in the Wiley collection, the 1925 paper is more compact and businesslike. than was possible in 1922, and more sophisticated as well. Z. EFRON B. 1995. The statistical century. Roy al Statistical SociZ. Z ety News 22 5 1 2. This is mostly about the postwar boom in statistical methodology and uses a different statistical. triangle than Figure 8.

FISHER, R. A. 1934. Two new properties of mathematical likeliZ hood. Proc. Roy. Soc. Ser. A 144 285 307. Concerns two situations when fully efficient estimation is possible in finite samples: one-parameter exponential families, where the MLE is a sufficient statistic, and location scale families, where there are exhaustive ancillary statistics. Reprinted in. the Mather and the Wiley collections. Section 4 Z.

EFRON, B. 1978. Controversies in the foundations of statistics. Z Amer. Math. Monthly 85 231 246. The Bay es Frequentist Fisherian argument in terms of what kinds of averages should the statistician take. Includes Fisher's famous circle. example of ancillarity. Z.

EFRON, B. 1982. Maximum likelihood and decision theory. Ann. Z Statist. 10 240 356. Examines five questions concerning maximum likelihood estimation: What kind of theory is it? How is it used in practice? How does it look from a frequentistic decision-theory point of view? What are its principal virtues and defects? What improvements have been sug. gested by decision theory? Z.

CIFARELLI, D. and REGAZZINI, E. 1996. De Finetti's contribution to probability and statistics. Statist. Sci. 11 253 282. The second half of the quote in my Section 4, their Section 3.2.2, goes on to criticize the Ney man Pearson school. De Finetti is less kind to Fisher in the discussion following Savage's Z. 1976 article.

REID, N. 1995. The roles of conditioning in inference. Statist. Sci. 10 138 157. This is a survey of the p formula, what I called the magic formula following Ghosh's terminology, and many other topics in conditional inference; see also the Z. discussion following the companion article on pages 173 199, in particular McCullagh's commentary. Gives an extensive bibliography.Z.

EFRON, B. 1993. Bay es and likelihood calculations from conZ fidence intervals. Biometrika 80 3 26. Shows how approximate confidence intervals can be used to get good approximate confidence densities, even in complicated prob. lems with a great many nuisance parameters.

, A so that density and likelihood have the form Z. Z. f, A and L ;, A ; the amplified formulas would then appear as

, for example. The confidence density approach gives sensible results in such situations. My 1993 Biometrika paper argues, a la Kass, that this is a way of using frequentist methods to aid Bayesian calculations. Z. 4 Professor Hinkley notes the continued vitality of Tukey-sty le data analysis. In its purest form this line of work is statistics without probability theory Zsee, e.g., Mosteller and Tukey's 1977 book ``Data. Analy sis and Regression'' and as such I could not place it any where in the statistical triangle of Section 11. This is my picture's fault of course, not Tukey's. Problem-driven areas like neural networks often begin with a healthy burst of pure data analysis before settling down to an accommodation with statistical theory. Z. 5 I am grateful to Professor Fraser for presenting a more intelligible version of the magic formula. This was the spot in the talk where ``avoiding technicalities'' almost avoided coherency. ``Trick'' is a positive word in my vocabulary, reflecting a Caltech education, and I only wish I could think of some more Fisher-level tricks. Fisherian statistics was