A new approach to analyze scientific methods as patternsof state transitions is proposed and exemplified by the two mostimportant, general methods: induction and deduction. Though only`local' states of science are considered in this paper, includinghypotheses, data, approximation and degree of fit, the approach caneasily be extended to more comprehensive kinds of states. Two `pure'forms of induction are distinguished, enumerative and hypothesisconstruction induction. A combination of these two forms is proposedto yield a more adequate picture of induction. While the pure forms (...) ofinduction are clearly distinct from the deductive pattern, the patternof the combined form of induction is very similar to the latter. Thepresent account of scientific methods not only points out thedifferences between different methods but – in contrast to usualdiscussions of methodology – also clarifies what they have in common. (shrink)

Chaos-related obstructions to predictability have been used to challenge accounts of theory validation based on the agreement between theoretical predictions and experimental data . These challenges are incomplete in two respects: they do not show that chaotic regimes are unpredictable in principle and, as a result, that there is something conceptually wrong with idealized expectations of correct predictions from acceptable theories, and they do not explore whether chaos-induced predictive failures of deterministic models can be remedied by stochastic modeling. In this (...) paper we appeal to an asymptotic analysis of state space trajectories and their numerical approximations to show that chaotic regimes are deterministically unpredictable even with unbounded resources. Additionally, we explain why stochastic models of chaotic systems, while predictively successful in some cases, are in general predictively as limited as deterministic ones. We conclude by suggesting that the way in which scientists deal with such principled obstructions to predictability calls for a more comprehensive approach to theory validation, on which experimental testing is augmented by a multifaceted mathematical analysis of theoretical models, capable of identifying chaos-related predictive failures as due to principled limitations which the world itself imposes on any less-than-omniscient epistemic access to some natural systems. (shrink)

Many scientists believe that there is a uniform, interdisciplinary method for the prac- tice of good science. The paradigmatic examples, however, are drawn from classical ex- perimental science. Insofar as historical hypotheses cannot be tested in controlled labo- ratory settings, historical research is sometimes said to be inferior to experimental research. Using examples from diverse historical disciplines, this paper demonstrates that such claims are misguided. First, the reputed superiority of experimental research is based upon accounts of scientific methodology (Baconian inductivism (...) or falsificationism) that are deeply flawed, both logically and as accounts of the actual practices of scientists. Second, although there are fundamental differences in methodology between experimental scien- tists and historical scientists, they are keyed to a pervasive feature of nature, a time asymmetry of causation. As a consequence, the claim that historical science is methodo- logically inferior to experimental science cannot be sustained. (shrink)

We present a model of the distribution of labour in science. Such models tend to rely on the mechanism of the invisible hand (e.g. Hull 1988, Goldman & Shaked 1991 and Kitcher 1990). Our analysis starts from the necessity of standards in distributed processes and the possibility of multiple standards in science. Invisible hand models turn out to have only limited scope because they are restricted to describing the atypical single-standard case. Our model is a generalisation of these models to (...) J standards; single-standard models such as Kitcher (1990) are a limiting case. We introduce and formalise this model, demonstrate its dynamics and conclude that the conclusions commonly derived from invisible hand models about the distribution of labour in science are not robust against changes in the number of standards. (shrink)

This classic work in the philosophy of physical science is an incisive and readable account of the scientific method. Pierre Duhem was one of the great figures in French science, a devoted teacher, and a distinguished scholar of the history and philosophy of science. This book represents his most mature thought on a wide range of topics.

There is currently no viable alternative to the Bayesian analysis of scientific inference, yet the available versions of Bayesianism fail to do justice to several aspects of the testing and confirmation of scientific hypotheses. Bayes or Bust? provides the first balanced treatment of the complex set of issues involved in this nagging conundrum in the philosophy of science. Both Bayesians and anti-Bayesians will find a wealth of new insights on topics ranging from Bayes's original paper to contemporary formal learning theory. (...) In a paper published posthumously in 1763, the Reverend Thomas Bayes made a seminal contribution to the understanding of "analogical or inductive reasoning." Building on his insights, modem Bayesians have developed an account of scientific inference that has attracted numerous champions as well as numerous detractors. Earman argues that Bayesianism provides the best hope for a comprehensive and unified account of scientific inference, yet the presently available versions of Bayesianisin fail to do justice to several aspects of the testing and confirming of scientific theories and hypotheses. By focusing on the need for a resolution to this impasse, Earman sharpens the issues on which a resolution turns. John Earman is Professor of History and Philosophy of Science at the University of Pittsburgh. (shrink)

The results, conclusions and claims of science are often taken to be reliable because they arise from the use of a distinctive method. Yet today, there is widespread skepticism as to whether we can validly talk of method in modern science. This outstanding survey explains how this controversy has developed since the 17th century, and explores its philosophical basis.

What is it to be scientific? Is there such a thing as scientific method? And if so, how might such methods be justified? -/- Robert Nola and Howard Sankey seek to provide answers to these fundamental questions in their exploration of the major recent theories of scientific method. Although for many scientists their understanding of method is something they just “pick up” in the course of being trained, Nola and Sankey argue that it is possible to be explicit about what (...) this tacit understanding of method is, rather than leave it as some unfathomable mystery. They robustly defend the idea that there is such a thing as scientific method and show how this might be legitimated. -/- The book begins with the question of what methodology might mean and explores the notions of values, rules and principles, before investigating how methodologists have sought to show that our scientific methods are rational. Part 2 of the book sets out some principles of inductive method and examines its alternatives including abduction, IBE, and hypothetico-deductivism. Part 3 introduces probabilistic modes of reasoning, particularly Bayesianism in its various guises, and shows how it is able to give an account of many of the values and rules of method. Part 4 considers the ideas of philosophers who have proposed distinctive theories of method such as Popper, Lakatos, Kuhn and Feyerabend and Part 5 continues this theme by considering philosophers who have proposed “naturalised” theories of method such as Quine, Laudan and Rescher. -/- The book offers readers a comprehensive introduction to the idea of scientific method and a wide-ranging discussion of how historians of science, philosophers of science and scientists have grappled with the question over the last fifty years. -/- . (shrink)

Some think that issues to do with scientific method are last century's stale debate; Popper was an advocate of methodology, but Kuhn, Feyerabend, and others are alleged to have brought the debate about its status to an end. The papers in this volume show that issues in methodology are still very much alive. Some of the papers reinvestigate issues in the debate over methodology, while others set out new ways in which the debate has developed in the last decade. The (...) book will be of interest to philosophers and scientists alike in the reassessment it provides of earlier debates about method and current directions of research. (shrink)

" Vivid . . . immense clarity . . . the product of a brilliant and extremely forceful intellect." — Journal of the Royal Naval Scientific Service "Still a sheer joy to read." — Mathematical Gazette "Should be read by any student, teacher or researcher in mathematics." — Mathematics Teacher The originator of algebraic topology and of the theory of analytic functions of several complex variables, Henri Poincare (1854–1912) excelled at explaining the complexities of scientific and mathematical ideas to lay (...) readers. Science and Method, written in 1908, has been appreciated by a wide audience of nonprofessionals and translated into many languages. It defines the basic methodology and psychology of scientific discovery, particularly in regard to mathematics and mathematical physics. Drawing on examples from many fields, it explains how scientists analyze and choose their working facts, and it explores the nature of experimentation, theory, and the mind. 1914 edition. Translated by Francis Maitland. (shrink)

In a career spanning sixty years, Sir Karl Popper has made some of the most important contributions to the twentieth century discussion of science and rationality. The Myth of the Framework is a new collection of some of Popper's most important material on this subject. Sir Karl discusses such issues as the aims of science, the role that it plays in our civilization, the moral responsibility of the scientist, the structure of history, and the perennial choice between reason and revolution. (...) In doing so, he attacks intellectual fashions (like positivism) that exagerrate what science and rationality have done, as well as intellectual fashions (like relativism) that denigrate what science and rationality can do. Scientific knowledge, according to Popper, is one of the most rational and creative of human achievements, but it is also inherently fallible and subject to revision. In place of intellectual fashions, Popper offers his own critical rationalism - a view that he regards both as a theory of knowlege and as an attitude towards human life, human morals and democracy. Published in cooperation with the Central European University. (shrink)

Described by the philosopher A.J. Ayer as a work of 'great originality and power', this book revolutionized contemporary thinking on science and knowledge. Ideas such as the now legendary doctrine of 'falsificationism' electrified the scientific community, influencing even working scientists, as well as post-war philosophy. This astonishing work ranks alongside The Open Society and Its Enemies as one of Popper's most enduring books and contains insights and arguments that demand to be read to this day.

By reference to Maxwell's kinetic theory, one feature of hypothetico-deductivism is defended. A scientist need make no inference to a hypothesis when he first proposes it. He may have no reason at all for thinking it is true. Yet it may be worth considering. In developing his kinetic theory there were central assumptions Maxwell made (for example, that molecules are spherical, that they exert contact forces, and that their motion is linear) that he had no reason to believe true. In (...) this paper I develop a position that explains why they were worth considering, and that rejects the retroductive position that a hypothesis is worth considering when, if true, it would explain the observed data. (shrink)

The two principal models of design in methodological circles in architecture—analysis/synthesis and conjecture/analysis—have their roots in philosophy of science, in different conceptions of scientific method. This paper explores the philosophical origins of these models and the reasons for rejecting analysis/synthesis in favour of conjecture/analysis, the latter being derived from Karl Popper’s view of scientific method. I discuss a fundamental problem with Popper’s view, however, and indicate a framework for conjecture/analysis to avoid this problem.

Knowledge of residual perturbations in Uranus's orbit led to Neptune's discovery in 1846 rather than the refutation of Newton's law of gravitation. Karl Popper asserts that this case is untypical of science and that the law was at least prima facie falsified. I argue that these assertions are the product of a false, a priori methodological position, 'Weak Popperian Falsificationism' (WPF), and that on the evidence the law was not, and was not considered, prima facie false. Many of Popper's commentators (...) presuppose WPF and their views on this case and its implications for scientific rationality and method are similarly unwarranted or defective. (shrink)

The first part of this paper reveals a conflict between the core principles of deterministic causation and the standard method of difference, which is widely seen (and used) as a correct method of causally analyzing deterministic structures. We show that applying the method of difference to deterministic structures can giverise to causal inferences that contradict the principles of deterministic causation. The second part then locates the source of this conflict in an inference rule implemented in the method of difference according (...) to which factors that can make a difference to investigated effects relative to one particular test setup are to be identified as causes, provided the causal background of the corresponding setup is homogeneous. The paper ends by modifying the method of difference in a way that renders it compatible with the principles of deterministic causation. (shrink)

This book, an exploration of the work of Leibnitz, is Ortega’s most systematic contribution to philosophy. Ortega begins with a detailed definition of a principle and with an examination of the specific principles formulated by Leibnitz. He goes on to examine Leibnitz. He goes on to examine Leibnitz’s complex and mercurial attitudes towards principles and discusses the effects of these attitudes on his philosophy and on contributions to mathematics and logic.

This paper investigates whether there is a discrepancy between the stated and actual aims in biomechanical research, particularly with respect to hypothesis testing. We present an analysis of one hundred papers recently published in The Journal of Experimental Biology and Journal of Biomechanics, and examine the prevalence of papers which (a) have hypothesis testing as a stated aim, (b) contain hypothesis testing claims that appear to be purely presentational (i.e. which seem not to have influenced the actual study), and (c) (...) have exploration as a stated aim. We found that whereas no papers had exploration as a stated aim, 58% of papers had hypothesis testing as a stated aim. We had strong suspicions, at the bare minimum, that presentational hypotheses were present in 31% of the papers in this latter group. (shrink)

Throughout more than two millennia philosophers adhered massively to ideal standards of scientific rationality going back ultimately to Aristotle’s Analytica posteriora . These standards got progressively shaped by and adapted to new scientific needs and tendencies. Nevertheless, a core of conditions capturing the fundamentals of what a proper science should look like remained remarkably constant all along. Call this cluster of conditions the Classical Model of Science . In this paper we will do two things. First of all, we will (...) propose a general and systematized account of the Classical Model of Science. Secondly, we will offer an analysis of the philosophical significance of this model at different historical junctures by giving an overview of the connections it has had with a number of important topics. The latter include the analytic-synthetic distinction, the axiomatic method, the hierarchical order of sciences and the status of logic as a science. Our claim is that particularly fruitful insights are gained by seeing themes such as these against the background of the Classical Model of Science. In an appendix we deal with the historiographical background of this model by considering the systematizations of Aristotle’s theory of science offered by Heinrich Scholz, and in his footsteps by Evert W. Beth. (shrink)

The concordance of results that are “robust” across multiple scientific modalities is widely considered to play a critical role in the epistemology of science. But what should we make of those cases where such multimodal evidence is discordant? Jacob Stegenga has recently argued that robustness is “worse than useless” in these cases, suggesting that “different kinds of evidence cannot be combined in a coherent way.” In this article I respond to this critique and illustrate the critical methodological role that robustness (...) plays as an aim of scientific inquiry. (shrink)

I believe that the long-neglected ideas on science and scientific method of Charles Sanders Peirce and Josiah Royce can illuminate some of the current attacks on science that have surfaced: misconduct and fraud in science and anti-scientism or the "new cynicism." In addition, Royce and Peirce offer insights relevant to the ferment in contemporary philosophy of science around the various forms of pluralism advocated by a number of philosophers (see Kellert, Longino, and Waters). "Pluralism" is the view that "plurality in (...) science possibly represents an ineliminable character of scientific inquiry and knowledge (about at least some phenomena) . . . and that analysis of metascientific concepts (like theory .. (shrink)

Humans are closely coupled with their environments. They rely on being `embedded' to help coordinate the use of their internal cognitive resources with external tools and resources. Consequently, everyday cognition, even cognition in the absence of others, may be viewed as partially distributed. As cognitive scientists our job is to discover and explain the principles governing this distribution: principles of coordination, externalization, and interaction. As designers our job is to use these principles, especially if they can be converted to metrics, (...) in order to invent and evaluate candidate designs. After discussing a few principles of interaction and embedding I discuss the usefulness of a range of metrics derived from economics, computational complexity, and psychology. (shrink)

The aim of this article is to discuss the nature of disagreement in scientific ontologies in the light of case studies from biology and cognitive science. I argue that disagreements in scientific ontologies are usually not about purely factual issues but involve both verbal and normative aspects. Furthermore, I try to show that this partly non-factual character of disagreement in scientific ontologies does not lead to a radical deflationism but is compatible with a “normative ontological realism.” Finally, I argue that (...) the case studies from the empirical sciences challenge contemporary metaontological accounts that insist on exactly one true way of “carving nature at its joints.”. (shrink)