Falsificationism has dominated 20th century philosophy of science. It seemed to have eclipsed all forms of inductivism. Yet recent debates have revived a specific form of eliminative inductivism, the basic ideas of which go back to F. Bacon and J.S. Mill. These modern endorsements of eliminative inductivism claim to show that progressive problem solving is possible using induction, rather than falsification as a method of justification. But this common ground between falsificationism and eliminative inductivism has not (...) led to a detailed investigation into the relationship, if any, which may exist between these two methodologies. This paper reviews several versions of eliminative inductivism, establishes a natural relation between eliminative inductivism and falsificationism, which derives from the distinction between models and theories, and carries out this investigation against a case study of the construction of atom models. The result of the investigation is that falsificationism is a form of eliminative inductivism in the limit of certain constraints. (shrink)

Philip Kitcher has developed a sort of inductivist-reliabilist justification for scientific realism. After distinguishing his argument from a well-known abductivist one (the "no-miracles" argument), I will argue that Kitcher's proposal cannot adequately meet the antirealist challenge. Firstly, it begs the question against the antirealists; secondly, it can hardly support a plausible - piecemeal - scientific realism. I will explore an alternative inductivist approach that exploits correlations between theoretical properties and empirical success. On my view, its prospects for avoiding the aforementioned (...) shortcomings are better than Kitcher's standpoint. I dare say, however, that an inductivist strategy alone cannot satisfy the demands of scientific realism since, in the end, an abductive move may well be mandatory for grounding it. (shrink)

Under what circumstances, if any, are we warranted to assert that a theory is true or at least approximately true? Scientific realists answer that such assertions are warranted only for those theories that enjoy explanatory and predictive success. A number of challenges to this answer have emerged, chief among them the argument from pessimistic meta-induction. According to this challenge, the history of science supplies ample evidence against realism in the form of successful theories that are now considered false. The main (...) realist reaction to this challenge questions the legitimacy of the pessimistic meta-inductivist inference. Advocates of this approach argue that upon closer scrutiny the historical record can be reconciled with scientific realism. When a successful theory is abandoned, not all of its components are discarded but only those that are inessential or idle for the theory’s success. Their abandonment is thus inconsequential for the realist. So long as the essential components survive into the new theory there is no cause for alarm. More precisely, an outdated theory T which enjoyed some measure of success must, according to the realist, be: (i) partially true precisely because some of its theoretical claims are.. (shrink)

One of the first to criticize the verifiability theory of meaning embraced by logical empiricists, Reichenbach ties the significance of scientific statements to their predictive character, which offers the condition for their testability. While identifying prediction as the task of scientific knowledge, Reichenbach assigns induction a pivotal role, and regards the theory of knowledge as a theory of prediction based on induction. Reichenbach's inductivism is grounded on the frequency notion of probability, of which he prompts a more flexible version (...) than that of Richard von Mises. Unlike von Mises, Reichenbach attempts to account for single case probabilities, and entertains a restricted notion of randomness, more suitable for practical purposes. Moreover, Reichenbach developed a theory of induction, absent from von Mises's perspective, and argued for the justification of induction. This article outlines the main traits of Reichenbach's inductivism, with special reference to his book Experience and prediction. (shrink)

This article suggests a ‘best alternative' justification of induction (in the sense of Reichenbach) which is based on meta-induction . The meta-inductivist applies the principle of induction to all competing prediction methods which are accessible to her. It is demonstrated, and illustrated by computer simulations, that there exist meta-inductivistic prediction strategies whose success is approximately optimal among all accessible prediction methods in arbitrary possible worlds, and which dominate the success of every noninductive prediction strategy. The proposed justification of meta-induction is (...) mathematically analytical. It implies, however, an a posteriori justification of object-induction based on the experiences in our world. *Received November 2005; revised March 2008. †To contact the author, please write to: Department of Philosophy, University of Duesseldorf, Universitaetsstrasse 1, Geb. 23.21, Duesseldorf, Germany D-40225; e-mail: gerhard.schurz@phil-fak.uni-duesseldorf.de. (shrink)

Inductivism is understood as the explication of the degree of confirmation as conditional logical probability. Inductivism is not recommendable in the form of Carnap's λ-system, but tenable in the form of Bayesianism. Objections directed at it are either irrelevant or can be taken account of within Bayesianism.

I I set out my view that all inference is essentially deductive and pinpoint what I take to be the major shortcomings of the induction rule.II The import of data depends on the probability model of the experiment, a dependence ignored by the induction rule. Inductivists admit background knowledge must be taken into account but never spell out how this is to be done. As I see it, that is the problem of induction.III The induction rule, far from providing a (...) method of discovery, does not even serve to detect pattern. Knowing that there is uniformity in the universe is no help to discovering laws. A critique of Reichenbach's justification of the straight rule is constructed along these lines.IV The induction rule, by itself, cannot account for the varying rates at which confidence in an hypothesis mounts with data. The mathematical analysis of this salient feature of inductive reasoning requires prior probabilities. We also argue, against orthodox statisticians, that prior probabilities make a substantive contribution to the objectivity of inductive methods, viz. to the design of experiments and the selection of decision rules.V Carnap's general criticisms of various estimation rules, like the straight rule and the ‘impervious rule’, are seen to be misguided when the prior densities to which they correspond are taken into account.VI Analysis of Hempel's definition of confirmation qua formalization of the enumerative (naive) conception of instancehood. We show that from the standpoint of the quantitative measure P(H/E):P(H) for the degree to which E confirms H, Hempel's classificatory concept yields correct results only for sampling at large from a finite population with a two-way classification all of whose compositions are equally probable. We extend the analysis to Goodman's paradox, finding cases in which grue-like hypotheses do receive as much confirmation as their opposite numbers. We argue, moreover, the irrelevancy of entrenchment, and maintain that Goodman's paradox is no more than a straightforward counter-example to the enumerative conception of instancehood embodied in Hempel's definition.VII We rebutt the objection that prior probabilities, qua inputs of Bayesian analysis, can only be obtained by enumerative induction (insofar as they are objective). The divergence in the prior densities of two rational agents is less a function of subjectivity, we maintain, than of vagueness.VIII Our concluding remarks stress that, for Bayesians, there is no problem of induction in the usual sense. (shrink)

It is generally accepted that Popper‘s degree of corroboration, though “inductivist” in a very general and weak sense, is not inductivist in a strong sense, i.e. when by ‘inductivism’ we mean the thesis that the right measure of evidential support has a probabilistic character. The aim of this paper is to challenge this common view by arguing that Popper can be regarded as an inductivist, not only in the weak broad sense but also in a narrower, probabilistic sense. In (...) section 2, first, I begin by briefly characterizing the relevant notion of inductivism that is at stake here; second, I present and discuss the main Popperian argument against it and show that in the only reading in which the argument is formally it is restricted to cases of predicted evidence, and that even if restricted in this way the argument is formally valid it is nevertheless materially unsound. In section 3, I analyze the desiderata that, according to Popper, any acceptable measure for evidential support must satisfy, I clean away its ad-hoc components and show that all the remaining desiderata are satisfied by inductuvist-in-strict-sense measures. In section 4 I demonstrate that two of these desiderata, accepted by Popper, imply that in cases of predicted evidence any measure that satisfies them is qualitatively indistinguishable from conditional probability. Finally I defend that this amounts to a kind of strong inductivism that enters into conflict with Popper’s anti-inductivist argument and declarations, and that this conflict does not depend on the incremental versus non-incremental distinction for evidential-support measures, making Popper’s position inconsistent in any reading.Keywords: Popper; Inductivism; Confirmation; Corroboration. (shrink)

This paper seeks to show that Achinstein's recent attempt to establish that both parties to the wave-particle debate in 19th-century optics were Bayesian conditionalizers forces us to ignore several of the key conceptual issues in that controversy-not least the role of the vera causa principle and, more important still, the role of positive evidence in securing acceptance for the wave theory of light.

In his The Poverty of Historicism 1 K.R. Popper and before him F. Kaufmann2 distinguish two broad classes of epistemological and methodological positions held in the social sciences: Antinaturalistic positions and pronaturalistic positions. These positions are distinguished with respect to their attitude regarding the applicability of the methods of the natural sciences, or rather what the representatives of the anti and pronaturalistic positions assume to be the method of the natural sciences. According to Popper and Kaufmann the representatives of antinaturalistic (...) positions hold that the methods of the natural sciences can not be applied in the social sciences, whereas the representatives of the pronaturalistic positions have the opposite view. This of course raises the question what views the representatives of the anti and pronaturalistic positions embrace with respect to the methods of the natural sciences. It is interesting to note, however, that Popper in his The Poverty of Historicism, indicates those views only briefly, the reasons presumably being the special tasks and aims he set himself in that analysis. These are primarily to show the disastrous political and social consequences of a false social science methodology and the invalidity of the ideas of historical necessity and that of laws of historical development. But it is well known that Popper analysed and discussed the methods and the philosophy of science with great rigour and acumen in his other books and articles and that he triggered a philosophical revolution in that field.3 Before Popper the prevailing. (shrink)

SummaryAfter presumably cleaning science of induction, Karl Popper claims to offer a purely noninductivist theory of science. In critically evaluating this theory, I focus on the allegedly noninductive character of this theory. First, I defend and expand Wesley Salmon's charge that Popper's dismissal of induction renders science useless for practical purposes. Without induction practitioners have no grounds for believing that the predicted event will actually take place. Second, despite Popper's demands to the contrary, his theory of science is shown to (...) rest on induction. In particular, the function he attributes to background knowledge in testing a scientific hypothesis requires induction. (shrink)

It is often noted that if someone has a tertiary degree in a scientific field who promotes an anti-science-establishment, antiscience, or pseudoscience agenda, they are very often engineers, dentists, surgeons or medical practitioners. While this does not mean that all members of these professions or disciplines are antiscience, of course, the higher frequency of pseudoscience among them is indicative of what I call the “deductivist mindset” regarding science itself. Opposing this is the “inductivist mindset”, a view that has been deprecated (...) among philosophers since Popper. Roughly, the deductivist mindset tends to see problems as questions that can be resolved by deduction from known theory or principle, while the inductivist sees problems as questions to be resolved by discovery. These form cognitive poles, which nobody ever purely instantiates, but a cognitive tendency to be a deductivist may explain why some people find results that conflict with prior theoretical commitments, whether scientific or not, unacceptable. The deductivist tends to be a cognitive conservative, where the inductivist tends to be a cognitive progressive, and the conservative mindset leads to a ressentiment about modernism, and hence about certain scientific results, more often, or so I shall argue in this chapter. (shrink)

I set up two axiomatic theories of inductive support within the framework of Kolmogorovian probability theory. I call these theories ‘Popperian theories of inductive support’ because I think that their specific axioms express the core meaning of the word ‘inductive support’ as used by Popper (and, presumably, by many others, including some inductivists). As is to be expected from Popperian theories of inductive support, the main theorem of each of them is an anti-induction theorem, the stronger one of them saying, (...) in fact, that the relation of inductive support is identical with the empty relation. It seems to me that an axiomatic treatment of the idea(s) of inductive support within orthodox probability theory could be worthwhile for at least three reasons. Firstly, an axiomatic treatment demands from the builder of a theory of inductive support to state clearly in the form of specific axioms what he means by ‘inductive support’. Perhaps the discussion of the new anti-induction proofs of Karl Popper and David Miller would have been more fruitful if they had given an explicit definition of what inductive support is or should be. Secondly, an axiomatic treatment of the idea(s) of inductive support within Kolmogorovian probability theory might be accommodating to those philosophers who do not completely trust Popperian probability theory for having theorems which orthodox Kolmogorovian probability theory lacks; a transparent derivation of anti-induction theorems within a Kolmogorovian frame might bring additional persuasive power to the original anti-induction proofs of Popper and Miller, developed within the framework of Popperian probability theory. Thirdly, one of the main advantages of the axiomatic method is that it facilitates criticism of its products: the axiomatic theories. On the one hand, it is much easier than usual to check whether those statements which have been distinguished as theorems really are theorems of the theory under examination. On the other hand, after we have convinced ourselves that these statements are indeed theorems, we can take a critical look at the axioms—especially if we have a negative attitude towards one of the theorems. Since anti-induction theorems are not popular at all, the adequacy of some of the axioms they are derived from will certainly be doubted. If doubt should lead to a search for alternative axioms, sheer negative attitudes might develop into constructive criticism and even lead to new discoveries. -/- I proceed as follows. In section 1, I start with a small but sufficiently strong axiomatic theory of deductive dependence, closely following Popper and Miller (1987). In section 2, I extend that starting theory to an elementary Kolmogorovian theory of unconditional probability, which I extend, in section 3, to an elementary Kolmogorovian theory of conditional probability, which in its turn gets extended, in section 4, to a standard theory of probabilistic dependence, which also gets extended, in section 5, to a standard theory of probabilistic support, the main theorem of which will be a theorem about the incompatibility of probabilistic support and deductive independence. In section 6, I extend the theory of probabilistic support to a weak Popperian theory of inductive support, which I extend, in section 7, to a strong Popperian theory of inductive support. In section 8, I reconsider Popper's anti-inductivist theses in the light of the anti-induction theorems. I conclude the paper with a short discussion of possible objections to our anti-induction theorems, paying special attention to the topic of deductive relevance, which has so far been neglected in the discussion of the anti-induction proofs of Popper and Miller. (shrink)

This work is in two parts. The main aim of part 1 is a systematic examination of deductive, probabilistic, inductive and purely inductive dependence relations within the framework of Kolmogorov probability semantics. The main aim of part 2 is a systematic comparison of (in all) 20 different relations of probabilistic (in)dependence within the framework of Popper probability semantics (for Kolmogorov probability semantics does not allow such a comparison). Added to this comparison is an examination of (in all) 15 purely inductive (...) dependence relations. ————Part 1 leads in an axiomatic step-by-step development from the elementary classical truth value semantics of a sentential-logical language, called ‘L’, (chapter 1) to the elementary Kolmogorov probability semantics of L (chapter 2), which is then extended to four axiomatic semantical theories of dependence relations between the formulae of L. First the elementary Kolmogorov probability semantics of L is extended to a theory, called ‘Kdd’, of the relations of deductive dependence and deductive independence between formulae of L (chapter 3). Then Kdd is extended to a theory, called ‘Kpd1’, of the degree to which formulae of L probabilistically depend on each other in regard to a given probability distribution on the set of all formulae of L (chapter 4). Kpd1, in its turn, gets extended to a theory, called ‘Kpd2’, of the relations of probabilistic dependence and independence, relativized to unary Kolmogorov probability functions defined on L (chapter 5). Then Kpd2 is extended to a theory, called ‘Kid’, of the relations of inductive dependence and inductive independence, again relativized to unary Kolmogorov probability functions defined on L (chapter 6). Finally, Kid is extended to a theory, called ‘Kpid’, of the relations of purely inductive positive and negative dependence, relativized to unary Kolmogorov probability functions defined on L (chapter 7). ——Chapter 1, which deals with the familiar notions of truth value functions, tautologies, consequence relations and relations of logical opposition, is naturally the shortest chapter of part 1.——In chapter 2, the elementary classical semantics of L is extended to the elementary Kolmogorov probability semantics of L, i.e. to an axiomatic theory of unary and of binary Kolmogorov probability functions defined on the set of formulae of L. Because of the elementary character of this theory, chapter 2 is also rather short.——Chapter 3 introduces the first theory on dependence relations, to wit: Kdd, the theory of deductive (in)dependence between formulae of L. I follow here the well-known idea of Popper and Miller, who have used it in a famous discussion on the nature of probabilistic support for their arguments that probabilistic support is deductive, not inductive. I develop Kdd in the form of about 100 theorems, making ample use of the fact that deductive independence is nothing but subcontrary opposition, and close with a remark on the fundamental difference between deductive and logical dependence—two relations the ideas of which are all too easily mixed up.——Chapters 4 and 5 deal extensively with the traditional ideas of probabilistic (in)dependence, applied to formulae rather than to events. As always, I proceed axiomatically in a step-by-step process under systematic viewpoints and obtain about 300 theorems in this way. In the formulation of the theorems, I took special care to state clearly and expressly so-called tacit assumptions, especially those concerning the probability values of the formulae said to be dependent on each other. These assumptions are usually missing in the literature, due either to economy of writing or to sloppiness of thinking. Presumably, both chapters contain little that is new, their value lying more in the systematic grouping and organic development of the theorems than in the newness of these.——In chapter 6, I extend the axiomatic theory about probabilistic (in)dependence which has been elaborated in chapter 5, to an axiomatic theory of inductive (in)dependence by requiring of the relation of inductive (in)dependence that it be probabilistic (in)dependence, but not also logical implication or logical opposition. I point out the differences between probabilistic and inductive (in)dependence by means of some 60 theorems and close my examination of inductive (in)dependence by considering its relationship to the notion of support in the philosophy of science.——Finally, in chapter 7, the last of part 1, I take the step from inductive dependence to what I call ‘purely inductive dependence’ by combining the idea of inductive dependence with that of deductive independence in a way which is suggested by writings of Popper and Miller. I arrive at two noteworthy theorems. Firstly, there is indeed no purely inductive support. But secondly, and perhaps amazingly, countersupport is purely inductive.————Whereas the probabilistic framework of part 1 of the present work is Kolmogorov probability semantics, the framework of part 2 is Popper probability semantics, which is not only worth examining as a fascinating alternative to orthodox Kolmogorov probability semantics, but also allows us to examine dependence relations more deeply, than Kolmogorov probability semantics does. Part 2 leads—again in an axiomatic step-by-step development—from the basic Popper probability semantics of L, called ‘Pb’, (chapter 8) via a probabilistic theory of logical attributes, called ‘Ps’, (chapter 9) to four axiomatic semantical theories of dependence relations between the formulae of L. First, Ps is extended to a theory, called ‘Pdd’, of the relations of deductive dependence and deductive independence between formulae of L (chapter 10). Then Pdd is extended to a theory, called ‘Ppd’, of (in all) 20 relations of probabilistic (in)dependence, relativized to binary Popper probability functions defined on L (chapter 11). Ppd, in its turn, is extended to a theory, called ‘Pid’, of (in all) 10 relations of inductive dependence, again relativized to binary Popper probability functions defined on L (first part of chapter 12). Finally, Pid is extended to a theory, called ‘Ppid’, of (in all) 15 relations of purely inductive positive and negative dependence, relativized to binary Popper probability functions defined on L (second part of chapter 12).——Chapter 8, the first chapter of part 2 of the present work, is entirely preparatory. It introduces the axioms and about 180 theorems (150 of them together with their proofs) of basic Popper probability semantics in order to set this kind of semantics under way.——Then, in chapter 9, basic Popper probability semantics is extended to a probabilistic theory of logical properties of and relations between the formulae of L. Although I think that the way I did this extension is of some interest in itself, the main task of chapter 9 is again a preparatory one: to yield the indispensable lemmata (about 90 in number) for the theorems concerning probabilistic dependence relations in chapter 11 and concerning inductive dependence relations in chapter 12.——Chapter 10 brings the extension of Ps to the theory Pdd of deductive (in)dependence. Only half a dozen theorems are noted here for later use in the Pdd-extensions Ppd and Ppid. In view of the over 100 theorems already gained on this topic in the Kolmogorovian framework (cf. chapter 3), a similar extensive elaboration of Pdd would have been superfluous.——Chapter 11 is the most important one of part 2. It consists of a systematic comparison of 20 probabilistic (in)dependence concepts by means of about 230 theorems, obtained within the axiomatic theory Ppd, which is built up as an extension of Pdd. The main points of comparison were: differences in logical strength; reflexivity and symmetry; behaviour under the condition that the probability values of the formulae in question are extreme. It turned out that each of the examined concepts violates a strong and straightforward version of the intuitive requirement that probabilistic dependence should go with logical dependence. Whereas the corresponding chapter 5 in part 1 of the present work may not have led to new theorems, chapter 11 yields dozens of them in the process of comparison of concepts of dependence and independence which had—as far as I know—never before been treated in a single theoretical framework. With Popper probability semantics, this framework has become available, and here I have simply made full use of it.——In chapter 12, I extend the theory Ppd of probabilistic (in)dependence to the theories Pid and Ppid of inductive and purely inductive dependence, in a way very similar to that in which I have extended the theory Kpd2 to the theories Kid and Kpid in chapters 6 and 7. The first main result of Kpid (roughly: there is no purely inductive support) could be repeated for four of the five purely inductive positive dependence relations considered in chapter 12, whereas the second main result of Kpid (roughly: there is purely inductive countersupport ) could be repeated for each of the five examined purely inductive negative dependence relations. Chapter 12 closes with a brief recapitulation and critical discussion of the main results. (shrink)

Edgar Allan Poe’s standing as a literary figure, who drew on (and sometimes dabbled in) the scientific debates of his time, makes him an intriguing character for any exploration of the historical interrelationship between science, literature and philosophy. His sprawling ‘prose-poem’ Eureka (1848), in particular, has sometimes been scrutinized for anticipations of later scientific developments. By contrast, the present paper argues that it should be understood as a contribution to the raging debates about scientific methodology at the time. This methodological (...) interest, which is echoed in Poe’s ‘tales of ratiocination’, gives rise to a proposed new mode of—broadly abductive—inference, which Poe attributes to the hybrid figure of the ‘poet-mathematician’. Without creative imagination and intuition, Science would necessarily remain incomplete, evenby its own standards. This concern with imaginative (abductive) inference ties in nicely with his coherentism, which grants pride of place to the twin virtues of Simplicity and Consistency, which must constrain imagination lest it degenerate into mere fancy. (shrink)

Is there a universal set of rules for discovering and testing scientific hypotheses? Since the birth of modern science, philosophers, scientists, and other thinkers have wrestled with this fundamental question of scientific practice. Efforts to devise rigorous methods for obtaining scientific knowledge include the twenty-one rules Descartes proposed in his Rules for the Direction of the Mind and the four rules of reasoning that begin the third book of Newton's Principia , and continue today in debates over the very possibility (...) of such rules. Bringing together key primary sources spanning almost four centuries, Science Rules introduces readers to scientific methods that have played a prominent role in the history of scientific practice. Editor Peter Achinstein includes works by scientists and philosophers of science to offer a new perspective on the nature of scientific reasoning. For each of the methods discussed, he presents the original formulation of the method selections written by a proponent of the method together with an application to a particular scientific example and a critical analysis of the method that draws on historical and contemporary sources. The methods included in this volume are Cartesian rationalism with an application to Descartes' laws of motion Newton's inductivism and the law of gravity two versions of hypothetico-deductivism -- those of William Whewell and Karl Popper -- and the nineteenth-century wave theory of light Paul Feyerabend's principle of proliferation and Thomas Kuhn's views on scientific values, both of which deny that there are universal rules of method, with an application to Galileo's tower argument. Included also is a famous nineteenth-century debate about scientific reasoning between the hypothetico-deductivist William Whewell and the inductivist John Stuart Mill and an account of the realism-antirealism dispute about unobservables in science, with a consideration of Perrin's argument for the existence of molecules in the early twentieth century. (shrink)

The cognitive neurosciences are based on the idea that the level of neurons or neural networks constitutes a privileged level of analysis for the explanation of mental phenomena. This paper brings to mind several arguments to the effect that this presumption is ill-conceived and unwarranted in light of what is currently understood about the physical principles underlying mental achievements. It then scrutinizes the question why such conceptions are nevertheless currently prevailing in many areas of psychology. The paper argues that corresponding (...) conceptions are rooted in four different aspects of our common-sense conception of mental phenomena and their explanation, which are illegitimately transferred to scientific enquiry. These four aspects pertain to the notion of explanation, to conceptions about which mental phenomena are singled out for enquiry, to an inductivist epistemology, and, in the wake of behavioristic conceptions, to a bias favoring investigations of input–output relations at the expense of enquiries into internal principles. To the extent that the cognitive neurosciences methodologically adhere to these tacit assumptions, they are prone to turn into a largely a-theoretical and data-driven endeavor while at the same time enhancing the prospects for receiving widespread public appreciation of their empirical findings. (shrink)

This paper seeks to defend the following conclusions: The program advanced by Carnap and other necessarians for probability logic has little to recommend it except for one important point. Credal probability judgments ought to be adapted to changes in evidence or states of full belief in a principled manner in conformity with the inquirer’s confirmational commitments—except when the inquirer has good reason to modify his or her confirmational commitment. Probability logic ought to spell out the constraints on rationally coherent confirmational (...) commitments. In the case where credal judgments are numerically determinate confirmational commitments correspond to Carnap’s credibility functions mathematically represented by so—called confirmation functions. Serious investigation of the conditions under which confirmational commitments should be changed ought to be a prime target for critical reflection. The necessarians were mistaken in thinking that confirmational commitments are immune to legitimate modification altogether. But their personalist or subjectivist critics went too far in suggesting that we might dispense with confirmational commitments. There is room for serious reflection on conditions under which changes in confirmational commitments may be brought under critical control. Undertaking such reflection need not become embroiled in the anti inductivism that has characterized the work of Popper, Carnap and Jeffrey and narrowed the focus of students of logical and methodological issues pertaining to inquiry. (shrink)

Legitimating the use of metaphysics in scientific research constituted a farreaching methodological revolution, invalidating the inductivist demands that science be guided by empirical information alone. Thus, science became tentative. The revolution was established when pioneering historians of science, Max Jammer among them, exhibited the working of metaphysics in scientific research. This raises many problems, since most metaphysical ideas are poor as compared with scientific ones. Yet taking science to be the effort to explain facts in a comprehensive manner, makes some (...) metaphysics unavoidable, and presents the better metaphysics as the possible frameworks within which older scientific theories may be reinterpreted and improved and newer ones may be developed. (shrink)

In this paper, I discuss how Newton’s inductive argument of the Principia can be defended against criticisms levelled against it by Duhem, Popper and myself. I argue that Duhem’s and Popper’s criticisms can be countered, but mine cannot. It requires that we reconsider, not just Newton’s inductive argument in the Principia, but also the nature of science more generally. The methods of science, whether conceived along inductivist or hypothetico-deductivist lines, make implicit metaphysical presuppositions which rigour requires we make explicit within (...) science so that they can be critically assessed, alternatives being developed and assessed, in the hope that they can be improved. Despite claiming to derive his law of gravitation by induction from phenomena without resource to hypotheses, Newton does nevertheless acknowledge in the Principia that his rules of reasoning make metaphysical presuppositions. To this extent, Newton has a more enlightened view of scientific method than most 20th and 21st century scientists and historians and philosophers of science. (shrink)

Francis Bacon (15611626) wrote that good scientists are not like ants (mindlessly gathering data) or spiders (spinning empty theories). Instead, they are like bees, transforming nature into a nourishing product. This essay examines Bacon's "middle way" by elucidating the means he proposes to turn experience and insight into understanding. The human intellect relies on "machines" to extend perceptual limits, check impulsive imaginations, and reveal nature's latent causal structure, or "forms." This constructivist interpretation is not intended to supplant inductivist or experimentalist (...) interpretations, but is designed to explicate Bacon's account of science as a collaborative project with several interdependent methodological goals. (shrink)

As climate policy decisions are decisions under uncertainty, being based on a range of future climate change scenarios, it becomes a crucial question how to set up this scenario range. Failing to comply with the precautionary principle, the scenario methodology widely used in the Third Assessment Report of the International Panel on Climate Change (IPCC) seems to violate international environmental law, in particular a provision of the United Nations Framework Convention on Climate Change. To place climate policy advice on a (...) sound methodological basis would imply that climate simulations which are based on complex climate models had, in stark contrast to their current hegemony, hardly an epistemic role to play in climate scenario analysis at all. Their main function might actually consist in ‘foreseeing future ozone-holes’. In order to argue for these theses, I explain first of all the plurality of climate models used in climate science by the failure to avoid the problem of underdetermination. As a consequence, climate simulation results have to be interpreted as modal sentences, stating what is possibly true of our climate system. This indicates that climate policy decisions are decisions under uncertainty. Two general methodological principles which may guide the construction of the scenario range are formulated and contrasted with each other: modal inductivism and modal falsificationism. I argue that modal inductivism, being the methodology implicitly underlying the third IPCC report, is severely flawed. Modal falsificationism, representing the sound alternative, would in turn require an overhaul of the IPCC practice. (shrink)

An approach to inference to the best explanation integrating a Popperianconception of natural laws together with a modified Hempelian account of explanation, one the one hand, and Hacking's law of likelihood (in its nomicguise), on the other, which provides a robust abductivist model of sciencethat appears to overcome the obstacles that confront its inductivist,deductivist, and hypothetico-deductivist alternatives.This philosophy of scienceclarifies and illuminates some fundamental aspects of ontology and epistemology, especially concerning the relations between frequencies and propensities. Among the most important (...) elements of this conception is thecentral role of degrees of nomic expectability in explanation, prediction,and inference, for which this investigation provides a theoretical defense. (shrink)

There’s a tendency to suppose that a naturalist is automatically, by virtue of her naturalism, committed to some particular view of logic. These days, for example, the classical Quinean picture is sometimes taken to be the naturalistic standard: logic lies at the center of the web of belief; remote from sense experience, but widely confirmed by its role in all our successful theorizing; a posteriori like the rest, but also the most resistant to change, given the principle of minimum mutilation; (...) and thus apparently, or even practically, a priori. 1 But others, at other times, have held that other views of logic followed directly from naturalism, say psychologism, or simple inductivism, or some form of linguistic conventionalism. The trouble is that ‘naturalism’ means something different in each case, or that it comes encumbered with various inessential add-ons (like holism). (shrink)

The place of induction in the framing and test of scientific hypotheses is investigated. The meaning of 'induction' is first equated with generalization on the basis of case examination. Two kinds of induction are then distinguished: the inference of generals from particulars (first degree induction), and the generalization of generalizations (second degree induction). Induction is claimed to play a role in the framing of modest empirical generalizations and in the extension of every sort of generalizations--not however in the invention of (...) high-level hypotheses containing theoretical predicates. It is maintained, on the other hand, that induction by enumeration is essential in the empirical test of the lowest-level consequences of scientific theories, since it occurs in the drawing of "conclusions" from the examination of empirical evidence. But it is also held that the empirical test is insufficient, and must be supplemented with theorification, or the expansion of isolated hypotheses into theories. Refutation is not viewed as a substitute for confirmation but as its complement, since the very notion of unfavorable case is meaningful only in connection with the concept of positive instance. Although the existence of an inductive method is disclaimed, it is maintained that the various patterns of plausible reasoning (inductive inference included) are worth being investigated. It is concluded that scientific research follows neither the advice of inductivism nor the injunction of deductivism, but takes a middle course in which induction is instrumental both heuristically and methodologically, although the over-all pattern of research is hypothetico-deductive. (shrink)

Many scientists believe that there is a uniform, interdisciplinary method for the prac- tice of good science. The paradigmatic examples, however, are drawn from classical ex- perimental science. Insofar as historical hypotheses cannot be tested in controlled labo- ratory settings, historical research is sometimes said to be inferior to experimental research. Using examples from diverse historical disciplines, this paper demonstrates that such claims are misguided. First, the reputed superiority of experimental research is based upon accounts of scientific methodology (Baconian (...) class='Hi'>inductivism or falsificationism) that are deeply flawed, both logically and as accounts of the actual practices of scientists. Second, although there are fundamental differences in methodology between experimental scien- tists and historical scientists, they are keyed to a pervasive feature of nature, a time asymmetry of causation. As a consequence, the claim that historical science is methodo- logically inferior to experimental science cannot be sustained. (shrink)

The non-justificationist deductivism (or critical rationalism) of Karl Popper constitutes the only approach to human knowledge, including of course the natural and social sciences, that is capable of overcoming all the failings, and the plain contradictions, of the traditional doctrine of inductivism and of its modern incarnation, Bayesianism.

The justification of induction is of central significance for cross-cultural social epistemology. Different ‘epistemological cultures’ do not only differ in their beliefs, but also in their belief-forming methods and evaluation standards. For an objective comparison of different methods and standards, one needs (meta-)induction over past successes. A notorious obstacle to the problem of justifying induction lies in the fact that the success of object-inductive prediction methods (i.e., methods applied at the level of events) can neither be shown to be universally (...) reliable (Hume's insight) nor to be universally optimal. My proposal towards a solution of the problem of induction is meta-induction. The meta-inductivist applies the principle of induction to all competing prediction methods that are accessible to her. By means of mathematical analysis and computer simulations of prediction games I show that there exist meta-inductive prediction strategies whose success is universally optimal among all accessible prediction strategies, modulo a small short-run loss. The proposed justification of meta-induction is mathematically analytical. It implies, however, an a posteriori justification of object-induction based on the experiences in our world. In the final section I draw conclusions about the significance of meta-induction for the social spread of knowledge and the cultural evolution of cognition, and I relate my results to other simulation results which utilize meta-inductive learning mechanisms. (shrink)