I aim to show that one way of testing the mettle of a theory of scientific explanation is to inquire what that theory entails about the status of brute facts. Here I consider the nature of brute facts, and survey several contemporary accounts of explanation vis a vis this subject. One problem with these accounts is that they seem to entail that brute facts represent a gap in scientific understanding. I argue that brute facts are non-mysterious and indeed are even (...) explainable by the lights of Salmon's ontic conception of explanation. The plausibility of various models of explanation, I suggest, depends to some extent on the tendency of their proponents to focus on certain examples of explananda - I ponder brute facts qua explananda here as a way of helping us to recognize this dependency. (shrink)

We call Frege's discovery that, in the context of second-order logic, Hume's principle-viz., The number of Fs = the number of Gs if, and only if, F a G, where F a G (the Fs and the Gs are in one-to-one correspondence) has its usual, second-order, explicit definition-implies the infinity of the natural numbers, Frege's theorem. We discuss whether this theorem can be marshalled in support of a possibly revised formulation of Frege's logicism.

Galileo's Dialogue (1632) can be read from the viewpoints of methodological judgment and critical reasoning; methodological judgment means the avoidance of onesidedness and extremes; and critical reasoning means reasoning aimed at the analysis and evaluation of arguments. Classic sources for these readings are Thomas Salusbury (1661) and the Port-Royal logicians (1662). This focus does not deny the book's scientific, historical, rhetorical, and aesthetic dimensions; it is critical of excessively rhetorical readings; and it suggests solutions to the problems of hermeneutical pluralism, (...) interpretation versus evaluation, and theory versus practice. The book's methodological judgment and critical reasoning can be shown to correspond to Galileo's self-reflections. (shrink)

A large body of research in computational vision science stems from the pioneering work of David Marr. Recently, Patricia Kitcher and others have criticized this work as depending upon optimizing assumptions, assumptions which are held to be inappropriate for evolved cognitive mechanisms just as anti-adaptationists (e.g., Lewontin and Gould) have argued they are inappropriate for other evolved physiological mechanisms. The paper discusses the criticism and suggests that it is, in part, misdirected. It is further suggested that the criticism leads to (...) interesting questions about how one formulates constraints--across "levels of organization" and disciplinary boundaries--on one's models of complex systems, such as human vision. (shrink)

This paper serves to introduce the papers from the symposium by the same title, by describing the sort of work done in philosophy of psychology conceived as a branch of the philosophy of science, distinguishing it from other discussions of psychology in philosophy, and criticizing the claims to set limits on scientific psychology in the largely psychologically uninformed literatures concerning "folk psychology' and "wide" and "narrow" content. Philosophy of psychology as philosophy of science takes seriously and analyzes the explanatory structures, (...) conceptual problems, and evidentiary practices of extant scientific psychology. (shrink)

This paper is intended to be an introductory survey of subjects related to the problems dealt with in the three other papers in this symposium on quantum field theory. A brief history of quantum electrodynamics is given and some of the objections to it are stated. A brief history of quantum field theories from the 1970's to the present is then provided. Finally, a sketch of some of the philosophical work that has been done on quantum field theories is presented. (...) The object of the paper is to explain why philosophers of physics have tended to neglect quantum field theories and to point out several of the conceptual issues raised by quantum field theories that call out for further analysis. (shrink)

One of the most widely debated and influential implications of the "demise" of positivism was the realization, now a commonplace, that philosophy of science must be firmly grounded in an understanding of the history of science, and/or of contemporary scientific practice. While the nature of this alliance is still a matter of uneasy negotiation, the principle that philosophical analysis must engage "real" science has transformed philosophical practice in innumerable ways. This short paper is the introduction to a symposium presented at (...) the 1994 PSA Biennial meetings that focused attention on recent developments at the interface between various disciplinary science studies fields. It brought together two philosophers who explore the implications of sociological and historical contextualization for philosophical studies of science, Brian Baigrie and Joseph Rouse; and a sociologist and historian, Andy Pickering and Betty Smocovitis, whose work raises philosophical questions about the sciences and about science studies. Each argued for ways of reconceptualizing our subject domains, our purposes, and our conventional strategies of inquiry that promise much richer understanding of the sciences, but necessarily challenge discipline-specific traditions of science studies quite profoundly. If there is a common theme to be discerned in these discussions it is that, in the spirit of Rouse's recommendations, science studies should be understood to be an essentially open ended and dynamic enterprise, like the sciences they study. (shrink)

Do predictions of novel facts provide stronger evidence for a theory than explanations of old ones? Sometimes yes, sometimes no. Which obtains has nothing to do with whether the evidence is predicted or explained, but only with the selection procedure used to generate the evidence. This is demonstrated by reference to a series of hypothetical drug cases and to Heinrich Hertz's 1883 cathode ray experiments.

Several diagrams and tables from review articles during the Ox-Phos Controversy serve as an occasion to assess the nature of competition in models of theory choice in science. Many models follow "Super-Bowl" principles of polar, either-or, winner-take-all competition. A significant alternative highlighted by this episode, however, is the differentiation of domains. Incommensurability and the partial divergence of overlapping domains serve both as signals and context for shifting frameworks of competition. Appropriate strategies may thus help researchers diagnose the status of competition (...) and shape their research accordingly. (shrink)

In realistic situations where a macroscopic system interacts with an external environment, decoherence of the quantum state, as derived in the decoherence approach, is only approximate. We argue that this can still give rise to facts, provided that during the decoherence process states that are, respectively, always close to eigenvectors of pointer position and record observable are correlated. We show in a model that this is always the case.

The new inter-disciplinary eclecticism championed by many philosophers of science has generated a heterogeneous family of science studies projects. Philosophers who favor an inter-disciplinary approach face many problems if they are to successfully forge a hybrid science studies that does not violate their integrity as philosophers in particular, they must isolate an intellectual space in which traditional agendas, such as the concern for the clarification of concepts, can hold court. In this paper, I outline what I regard as a new (...) brand of HPS, one that is deeply rooted in the history of the exact science. The virtue of this New HPS, I will submit, is that it furnishes philosophers of science with a fresh perspective from which to carry on philosophy's classic normative mission. (shrink)

Recently we have learned how experiment can have a life of its own. However, experiment remains epistemologically disadvantaged. Scientific knowledge must have a theoretical/propositional form. To begin to redress this situation, I discuss three ways in which instruments carry meaning: 1. Scientific instruments can carry tremendous loads of meaning through association, analogy and metaphor. 2. Instrumental models of complicated phenomena work representationally in much the same way as theories. 3. Instruments which create new phenomena establish a new field of material (...) possibilities. I suggest that scientists employ a "visual/physical/material logic," analogous to propositional logic, which establishes relations between different material forms. (shrink)

I advance a decision principle called the "weak dominance principle" (WDP) based on the interval notion of probability to deal with the Ellsberg type paradox (ETP). Given ETP, I explain three things: (i) Why WDP is a better principle than many principles e.g. Kyburg's principle and Gardenfors and Sahlin's principle, (ii) Why one should not, contrary to many principles, expect a unique solution in ETP, and (iii) What is the relationship between WDP and the principles mentioned above. I prove also (...) that WDP induces a strict partial ordering on the intervals to which it is applied. (shrink)

The question whether research techniques are producing artifacts or data is often a crucial one for scientists. The potential for artifacts results from the fact that generating data often requires numerous procedures that are often brutal, poorly understood, and very sensitive to details of the procedure. Through a case-study of the introduction of electron microscopy as a tool for studying cells, I examine how scientists judge whether new techniques are introducing artifacts. Three factors seem to be most salient in their (...) judgments: determinateness of the results, consilience of different procedures, and ability of the results to fit into emerging theories. (shrink)

This paper presents an analysis of the forms of response that scientists make when confronted with anomalous data. We postulate that there are seven ways in which an individual who currently holds a theory can respond to anomalous data: (1) ignore the data; (2) reject the data; (3) exclude the data from the domain of the current theory; (4) hold the data in abeyance; (5) reinterpret the data; (6) make peripheral changes to the current theory; or (7) change the theory. (...) We analyze psychological experiments and cases from the history of science to support this proposal. Implications for the philosophy of science are discussed. (shrink)

John Earman and John Norton have argued that substantivalism leads to a radical form of indeterminism within local spacetime theories. I compare their argument to more traditional arguments typical in the Relationist/Substantivalist dispute and show that they all fail for the same reason. All these arguments ascribe to the substantivalist a particular way of talking about possibility. I argue that the substantivalist is not committed to the modal claims required for the arguments to have any force, and show that this (...) naturally leads to an alteration in the way determinism is characterized for local spacetime theories. (shrink)

The thesis of this paper is that philosophers are often too hasty in rejecting justifications because the argument that yields the justification is circular. Circularity is distinguished from vicious circularity and several examples are examined in which a proposed justification is circular in a precise sense, but not viciously circular. These include an observational procedure which could yield a velocity in excess of the velocity of light even though the impossibility of such velocities is assumed at a key step in (...) analyzing the data, and an argument that uses a specific argument form to show that that form is invalid. (shrink)

The thesis that scientists give greater weight to novel predictions than to explanations of known facts is tested against historical cases in physical science. Several theories were accepted after successful novel predictions but there is little evidence that extra credit was given for novelty. Other theories were rejected despite, or accepted without, making successful novel predictions. No examples were found of theories that were accepted primarily because of successful novel predictions and would not have been accepted if those facts had (...) been previously known. (shrink)

The aim of cognitive neuropsychology is to articulate the functional architecture underlying normal cognition, on the basis of cognitive performance data involving brain-damaged subjects. Glymour formulates a discovery problem for cognitive neuropsychology, in the sense of formal learning theory, concerning the existence of a reliable methodology, and argues that the problem is insoluble: granted certain apparently plausible assumptions about the form of neuropsychological theories and the nature of the available evidence, a reliable methodology does not exist! I argue for a (...) reformulation of the discovery problem in terms of an alternative characterization of relevant evidence in neuropsychology. (shrink)

Descartes' conception of the mind as a private entity, separable (in various ways) from the body and the world around it, has come under increasingly vigorous attack in recent years. A new and very different sort of expansion of the scope of psychology has recently been advanced by John Haugeland, who argues quite ingeniously that the Cartesian divisions between mind, body, and world are psychologically otiose. I demur, citing several traditional individuative criteria that are immune to Haugeland's case.

A realist causal model of quantum cosmology (QC) is developed. By applying the de Broglie-Bohm interpretation of quantum mechanics to QC, we resolve the notorious 'problem of time' in QC, and derive exact equations of motion for cosmological dynamical variables. Due to this success, it is argued that if the situation in QC is used as a yardstick by which other interpretations are measured, the de Broglie-Bohm theory seems uniquely fit as an interpretation of quantum mechanics.

Pluralism is usually opposed to realism. This paper argues that the two come naturally into conflict only given a third assumption-imperialism, i.e., the doctrine that some one, or some handful, of our favourite theories are universal. This paper attempts to show why that assumption is implausible, even in the case of fundamental theories in physics. It argues first that physics theories are true only in their models: for the most part the successes of a theory are confined to situations that (...) resemble the models. Second it argues specifically for the possibility of peaceful co-existence between quantum and classical physics. (shrink)

A central theme in the foundational debates in the early Twentieth century in response to the paradoxes was to invoke the notion of the indefinite extensibility of certain concepts e,g. definability and class. Dummett has recently revived the notion, as the real lesson of the paradoxes and the source of Frege's error in basic law five of the Grundgesetze. The paper traces the historical and conceptual evolution of the concept and critices Dummett's argument that the proper lesson of the paradoxes (...) is that set theory is a theory of indefinitely extensible domains. (shrink)

Rasmussen (1993) argues that, because electron microscopists did not use robustness and would not have been warranted in using it as a criterion for the reality or the artifactuality of mesosomes, the bacterial mesosome serves as a test case for robustness that it fails. I respond by arguing that a more complete reading of the research literature on the mesosome shows that ultimately the more robust body of data did not support the mesosome and that electron microscopists used and were (...) warranted in using robustness as a criterion for the artifactuality of mesosomes. (shrink)

In the philosophy of science, we are to assess critically and on their intrinsic merits various proposals for a consistent interpretation of quantum mechanics, including resolutions of the measurement problem and accounts of the long-range Bell correlations. In this paper I suggest that the terms of debate may have been so severely and unduly constrained by the reigning orthodoxy that we labor unproductively with an unhelpful vocabulary and set of definitions and distinctions. I present an alternative conceptual framework, free of (...) many of the standard conundrums. (shrink)

Molecular biologists use different kinds of reasoning strategies for different tasks, such as hypothesis formation, experimental design, and anomaly resolution. More specifically, the reasoning strategies discussed in this paper may be characterized as (1) abstraction-instantiation, in which an abstract skeletal model is instantiated to produce an experimental system; (2) the systematic scan, in which alternative hypotheses are systematically generated; and (3) modular anomaly resolution, in which components of a model are stated explicitly and methodically changed to generate alternative changes to (...) resolve an anomaly. This work grew out of close observation over a period of six months of an actively functioning molecular genetics laboratory. (shrink)

I review the modal interpretation of quantum mechanics, some versions of which rely on the biorthonormal decomposition of a statevector to determine which properties are physically possessed. Some have suggested that these versions fail in the case of inaccurate measurements, i.e., when one takes tails of the wavefunction into account. I show that these versions of the modal interpretation are satisfactory in such cases. I further suggest that a more general result is possible, namely, that these versions of the modal (...) interpretation never encounter the sort of trouble that has been claimed to arise in the case of inaccurate measurement. (shrink)

Most discussion of the unity of science has concerned what might be called vertical relations between theories: the reducibility of biology to chemistry, or chemistry to physics, and so on. In this paper I shall be concerned rather with horizontal relations, that is to say, with theories of different kinds that deal with objects at the same structural level. Whereas the former, vertical, conception of unity through reduction has come under a good deal of criticism recently (see, e.g., Dupré 1993), (...) horizontal unity has generally been conceded to be an important goal. The most pressing questions about horizontal unification arise in the study of human behavior. Numerous sciences including psychology, economics, anthropology, sociology, and parts of biology, attempt to provide explanations of human behavior. It is possible that some of these sciences may be able to coexist peacefully or even cooperatively. However things do not always go so smoothly, and at least two approaches to human behavior, those deriving from economics and evolutionary biology, often involve clearly imperialist tendencies. Devotees of these approaches are inclined to claim that they are in possession not just of one useful perspective on human behavior, but of the key that will open doors to the understanding of ever wider areas of human behavior. In this paper I shall consider some areas in which economic and evolutionary imperialists are currently staking claims. It is of particular interest to look at situations where two the two imperialist programs are staking the same claim, but limitations of space force me to focus here mainly on economics, and my remarks on evolutionary imperialism will be cursory. As well as some specific insights into the particular strategies of these scientific programs, I hope that my discussion will throw some more general light on the limits of such general theoretical strategies and, thereby, I shall suggest some motivations for adhering to a horizontal pluralism of science that matches the vertical pluralism advocated by anti-reductionists. (shrink)

Decoherence results from the dissipative interaction between a quantum system and its environment. As the system and environment become entangled, the reduced density operator describing the system "decoheres" into a mixture (with the interference terms damped out). This formal result prompts some to exclaim that the measurement problem is solved. I will scrutinize this claim by examining how modal and relative-state interpretations can use decoherence. Although decoherence cannot rescue these interpretations from general metaphysical difficulties, decoherence may help these interpretations to (...) pick out a preferred basis. I will explore whether decoherence solves nagging technical problems associated with selecting a preferred basis. (shrink)

Hacking and other entity realists suggest a strategy to build scientific realism on a stronger foundation than inference to the best explanation. They argue that if beliefs in the existence of theoretical entities are derived from experimentation rather than theories, they can escape the antirealist's criticism and provide a stronger ground for realism. In this paper, an outline and a critique of entity realism are presented. It will be argued that entity realism cannot stand as a separate position from classical (...) realism. Thus, entity realism cannot avoid the problems facing classical realism.. (shrink)

Several authors have argued for taxonomic pluralism in biology -the position that there is a plurality of equally legitimate classifications of the organic world. Others have objected that such pluralism boils down to a position of anything goes. This paper offers a response to the anything goes objection by showing how one can be a discerning pluralist. In particular, methodological standards for choosing taxonomic projects are derived using Laudan's normative naturalism. This paper also sheds light on why taxonomic pluralism occurs (...) in biology as well as illustrates the usefulness of normative naturalism. (shrink)

In his first philosophy book, Science and Hypothesis, Poincare provides a picture in which the different sciences are arranged in a hierarchy. Arithmetic is the most general of all the sciences because it is presupposed by all the others. Next comes mathematical magnitude, or the analysis of the continuum, which presupposes arithmetic; and so on. Poincare's basic view was that experiment in science depends on fixing other concepts first. More generally, certain concepts must be fixed before others: hence the hierarchy. (...) This paper attempts to dissolve some potential problems regarding Poincare's hierarchy. One is an apparent epistemological circularity in the hierarchy. A more serious problem regarding the epistemology of analysis is also addressed. (shrink)

One important point that has emerged from recent work on the history and philosophy of experiment is that technology plays an integral role in experiment, and therefore in science. Technology determines what experimenters can measure and how well it can be measured. The importance of technology, along with several new questions that its use raises, has been made quite clear in the papers presented in this session.

I aim to recover some of the original cultural significance that was attached to the realism-instrumentalism debate (RID) when it was hotly contested by professional scientists in the decades before World War I. Focusing on the highly visible Mach-Planck exchange of 1908-13, I show that arguments about the nature of scientific progress were used to justify alternative visions of science education. Among the many issues revealed in the exchange are realist worries that instrumentalism would subserve science entirely to human interests, (...) as well as instrumentalist worries that realism could become the basis of a science-based religion. I conclude by addressing some issues relating to RID that are now occluded because of Planck's triumph over Mach. (shrink)

This address focuses on those of us engaged in viewing science, particularly philosophers and sociologists of science. I begin with a historical perspective on the philosophy of science, focusing on the historical contingencies which have shaped its development since the 1930s. I then turn my gaze to the more recent history of the sociology of science. For both disciplines I hold up to view the reflexive problem of the status of that discipline's claims from its own perspective. I conclude with (...) a realist vision of science which rejects asymmetric notions, such as rationality, in favor of a naturalistic, perspectival realism. (shrink)

In this paper I offer a criticism of Carnap's inductive logic which also applies to other formal methods of inductive inference. Criticisms of Carnap's views have typically centered upon the justification of his particular choice of inductive method. I argue that the real problem is not that there is an agreed upon method for which no justification can be found, but that different methods are justified in different circumstances.

Neyman-Pearson methods in statistics distinguish between Type I and Type II errors. Through rigid control of Type I error, the "null" hypothesis typically receives the benefit of the doubt. I compare philosophers' interpretations of this feature of Neyman-Pearson tests with interpretations given in statistics textbooks. The pragmatic view of the tests advocated by Neyman, largely rejected by philosophers, lives on in many textbooks. Birnbaum thought the pragmatic view has a useful "heuristic" role in understanding testing. I suggest that it may (...) have the opposite effect. (shrink)

This article blends psychological and social factors in the explanation of science, and defends the compatibility of a psychosocial picture with an epistemic picture. It examines three variants of the 'political' approach to interpersonal persuasion advocated by Latour and others. In each case an 'epistemic' or mixed account is more promising and empirically better supported. Psychological research on motivated reasoning shows the epistemic limits of interest-driven belief. Against social constructivism, the paper defends the viability of a truth-based standard, and reports (...) how truth-possession can be promoted even when scientific work is motivated by political/professional considerations. (shrink)

Many philosophers of psychology fail to appreciate the constructivist process of science as well as its pragmatic aspects. A well-developed philosophy of science helps to clear many conceptual confusions. However, ridding ourselves of popular complaints only opens more sophisticated worries regarding how we generalize specific events and how we use those generalizations to build physical systems and abstract models. These questions can still be answered though by realizing that science is largely a social enterprise, and how and what we explain (...) depends a great deal upon who is asking the question of whom and when. (shrink)

The objectivity of Bayesian induction relies on the ability of evidence to produce a convergence to agreement among agents who initially disagree about the plausibilities of hypotheses. I will describe three sorts of Bayesian convergence. The first reduces the objectivity of inductions about simple "occurrent events" to the objectivity of posterior probabilities for theoretical hypotheses. The second reveals that evidence will generally induce converge to agreement among agents on the posterior probabilities of theories only if the convergence is 0 or (...) 1. The third establishes conditions under which evidence will very probably compel posterior probabilities of theories to converge to 0 or 1. (shrink)

We report two issues concerning diverging sets of Bayesian (conditional) probabilities-divergence of "posteriors"-that can result with increasing evidence. Consider a set P of probabilities typically, but not always, based on a set of Bayesian "priors." Fix E, an event of interest, and X, a random variable to be observed. With respect to P, when the set of conditional probabilities for E, given X, strictly contains the set of unconditional probabilities for E, for each possible outcome X = x, call this (...) phenomenon dilation of the set of probabilities (Seidenfeld and Wasserman 1993). Thus, dilation contrasts with the asymptotic merging of posterior probabilities reported by Savage (1954) and by Blackwell and Dubins (1962). (1) In a wide variety of models for Robust Bayesian inference the extent to which X dilates E is related to a model specific index of how far key elements of P are from a distribution that makes X and E independent. (2) At a fixed confidence level, (1-α), Classical interval estimates A n for, e.g., a Normal mean θ have length O(n -1/2 ) (for sample size n). Of course, the confidence level correctly reports the (prior) probability that θ ∈ A n ,P(A n )=1-α , independent of the prior for θ . However, as shown by Pericchi and Walley (1991), if an ε -contamination class is used for the prior on the parameter θ , there is asymptotic (posterior) dilation for the A n , given the data. If, however, the intervals A ′ n are chosen with length $O(\sqrt{\log (\text{n})/\text{n})}$ , then there is no asymptotic dilation. We discuss the asymptotic rates of dilation for ClassClassical and Bayesian interval estimates and relate these to Bayesian hypothesis testing. (shrink)

One of the traditional problems of philosophy is the nature of the connection between perceptual experience and empirical knowledge. That there is an intimate connection between the two is rarely doubted. Three case studies of visual deficits due to brain damage are used to motivate the claim that perceptual experience is neither necessary nor sufficient for perceptual knowledge. Acceptance of this claim leaves a mystery as to the epistemic role, if any, of perceptual experience. It is argued that one function (...) of perceptual experience is to provide information about the sources of beliefs, both as to which perceptual modality and within a given modality. This information is useful in assessing the reliability of perceptual beliefs. (shrink)

A true Turing machine requires an infinitely long paper tape. Thus a TM can be housed in the infinite world of Newtonian spacetime, but not necessarily in our world, because our world-at least according to our best spacetime theory, general relativity-may be finite. All the same, one can argue for the "existence" of a TM on the basis that there is no such housing problem in some other relativistic worlds that are similar to our world. But curiously enough-and this is (...) the main point of this paper-some of these close worlds have a special spacetime structure that allows TMs to perform certain Turing unsolvable tasks. For example, in one kind of spacetime a TM can be used to solve first-order predicate logic and the halting problem. And in a more complicated spacetime, TMs can be used to decide arithmetic. These new computers serve to show that Church's thesis is a thoroughly contingent claim. Moreover, since these new computers share the fundamental properties of a TM in ordinary operation, a computability theory based on these non-Turing computers is no less worthy of investigation than orthodox computability theory. Some ideas about this new mathematical theory are given. (shrink)

I criticize a certain view of the 'quanta' of quantum mechanics that sees them as fundamentally non-atomistic and fundamentally significant for our understanding of quantum fields. In particular, I have in mind work by Redhead and Teller (1991, 1992 and Teller 1990). I prove that classical particles do not have the rather strong flavour of identity often associated with them; permuting positions and momenta does not produce distinct states. I show that even the label free excitation formalism is compatible with (...) a mild form of atomism. Finally, I summarise some of the principle objections to an 'oscillator' interpretation of quantum fields. (shrink)

In this paper, we argue for the centrality of countable additivity to realist claims about the convergence of science to the truth. In particular, we show how classical sceptical arguments can be revived when countable additivity is dropped.

There are a number of controversies surrounding the Human Genome Project. Some criticisms are based on the contention that the full human sequence will be scientifically worthless; others stem from short-term worries about the social impact of genetic testing and the release of genetic information about individuals. I argue that, properly understood, the HGP is a valuable scientific project with a misleading name, that the moral issues surrounding the short-term difficulties are relatively straightforward but that there are problems of practical (...) politics in implementing the obvious solutions. Finally, I suggest that the HGP serves as the occasion for raising deeper philosophical questions about our commitment to improve the quality of human lives. (shrink)

Hacking has maintained that in experiments phenomena are created, not discovered, and that scientific entities are tools for doing. These claims undermine the distinction between the natural and the artificial: phenomena and scientific entities become artifacts. Hacking's view raises the question whether the distinction between the natural and the artificial has to be given up. The paper argues 1) that phenomena are created, but in a sense that does not undermine the distinction between the natural and the artificial, 2) that (...) scientific entities are used as tools instead of being tools, and 3) that Hacking's view on experiments may be reconciled with the traditional view provided the concept of nature be reinterpreted. (shrink)

This paper is part of a larger project defending of the foundations of microeconomics against recent criticisms by philosophers. Here, we undermine one source of these criticisms, arising from philosophers' disappointment with the performance of microeconomic tools, in particular game theory, when these are applied to normative decision theory. Hollis and Sugden have recently articulated such disappointment in a sophisticated way, and have argued on the basis of it that the economic conception of rationality is inadequate. We argue, however, that (...) their claim rests upon a misunderstanding of the concept of a game as it is used in microeconomics. (shrink)

In Bayes or Bust?, John Earman attempts to express in Bayesian terms a sense of "projectibility" in which it is logically impossible for "All emeralds are green" and "All emeralds are grue" simultaneously to be projectible. I argue that Earman overlooks an important sense in which these two hypotheses cannot both be projectible. This sense is important because it allows projectibility to be connected to lawlikeness, as Goodman intended. Whether this connection suggests a way to resolve Goodman's famous riddle remains (...) unsettled, awaiting an account of lawlikeness. I explore one line of thought that might prove illuminating. (shrink)

During the Middle Ages and Rennaissance, it was commonly believed that Aristotle's biological studies reflected his theory of demonstrative science quite well. By contrast, most commentators in the twentieth century have taken it that this is not the case. This is largely the result of preconceptions about what a natural science modelled after the proposals of Aristotle's Posterior Analytics would look like. I argue that these modern preconceptions are incorrect, and that, while the Analytics leaves a variety of issues unanswered (...) that a practicing biology must have answers to (hence Parts of Animals I), Aristotle's biological practice conforms to the Analytics model. It is further argued that establishing this claim requires reading philosophically through entire biological treatises--that is, one will miss the logical structure by following the usual practice of 'sampling' these treatises rather than reading them systematically. (shrink)

An important theme to have emerged from the new experimentalist movement is that much of actual scientific practice deals not with appraising full-blown theories but with the manifold local tasks required to arrive at data, distinguish fact from artifact, and estimate backgrounds. Still, no program for working out a philosophy of experiment based on this recognition has been demarcated. I suggest why the new experimentalism has come up short, and propose a remedy appealing to the practice of standard error statistics. (...) I illustrate a portion of my proposal using Galison's experimental narrative on neutral currents. (shrink)

If classics of science were to be defined as works that mark scientific revolutions, in the sense of sharp shifts in research tradition, then none of the three works discussed in our symposium quite qualifies. I briefly indicate the fate of each. While impressed by his argument, I express some reservations about Lennox's claim to have dissolved the "problem of demonstration" for Aristotle's De Partibus Animalium. I question Finocchiaro's challenging assertion that in structuring the Dialogo as he did, Galileo "operated (...) within the restrictions" laid on him. Finally, I argue that the legacy of Newton's Opticks was in crucial respects a divided one for the generations that followed. (shrink)

Some very persuasive arguments have been put forward in recent years in support of the disunity of science. Despite this, one is forced to acknowledge that unification, especially the practice of unifying theories, remains a crucial aspect of scientific practice. I explore specific aspects of this tension by examining the nature of theory unification and how it is achieved in the case of the electroweak theory. I claim that because the process of unifying theories is largely dependent on particular kinds (...) of mathematical structures it is possible to have a theory that displays a degree of unity at the level of theoretical structure without an accompanying ontological unity or reduction. As a result, unity and disunity can coexist not only within science but within the same theory. (shrink)

We consider Helen Longino's proposal that "ontological heterogeneity", "complexity of relationship", and "the non-disappearance of gender" are criteria for good science and cannot be separated into cognitive and social virtues. Using a research program in neuroendocrinology investigating a hormonal basis for sex-differentiated lateralization as a case study, the authors disagree concerning whether the first two criteria can be construed as criteria for good science. Concerning the non-disappearance of gender criterion, we argue that its appropriateness is context specific, and that its (...) cognitive and social formulations are separable and should be construed as such. (shrink)

Newborn screening for the genetic disease phenylketonuria (PKU) is generally considered the greatest success story of applied human genetics. Even those generally skeptical of the value of genetic testing often comment enthusiastically on this program. In fact, PKU screening has been plagued with serious problems since its inception in the early 1960s. This essay describes some of these difficulties and asks what lessons they hold for other screening programs. It also argues that realism in our assessment of such programs requires (...) that we pay greater attention to the concrete experience of families. How screening should work in theory is of less importance than how it does work in practice. (shrink)

Studies of science are usually addressed in a representational idiom which takes it for granted that the defining characteristic of science is its production of representations of nature. Here I advocate the move to a performative idiom which thematises the agency of machines and human beings. This move leads to a temporally emergent and posthumanist analysis of scientific culture and practice, and promises an antidisciplinary synthesis of the science-studies disciplines, spanning an impure sociology of science, a displacement of the traditional (...) philosophical problematics of realism and incommensurability, and a historiography of science centered on performative intertwinings of science, technology and society. (shrink)

When is it more rational to think for oneself or to defer to the relevant expert? Expertise is either closed-system oriented and lay-person oriented. The first sort is concerned primarily with controlling and manipulating a discipline's defining set of variables as a closed or relatively closed system. The second sort is simply in the business of "advising" clients. I argue that when expert claims are of the first sort, the layperson must defer to the experts; but when experts either extrapolate (...) from their closed-systems, or if they are of the second sort, then the layperson should think for herself. (shrink)

I argue against the assumption that the influence of non-cognitive values must lead to bad science and against the methodological norm that seems to some philosophers to follow from it, viz. that a good philosophy of science should analyze the morally and politically neutral production of good science. Against these, I argue for the assumption that non-cognitive values are compatible with good science and for the metaphilosophical norm that a good philosophy of science should allow us to see whether and (...) how non-cognitive values influence good science. In pursuit of one of its scandalous goals, viz. determining whether and when gender politics influence good scientific work, feminist philosophy of science is well served by this methodological norm. (shrink)

Popper's conception of methodology and its relationship to epistemology is examined, and found wanting. Popper argues that positivist criteria of demarcation fail because they are attempts to discover a difference in the natures of empirical science and metaphysics. His alternative to naturalism is that a plausible criterion of demarcation is a proposal for an agreement, or convention. But this conventionalism about methodology is misplaced. Methodological rules are conventions, but which methodological rules are followed by scientists it is not itself a (...) matter of convention. This casts doubt upon the status of Popper's famous criterion of demarcation. (shrink)

Research on the oxidation of alloys supports the claim that natural scientists can and do use ideal type concepts when confronted with analytical or computational intractability. In opposition to those who collapse ideal types into 'standard' theoretical concepts, I argue ideal types possess a unique structure, function and axiology. In phenomenologically complex situations, scientists use these features to articulate experiment with theory generally and in particular to discover new boundary conditions. This conceptual articulation is achieved using models rather than objective (...) perceptual attributes alone. The analysis supports a claim of local rather than global identities of methodology. (shrink)

The status of the vacuum in relativistic quantum field theory is examined. A sharp distinction arises between the global vacuum and the local vacuum. The concept of local number density is critically assessed. The global vacuum state implies fluctuations for all local observables. Correlations between such fluctuations in space-like separated regions of space-time are discussed and the existence of correlations which are maximal in a certain sense is remarked on, independently of how far apart those regions may be. The analogy (...) with the mirror-image correlations in the singlet state of two spin-1/2 particles is explained. The connection between these maximal correlations and the well-known violation of the Bell inequality in the vacuum state is discussed, together with the way in which the existence of these correlations might be exploited in developing a vacuum version of the Einstein-Podolsky-Rosen argument. The recent relativistic formulation of the Einstein-Podolsky-Rosen argument by Ghirardi and Grassi is critically assessed with particular reference to the vacuum case. (shrink)

We analyse aspects of the Big Bang program in modern cosmology, with special focus on the strategies employed by its adherents both in defending the theory against anomalous data and in dismissing rival accounts. We illustrate this by critically examining four aspects of Big Bang cosmology: the interpretation of the cosmic red-shift, the explanation of the cosmic background radiation, the inflation hypothesis and the search for dark matter. We conclude that the Big Bang's dominance of contemporary cosmology is not justified (...) by the degree of experimental support it receives relative to rival theories. (shrink)

Optimization models treat natural selection as a process tending to produce maximal adaptedness to the environment, measured on some "criterion scale" defining the optimal phenotype. These models are descriptively adequate if they describe the outcomes of evolutionary processes. They are dynamically adequate if the variables which describe the outcomes also are responsible for those evolutionary outcomes. Optimality models can be descriptively adequate, but dynamically unrealistic. Relying on cases from evolutionary ecology, I provide reasons to question the dynamic adequacy of optimality (...) models, and offer reasons for distinguishing, at least at a theoretical level, between satisficing and optimizing. (shrink)

A new model of scientific explanation is proposed: the covering theory model. Its goal is understanding. One chooses the appropriate scientific theory and a model within it. From these follows the functioning of the explanandum, i.e. the way in which the model portrays it on one particular cognitive level. It requires an ontology and knowledge of the causal processes, probabilities, or potentialities (propensities) according to which it functions. This knowledge yields understanding. Explanations across cognitive levels demand pluralistic ontologies. An explanation (...) is believed or only accepted depending on the credibility of the theory and the idealizations in the model. (shrink)

Marr's theory of vision is often said to exemplify wide psychology. The claim rests primarily on Marr's appeal to a high level theory of computational functions. I agree that Marr's theory embodies an exemplary form of wide psychology; what is exemplary about it is the appeal to perceptual tasks. But I argue that the result of invoking task considerations is that we should not adhere to Marr's own conception of proper explanatory width. There is no one conception of width that (...) has a priviledged place in explanation. (shrink)

Specific methodological limitations of traditional sex differences research are uncovered by feminist psychologists who argue for a shift toward a theoretical appropriation of gender that reveals its significance as a site of ongoing situated social regulation. I argue that such a shift has important implications for studies on gender and cognition, and that such studies have the potential to significantly expand our understanding of the contextual and situated nature of both social and "non-social" cognition.

By developing an elaborate allegory, this paper attempts to show that the advertised aim of the Human Genome project, to sequence the entire 3 billion base pair primary sequence of the nucleic acid molecules that constitute the human genome, does not make scientific sense. This raises the questions of what the real aim of the project could be, and why the molecular biological community has chosen to offer the primary sequence as the objective to be funded, when identifying functionally important (...) sub-regions of the genetic material is both far more useful and independently attainable an aim. (shrink)

The success of chemistry is directly credited to the capacity of instruments to provide human contact to the structures of physical reality. Empiricist philosophers have given scant attention to instruments as a separate topic of inquiry on the grounds that reliability of instruments is reducible to the epistemology of common sense experience. I argue that the reliability of many modern instruments is based on their design as analogical replication of natural systems. Scientists designed absorption spectrometers as artificial technological replicas of (...) familiar physical systems. Such designs are generated by analogical projections of theoretical insights from known physical systems to unknown terrain. Instrumentation enables scientists to extend theoretical understanding to previously hidden domains. After exploring this analogical function of instruments, the nature of instrumental data is discussed, followed by an explicit rejection of both skepticism and naive realism. In the end I argue for an experimental realism which lacks any theory-neutral access to the fundamental analogies of nature. (shrink)

The paper introduces cultural studies of science as an alternative to the "legitimation project" in philosophy and sociology of science. The legitimation project stems from belief that the epistemic standing and cultural authority of the sciences need general justification, and that such justification (or its impossibility) arises from the nature or characteristic aim of the sciences. The paper considers three central themes of cultural studies apart from its rejection of these commitments to the legitimation project: first, focus upon the sciences (...) as ongoing and dynamic practices; second, a deflationary and non-representationalist approach to understanding scientific knowledge; and third, foregrounding questions about the significance of scientific practices, statements, and the objects they engage, and how that significance changes within ongoing practices. (shrink)

It is shown that, for technical reasons, the additivity of variance criterion employed by Lloyd (1988) to define a unit of selection is, in almost all models of selection, inconsistent with the possibility that genes are sometimes not the unit of selection. A case when the latter view is particularly attractive is that of heterosis, and the additivity criterion is inadequate in even such an extreme case. The connection between that criterion and the so-called "fundamental theorem of natural selection" is (...) briefly explored. Skepticism is expressed about the value of measures such as variance in efforts to resolve any of the disputes about the "units of selection.". (shrink)

Debates over the significance of the particle concept, and the problem of locality-how do we represent localized phenomena?-appear to presuppose that particles and observed phenomena are things rather than events. Well-known theorems (Hergerfelt, Reeh-Schlieder), and a recent variant of Hergerfelt's theorem due to David Malement, present a problem of locality only given the tacit appeal to the concept of thing, in fact an individual, in a sense contrary to particle indistinguishability. There is no difficulty with the particle concept per se, (...) but it is a global construction more than one step removed from events actually observed, which are represented by local integrals over self-adjoint field densities. (shrink)

Differing views on reduction are briefly reviewed and a suggestion is made for a working definition of 'approximate reduction'. Ab initio studies in quantum chemistry are then considered, including the issues of convergence and error bounds. This includes an examination of the classic studies on CH2 and the recent work on the Si2C molecule. I conclude that chemistry has not even been approximately reduced.

This article examines how a molecular "solution" to an important biological problem-how is antibody diversity generated? was obtained in the 1970s. After the primarily biological clonal selection theory (CST) was accepted by 1967, immunologists developed several different contrasting theories to complete the SCST. To choose among these theories, immunology had to turn to the new molecular biology, first to nucleic acid hybridization and then to recombinant DNA technology. The research programs of Tonegawa and Leder that led to the "solution" are (...) discussed, and some of their strategies and heuristics are broadly characterized: (1) to what extent does the new recombinant DNA technology provide what the scientists claim is "direct evidence," what does that term mean, and what are the implications of that claim for biological "realism," and (2) is this episode one of reduction, partial reduction, or explanatory extension, and what do these terms mean in the context of a successful molecular "solution" to a biological problem. (shrink)

Using the example of Newton's Opticks, the author develops the concept of 'classic' as applied to landmark works in the history of the sciences. A discussion of themes drawn from H.-G. Gadamer and T. Kuhn is followed by an introduction of the notions of the texture and contexture of scientific works, conceived as the result of an author's weaving together foreground and background concerns. These notions assist in understanding how certain works can exercise a continuing appeal to both specialists and (...) nonspecialists. The essay concludes with reflections on the pedagogical purpose of using classic scientific texts in university education. (shrink)

Since the founding of psychophysics in the latter half of the nineteenth century, controversy has raged over the subject matter of psychophysical laws. Originally, Fechner characterized psycho physics as the science describing the relation between physical magnitudes and the sensations these magnitudes produce in us. Today many psycho-physicists would deny that sensation is or could be a topic of psycho-physical investigation. I consider Savage's (1970) influential objections to the possibility of such an investigation and argue that they depend upon (i) (...) holding psychophysics to higher standards than those to which we hold other sciences; and (ii) misrepresenting Fechner's stated goals for psychophysics. (shrink)

Some desiderata for scientific confirmation are formulated in the light of a tentative scientific world view. Bayesian confirmation theories generically satisfy most of these desiderata, but one of them, "the strategy of ascent," fits best in a tempered personalist version of Bayesianism. There are both empirical and rational components, dialectically combined, in tempered personalism. The question of explanation vs. prediction is treated in a Bayesian manner, and it is found that both operations are susceptible to characteristic systematic errors. If these (...) are eliminated, however, then explanation and prediction provide equally good evidential support for hypotheses. (shrink)

Many ecologists have dismissed alleged ecological laws as tautological or trivial. This essay investigates the epistemological status of one prominent such "law," the population-growth thesis, and argues for 4 claims: (1) Once interpreted, the thesis cannot be denied the status of empirical law on the grounds that it is always and everywhere untestable. (2) Contrary to Peters' (1991) claim, some interpretations of the thesis have significant heuristic power. (3) One can use the reasoning of Brandon (1990), Lloyd (1987), and Sober (...) (1984) to show that some interpretations of the thesis are not a priori. (4) Even if the thesis is a priori, it has explanatory power as a "schematic law.". (shrink)

This paper consists of two parts: the first is a brief historical summary of relevant discussions to date involving members of the panel; the second part is a discussion of the new contextualism within science studies, the consequent move towards the cultural study of scientific knowledge, and what this means for intellectual/cultural historians of science in terms of specific procedures. Thus, my role on this panel-as I understand it-- will be to play the sociologically and philosophically minded historian to the (...) sociologically and historically minded philosophers as all of us attempt to adapt cross-disciplinary procedures to our specific disciplinary needs. (shrink)

Social scientists regularly make use of multivariate models to describe complex social phenomena. It is argued that this approach is useful for modelling the variety of cognitive and social factors contributing to scientific change, and superior to the integrated models of scientific change currently available. It is also argued that care needs to be taken in drawing normative conclusions: cognitive factors are not instrinsically more "rational" than social factors, nor is it likely that social factors, by some "invisible hand of (...) reason," generally work to produce scientific success. A multivariate model of the biasing factors within a scientific community at particular times is developed. This model, which is an example of work in social epistemology, yields normative conclusions. (shrink)

Faraday's view of the mutual relation of speculative theories and laws of nature implies that there should be a procedure, leading from speculative considerations to a system of facts and laws in which theories do no longer play any role. In order to make out the degree in which Faraday's claims correspond to his practice, the way in which he gains an explanation of Arago's effect is analyzed. The thesis is proposed that he indeed has a procedure of leaving theories (...) aside. It is intimately connected with certain methodological guidelines of his experimentation. (shrink)

I characterize and then complicate Solomon, Thagard and Goldman ' s framing of the issue of integrating cognitive and social factors in explaining science. I sketch a radically different framing which distributes the mind beyond the brain, embodies it, and has that mind - body - person become, as s / he always is, an agent acting in a society. I also find problems in Solomon ' s construal of multivariate statistics, Thagard ' s analogies for multivariate analysis, and Goldman (...) ' s faith in the capacity of the community of users of scientific method to home in on true beliefs. (shrink)

Cognitive and social explanations of science should be complementary rather than competing. Mind, society, and nature interact in complex ways to produce the growth of scientific knowledge. The recent development and wide acceptance of the theory that ulcers are caused by bacteria illustrates the interaction of psychological, sociological, and natural factors. Mind-nature interactions are evident in the use of instruments and experiments. Mind-society interactions are evident in collaborative research and the flow of information among researchers. Finally, nature-society interactions are evident (...) in the role of granting agencies in determining the availability of instruments and the funding of experiments. (shrink)

There is a common (although not universal) claim among historians and philosophers that Copernican theory predicted the phases of Venus. This claim ignores a prominant feature of the writings of, among others, Copernicus, Galileo and Kepler-the possibility that Venus might be self illuminating or translucent. I propose that such over-simplifications of the history of science emerges from "psychological predictivism", the tendency to infer from "E is good evidence for H" to "H predicts E." If this explanation is correct, then in (...) cases where evidence is less blatant the history of science (and philosophies of science that rely on it) has probably been seriously distorted in a predictivist direction. (shrink)

Inferential statistical tests-such as analysis of variance, t-tests, chi-square and Wilcoxin signed ranks-now constitute a principal class of methods for the testing of scientific hypotheses. In this paper I will consider the role of one statistical concept (statistical power) and two statistical principles or assumptions (homogeneity of variance and the independence of random error), in the reliable application of selected statistical methods. I defend a tacit but widely-deployed naturalistic principle of explanation (E): Philosophers should not treat as inexplicable or basic (...) those correlational facts that scientists themselves do not treat as irreducible. In light of (E), I contend that the conformity of epistemically reliable statistical tests to these concepts and assumptions entails at least the following modest or austere realist commitment: (C) The populations under study have a stable theoretical or unobserved structure that metaphysically grounds the observed values; the objects therefore have a fixed value independent of our efforts to measure them. (C) provides the best explanation for the correlation between the joint use of statistical assumptions and statistical tests, on the one hand, and methodological success on the other. (shrink)

Two recently proposed quantum experiments are analyzed. The first allows to find an object without "touching" it. The second allows to teleport quantum states, transmitting a very small amount of information. It is shown that in the standard approach these experiments are in conflict with the intuitive notions of causality and locality. It is argued that the situation is less paradoxical in the framework of the many-worlds interpretation of quantum theory.

In a traditional view of science we come to fully believe the main accepted theories . Some of the hypotheses "possible for all that science tells us" seem more likely than others: enter probability as grading the possibilities left open. Probabilism contends with this tradition. Richard Jeffrey told us never to resolve doubt but only to quantify it, and to give maximal probability only to tautologies. Despite severe difficulties, I shall argue that the traditional view is reconcilable with probabilism. I (...) will propose a single unified account with conditional personal probability as basic, allowing for full belief in empirical theories, with our probabilities grading the possibilities left open. (shrink)

People often make trait judgments about themselves and others. Social perception researchers have attempted to study the accuracy of such judgments. Such studies raise the philosophical/conceptual question of what it means to say that a person's judgment is accurate. Two attempts have recently been made to taxonomize current research in terms of the notion of accuracy which has been adopted. My aim in this paper is twofold: first, to argue that the proposed philosophical taxonomies are problematic and, hence, should be (...) abandoned, and second, to recommend adoption of an alternative "minimalist" notion of accuracy. (shrink)

I argue that Fodor's analysis of ceteris paribus laws fails to underwrite his appeal to such laws in his sufficient conditions for representation. It also renders his appeal to ceteris paribus laws impotent against the major problem for his theory of representation. Finally, Fodor's analysis fails to provide useful solutions to the traditional problems associated with a thoroughgoing understanding of ceteris paribus clauses. The analysis, therefore, fails to bolster Fodor's position that special science laws are of necessity ceteris paribus laws (...) and that one must recognize them as scientifically legitimate. (shrink)

Fresnel's theory of light was (a) impressively predictively successful yet (b) was based on an "entity" (the elastic-solid ether) that we now "know" does not exist. Does this case "confute" scientific realism as Laudan suggested? Previous attempts (by Hardin and Rosenberg and by Kitcher) to defuse the episode's anti-realist impact. The strongest form of realism compatible with this case of theory-rejection is in fact structural realism. This view was developed by Poincare who also provided reasons to think that it is (...) the only realist view of theories that really makes sense. (shrink)