Objections to reliabilist theories of knowledge and justification have looked insuperable. Reliability is a property of the process of belief formation. But the generality problem apparently makes the specification of any such process ambiguous. The externalism of reliability theories clashes with strongly internalist intuitions. The reliability property does not appear closed under truth-preserving inference, whereas closure principles have strong intuitive appeal. And epistemic paradoxes, like the preface and the lottery, seem unavoidable if knowledge or justification depends on the frequency with (...) which a process generates true beliefs. The present theory has the conceptual resources to meet these challenges. It requires that a justificatory belief-formation process be intentionally applied. It distinguishes the justification of beliefs from that of the believer. And it avoids a frequency interpretation of reliability by introducing a notion of the normalcy of conditions under which processes are intentionally used. (shrink)

Pursuant to criticism, this paper revisits the relation between the theses of empirical equivalence and evidential underdetermination. I argue against some antirealist strategies for fixing the empirical commitments of underdetermined theories.

Leplin attempts to reinstate the common sense idea that theoretical knowledge is achievable, indeed that its achievement is part of the means to progress in empirical knowledge. He sketches the genesis of the skeptical position, then introduces his argument for Minimalist Scientific Realism -- the requirement that novel predicitons be explained, and the claim that only realism about scientific theories can explain the importance of novel prediction.

This paper criticizes the attempt to found the epistemological doctrine that all theories are evidentially underdetermined on the thesis that all theories have empirically equivalent rivals. The criticisms focus on the role of auxiliary hypotheses in prediction. It is argued, in particular, that if auxiliaries are underdetermined, then the thesis of empirical equivalence is undecidable. The inference from empirical equivalence to the underdetermination of total theories would seem to survive the criticisms, because total theories do not require auxiliaries to yield (...) observational consequences. It is shown that, nevertheless, underdetermination cannot be established for total theories. (shrink)

Some recent theories in theoretical physics are not subject to epistemic evaluation by empiricist standards of evidential warrant. The advantage of these theories is not pragmatic but explanationist; they fail to yield testable consequences that distinguish them from earlier theories. But this is essentially a technological limitation, rather than a theoretical defect. There is an explanation, itself confirmed by empiricist standards, of the unconfirmability of these theories. This paper considers what epistemic stance is proper in this situation, and explores the (...) prospects for justifiable change to an explanationist methodology capable of warranting theories that transcend the range of our experience. (shrink)

The fact that the goals and methods of science, as well as its empirical conclusions, are subject to change, is shown to allow at once for: (a) the objectivity of warrant for knowledge claims; (b) the absence of a priori standards from epistemology; (c) the normative character of epistemology; and (d) the rationality of axiological innovation. In particular, Laudan's attempt to make axiological constraints undercut epistemic realism is confuted.

This paper defends the Causal Theory of Reference against the recent criticism that it imposes a priori constraints on the aims and practices of science. The metaphysical essentialism of this theory is shown to be compatible with the requirements of naturalistic epistemology. The theory is nevertheless unable to forestall the problem of incommensurability for scientific terms, because it misrepresents the conditions under which their reference is fixed. The resources of the Causal Theory of Reference and of the traditional cluster or (...) "network" theory of meaning for handling problems of commensurability are compared, and an alternative approach is recommended. (shrink)

In response to recent recognition of the complexities of scientific change, discussion of the objectivity and the rationality of science has focused on criteria of theory choice. This paper addresses instead the rationality of scientific decisions at the level of ongoing research. It argues that whether or not a realist view of theories is compatible with the historical discontinuities of scientific change, certain realist assumptions are crucial to the rationality of research. The researcher must presume that questions about the existence (...) and the properties of at least some of the "unobservable" entities he theorizes about or experiments on are answerable on the basis of his work. The rationale of research cannot be understood solely in terms of the desiderata of instrumental utility or the empirical adequacy of theories. (shrink)

A realist interpretation of successful science is defended against a historical induction to the ultimate failure of current science from the failure of theories which once excelled by current standards. The defense requires (1) restrictions on the forms of success which realism, by its own lights, must explain, (2) referential stability through theory changes where the rejected theory achieves such success, and (3) degrees of truth for scientific statements.

Recent discussion of the problem of the conclusive falsification of scientific hypotheses has generally regarded the Duhemian Thesis (D-Thesis) as both true and interesting [10] but has dismissed the claim that disconfirmed hypotheses can be retained in explanations of the disconfirming evidence as either trivial [3] or unargued [12]. This paper rejects these positions. First, the status, in the argument for the D-Thesis, of the claim that auxiliary assumptions are necessary for the derivation of evidential propositions from hypotheses is examined. (...) It is concluded that depending on this status, the D-Thesis is either trivially true or unargued. Then the retention of contextually disconfirmed hypotheses is discussed. It is found that the use of such hypotheses in explanations of the disconfirming evidence is mediated by principles of scientific methodology. A new thesis is presented connecting the problem of conclusive falsification with changes in methodology. The argument for this thesis introduces a new analysis of methodological principles as inductive generalizations from scientific practice. Finally, the contribution of this analysis to the understanding of scientific change is described. (shrink)