Empiricism

New Dictionary of the History of Ideas
COPYRIGHT 2005 The Gale Group, Inc.

EMPIRICISM.

Empiricism is a family of theories of knowledge (epistemology) claiming that all knowledge about the extant universe is based on experience, primarily on perception via the five senses. Some empiricists add introspection, a moral sense, or a special sensitivity to religious or aesthetic experience. Strong empiricists claim that all knowledge whatever derives from experience. They must show how empiricism can handle apparently a priori knowledge, including logic, mathematics, and ordinary truths such as "Bachelors are unmarried males." Empiricism also provides an account of mind, language, and learning. The traditional contrast of empiricism is with rationalism and nativism, the view that we do possess a priori knowledge, either furnished by reason alone or innate. Empiricists tend to
perceptualize the mind and its operations, while rationalists tend to intellectualize it. With its down-to-earth emphasis on concrete experience and clarity, empiricism has flourished in Anglophone countries, whereas the more speculative rationalist and Kantian ideas have flourished on the Continent. This is one aspect of the divide between Continental philosophy and Anglo-American, "analytic," and "linguistic" philosophies.

In the twenty-first century nearly everyone is an empiricist in the everyday sense of taking experience seriously as a basis for knowledge claims about the natural world and human behavior, but most philosophers reject traditional, doctrinaire empiricism—the view that human sense experience provides a special connection of the knowing mind to the world and thus provides a foundation on which knowledge can build, step by step.

A Thumbnail History

In ancient times Aristotle was an empiricist relative to Plato's other-worldly rationalism. Modern empiricism began around 1600 with Francis Bacon (1561–1626), who promoted a new, experimental philosophy combining experience and reason, and with Galileo Galilei (1564–1642), who united experimental observation with a Platonic mathematical framework. Thomas Hobbes (1588–1679) further enriched early empiricist thinking, but the "big three" British empiricists were John Locke (1632–1704), George Berkeley (1685–1753), and David Hume (1711–1776). Locke first systematically expounded modern empiricism (see below). He was followed in the eighteenth century by Berkeley, notorious for his subjective idealism, the radical empiricist view that there are no material objects, that everything can be analyzed into minds and their ideas. Hume then took the further step of denying that there is even a substantial mind or ego. We introspect only a bundle of passing impressions. The mind is governed by natural laws of association, analogous to Isaac Newton's (1642–1727) law of gravitation, without needing an executive overseer. Hume also denied that inductive inference can be justified by logical argument, but he defended a wider conception of rationality (or at least sensible action) based on our natural impulses to believe and act. As he wrote in A Treatise of Human Nature (1739–1740) and its popularization, An Enquiry Concerning Human Understanding (1748), passion or custom, not reason, is "the great guide of human life."

The leading nineteenth-century empiricist was John Stuart Mill (1806–1873), who developed a full-fledged phenomenalism. Mill held that simple induction by enumeration (ravens 1, 2, 3, … n are black; therefore all ravens are black) is sufficient to support both science and mathematics: even the principles of logic and mathematics are very general empirical laws. Twentieth-century empiricists such as Bertrand Russell (1872–1970), George Edward Moore (1873–1958), and Alfred Jules Ayer (1910–1989) denied this, as did the logical empiricists of the Vienna Circle, who contended that the laws of logic and mathematics are both a priori and analytically true, that is, true by virtue of logical form and our linguistic conventions, hence completely empty of empirical content. By contrast, many twentieth-century thinkers, following Willard Van Orman Quine (1908–2000), have returned to a more naturalistic pragmatism.

The most damaging criticisms of British empiricism were leveled first by Immanuel Kant (1724–1804) and by the German and British idealists who followed him, then by Ludwig Wittgenstein (1899–1969), Gilbert Ryle (1900–1976), Quine, Wilfrid Sellars (1912–1989), Thomas Kuhn (1922–1996), and other twentieth-century figures, who attacked the entire Cartesian-Lockean conception of mind, experience, and language. Shocked by Hume's apparent skepticism about causality and Newtonian science, Kant synthesized rationalism and empiricism, while critically transcending both. The human mind itself furnishes the conceptual and rational apparatus necessary to organize our experience, as he argued in his Critique of Pure Reason (1781). "Concepts without percepts are empty; percepts without concepts are blind." Kant retained a vestige of the rationalist idea that we possess a special sort of intuition that enables us to make substantive, "synthetic" claims about the world that are nonetheless known a priori. He held that Euclidean geometry and the basic principles of Newtonian mechanics are such "synthetic a priori" truths. Post-Kantian empiricists deny this.

Foundational Empiricism

The traditional empiricists and rationalists were foundationists in epistemology. Foundationism postulates a base set of propositions that play a distinctive epistemic role plus a superstructure (comprising the bulk of our knowledge) appropriately related to the base. The empiricists and rationalists added the constraints that the basic statements must be certain and self-justifying (self-evident to reason for the rationalists and evident to the senses for the empiricists) and that the relation of base to superstructure be one of logical inference: deductive and perhaps inductive logic must suffice to generate the superstructure from the base. The justification is one-way or "linear" in the sense that the various layers of superstructure depend only on lower layers and, hence, ultimately on the base for their justification. Euclidean geometry provides the intellectual model. In this case the inferences are strictly deductive.

Given such a Euclidean geometry–inspired model, one wants the largest possible superstructure from the narrowest and most certain possible base. Two main problems stand in the way: the base problem (whether the base itself can be adequately justified) and the superstructure problem (whether the inferential resources are sufficient to support the desired superstructure on the base). From the beginning, empiricists have addressed the second problem by restricting the super-structure to claims within reach of observation and experiment and by developing the resources of logic, probability theory, and statistical inference. The British empiricists did not fully recognize the seriousness of the first problem.

Within this foundationist framework, Locke established the overall structure of a specifically empiricist theory in An Essay Concerning Human Understanding (1690), one of the founding works of the Enlightenment:

All simple ideas come from experience. There are no innate ideas. Contrary to nativists such as René Descartes (1596–1650), the mind is a tabula rasa—that is, a blank slate—at birth.

Ideas of solidity, movement, number, and so forth, resemble features of the real world (primary qualities), whereas sensations of color, sound, taste, and so forth, do not resemble the physical powers (secondary qualities) in objects that produce these sensations in the mind. They are mind-dependent.

Thus knowledge, which is the intellectual recognition of the agreement or disagreement of ideas, cannot go beyond the limits of experience. (Locke's is an "idea empiricism," but knowledge requires an operation of mind in addition to the presence of ideas.)

Neither can meaningful language transcend experience, since the meaning of a word is an idea in the mind. Having the appropriate idea in mind is what distinguishes a person's from a parrot's uttering, "I want a cracker."

We each learn our native language by attaching public noises or marks (words) to ideas. We can then communicate our ideas to others by making the appropriate noises or marks.

Thought is a connected sequence of ideas.

The immediate objects of perception and thought are ideas in the mind, which in turn represent external things and situations (doctrine of representative perception, two-object theory of cognition).

All existing things are concrete and particular.

Empiricists immediately encountered the superstructure problem. Locke recognized that most meaningful words are general and many are abstract (rather than proper names of concrete objects, e.g., canine versus Lassie ), so how do we get the corresponding ideas (meanings) from experience, which furnishes only particular ideas? From an image of a particular triangle, said Locke, we can abstract from its being equilateral, isosceles, or scalene, and thus construct a general idea of a triangle that is "all of these and none of these at once." Berkeley and Hume improved on this unsatisfactory solution, but to this day empiricist abstraction accounts face serious difficulties. Hume added to the superstructure problem by denying the adequacy of reason alone to produce, from particular experiences, either (a) moral judgments, about what one ought to do, or (b) inductive conclusions, such as "All ravens are black" and Newton's laws. The former is his point that one cannot deduce "ought" from "is" or value judgments from objective facts, and the latter is the aforementioned problem of induction. Meanwhile, Berkeley had challenged Locke's empirical base by rejecting his distinction between primary and secondary qualities.

The Appearance-Reality Distinction

The two problems resurrect the old difficulty of bridging the gap between appearance and reality. Seventeenth-century advocates of the new science joined Plato in sharply distinguishing the world of everyday experience from underlying reality. Empiricists, with their limited resources, have tended to stick close to the experiential surface of the world by either narrowing the gap between appearance and reality, denying the existence of an underlying reality altogether, adopting the skeptical position that we simply cannot know it, or rejecting all talk of a reality beyond experience as "metaphysical" and hence meaningless. In short, they have wavered over commitment to the reality of unobservable entities and processes.

Locke denied that we can know the real essences of things. Our classifications are not natural but artificial—conventions made for human convenience. Hume and the later positivists, with their verifiability theory of meaning, ruled out metaphysics as meaningless. Ernst Mach (1838–1916), the Viennese physicist and positivist, denied the existence of atoms and developed a phenomenalistic account of the world. Berkeley had denied the existence of matter with his principle, "To be is to be perceived or to perceive" (Principles of Human Knowledge, 1710). Only minds and ideas exist. Does the cat then go out of existence when it disappears beyond the sofa? No, because God (the biggest mind) still perceives it, replied Berkeley. Mill later used a logico-linguistic device to remove the need for God and thus obtain a full-fledged phenomenalism. In An Examination of Sir William Hamilton's Philosophy (1865), he attempted to reduce physical objects to "permanent possibilities of sensation," expressible by (impossibly) long series of statements about what a person would experience or would have experienced in such-and-such a situation. Russell, using the new symbolic logic to the same end, attempted to reduce mind itself to a logical construction out of experiences. He took the same line for the postulated theoretical entities of physics: "Wherever possible, logical constructions out of known objects are to be substituted for inferred entities" ("The Relation of Sense Data to Physics," in Mysticism and Logic, 1917). This was a halfway house between realism and instrumentalism or fictionalism. If electrons are logical constructions out of actual and possible laboratory operations and the resulting observations, then they are not real entities of underlying reality; but neither are they complete fictions. Rather, electron talk is a convenient, economical façon de parler.

The Twentieth Century and Beyond

Twentieth-century thinkers abandoned or at least transformed British empiricism for its failure to solve the base and super-structure problems. These developments include: (1) The linguistic turn. Linguistic philosophers speak about terms in a language rather than, vaguely, about ideas in the mind. They also employ the full power of symbolic logic or the subtle devices of ordinary language to address the twin problems of relating subjective experience to basic statements and basic statements to superstructure. (2) The holistic turn. This is a further shift from the atomism of individual ideas or terms to
whole statements, representing completed thoughts, and even to entire languages and conceptual frameworks. In "Two Dogmas of Empiricism" (1951), Quine argued that neither individual terms nor even full statements (not even basic observation statements) can be directly correlated with experience. Moreover, the data of experience logically underdetermine our theoretical claims. (3) Rejection of the analytic-synthetic distinction, also by Quine. We cannot factor theories into purely empirical and purely analytic components, only the first of which is vulnerable. "No statement is immune to revision," come what may, not even the statements of mathematics—for example, it is now known that Euclidean geometry is not the only conceivable geometry and that it is not even true. Quine's work called into question not only the concept of analytic statement but also that of analysis as a philosophical method, for no one has provided an adequate analysis of analysis! (4) Rejection of the scheme versus content distinction by Donald Davidson, who proclaimed this the third and last dogma of empiricism. (5) Rejection of the correspondence theory of truth and of (6) the linear-foundational model of justification. These doctrines give way either to a weaker, nonlinear and fallibilist foundationism or to a coherence theory of justification based on the idea of a mutually supporting network of claims and practices. For some, pragmatic problem-solving supplants truth as a goal of research. (7) Anti-Kantian Kantianism. Despite the rejection of Kantian intuition and synthetic a priori claims, logical empiricists Hans Reichenbach (1891–1953) and Rudolf Carnap (1891–1970) and historian Kuhn in different ways defended the need for larger structures, at least temporarily immune to serious revision, in order to make sense of the history of science as well as individual cognition. These structures are not mere hypotheses up for testing, alongside the others, for they are constitutive of experience and of normal scientific practice, in a quasi-Kantian way. To reject them would be to throw out the baby with the bath water. (8) Rejection by Karl Popper (1902–1994) and the positivists of the traditional identification of empiricism with inductivism, the view that we must gather and classify facts prior to theorizing. They developed a sophisticated, hypothetico-deductive model of scientific research, which was in turn subjected to severe criticism.

(9) Rejection of the imagist tradition that treats cognitive states or contents as little pictures before consciousness, and of (10) "the myth of the given," by Sellars and others, the idea that subjective experience provides a special, direct, infallible, nonnatural connection of knowing mind to known world. These difficulties highlight the problem of the empirical base. Insofar as our experiential claims are certain they are not about physical reality (because we have had to retreat into the certainty of our subjective sense data of the moment), and insofar as they are about reality, they are not certain (because they are now subject to override by other observers or even by widely accepted theories). The price of relevance is fallibility. Thus accepting a basic statement is a social decision. All conceptual thinking, including perception, is mediated by language (a further phase of the linguistic turn). There is no prelinguistic cognitive (conceptual) awareness. There is no thought, no fully human perception or scientific observation, prior to language. Roughly, "language games" (Wittgenstein's term) take over the role played by Kant's categories. All inquiry is thus fallible and mediated by language and by participation in an appropriate community of inquirers. The isolated Cartesian inquirer is a myth. The result is (11) the failure of phenomenalism and sense datum theories of perception and, more generally, (12) rejection of the whole Cartesian-Lockean conception of cognition and language. This conception is based on a Cartesian dualism of mind and body and, specifically, upon the privacy, immediacy, and alleged epistemological privilege of one's current mental contents. Philosophical and psychological behaviorism provided strong arguments against the Cartesian conception even for those thinkers, such as Sellars, who went beyond behaviorism.

(13) The failure of attempts to define knowledge precisely as justified true belief, which inspired (14) externalism versus internalism in epistemology. Internalism is the Cartesian-Lockean view that a person's knowledge claims must be justified in terms of the beliefs to which that person has access. The most popular form of externalism is reliabilism. According to process reliabilists, knowledge or justification consists of true beliefs formed by a reliable process whether or not the believer has sufficient Cartesian access to that process to justify it internally. Virtue epistemology, analogous to virtue ethics, is a variant of this idea: reliable beliefs are those formed by an intellectually virtuous process. (15) Recognition of the importance of tacit versus explicit knowledge (knowledge-how vs. knowledge-that) and of embodied knowledge, for example, skilled practices that we cannot fully articulate. (16) The feminist introduction of gender variables into epistemology. (17) Competing attempts to naturalize and socialize epistemology. Increasingly, empiricist philosophers work in the cognitive sciences, although few share Quine's view that epistemology will simply become a branch of psychology. Meanwhile, sociologists of knowledge regard their sociological approach as more fundamental than psychological studies of cognition. (18) The postmodern critique of empiricism. Postmodernists, including Richard Rorty and radical feminists and sociologists, regard empiricism, epistemology in general, and, indeed, the entire Enlightenment project to replace a tradition-bound life with modern life based on empirical science as a "modern" enterprise whose time is past. It is a mistake, they say, to abstract from sociohistorical contexts with their specific power and gender relations to seek the "one true account" of the world, as if there were a determinate world out there waiting for us to provide a correct description in its own language. Rather, say the critics, the world and our modes of inquiry are all socially constructed, as is empiricism itself. It is now time to deconstruct it. These controversial oppositions have generated "the science wars."

Although philosophical thinkers have abandoned both traditional rationalism and empiricism and although Quine, Davidson, and others have rejected the "dogmas" of empiricism and hence empiricism itself as a technical philosophical doctrine, there is a wider sense in which empiricism wins. For everyone is an empiricist in regarding observation and experience as crucial to justifying claims about the world, while almost no one believes that such claims can be defended purely a priori or on the basis of some kind of nonempirical intuition.
However, this is no longer an empiricist epistemology in the old sense, for gone is the idea that epistemology commands special resources that can provide external or transcendental justification for any enterprise. The sciences, for example, can only justify their claims internally, by applying further scientific tests and by their own fruits.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Empiricism

Europe, 1450 to 1789: Encyclopedia of the Early Modern World
COPYRIGHT 2004 The Gale Group Inc.

EMPIRICISM

EMPIRICISM. In broad terms, empiricism is the view that experience is the most important or even the only source of knowledge or sound belief. The term itself is of nineteenth-century origin, but the history of empiricism can be traced at least as far back as the ancient Greek philosopher Epicurus (341–270 B.C.E.). With the emergence of Christian civilization, however, belief in the cognitive importance of the senses was no more encouraged than was the pursuit of their pleasures. The Greek philosopher who seemed most consistent with religious belief was Plato, who thought that we needed to escape from the senses in order to achieve true knowledge or, for that matter, happiness. Though it was Aristotle who became "the Philosopher" in the medieval universities and monastic institutions, the empiricist strands in Aristotle's thought were not taken up in any systematic way. One of the best known empiricist maxims, Nihil est in intellectu quod non fuerit prius in sensu ('There is nothing in the mind that was not previously in the senses') seems to have been first stated by the great medieval Aristotelian and theologian Thomas Aquinas (1224/1225–1274). But empiricism formed no part of his enterprise of reconciling revealed religion with an Aristotelian philosophy.

At the beginning of the early modern period empiricism was not generally regarded as an intellectually defensible position. The word empiric, indeed, was used as a term of abuse, one that referred particularly to quack doctors who rejected the medical orthodoxies of their day, preferring remedies that they claimed worked in experience. While it was acknowledged that everyone has to rely, to some extent, on their sense experiences, many philosophers believed that humans have a faculty of reason that enables them to avoid the errors of the senses. Well into the early modern period the prevalent theories of knowledge and the sciences were ones that have appropriately been called "rationalist" to reflect their stress on reason and abstract argument.

These "rationalist" philosophers were sometimes important figures in the history of the mathematical sciences. This was true of the French philosopher René Descartes (1596–1650), whose view that the essence of matter consists of its geometrical properties was highly influential in the late seventeenth and early eighteenth centuries. The German philosopher Gottfried Wilhelm Leibniz (1646–1716), co-inventor of the differential calculus, could also be counted among the "rationalists." Leibniz accepted that animals learn from experience but thought that the "simple empiric" was no better than they were, insofar as he did not use his reason. For Leibniz, as for many "rationalist" philosophers, reason was the "divine spark" in humankind that set it apart from the rest of creation as capable of knowing the truths not only of mathematics but of morality and religion.

Empiricism was not an organized philosophical point of view at the beginning of the early modern period. It seems remarkable indeed that it developed at all, given the religiously motivated bias and the intellectual contempt felt for it. Yet not only did it develop, but by the eighteenth century it had become and was to remain the most widely accepted philosophy of the sciences.

FRANCIS BACON AND HIS INFLUENCE

The first early modern defender of what would now be called an organized "empiricism" was the English statesman and philosopher Francis Bacon (1561–1626). Bacon maintained that the true philosopher should be neither an empiricist nor a rationalist. The empiricist, he complained, is like an ant that collects much of value but does not put it into a coherent system. The rationalist, on the other hand, was like a spider, who spun wonderful constructions from within itself but whose thoughts did not connect with external reality. The true philosopher, Bacon wrote, should be like the bee that both collects much of value and puts it into an organized system.

What Bacon proposed were empirical methods of "induction," the process of arguing from a collection of instances of a phenomenon to a general conclusion. In his Novum Organum of 1620, Bacon already went beyond the method Leibniz was to dismiss as that of the "simple empiric," who notices resemblances between sequences of events (for instance, thunder repeatedly followed by rain) and arrives at a general conclusion on that basis (for instance, that thunder causes rain). Bacon stressed the importance of observing differences as well as similarities between sequences of events.

Bacon's view of science was in many ways ahead of his time, for his critical empiricism was combined with the view that knowledge would gradually increase
and that its pursuit should be cooperative and free of sectarianism. His ideas were taken up by some of the founders of the Royal Society in England, such as Robert Boyle (1627–1691), who is sometimes called "the founder of modern chemistry," and Robert Hooke (1635–1703). Indeed the very aims of the Royal Society as articulated by its first secretary, Henry Oldenburg, sound highly Baconian, especially in their opposition to mere speculation and commitment to exact observations and experiments. The achievements of the great English physicist Isaac Newton (1642–1727) added to the prestige not only of the Royal Society but also of the new "experimental philosophy" with which he was associated.

Bacon had an immense influence on the self-perception of British scientists well into the nineteenth century, and he was also held in wide esteem elsewhere in Europe, for instance by the editors of the Encyclopédie (1751–1765). In his Discours préliminaire (1751; Preliminary discourse) to the Encyclopédie, Jean Le Rond d'Alembert (1717–1783), an editor and leading contributor of scientific articles, referred to Bacon as the virtual founder of an experimental natural philosophy, and the Encyclopédie as a whole followed Bacon's tripartite scheme of knowledge.

Empiricism was revived, to some extent independently, by Bacon's younger French contemporary, Pierre Gassendi (1592–1655). Like Bacon, Gassendi was dissatisfied with the philosophical systems of his day, but he sought to avoid the extreme skepticism to which others were driven. Gassendi was inspired to a constructive philosophy by his study of Epicurus, whose philosophy he modified to cut out the points of conflict with Christianity (Gassendi was a priest). Gassendi insisted that our knowledge of the world comes only from experience, and he put forward a form of atomism as a hypothesis for explaining the world. This atomism was taken up by Robert Boyle, among others, and it was important in the development of seventeenth-century science.

Gassendi's empiricism also influenced the English philosopher John Locke (1632–1704). In his Essay concerning Human Understanding (1690), Locke provided a sustained defense of the empiricist principle that all our ideas come from experience. Prior to Locke it was widely assumed that humans were born with an innate knowledge of certain principles, for instance of right and wrong. His critique of such innate principles was particularly valued as a corrective to the kind of dogmatism that had tended to prevail in moral and religious matters.

The empiricism of Locke was criticized from two different quarters, from followers who thought he had not gone far enough and from critics who thought he had gone too far. To some of his followers the Essay, although it seemed to point in the right direction, was not empirical enough. Locke had included a "rationalist" defense of moral truths and of the existence of God, for instance, claiming for them the kind of knowledge reserved for mathematics. He also, against empiricist principles, allowed that the mind was capable of forming abstract general ideas. To some of his empiricist successors this seemed to reinstate some of the metaphysical abstractions Locke's method and principles had managed to exclude. The Irish freethinker John Toland (1670–1722), for instance, attacked those mathematicians who turned to metaphysics in proposing such concepts as absolute space and time. For Toland the concept of a soul as an immaterial substance was another such untenable abstraction. Toland's radical interpretation of Locke brought out the natural association of empiricism with materialism. Locke sought to dissociate himself from Toland, but he was not entirely able to do so.

Locke was by some measures the most influential philosopher of the eighteenth century, at any rate in Britain and France. There was some controversy between those who supported an empiricism like Locke's and those who favored the more rationalist philosophies of Leibniz or the French priest Nicolas Malebranche (1638–1715). But for many the decision was not whether to be for or against Locke, but whether to support a more radical or a more conservative interpretation of his empiricism.

The more radical reading of Locke became very influential in France, where skepticism and materialism were attractive to a number of intellectuals or philosophes, as they were called. These included the aristocratic Voltaire (1694–1778), who was noted for his hostility to the ecclesiastical establishment and for his sloganÉcrasez l'infâme! ('Crush
the infamous thing!'). In his Lettres philosophiques (1734; Letters on the English) Voltaire praised the new experimental method of Bacon, Locke, and Newton. This English trio was also adulated by many of those involved in the Encyclopédie project. The chief editor, Denis Diderot (1713–1784), was a freethinking empiricist and materialist.

Most British philosophers who followed Locke sought to interpret or modify his philosophy so that it would be compatible with religious belief. This was true of the Irish clergyman George Berkeley (1685–1753), who argued, in effect, that a more consistent empiricism than Locke's would undermine materialism. Berkeley argued that there were no "abstract general ideas," as Locke had allowed, but that the ideas we have are always particular. The concept of "matter" was a scholastic abstraction that was not needed in order to make sense of our experience. Berkeley's conclusion that the only substances in the world were God and spirits like ourselves was generally thought to be unbelievable. His analysis of the mathematical sciences foreshadows the "instrumentalism" common in twentieth century philosophies of physics. He allowed abstractions like "force" and "gravity" into theoretical formulae that were useful for making predictions, even though he did not think it should be supposed that anything answering to these abstractions exists in reality.

Berkeley's philosophy of the mathematical sciences was hardly acknowledged in the eighteenth century. This is surprising in view of the complaint, commonly made against empiricism, that it fails to do justice to the mathematical sciences. On an empiricist account, mathematical truths are only truths about the necessary relations between our ideas and not substantial truths about the world. Empiricism seemed for this reason an unsuccessful philosophy. The great German philosopher Immanuel Kant (1724–1804) accepted that our ideas arise in experience and that most of our knowledge is based on our senses. In his Kritik der reinen Vernunft (1781; Critique of pure reason), however, he argued that the truths of arithmetic and geometry were both necessary and substantial truths about the world, although empiricism cannot strictly allow them. Kant left a highly influential legacy of criticism of empiricism to subsequent philosophy.

THE EXTREME EMPIRICISM OF DAVID HUME

The empiricist philosopher to whom Kant was responding in his first Critique was the Scottish skeptic David Hume (1711–1776). Hume is generally regarded as the most thoroughgoing defender of empiricism and critic of abstract metaphysics of the early modern period. He accepted Berkeley's argument that we have no reason to believe in "material substances" that exist independently of our senses. But similar arguments, he thought, also brought into question the spiritual substances to which Berkeley gave pride of place. All we actually experience, according to Hume, are fleeting impressions. We are not strictly aware of the self. Hume's empiricism thus led him even further than Berkeley had gone from a commonsense position, though he sought to save the situation by arguing that we are bound to hold beliefs that are not strictly warranted by experience.

Hume claimed that he was extending the same experimental method to the sciences of human nature that Newton had shown to be so fruitful in natural philosophy. There is some dispute about how to interpret his deeply probing arguments. On the one hand, his empiricism seemed to lead him to undermine the fundamental principles of scientific inquiry. For instance, it is fundamental to empirical science to be able to assume that the future will be like the past—that we learn things from experience (such as that food nourishes us) and thus gain knowledge of the future or at least very strong grounds for belief about it. But what is the rational basis for such an assumption? An empiricist has to say that it is based on experience. But this simply begs the question. For it does not follow that, just because past experience has been a good guide to the future, it will continue to be reliable. Thus a rigorous empiricism, far from underpinning a scientific philosophy, appears to actually undermine it. Put another way, a rigorous empiricism appears to lead to skepticism. And this was an important part of Hume's legacy. At the same time Hume himself offered a way of avoiding a skeptical conclusion, maintaining that we are so constituted that we are bound to expect the future to be like the past. He even suggested, though perhaps not seriously, that nature was guiding us to the truth.

During the early modern period empiricism, despite the difficulties it entailed, gradually became the dominant theory of scientific rationality. The increased status of empirical science meant that philosophers began to frame their arguments in new ways. For instance, philosophers in the seventeenth century did not generally base their arguments for the existence of God or the immortality of the soul on experience. This was partly because they wished their conclusions to be demonstrated and not merely accepted as hypotheses. In the eighteenth century it became commonplace to accept that the existence of God was at best probable. The arguments for it were based on experience—in particular the experience of order in the universe, from which it was widely thought to be possible to infer the existence of an intelligent designer. These empirical arguments were increasingly favored by theologians. Hume himself took them seriously and examined them critically in his Dialogues concerning Natural Religion (1779). He suggested, however, that there were other, less obvious but equally plausible hypotheses that could be advanced to explain the evidence of order than the hypothesis of an intelligent creator.

A common commitment to empiricism did not lead everyone to the same conclusions, but it did settle the terms of debate, at least for many. One of the most widely read works of fiction of the eighteenth century was Voltaire's Candide (1759), whose hero perseveres in his "optimistic" belief that God has created the best of all possible worlds despite all the terrible misfortunes that befall him and those around him. In the book, Candide has been taught some theoretical basis (which he has forgotten) for his optimism by the German rationalist Pangloss. To those whose sympathies were on the side of Pangloss and who believed in a perfect providence, Candide would have been regarded as in very poor taste. It succeeded as a satire partly because the sympathies of enough readers were on the side of the author with regard to the existence, as an empirical fact, of massive unjustifiable evil in the world.

——. Letters Concerning the English Nation. Edited by Nicholas Cronk. Oxford and New York, 1994. Translation of part of his Lettres philosophiques.

Secondary Sources

Brown, Stuart, ed. British Philosophy and the Age of Enlightenment. Routledge History of Philosophy, vol. 5. London and New York, 1996. Provides chapters on most of the empiricist philosophers of the early modern period.

Cottingham, John. Rationalism. Edited by Justin Wintle. London, 1984. Puts empiricism in the context of the rationalist philosophers, their criticisms, and alternatives.

Garrett, Don, and Edward Barnabell, eds. Encyclopedia of Empiricism. Westport, Conn., 1997. The definitive reference work on this topic.

Stuart Brown

Cite this article Pick a style below, and copy the text for your bibliography.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Empiricism

Empiricism

Empiricism can be traced back to Aristotle’s dictum, “there is nothing in the intellect that is not first in the senses,” although Aristotle himself is not usually regarded as an empiricist in the modern sense. The theoretical foundations of modern philosophical empiricism are found in the works of John Locke, George Berkeley, and David Hume, and in the nineteenth-century philosopher William James. These philosophers inquired about the limits and scope of the human mind, and argued that experience itself is the primary source of all knowledge. Empiricism is thus a theory of knowledge that highlights the importance of experience. The term experience can be defined minimally, as in terms of the senses, or expanded to include all forms of consciousness.

Locke’s project in his Essay Concerning Human Understanding (1690) was to set out “to enquire into the origin, certainty, and extent of human knowledge” (Locke 1975, p. 43). Locke argued that knowledge is restricted to ideas generated by objects that one experiences through the senses (ideas of sensation) or by reflection upon our mental operations on those ideas (ideas of reflection). In this complex sense, knowledge and human understanding in general (including unscientific beliefs such as justice) originate in experience, as the origin of all ideas is in experience, which involves two logical levels, sensation and reflection. Each person’s mind can be thought of as initially a blank tablet (tabula rasa) first written upon by the sensations of experience (ideas of sensation), which can then be manipulated in various ways, the ideas of which— the ideas of reflection—being the second level of experience.

Berkeley argued in both Principles (1710) and Dialogues (1713) against the actual existence of matter, and claimed in his dictum “to be is to be perceived” (or to perceive). This means that objects can never be understood independently of their ideas since, for Berkeley, the object and sensation are the same thing. Berkeley maintained that there are only ideas and minds, or the location where ideas occur. Thus a thing is understood as the sum of perceived qualities. Although for Berkeley it is impossible to think of anything except as it related to the mind, both Berkeley and Locke believed that all knowledge about the existence of things and the reality of matter depends on visual and sensory experience.

In his work Enquiry Concerning Human Understanding (1784), Hume claimed that human senses allow people to perceive, and these perceptions (made up of impressions and ideas) are the contents of the mind. The original thought itself, according to Hume, is an impression, and an idea is only a copy of an impression. The difference between the two is their vividness, for when one reflects upon these impressions one has ideas of them. Hume’s work does not ground impressions to a material world, and argues instead that impressions are internal subjective states that do not provide evidence of an external reality.

In his metaphysics, James wrote in a tradition that focuses on the process of consciousness based in experience—a “process metaphysics.” For James, humans have a continuous development of thought that is based in interpretations of the experiences themselves. In this way, human consciousness consists of experienced relations (a “stream of thought”), which are themselves experienced (affectively and effectively), as one both transforms and is transformed by these experiences. Indeed, James’s radical empiricism is pluralistic in that it allows for different points of view—different “givennesses”—of reality. Because James allowed for individual perspectives of experience, it follows that one’s epistemologies themselves are informed by one’s experiences. Absolute unity of reality, for James, is “ever not quite,” as “fact” is based on experience, and the multiple experiences of experience itself. Thus there is no objective truth, as Jamesian truth is experientially cognized at the level of subjective/individual perception.

The empirical tradition runs counter to rationalist philosophy, which poses that knowledge can be derived through the exercise of reason alone, and in terms of a person’s rational power. All of the aforementioned philosophers wrote in a tradition that opposes the rationalist view, represented most notably by the French mathematician and philosopher René Descartes, that humans enter the world with innate ideas built into the mind itself. Instead, these philosophers argue that persons must rely on experience itself to inform knowledge claims.

Within the social sciences, empiricism describes research methods that depend on the collection of facts and observations, some of which require verification, counting, and measuring. Although a researcher may use empirical methods, it does not follow that he or she is a philosophical empiricist, and does not make one an empiricist per se. There are thus many forms of empirical research methods.

Auguste Comte, a sociologist and philosopher, held that knowledge of the world arises from observation, and conceived of positivism as a method of study based on the strict use of the scientific method. He asserted that authentic knowledge (or all true knowledge) is scientific knowledge that is objective, predictable, and has logical structures. Logical positivism (or logical/rational empiricism) combines positivism with a verifiability criterion for meaningfulness. For logical positivists, all knowledge should be based on logical inference, justification, and verifiability through experience or observation. Meaningful statements fall into two categories for the logical positivist, a priori analytic knowledge (necessary truths that are knowable prior to experience; for example, all circles are round) and a posteriori synthetic knowledge (or contingent knowledge that is verified by sensory experience; for example, it is raining outside). Quantitative methodology is a kind of scientific empiricism and refers to the compilation and analysis of numerical data, which for the social scientist is empirical in nature since it can be tested and verified (validated or falsified) by empirical observation. Moreover, quantitative methodology is positivistic since it relies on scientific and systematic observation and experiment, and can be thought of as the scientific approach to the study of sociocultural life.

Nonetheless, although social scientists do not ask underlying metaphysical questions about the actual existence of objects, they are indeed concerned with the experience of social objects and phenomena. For example, the first professor of sociology, Émile Durkheim, in his book The Rules of Sociological Method (1938), enshrined this idea with his conceptualization of a “social fact,” which is as objective as facts are in the natural sciences.

For Thomas Kuhn, empirical methods are capable of elucidating and eradicating problems within paradigms during periods of “normal science.” Interestingly, Kuhn shows how this “science” is reflective of one’s theoretical connectedness to a specific paradigm itself, and is not the reflection of any truth-claims to knowledge.

Social constructivism is a philosophical theory of knowledge that states that knowledge itself is contingent upon social experience, context, convention, and human perception. Some examples of socially constructed knowledge are gender (feminine and masculine), sexuality, and racial categories. This theory of knowledge does not necessarily reflect any external “transcendent” metaphysical reality, and is instead based on a socially constructed reality as opposed to an ontological reality. However, the notion of experience is still important for a constructivist, as experiences between and among individuals differs within and outside of varying contexts, thereby allowing for different “realities,” some of which are based in oppression (for example, women, minorities, and homosexuals).

Empirical methods have been used to study race, gender, sexuality, and religion, among a plethora of other social phenomena such as crime, deviance, attitudes, and beliefs.

Considering race, there has been much research done in social science regarding migration, connections with class, connections to skin color, social surveys of self-image and self-regard among ethnic minorities, and measuring prejudice in terms of scales of social and ethnic “distance.” Additional quantitative studies concerning race have focused on social inequality, institutional racism, patterns of interaction and segregation, genocide, social standing, poverty, and assimilation of dominant culture patterns.

Gender has been studied in the social sciences through the analysis of images of women in media and culture. These empirical studies of symbols and images range from studies of archaeological statues of goddesses to contemporary studies of how women are portrayed in film or advertisements. Discrepancies in gender stratification and sexism can be analyzed from a quantitative approach, as can the important issue of violence against women. Additionally, empirical studies of gender also inform analyses of family relations, employment patterns, and distribution of wealth, education trends, and politics.

Using empirical methods to study sexuality, social scientists focus on topics such as sexual orientation, contraception, prostitution, gender identity, and attraction. Additional research can also be found on teen pregnancy, fertility, pornography, activist movements, sexual violence, sex education, and queer studies. One of the most important works in this area is The Archaeology of Knowledge (1972) by Michel Foucault.

Religion has also been analyzed empirically in terms of socioeconomic status, the family, marriage patterns, social class, family violence, cohabitation, political affiliation, church attendance, opinions about religious matters, as well as feelings, beliefs, and behaviors pertaining to religion as measured by social surveys. This is especially evident in the work of Rodney Stark, but began as early as 1904 in Max Weber’s seminal work The Protestant Ethic and the Spirit of Capitalism.

Louis Althusser critiqued empiricism as a methodological stance and argued against the empirical process of knowledge, claiming that theoretical discourse is a “production,” making empiricism itself ideological and dogmatic, and therefore not scientific. According to Althusser, “facts” of theoretical discourse are tied to theoretical practice, making knowledge itself a form of discourse.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Empiricism

Encyclopedia of Science and Religion
COPYRIGHT 2003 The Gale Group Inc.

Empiricism

The term empiricism describes a philosophical position emphasizing that all concepts and knowledge are derived from and justified by experience. Empiricists disagree on the nature of experience, including whether it is individual or social and whether sense experience is to be emphasized. Empiricism often is associated with other positions, including nominalism, naturalism, materialism, atheism, secularism, humanism, behaviorism, and emotivism.

Empiricism usually contrasts with views that truths can be derived from tradition, Scripture, revelation, intuition, or reason. Empiricists often have a special attitude toward mathematics, acknowledging its role in understanding the world yet denying that it gives direct truths about the world apart from experience. In the last third of the twentieth century, Anglo-American discussion has tended to contrast empiricism with holism or coherentism.

Classic empiricism

Despite earlier roots, empiricism really began with the seventeenth- and eighteenth-century British philosophers John Locke (1632–1704), George Berkeley (1685–1753), and David Hume (1711–1776). Locke rejected the existence of innate ideas, including truths of religion and morals and held that the mind is a "blank slate" at birth. All of one's ideas are derived, either directly or indirectly, from either sensation (the source of one's knowledge of external objects) or reflection (the source of one's knowledge of one's mental processes). Berkeley, holding that perception requires a perceiver, developed a theory that required individual minds and God as perceivers of the world. Hume pushed empiricism in a skeptical direction, questioning beliefs in causation, self, and God.

Early in the twentieth century, the Vienna circle of logical positivists made a major impact on philosophy in England and the United States. They used empiricism as a criterion for meaning, holding that the only meaningful propositions are either tautologies (including mathematical statements), which tell nothing about the world, or else statements that are empirically verifiable. Logical positivism ran into two problems: It was difficult to state the principle of verification precisely, and it had a self-contradiction at its heart because the criterion of meaning is neither a tautology nor empirically verifiable. Thus the criterion of meaning seems to be meaningless. The later holism of American philosopher W. V. O. Quine (1908–2000) also challenged the positivist distinction between tautologies and empirical statements, pointing out that meanings may vary so much between contexts that the dichotomy is hard to maintain.

American empiricism

In the United States, William James (1842–1910) and John Dewey (1859–1952) developed an empiricism (called radical empiricism by James) that challenged some of the assumptions of British empiricism, especially the commitment to the existence of separate sensations. James held instead that people experience complexes of sensations in a matrix of relations. Thus they are not left with a choice between Hume's world of separate pieces and the non-empirical containers of these pieces (mind, God) of idealism. Values, the worth of things, can be perceived. Thus values are not subjective and arbitrary additions to empirical facts as held by most empiricists (and by modern culture generally). Dewey's subject-object transactionalism avoids the subject-object dichotomy. This more "generous empiricism" has influenced such thinkers as Henry Nelson Wieman, Bernard Meland, William Dean, Nancy Frankenberry, and Jerome A. Stone. Later Quine held that since empirical propositions are embedded in a network of commonsense or scientific theories, no statement can be verified in isolation. Confirmation or disconfirmation always affects a range of theories.

That vast conglomeration of ideas typically labeled postmodern has also impacted empiricism. A common theme of postmodernism is that there is no theory-free observation, that theories are not completely determined by data, and consequently that science is merely one of the many stories that people can tell each other. A major task confronting people who value science is how to honor the insights of postmodernism, including the tentativeness of verification and the hegemonic motive of the Enlightenment grand narrative of progress toward rationality, while at the same time articulating the ways in which scientific procedures have a relative and tentative yet significant value. A number of thinkers work towards this, including Richard Bernstein, Frederick Ferré, Susan Haack, J. Wentzel van Huyssteen, Lynn Hankinson Nelson, and Robert Neville.

It has been asked whether human gender influences empirical procedures, either through biological or cultural factors. Sandra Harding, Helen Longino, Evelyn Fox Keller, Lynn Hankinson Nelson, and others have been pursuing this question from differing perspectives.

Cross-cultural perspectives

To turn to a cross-cultural analysis, it should be observed that in developing their various technologies all cultures seem to have pursued empirical methods, sometimes in combination with nonempirical approaches. However, only the Western philosophical tradition seems to have developed the exclusiveness of empiricism as a theoretical option. In South Asia the Carvakas, Nyaya-Vaisesikas, and early Buddhists might be classified as empiricists. In China, Korea, and Japan the principle of "the investigation of things" occasionally took an empiricist direction, although not with the exclusiveness of European empiricism. "The investigation of things" usually included an investigation of the worth of things. One might speak of the empiricism of Mozi, Xunzi, Wang Fuzhi, Yan Yuan, Dai Zhen, and others of the "Investigations Based on Evidence" movement, and of the Korean Yi Yulgok.

Empiricism in the science-religion dialogue

As for science-religion issues, the topic of empiricism relates to virtually every question. For example, ideas on God, the soul, heaven, or reincarnation will be greatly influenced by a person's stance toward empiricism. That stance will also affect a person's ideas on the questions of the worth of tradition, revelation, scripture, or reason in religion and ethics. Related questions are whether the divine or the sacred as a quality of natural processes can be appreciated or responded to, as some "religious naturalists" hold, and whether such awareness is a complement to or an extension of a more strict empirical method. Another approach is to ask whether religious ideas can be vetoed by empirical procedures, whether they must be strictly based on or may be more loosely informed by them, or whether science and religion are such distinct orientations that neither can interfere with the other. Writers such as Douglas Clyde Macintosh and Henry Nelson Wieman have attempted to treat theology as an empirical study. The success of this depends on how one conceives God and also empirical method.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

empiricism

empiricism In sociology, the term empiricism is often used, loosely, to describe an orientation to research which emphasizes the collection of facts and observations, at the expense of conceptual reflection and theoretical enquiry. More rigorously, empiricism is the name given to a philosophical tradition which, in its modern form, developed in the context of the scientific revolution of the seventeenth century. Though by no means all of the early empiricists were advocates of the new science, empiricism subsequently developed in close symbiotic association with modern science. In sociology, empiricism has been widely adopted as a philosophical approach by those who advocate methodological naturalism: the development of sociology as a scientific discipline.

In its early forms (in the work of John Locke, David Hume, and others) empiricism was primarily an epistemology: a theory of the nature, scope, and limits of human knowledge. As such, it included a theory of the mind and its workings which has subsequently been displaced by the development of cognitive psychology. What remains of empiricism as a philosophical theory is primarily the thesis that substantive human knowledge is limited to what may be tested (confirmed or validated) by empirical observation. What may be known a priori, or independently of all experience, is restricted to analytical statements—for example, statements that offer definitions of technical concepts, or, as Hume put it, which state ‘relations of ideas’. Empiricism defended the privileged status of science as the only form of human enquiry in which knowledge-claims were based upon, or were permanently open to, testing in terms of empirical observation and experiment. Theology and speculative metaphysics, by contrast, made bogus claims to knowledge on the basis of faith, intuition, or ‘pure’ reason.

Though empiricists are keen to demonstrate their opposition to metaphysics, it may be argued that empiricism itself carries an implicit metaphysics: namely, that the ultimate (knowable) realities are the fleeting sensory impressions (or ‘sense-data’) against which all genuine knowledge-claims are to be tested. The most radical forms of empiricism, then, are liable to be sceptical about the knowability not only of the objects of scientific knowledge, but also the things and beings of common-sense experience. Thus, the distinctive twentieth-century form of empiricism, the logical empiricism or positivism of the Vienna Circle, followed upon the deep uncertainties of the turn-of-the-century revolution in physical science. In general, empiricists have raised the standard of empirical testability as a means of defending science, and combating the claims of, first metaphysics and theology, and more recently pseudo-sciences such as Marxism and psychoanalysis. Their difficulty has been to do so in a way which does not rule out all, or most, genuine science by the same criterion.

Cite this article Pick a style below, and copy the text for your bibliography.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Empiricism

Gale Encyclopedia of Psychology
COPYRIGHT 2001 The Gale Group Inc.

Empiricism

Type of research that is based on direct observation.

Psychologists prefer to learn about behavior through direct observation or experience. This approach reflects what is called empiricism. Psychologists are well-known for creating experiments, conducting interviews and using surveys, and carrying out case studies. The common feature of these approaches is that psychologists wait until observations are made before they draw any conclusions about the behaviors they are interested in.

Scientists often maintain that empiricism fosters healthy skepticism. By this they mean that they will not regard something as being true until they have made the observations themselves. Such an approach means that science can be self-correcting in the sense that when erroneous conclusions are drawn, others can test the original ideas to see if they are correct.

Empiricism is one of the hallmarks of any scientific endeavor. Other disciplines employ different approaches to gaining knowledge. For example, many philosophers use the a priori method rather than the empirical method. In the a priori method, one uses strictly rational, logical arguments to derive knowledge. Geometric proofs are an example of the use of the a priori method.

In everyday life, people accept ideas as being true or false based on authority or on intuition. In many cases, people hold beliefs because individuals who are experts have made pronouncements on some topic. For example, in religious matters, many people rely on the advice and guidance of their religious leaders in deciding on the correct way to lead their lives. Further, we often believe things because they seem intuitively obvious. Relying on authority and intuition may be very useful in some aspects of our lives, like those involving questions of morality.

Scientists prefer the empirical method in their work, however, because the topics of science lend themselves to observation and measurement . When something cannot be observed or measured, scientists are likely to conclude that it is outside the realm of science, even though it may be vitally important in some other realm.

Further Reading

Carruthers, Peter. Human Knowledge and Human Nature: A New Introduction to an Ancient Debate. Oxford, Eng.: Oxford University Press, 1992.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

empiricism

The Columbia Encyclopedia, 6th ed.

Copyright The Columbia University Press

empiricism (ĕmpĬr´ĬsĬzəm) [Gr.,=experience], philosophical doctrine that all knowledge is derived from experience. For most empiricists, experience includes inner experience—reflection upon the mind and its operations—as well as sense perception. This position is opposed to rationalism in that it denies the existence of innate ideas. According to the empiricist, all ideas are derived from experience; therefore, knowledge of the physical world can be nothing more than a generalization from particular instances and can never reach more than a high degree of probability. Most empiricists recognize the existence of at least some a priori truths, e.g., those of mathematics and logic. John Stuart Mill was the first to treat even these as generalizations from experience. Empiricism has been the dominant but not the only tradition in British philosophy. Among its other leading advocates were John Locke, George Berkeley, and David Hume. See also logical positivism.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.