In this book the author gives a broad overview of different areas of research in nonmonotonic reasoning, and presents some new results and ideas based on his research. The guiding principles are: clarification of the different research activities in the area, which have sometimes been undertaken independently of each other; and appreciation of the fact that these research activities often represent different means to the same ends, namely sound theoretical foundations and efficient computation. The book begins with a discussion (...) of the various types of nonmonotonic reasoning, their applications and their logics. Theorem proving techniques for these logics are also described. The following chapters deal with formulations of nonmonotonic inheritance, and nonmonotonic reasoning based on nonmonotonic rules. The final chapter discusses the achievements in the field in the light of the Yale shooting example. The book will be welcomed by researchers in theoretical computer science and artificial intelligence. (shrink)

This article begins with an introduction to defeasible (nonmonotonic) reasoning and a brief description of a computer program, EVID, which can perform such reasoning. I then explain, and illustrate with examples, how this program can be applied in computational representations of ordinary dialogic argumentation. The program represents the beliefs and doubts of the dialoguers, and uses these propositional attitudes, which can include commonsense defeasible inference rules, to infer various changing conclusions as a dialogue progresses. It is proposed (...) that computational representations of this kind are a useful tool in the analysis of dialogic argumentation, and, in particular, demonstrate the important role of defeasible reasoning in everyday arguments using commonsensereasoning. (shrink)

This book by one of the world's foremost philosophers in the fields of epistemology and logic offers an account of suppositional reasoning relevant to practical deliberation, explanation, prediction and hypothesis testing. Suppositions made 'for the sake of argument' sometimes conflict with our beliefs, and when they do, some beliefs are rejected and others retained. Thanks to such belief contravention, adding content to a supposition can undermine conclusions reached without it. Subversion can also arise because suppositional reasoning is ampliative. (...) These two types of nonmonotonic logic are the focus of this book. A detailed comparison of nonmonotonicity appropriate to both belief contravening and ampliative suppositional reasoning reveals important differences that have been overlooked. (shrink)

In this paper, we present an approach to commonsense causal explanation of stories that can be used for automatically determining the liable party in legal case descriptions. The approach is based on , a core ontology for law that takes a commonsense perspective. Aside from our thesis that in the legal domain many terms still have a strong commonsense flavour, the descriptions of events in legal cases, as e.g. presented at judicial trials, are cast in commonsense (...) terms as well. We present design principles for representing commonsense causation, and describe a process-based approach to automatic identification of causal relations in stories, which are described in terms of the core ontology. The resulting causal explanation forms a necessary condition for determining the liability and responsibility of agents that play a role in the case. We describe the basic architecture and working of , the demonstrator we are constructing to test the validity of our process oriented view on commonsense causation. This view holds that causal relations are in fact abstractions constructed on the basis of our commonsense understanding of physical and mental processes. (shrink)

Normality judgements are frequently used in everyday communication as well as in biological and social science. Moreover they became increasingly relevant to formal logic as part of defeasible reasoning. This paper distinguishes different kinds of normality statements. It is argued that normality laws like “Birds can normally fly” should be understood essentially in a statistical way. The argument has basically two parts: firstly, a statistical semantic core is mandatory for a descriptive reading of normality in order to explain the (...) logical features of normality laws. Secondly, a statistical justification of normality statements can be derived by game theoretic considerations if the normality law is understood as communication convention. (shrink)

The success of set theory as a foundation for mathematics inspires its use in arti cial intelligence, particularly in commonsensereasoning. In this survey, we brie y review classical set theory from an AI perspective, and then consider alternative set theories. Desirable properties of a possible commonsense set theory are investigated, treating di erent aspects like cumulative hierarchy, self-reference, cardinality, etc. Assorted examples from the ground-breaking research on the subject are also given.

The success of set theory as a foundation for mathematics inspires its use in artificial intelligence, particularly in commonsensereasoning. In this survey, we briefly review classical set theory from an AI perspective, and then consider alternative set theories. Desirable properties of a possible commonsense set theory are investigated, treating different aspects like cumulative hierarchy, self-reference, cardinality, etc. Assorted examples from the ground-breaking research on the subject are also given.

This is an expository article about the solution to the frame problem proposed in 1980 by Raymond Reiter. For years, his “frame default” remained untested and suspect. But developments in some seemingly unrelated areas of computer science—logic programming and satisfiability solvers—eventually exonerated the frame default and turned it into a basis for important applications.

This paper investigates an alternative set theory (due to Peter Aczel) called Hyperset Theory. Aczel uses a graphical representation for sets and thereby allows the representation of non-well-founded sets. A program, called HYPERSOLVER, which can solve systems of equations defined in terms of sets in the universe of this new theory is presented. This may be a useful tool for commonsensereasoning.

We present an axiomatization of a problem in commonsensereasoning, characterizing the proper procedure for cracking an egg and transferring its contents to a bowl. The axiomatization is mid-sized, larger than toy problems such as the Yale Shooting Problem or the Suitcase Problem, but much smaller than the comprehensive axiomatizations associated with CYC and HPKB. This size of axiomatization permits the development of non-trivial, reusable core theories of commonsensereasoning, acts as a testbed for existing theories (...) of commonsensereasoning, and encourages the discovery of new problems in commonsensereasoning.We present portions of core theories of containment, falling, and pouring, integrated into Shanahan's circumscriptive event calculus, and show how these can serve as the basis of an axiomatization that partly characterizes egg cracking. We discuss several commonsensereasoning problems encountered during this research, such as the Initial Specification Problem (a relative of the frame problem that occurs in theories in which fluents can trigger actions), and the Unobtainable State Problem (the problem of determining whether or not a theorem stating that one cannot get from one state to another is meaningful). (shrink)

What is this thing called ‘Commonsense Psychology’? The first matter to settle is what the issue is here. By ‘commonsense psychology,’ I mean primarily the systems of describing, explaining and predicting human thought and action in terms of beliefs, desires, hopes, fears, expectations, intentions and other so-called propositional attitudes. Although commonsense psychology encompasses more than propositional attitudes--e.g., emotions, traits and abilities are also within its purview--belief-desire reasoning forms the core of commonsense psychology. Commonsense psychology (...) is what we use to explain intentional action as ordinarily described--e.g., Jack went to the store because he wanted some ice cream. Commonsense psychology also is used to explain mental states--e.g., Jill feared that she would be late because she thought that the meeting began at 4:00. Commonsense psychology is the province of everyone; we all use it all the time. (shrink)

Despite overwhelming evidence suggesting that quantifier scope is a phenomenon that must be treated at the pragmatic level, most computational treatments of scope ambiguities have thus far been a collection of syntactically motivated preference rules. This might be in part due to the prevailing wisdom that a commonsense inferencing strategy would require the storage of and reasoning with a vast amount of background knowledge. In this paper we hope to demonstrate that the challenge in developing a commonsense (...) inferencing strategy is in the discovery of the relevant commonsense data and in a proper formulation of the inferencing strategy itself, and that a massive amount of background knowledge is not always required. In particular, we present a very effective procedure for resolving quantifier scope ambiguities at the pragmatic level using simple quantitative data that is readily available in most database environments. (shrink)

This essay explores the universal cognitive bases of biological taxonomy and taxonomic inference using cross-cultural experimental work with urbanized Americans and forest-dwelling Maya Indians. A universal, essentialist appreciation of generic species appears as the causal foundation for the taxonomic arrangement of biodiversity, and for inference about the distribution of causally-related properties that underlie biodiversity. Universal folkbiological taxonomy is domain-specific: its structure does not spontaneously or invariably arise in other cognitive domains, like substances, artifacts or persons. It is plausibly an innately-determined (...) evolutionary adaptation to relevant and recurrent aspects of ancestral hominid environments, such as the need to recognize, locate, react to, and profit from many ambient species. Folkbiological concepts are special players in cultural evolution, whose native stability attaches to more variable and difficult-to-learn representational forms, thus enhancing the latter's prospects for regularity and recurrence in transmission within and across cultures. This includes knowledge that cumulatively enriches (folk expertise), overrides (religious belief) or otherwise transcends (science) the commonsense ontology prescribed by folkbiology. Finally, the studies summarized here indicate that results gathered from “standard populations” in regard to biological categorization and reasoning more often than not fail to generalize in straightforward ways to humanity at large. This suggests the need for much more serious attention to cross-cultural research on basic cognitive processes. (shrink)

A naturalistic theory of modal intuitions and modal reasoning inspired by Hintikka's theorizing should start from the principle that advanced modal reasoning has its roots in commonsense intuitions. It is proposed that the naturalist can rely on the assumption of uniformity: the same set of basic principles is used in reasoning about actual and counterfactual dependencies - modal cognition is conservative. In the most primitive cases the difference between a model of an actual situation and of (...) a merely possible one lies in its functional and indicational roles, not in its internal make-up. This conjecture enables one to derive important aspects of modal reasoning from the non-modal one. In the final section of the paper a simplified account of such derivation is proposed, drawn partly from connection- ist literature. (shrink)

Description logics refer to a family of formalisms concentrated around concepts, roles and individuals. They belong to the most frequently used knowledge representation formalisms and provide a logical basis to a variety of well known paradigms. The main reasoning tasks considered in the area of description logics are those reducible to subsumption. On the other hand, any knowledge representation system should be equipped with a more advanced reasoning machinery. Therefore in the current paper we make a step towards (...) integrating description logics with second-order reasoning. One of the important motivations behind introducing second-order formalism follows from the fact that many forms of commonsense and nonmonotonic reasoning used in AI can be modelled within the second-order logic. To achieve our goal we first extend description logics with a possibility to quantify over concepts. Since one of the main criticisms against the use of second-order formalisms is their complexity, we next propose second-order quantifier elimination techniques applicable to a large class of description logic formulas. Finally we show applications of the techniques, in particular in reasoning with circumscribed concepts and approximated terminological formulas. (shrink)

In this paper, the foundations for setting up a knowledge industry are laid. Firstly, it is established that this industry constitutes the only way of making use of the huge amounts of knowledge produced as a result of the introduction of the Science-Technology binomial in postindustrial society. Then, the elements which will lead to such an industry are defined, that is, the resources and means. Under the ‘Means’ section, special emphasis is placed on the processes involved, in other words, inference (...) methods and commonsensereasoning. Finally, it is concluded that the establishment of this industry, calledmindfacturing because of the raw material that it processes and uses, is, more than possible, desirable, provided that the precautions outlined in the epilogue are taken. (shrink)

This is a book on morality, rationality, and the interconnections between the two. In it, I defend a version of consequentialism that both comports with our commonsense moral intuitions and shares with other consequentialist theories the same compelling teleological conception of practical reasons.

Cognitive science is beginning to make a contribution to the science-and-religion dialogue by its claims about the nature of both scientific and religious knowledge and the practices such knowledge informs. Of particular importance is the distinction between folk knowledge and abstract theoretical knowledge leading to a distinction between folk science and folk religion on the one hand and the reflective, theoretical, abstract form of thought that characterizes both advanced scientific thought and sophisticated theological reasoning on the other. Both folk (...) science and folk religion emerge from commonsensereasoning about the world, a form of reasoning bequeathed to us by the processes of natural selection. Suggestions are made about what scientists and theologians can do if they accept these claims. (shrink)

The frame problem is a problem in artificial intelligence that a number of philosophers have claimed has philosophical relevance. The structure of this paper is as follows: (1) An account of the frame problem is given; (2) The frame problem is distinguished from related problems; (3) The main strategies for dealing with the frame problem are outlined; (4) A difference between commonsensereasoning and prediction using a scientific theory is argued for; (5) Some implications for the..

In recent years there has been a growing consensus that ordinary reasoning does not conform to the laws of classical logic, but is rather nonmonotonic in the sense that conclusions previously drawn may well be removed upon acquiring further information. Even so, rational belief formation has up to now been modelled as conforming to some fundamental principles that are classically valid. The counterexample described in this paper suggests that a number of the most cherished of these principles should not (...) be regarded as valid for commonsensereasoning. An explanation of this puzzling failure is given, arguing that a problem in the theory of rational choice transfers to the realm of belief formation. (shrink)

From a philosophical standpoint, the work presented here is based on van Fraassen [26]. The bulk of that paper is organized around a series of arguments against the assumption, built into standard deontic logic, that moral dilemmas are impossible; and van Fraassen only briefly sketches his alternative approach. His paper ends with the conclusion that “the problem of possibly irresolvable moral conflict reveals serious flaws in the philosophical and semantic foundations of ‘orthodox’ deontic logic, but also suggests a rich set (...) of new problems and methods for such logic.” My goal has been to suggest that some of these methods might be found in current research on nonmonotonic reasoning, and that some of the problems may have been confronted there as well.I have shown that nonmonotonic logics provide a natural framework for reasoning about moral dilemmas, perhaps even more useful than the ordinary modal framework, and that the issues surrounding the treatment of exceptional information within these logics run parallel to some of the problems posed by conditional oughts. However, there is also another way in which deontic logic might benefit from a connection to nonmonotonic reasoning. A familiar criticism among ethicists of work in deontic logic is that it is too abstract, and too far removed from the kind of problems confronted by real agents in moral deliberation. It must be said that similar criticisms of abstraction and irrelevance are often lodged against work in nonmonotonic reasoning by more practically minded researchers in artificial intelligence; but here, at least, the criticisms are taken seriously. Nonmonotonic logic aims at a qualitative account of commonsensereasoning, which can be used to relate planning and action to defeasible goals and beliefs; and at least some of the theories developed in this area have been tested in realistic situations. By linking the subject of deontic logic to this research, it may be possible also to relate the idealized study of moral reasoning typical of the field to a more robust treatment of practical deliberation. (shrink)

Ask nearly any analytic philosopher of mind how we understand intentional actions performed for reasons and you are bound to be told that we do so by deploying mental concepts, such as beliefs and desires, in systematic ways. This way of making sense of actions is known as commonsense or folk psychology (or CSP or FP for short). There have been many interesting debates about CSP over the years. These have focused on questions including: How fundamental and universal is (...) this practice? Which species engage in it? What mechanisms underwrite the competence? How is the ability acquired? And, what exactly is its status (e.g. is it a kind of theory or simulative ability? If it's a theory, is it a good theory, etc.)? Philosophers divide in their responses to such questions, but practically all of them agree that CSP is at least a prominent and important part of our everyday understanding and that it grounds at least some very important social practices. (shrink)

Second-order quantifier elimination in the context of classical logic emerged as a powerful technique in many applications, including the correspondence theory, relational databases, deductive and knowledge databases, knowledge representation, commonsensereasoning and approximate reasoning. In the current paper we first generalize the result of Nonnengart and Szałas [17] by allowing second-order variables to appear within higher-order contexts. Then we focus on a semantical analysis of conditionals, using the introduced technique and Gabbay’s semantics provided in [10] and substantially (...) using a third-order accessibility relation. The analysis is done via finding correspondences between axioms involving conditionals and properties of the underlying third-order relation. (shrink)

Despite the venerable place that "justice" occupies in social scientific theory and research, little effort has been made to see how members of society themselves define and use the concept when confronted with determining "what has happened" in some social arena, theorizing about why it happened, and deciding what should ensue. We take an ethnomethodological approach to justice, attempting to recover it as a feature of practical activity or a "phenomenon of order." Our analysis involves an actual videotaped jury deliberation. (...) In his classic study of decision making by juries, Garfinkel observed that jurors changed their reliance on commonsensereasoning very little, even though they were instructed to adhere to official and legal criteria for guilt. The vacillation between commonsensereasoning and using official criteria creates a tension; in our data this tension is manifested as the choice between adhering to law and procedural rules and providing "justice." By articulating this tension as a puzzle, several of the jurors prepare the way for using "justice," and then use this concept in formal ways which, along with other discursive patterns and strategies, constitute the deliberation as a structured, concerted activity. We show four stages in the use of the term justice as it is embedded in jurors' practical reasoning. (shrink)

Montague’s framework for semantic interpretation has always been less well adapted to the interpretation of words than of syntactic constructions. In the late 1970s, David Dowty addressed this problem, concentrating on the interpretation of tense, aspect, inchoatives, and causatives in an extension of Montague’s Intensional Logic. In this paper I will try to revive this project, conceiving it as part of a larger task aiming at the interpretation of derivational morphology. I will try to identity some obstacles arising in Dowty’s (...) approach, and will suggest an alternative approach that, while it does not provide a global interpretation of causality, seems to work well with a wide range of the causal constructions that are important in word formation. I try to relate these ideas to some themes in contemporary philosophy and in the formalization of commonsensereasoning. (shrink)

This essay describes a general approach to building perturbation-tolerant autonomous systems, based on the conviction that artificial agents should be able to notice when something is amiss, assess the anomaly, and guide a solution into place. This basic strategy of self-guided learning is termed the metacognitive loop; it involves the system monitoring, reasoning about, and, when necessary, altering its own decision-making components. This paper (a) argues that equipping agents with a metacognitive loop can help to overcome the brittleness problem, (...) (b) details the metacognitive loop and its relation to our ongoing work on time-sensitive commonsensereasoning, (c) describes specific, implemented systems whose perturbation tolerance was improved by adding a metacognitive loop, and (d) outlines both short-term and long-term research agendas. (shrink)

In this paper we present an embedding of abstract argumentation systems into the framework of Barwise and Seligmans logic of information flow. We show that, taking P.M. Dungs characterization of argument systems, a local logic over states of a deliberation may be constructed. In this structure, the key feature of non-monotonicity of commonsensereasoning obtains as the transition from one local logic to another, due to a change in certain background conditions. Each of Dungs extensions of argument systems (...) leads to a corresponding ordering of background conditions. The relations among extensions becomes a relation among partial orderings of background conditions. This introduces a conceptual innovation in Barwise and Seligmans representation of commonsensereasoning. (shrink)

. Artificial Intelligence (AI) has long dealt with the issue of finding a suitable formalization for commonsensereasoning. Defeasible argumentation has proven to be a successful approach in many respects, proving to be a confluence point for many alternative logical frameworks. Different formalisms have been developed, most of them sharing the common notions of argument and warrant. In defeasible argumentation, an argument is a tentative (defeasible) proof for reaching a conclusion. An argument is warranted when it ultimately prevails (...) over other conflicting arguments. In this context, defeasible consequence relationships for modelling argument and warrant as well as their logical properties have gained particular attention. This article analyzes two non-monotonic inference operators Carg and Cwar intended for modelling argument construction and dialectical analysis (warrant), respectively. As a basis for such analysis we will use the LDSar framework, a unifying approach to computational models of argument using Labelled Deductive Systems (LDS). In the context of this logical framework, we show how labels can be used to represent arguments as well as argument trees, facilitating the definition and study of non-monotonic inference operators, whose associated logical properties are studied and contrasted. We contend that this analysis provides useful comparison criteria that can be extended and applied to other argumentation frameworks. (shrink)

Any division between scientific practice and a metalevel of the methods and goals of science is largely a false dichotomy. Since a priori, foundationist or logicist approaches to normative principles have proven unequal to the task of representing actual scientific practice, methodologies of science must be abstracted from episodes in the history of science. Of course, it is possible that such characteristics could prove universal and constant across various eras. But, case studies show that they are not in anything beyond (...) the strictures applied to everyday, commonsensereasoning (e.g., a requirement of noncontradiction in a deductive argument). Hence, even if some presently-on-offer methodology or description of past scientific practice were adequate, it need not remain so for current (‘frontier’) areas of science. For this reason, it is important to examine recent episodes in, say, high-energy physics. Results from case studies of several episodes in that field are used to argue that successful practice leads scientists to countenance essential changes in the methodological framework at the levels of the criteria employed in judging theories (i.e., what counts for an explanation and what are canons of rationality) and of the goals of science. *Partial support for this research was provided by the History and Philosophy of Science Program of the National Science Foundation under grants Nos. SES-8606472 and SES-8705469. A preliminary version of this paper was given at an HPS seminar at King's College, London University in May 1988. Helpful comments and useful criticisms were made by several colleagues, especially Ernan McMullin, Heinz Post and Simon Saunders (none of whom are to be held responsible for or necessarily even in agreement with the views expressed here.). (shrink)

“Toy worlds” involving actions, such as the blocks world and the Missionaries and Cannibals puzzle, are often used by researchers in the areas of commonsensereasoning and planning to illustrate and test their ideas. We would like to create a database of generalpurpose knowledge about actions that encodes common features of many action domains of this kind, in the same way as abstract algebra and topology represent common features of speciﬁc number systems. This paper is a report on (...) the ﬁrst stage of this project—the design of an action description language in which this database will be written. The new language is an extension of the action language C+. Its main distinctive feature is the possibility of referring to other action descriptions in the deﬁnition of a new action domain. (shrink)

“Toy worlds” involving actions, such as the Blocks World and the Monkey and Bananas domain, are often used by researchers in the areas of commonsensereasoning and planning to illustrate and test their ideas. Many of the axioms found in descriptions of these toy worlds are expressions of generalpurpose knowledge, though they are often cast in a form only useful for solving one speciﬁc problem and are not faithful representations of general facts that can be used in other (...) domains. Instead of using such domain-speciﬁc axioms for each problem, we are building a general-purpose library of action descriptions which can be referred to in descriptions of many action domains. The library is being written in the modular action description language MAD, an extension of the action language ·. In this paper we present an initial version of some of our library modules, along with a new formalization of the Monkey and Bananas domain that uses the library. Most of the axioms in this formalization come from the library, with only a few domain-speciﬁc axioms needed. (shrink)

Logical models of argument formalize commonsensereasoning while taking process and computation seriously. This survey discusses the main ideas which characterize di erent logical models of argument. It presents the formal features of a few main approaches to the modeling of argumentation. We trace the evolution of argumentationfrom the mid-80's, when argumentsystems emerged as an alternative to nonmonotonic formalisms based on classical logic, to the present, as argument is embedded in di erent complex systems for real-world applications, and (...) allows more formal work to be done in di erent areas, such as AI & Law, case-based reasoning and negotiation among intelligent agents. (shrink)

ABSTRACT Certain problems in commonsensereasoning lend themselves to the use of non-standard formalisms which we call active logics. Among these are problems of objects misidentification. In this paper we describe some technical issues connected with automated inference in active logics, using particular object misidentification problems as illustrations. Control of exponential growth of inferences is a key issue. To control this growth attention is paid to a limited version of an inference rule for negative introspection. We also present (...) some descriptive statistics for comparison with earlier active-logic approaches. (shrink)

The position of Polish informatics, as well in research as in didactic, has its roots in achievements of Polish mathematicians of Warsaw School and logicians of Lvov-Warsaw School. Jan Lukasiewicz is considered in the world of computer science as the most famous Polish logician. The parenthesis-free notation, invented by him, is known as PN (Polish Notation) and RPN (Reverse Polish Notation). Lukasiewicz created many-valued logic as a separate subject. The idea of multi-valueness is applied to hardware design (many-valued or fuzzy (...) switching, analog computer). Many-valued approach to vague notions and commonsensereasoning is the method of expert systems, databases and knowledge-based systems. Stanis3aw Jaokowski's system of natural deduction is the base of systems of automatic deduction and theorem proving. He created a system of paraconsistent logic. Such logics are used in AI. Kazimierz Ajdukiewicz with his categorial grammar participated in the development of formal grammars, the field significant for programming languages. Andrzej Grzegorczyk had an important contribution to the development of the theory of recursiveness. (shrink)

We discuss circumscription, a logical formalization of non-monotonic reasoning, introduced by John McCarthy and Vladimir Lifschitz. First section contains presentation of assumptions of logic-based artificial intelligence, problem of non-monotonicity in commonsensereasoning and informal formulation of circumscription. In section two, a formal definition of circumscription is given. The idea of circumscription is discussed from syntactic and semantic point of view. Theoretical investigations are supplemented with examples. In section three, methods of computing circumscription are discussed. Section four contains (...) exemplary circumscription-based formalization of simple non-monotonic reasoning. Finally, not only a comment about the role of logic in artificial intelligence is made but a piece of information about implementation of circumscription is given as well. (shrink)

Decision theory explains weakness of will as the result of a conflict of incentives between different transient agents. In this framework, self-control can only be achieved by the I-now altering the incentives or choice-sets of future selves. There is no role for an extended agency over time. However, it is possible to extend game theory to allow multiple levels of agency. At the inter-personal level, theories of team reasoning allow teams to be agents, as well as individuals. I apply (...) team reasoning at the intra-personal level, taking the self as a team of transient agents over time. This allows agents to ask, not just “what should I-now do?’, but also ‘What should I, the person over time do?’, which may enable agents to achieve self-control. The resulting account is Aristotelian in flavour, as it involves reasoning schemata and perception, and it is compatible with some of the psychological findings about self-control. (shrink)

Regress arguments have convinced many that reasoning cannot require beliefs about what follows from what. In this paper I argue that this is a mistake. Regress arguments rest on dubious (although deeply entrenched) assumptions about the nature of reasoning — most prominently, the assumption that believing p by reasoning is simply a matter of having a belief in p with the right causal ancestry. I propose an alternative account, according to which beliefs about what follows from what (...) play a constitutive role in reasoning. (shrink)

When and why does it matter whether we can give an explicit justification for what we believe? This paper examines these questions in the light of recent empirical work on the social functions served by our capacity to reason, in particular, Mercier and Sperber’s argumentative theory of reasoning.

Is conceptual relativity a genuine phenomenon? If so, how is it properly understood? And if it does occur, does it undermine metaphysical realism? These are the questions we propose to address. We will argue that conceptual relativity is indeed a genuine phenomenon, albeit an extremely puzzling one. We will offer an account of it. And we will argue that it is entirely compatible with metaphysical realism. Metaphysical realism is the view that there is a world of objects and properties that (...) is independent of our thought and discourse (including our schemes of concepts) about such a world. Hilary Putnam, a former proponent of metaphysical realism, later gave it up largely because of the alleged phenomenon that he himself has given the label ‘conceptual relativity’. One of the key ideas of conceptual relativity is that certain concepts—including such fundamental concepts as object, entity, and existence—have a multiplicity of different and incompatible uses (Putnam 1987, p. 19; 1988, pp. 110 14). According to Putnam, once we recognize the phenomenon of conceptual relativity we must reject metaphysical realism: The suggestion . . . is that what is (by commonsense standards) the same situation can be described in many different ways, depending on how we use the words. The situation does not itself legislate how words like “object,” “entity,” and “exist” must be used. What is wrong with the notion of objects existing “independently” of conceptual schemes is that there are no standards for the use of even the logical notions apart from conceptual choices.” (Putnam 1988, p. 114) Putnam’s intriguing reasoning in this passage is difficult to evaluate directly, because conceptual [1] relativity is philosophically perplexing and in general is not well understood. In this paper we propose a construal of conceptual relativity that clarifies it considerably and explains how it is possible despite its initial air of paradox. We then draw upon this construal to explain why, contrary to Putnam and others, conceptual relativity does not conflict with metaphysical realism, but in fact comports well with it.. (shrink)

Can some films be genuine thought experiments that challenge our commonsense intuitions? Certain filmic narratives and their mise-en-scène details reveal rigorous reasoning and counterintuitive outcomes on philosophical issues, such as skepticism or personal identity. But this philosophical façade may hide a mundane concern for entertainment. Unfamiliar narratives drive spectator entertainment, and every novel cinematic situation could be easily explained as part of a process that lacks motives of philosophical elucidation. -/- The paper inverses the above objection, and proposes (...) that when the main cinematic character resists spectator engagement (a crucial source of cinematic entertainment), emotionally challenged spectators also question their commonsensical beliefs about his/her actions, and detect a conceptually novel situation as such. -/- A case study is Mike Leigh’s film Happy-Go-Lucky (2008), in which the main female character presents an unrelenting but eccentric version of 'feel good' happiness. Spectators gradually detect that the previously unexamined, commonsensical version of subjective happiness comes at the price of individual eccentricity, and that the choice of a subjective theory of happiness leads to consequences hitherto unacknowledged. (shrink)

Elaborating on the notions that humans possess different modalities of decision-making and that these are often influenced by moral considerations, we conducted an experimental investigation of the Trolley Problem. We presented the participants with two standard scenarios (‹lever’ and ‹stranger’) either in the usual or in reversed order. We observe that responses to the lever scenario, which result from (moral) reasoning, are affected by our manipulation; whereas responses to the stranger scenario, triggered by moral emotions, are unaffected. Furthermore, when (...) asked to express general moral opinions on the themes of the Trolley Problem, about half of the participants reveal some inconsistency with the responses they had previously given. (shrink)

Since at least the 1960s, deontic logicians and ethicists have worried about whether there can be normative systems that allow conflicting obligations. Surprisingly, however, little direct attention has been paid to questions about how we may reason with conflicting obligations. In this paper, I present a problem for making sense of reasoning with conflicting obligations and argue that no deontic logic can solve this problem. I then develop an account of reasoning based on the popular idea in ethics (...) that reasons explain obligations and show that it solves this problem. (shrink)

Much research in the last two decades has demonstrated that human responses deviate from the performance deemed normative according to various models of decision making and rational judgment (e.g., the basic axioms of utility theory). This gap between the normative and the descriptive can be interpreted as indicating systematic irrationalities in human cognition. However, four alternative interpretations preserve the assumption that human behavior and cognition is largely rational. These posit that the gap is due to (1) performance errors, (2) computational (...) limitations, (3) the wrong norm being applied by the experimenter, and (4) a different construal of the task by the subject. In the debates about the viability of these alternative explanations, attention has been focused too narrowly on the modal response. In a series of experiments involving most of the classic tasks in the heuristics and biases literature, we have examined the implications of individual differences in performance for each of the four explanations of the normative/descriptive gap. Performance errors are a minor factor in the gap; computational limitations underlie non-normative responding on several tasks, particularly those that involve some type of cognitive decontextualization. Unexpected patterns of covariance can suggest when the wrong norm is being applied to a task or when an alternative construal of the task should be considered appropriate. Key Words: biases; descriptive models; heuristics; individual differences; normative models; rationality; reasoning. (shrink)

In his work on following a rule Wittgenstein discerned principles of interpretation that apply to commonsense psychology and psychoanalysis. We can use these to assess the cogency of psychoanalytic reasoning.