We maximally extend the quantum‐mechanical results of Muller and Saunders ( 2008 ) establishing the ‘weak discernibility’ of an arbitrary number of similar fermions in finite‐dimensional Hilbert spaces. This confutes the currently dominant view that ( A ) the quantum‐mechanical description of similar particles conflicts with Leibniz’s Principle of the Identity of Indiscernibles (PII); and that ( B ) the only way to save PII is by adopting some heavy metaphysical notion such as Scotusian haecceitas or Adamsian primitive thisness. (...) We take sides with Muller and Saunders ( 2008 ) against this currently dominant view, which has been expounded and defended by many. *Received July 2008; revised May 2009. †To contact the authors, please write to: F. A. Muller, Faculty of Philosophy, Erasmus University Rotterdam, Burg. Oudlaan 50, H5–16, 3062 PA Rotterdam, The Netherlands; e‐mail: f.a.muller@fwb.eur.nl , and Institute for the History and Foundations of Science, Utrecht University, Budapestlaan 6, IGG–3.08, 3584 CD Utrecht, The Netherlands; e‐mail: f.a.muller@uu.nl . M. P. Seevinck, Institute for the History and Foundations of Science, Utrecht University, Budapestlaan 6, IGG–3.08, 3584 CD Utrecht, The Netherlands; e‐mail: m.p.seevinck@uu.nl. (shrink)

This paper follows Part I of our essay on case-intensional first-order logic (CIFOL; Belnap and Müller (2013)). We introduce a framework of branching histories to take account of indeterminism. Our system BH-CIFOL adds structure to the cases, which in Part I formed just a set: a case in BH-CIFOL is a moment/history pair, specifying both an element of a partial ordering of moments and one of the total courses of events (extending all the way into the future) that that moment (...) is part of. This framework allows us to define the familiar Ockhamist temporal/modal connectives, most notably for past, future, and settledness. The novelty of our framework becomes visible in our discussion of substances in branching histories, i.e., in its first-order part. That discussion shows how the basic idea of tracing an individual thing from case to case via an absolute property is applicable in a branching histories framework. We stress the importance of keeping apart extensionality and moment-definiteness, and give a formal account of how the specification of natural sortals and natural qualities turns out to be a coordination task in BH-CIFOL. We also provide a detailed answer to Lewis’s well-known argument against branching histories, exposing the fallacy in that argument. (shrink)

Counter to the popular impression that Adam Smith was a champion of selfishness and greed, Jerry Muller shows that the Inquiry into the Nature and Causes of the Wealth of Nations maintained that markets served to promote the well-being of ...

In this original work of psychoanalytic theory, John Muller explores the formative power of signs and their impact on the mind, the body and subjectivity, giving special attention to work of the French psychoanalyst Jacques Lacan and the American philosopher Charles Sanders Peirce. Muller explores how Lacan's way of understanding experience through three dimensions--the real, the imaginary and the symbolic--can be useful both for thinking about cultural phenomena and for understanding the complexities involved in treating psychotic patients. (...) class='Hi'>Muller develops Lacan's perspective gradually, presenting it as distinctive approaches to data from a variety of sources, such as cognitive, social and developmental psychology, literature, history, art, and psychoanalytic treatment. The book's first four chapters present Muller's reading of selected data from child development research, psychology and linguistics, approximating a semiotic model of "normal" development. The following three chapters examine in a Lacanian framework the structural basis of psychotic stages as indicative of massive semiotic failure in development. The final chapters on human narcissism suggest reasons that "normal" development may be impossible. (shrink)

The Digital Dictionary of Buddhism [DDB] (http://buddhism-dict.net/ddb), now on the Web for more than 15 years, has become a primary reference work for the field of Buddhist Studies. Containing over 53,000 entries, it is subscribed to by more than 30 university libraries (http://www.buddhism-dict.net/ddb/subscribing_libraries.html), and supported by the contributions of over 70 specialists, many of these recognized leaders in the field. It can perhaps be described as example of the type of web resource that has reached a degree of status and (...) sustainability such that it has been able to grow and thrive as a collaborativelydeveloped online reference—despite having little funding or the support of a major organization or team of programmers—in the age where such resources are so readily washed away by the combination of Wikipedia and Google. Thus, the field of Buddhist Studies has its own reliable, scholarly-edited, fully documented and responsible resource that has developed a center of gravity sufficient for it to continue to grow as the resource that specialists turn to first without hesitation, and to which they may contribute knowing that they will be clearly accredited, and that what they write will not be deleted or changed in the following moment by, for example, a junior high school student. Recently, the technical advisor to the DDB, Michael Beddow, has completed a full overhaul of the supporting structure of the DDB and CJKV-E dictionaries, which will include a broad range of enhanced functions, both internal to the dictionary and in terms of interoperation with other lexicons and web corpora. This presentation will start off with a demonstration of the most advanced functions of the DDB, to be followed by a brief overview of its technical framework (P5-influenced XML, delivered through XSL and Perl). We will then outline the key factors of the management of the DDB that we believe have most directly contributed to its success. (shrink)

Are we perhaps in the "matrix", or anyway, victims of perfect and permanent computer simulation? No. The most convincing—and shortest—version of Putnam's argument against the possibility of our eternal envattment is due to Crispin Wright (1994). It avoids most of the misunderstandings that have been elicited by Putnam's original presentation of the argument in "Reason, Truth and History" (1981). But it is still open to the charge of question-begging. True enough, the premisses of the argument (disquotation and externalism) can be (...) formulated and defended without presupposing external objects whose existence appears doubtful in the light of the very skeptical scenario which Putnam wants to repudiate. However, the argument is only valid if we add an extra premiss as to the existence of some external objects. In order to avoid circularity, we should run the argument with external objects which must exist even if we are brains in a vat, e.g. with computers rather than with trees. As long as the skeptic is engaged in a discussion of the brain-in-a-vat scenario, she should neither deny the existence of computers nor the existence of causal relations; for if she does, she is in fact denying that we are brains in a vat. (shrink)

Alan Musgrave, Michael Friedman, Jeffrey Foss, and Richard Creath raised different objections against the Distinction between observables and unobservables when drawn within the confines of Bas C. van Fraassen's Constructive Empiricism (CE), to the effect that the Distinction cannot be drawn there coherently. Van Fraassen has only responded to Musgrave but Musgrave claimed not to understand van Fraassen's succinct response. I argue that van Fraassen's response is not enough. What remains in the end is an unsolved problem which CE cannot (...) afford to leave unsolved, or so I argue; I then strengthen Musgrave's criticism and indicate that an extension of the epistemic policy of CE is mandatory to solve the problem. I also argue that Friedman's and Foss' objection against the Distinction in CE misses the mark on closer inspection. An objection due to Creath does hit the mark but can be taken care of without too much ado. All these objections seem alive and kicking until the present day; I try (and hope) to put them all to rest. (shrink)

When it became uncool to speak of beauty with respect to pieces of art, physicists started claiming that their results are beautiful. They say, for example, that a theory's beauty speaks in favour of its truth, and that they strive to perform beautiful experiments. What does that mean? The notion cannot be defined. (It cannot be defined in the arts either). Therefore, I elucidate it with examples of optical experimentation. Desaguliers' white synthesis, for example, is more beautiful than Newton's, and (...) the many colourful syntheses done by Viennese painter Ingo Nussbaumer exemplify even greater beauty. Here are some criteria (which, of course, do not implement a decision procedure concerning beauty in experiments): cleanliness, simplicity, intellectual clarity, symmetry. Similar criteria are relevant to our aesthetical judgements about some pieces of music. So we can assume that our notion of beauty conserning art is related to the one conserning scientific experiments. (shrink)

Alvin Plantinga has famously argued that metaphysical naturalism is self-defeating, and cannot be rationally accepted. I distinguish between two different ways of understanding this argument, which I call the "probabilistic inference conception", and the "process characteristic conception". I argue that the former is what critics of the argument usually presuppose, whereas most critical responses fail when one assumes the latter conception. To illustrate this, I examine three standard objections to Plantinga's evolutionary argument against naturalism: the Perspiration Objection, the Tu Quoque (...) Objection, and the "Why Can't the Naturalist Just Add a Little Something?" Objection. I show that Plantinga's own responses to these objections fail, and propose counterexamples to his first two principles of defeat. I then go on to construct more adequate responses to these objections, using the distinctions I develop in the first part of the paper. (shrink)

Since the validity of Bell's inequalities implies the existence of joint probabilities for non-commuting observables, there is no universal consensus as to what the violation of these inequalities signifies. While the majority view is that the violation teaches us an important lesson about the possibility of explanations, if not about metaphysical issues, there is also a minimalist position claiming that the violation is to be expected from simple facts about probability theory. This minimalist position is backed by theorems due to (...) A. Fine and I. Pitowsky.Our paper shows that the minimalist position cannot be sustained. To this end,we give a formally rigorous interpretation of joint probabilities in thecombined modal and spatiotemporal framework of `stochastic outcomes inbranching space-time' (SOBST) (Kowalski and Placek, 1999; Placek, 2000). We show in this framework that the claim that there can be no joint probabilities fornon-commuting observables is incorrect. The lesson from Fine's theorem is notthat Bell's inequalities will be violated anyhow, but that an adequate modelfor the Bell/Aspect experiment must not define global joint probabilities. Thus we investigate the class of stochastic hidden variable models, whichprima facie do not define such joint probabilities. The reasonwhy these models fail supports the majority view: Bell's inequalities are notjust a mathematical artifact. (shrink)

We discuss in some length evidence from the cognitive science suggesting that the representations of objects based on spatiotemporal information and featural information retrieved bottomup from a visual scene precede representations of objects that include conceptual information. We argue that a distinction can be drawn between representations with conceptual and nonconceptual content. The distinction is based on perceptual mechanisms that retrieve information in conceptually unmediated ways. The representational contents of the states induced by these mechanisms that are available to a (...) type of awareness called phenomenal awareness constitute the phenomenal content of experience. The phenomenal content of perception contains the existence of objects as separate things that persist in time and time, spatiotemporal information, and information regarding relative spatial relations, motion, surface properties, shape, size, orientation, color, and their functional properties. (shrink)

This paper investigates the prospects of Rodney Brooks’ proposal for AI without representation. It turns out that the supposedly characteristic features of “new AI” (embodiment, situatedness, absence of reasoning, and absence of representation) are all present in conventional systems: “New AI” is just like old AI. Brooks proposal boils down to the architectural rejection of central control in intelligent agents—Which, however, turns out to be crucial. Some of more recent cognitive science suggests that we might do well to dispose of (...) the image of intelligent agents as central representation processors. If this paradigm shift is achieved, Brooks’ proposal for cognition without representation appears promising for full-blown intelligent agents—Though not for conscious agents. (shrink)

By focussing on the intentional character of observation in science, we argue that Constructive Empiricism—B.C. van Fraassen’s much debated and explored view of science—is inconsistent. We then argue there are at least two ways out of our Inconsistency Argument, one of which is more easily to square with Constructive Empiricism than the other.

Can we conceive of a mind without body? Does, for example, the idea of the soul's immortality make sense? Certain versions of materialism deny such questions; I shall try to prove that these versions of materialism cannot be right. They fail because they cannot account for the mental vocabulary from the language of brains in the vat. Envatted expressions such as "I think", "I believe", etc., do not have to be reinterpreted when we translate them to our language; they are (...) semantically stable. By contrast, physical expressions from the vat language are semantically instable; due to Putnam's externalism they cannot be transported to our language without change. This contrast opens the way to a new understanding of what the immortality of the soul might be like: A brain in a vat (and its mental life) might survive what the brain calls "my physical body's death". (shrink)

In the context of discussions about the nature of ‘identical particles’ and the status of Leibniz’s Principle of the Identity of Indiscernibles in Quantum Mechanics, a novel kind of physical discernibility has recently been proposed, which we call witness-discernibility. We inquire into how witness-discernibility relates to known kinds of discernibility. Our conclusion will be that for a wide variety of cases, including the intended quantum-mechanical ones, witness-discernibility collapses extensionally to absolute discernibility, that is, to discernibility by properties.

The paper argues that the reference of perceptual demonstratives is fixed in a causal nondescriptive way through the nonconceptual content of perception. That content consists first in spatiotemporal information establishing the existence of a separate persistent object retrieved from a visual scene by the perceptual object segmentation processes that open an object-file for that object. Nonconceptual content also consists in other transducable information, that is, information that is retrieved directly in a bottom-up way from the scene (motion, shape, etc). The (...) nonconceptual content of the mental states induced when one uses a perceptual demonstrative constitutes the mode of presentation of the perceptual demonstrative that individuates but does not identify the object of perceptual awareness and allows reference to it. On that account, perceptual demonstratives put us in a de re relationship with objects in the world through the non-conceptual information retrieved directly from the objects in the environment. (shrink)

Pacifists and their opponents disagree not only about moral questions, but most often about factual questions as well. For example, they came to divergent descriptions of the crisis in Kosovo. According to my reconstruction of pacifism, this is not a surprise because the pacifist, legitimately, looks at the facts in the light of her system of value. Her opponent, in turn, looks at the facts in the light of alternative systems of value, and the quarrel between the two parties about (...) supposedly descriptive matters does not come to an end as there is no objective reality about the war in question that could settle the issue. If I am right, the pacifist's value-laden way of looking at reality can be reconstructed as an obedience to three epistemic imperatives. First, the Epistemic Imperative concerning Human Nature ("Resist against demonizing the other side; always try to understand the case from their point of view"). Second, the Epistemic Imperative concerning Non-Violent Alternatives ("Always search for non-violent alternatives to projected military action"). Third, the Epistemic Imperative concerning Uncontrolled Escalation ("Sharpen your attention for uncontrolled, irreversible side effects of military action, particularly for the danger of escalation to another world war"). Objective reality does not decide how far one should go in following these imperatives. Rather, the decision about this is our's—similarly as in case of the scientist who decides to search for common deep structure behind the chaos of the manifold. So the pacifist's epistemic imperatives can be compared to Kant's regulative principles that are necessary for guiding the scientific scrutiny of reality. (shrink)

This paper investigates the view that digital hypercomputing is a good reason for rejection or re-interpretation of the Church-Turing thesis. After suggestion that such re-interpretation is historically problematic and often involves attack on a straw man (the ‘maximality thesis’), it discusses proposals for digital hypercomputing with Zeno-machines , i.e. computing machines that compute an infinite number of computing steps in finite time, thus performing supertasks. It argues that effective computing with Zeno-machines falls into a dilemma: either they are specified such (...) that they do not have output states, or they are specified such that they do have output states, but involve contradiction. Repairs though non-effective methods or special rules for semi-decidable problems are sought, but not found. The paper concludes that hypercomputing supertasks are impossible in the actual world and thus no reason for rejection of the Church-Turing thesis in its traditional interpretation. (shrink)

One of the reasons provided for the shift away from an ontology for physical reality of material objects & properties towards one of physical structures & relations (Ontological Structural Realism: OntSR) is that the quantum-mechanical description of composite physical systems of similar elementary particles entails they are indiscernible. As material objects, they 'whither away', and when they wither away, structures emerge in their stead. We inquire into the question whether recent results establishing the weak discernibility of elementary particles pose a (...) threat for this quantum-mechanical reason for OntSR, because precisely their newly discovered discernibility prevents them from 'whithering away'. We argue there is a straightforward manner to consider the recent results as a reason in favour of OntSR rather than against it. (shrink)

This is a review of the book Cultivating Original Enlightenment: Wŏnhyo's Exposition of the Vajrasamādhi-Sūtra , by Robert E. Buswell, Jr., published by the Univeristy of Hawaii Press (2008). This volume, the first to be published in the Collected Works of Wŏnhyo series, contains the translation of a single text by Wŏnhyo, the Kŭmgang Sammaegyŏng Non.

The author endeavours to show two things: first, that Schrödingers (and Eckarts) demonstration in March (September) 1926 of the equivalence of matrix mechanics, as created by Heisenberg, Born, Jordan and Dirac in 1925, and wave mechanics, as created by Schrödinger in 1926, is not foolproof; and second, that it could not have been foolproof, because at the time matrix mechanics and wave mechanics were neither mathematically nor empirically equivalent. That they were is the Equivalence Myth. In order to make the (...) theories equivalent and to prove this, one has to leave the historical scene of 1926 and wait until 1932, when von Neumann finished his magisterial edifice. During the period 1926–1932 the original families of mathematical structures of matrix mechanics and of wave mechanics were stretched, parts were chopped off and novel structures were added. To Procrustean places we go, where we can demonstrate the mathematical, empirical and ontological equivalence of ‘the final versions of’ matrix mechanics and wave mechanics. -/- The present paper claims to be a comprehensive analysis of one of the pivotal papers in the history of quantum mechanics: Schrödingers equivalence paper. Since the analysis is performed from the perspective of Suppes structural view (‘semantic view’) of physical theories, the present paper can be regarded not only as a morsel of the internal history of quantum mechanics, but also as a morsel of applied philosophy of science. The paper is self-contained and presupposes only basic knowledge of quantum mechanics. For reasons of length, the paper is published in two parts; Part I appeared in the previous issue of this journal. Section 1 contains, besides an introduction, also the papers five claims and a preview of the arguments supporting these claims; so Part I, Section 1 may serve as a summary of the paper for those readers who are not interested in the detailed arguments. (shrink)

The author endeavours to show two things: first, that Schrödingers (and Eckarts) demonstration in March (September) 1926 of the equivalence of matrix mechanics, as created by Heisenberg, Born, Jordan and Dirac in 1925, and wave mechanics, as created by Schrödinger in 1926, is not foolproof; and second, that it could not have been foolproof, because at the time matrix mechanics and wave mechanics were neither mathematically nor empirically equivalent. That they were is the Equivalence Myth. In order to make the (...) theories equivalent and to prove this, one has to leave the historical scene of 1926 and wait until 1932, when von Neumann finished his magisterial edifice. During the period 1926–1932 the original families of mathematical structures of matrix mechanics and of wave mechanics were stretched, parts were chopped off and novel structures were added. To Procrustean places we go, where we can demonstrate the mathematical, empirical and ontological equivalence of ‘the final versions of’ matrix mechanics and wave mechanics. -/- The present paper claims to be a comprehensive analysis of one of the pivotal papers in the history of quantum mechanics: Schrödingers equivalence paper. Since the analysis is performed from the perspective of Suppes structural view (‘semantic view’) of physical theories, the present paper can be regarded not only as a morsel of the internal history of quantum mechanics, but also as a morsel of applied philosophy of science. The paper is self-contained and presupposes only basic knowledge of quantum mechanics. For reasons of length, the paper is published in two parts; Part I appeared in the previous issue of this journal. Section 1 contains, besides an introduction, also the papers five claims and a preview of the arguments supporting these claims; so Part I, Section 1 may serve as a summary of the paper for those readers who are not interested in the detailed arguments. (shrink)

I have argued that to say qualia are epiphenomenal is to say a world without qualia would be physically identical to a world with qualia. Dan Cavedon-Taylor has offered an alternative interpretation of the commitments of qualia epiphenomenalism according to which qualia cause beliefs and those beliefs can and do cause changes to the physical world. I argue that neither of these options works for the qualia epiphenomenalist and thus that theory faces far more serious difficulties than has previously been (...) recognized. (shrink)

In this paper we describe some first steps for bringing the framework of branching space-times to bear on quantum information theory. Our main application is quantum error correction. It is shown that branching space-times offers a new perspective on quantum error correction: as a supplement to the orthodox slogan, ``fight entanglement with entanglement'', we offer the new slogan, ``fight indeterminism with indeterminism''.

We defend the view that belief is a psychological category against a recent attempt to recast it as a normative one. Tamar Gendler has argued that to properly understand how beliefs function in the regulation and production of action, we need to contrast beliefs with a class of psychological states and processes she calls “aliefs.” We agree with Gendler that affective states as well as habits and instincts deserve more attention than they receive in the contemporary philosophical psychology literature. But (...) we argue that it is a serious error to align beliefs with the norm of rationality, while building a contrasting category whose members are characterized primarily by their failure to measure up to that normative standard, since these latter ones cannot constitute a distinct psychological category. First, we demonstrate that Gendler gets unwarranted conclusions about the existence of aliefs from belief-discordant cases. Next, we argue that the concept of alief is insufficiently clear. Aliefs cannot be distinguished from other types of states, such as beliefs. Also, when grouping many states under the category of aliefs, Gendler overlooks important differences between phenomena that are clearly distinct, such as habits and instincts. Aliefs simply do not constitute a legitimate psychological category. (shrink)

In the six decades since the publication of Julian Huxley's Evolution: The Modern Synthesis, spectacular empirical advances in the biological sciences have been accompanied by equally significant developments within the core theoretical framework of the discipline. As a result, evolutionary theory today includes concepts and even entire new fields that were not part of the foundational structure of the Modern Synthesis. In this volume, sixteen leading evolutionary biologists and philosophers of science survey the conceptual changes that have emerged since Huxley's (...) landmark publication, not only in such traditional domains of evolutionary biology as quantitative genetics and paleontology but also in such new fields of research as genomics and EvoDevo. Most of the contributors to Evolution—The Extended Synthesis accept many of the tenets of the classical framework but want to relax some of its assumptions and introduce significant conceptual augmentations of the basic Modern Synthesis structure—just as the architects of the Modern Synthesis themselves expanded and modulated previous versions of Darwinism. This continuing revision of a theoretical edifice the foundations of which were laid in the middle of the nineteenth century—the reexamination of old ideas, proposals of new ones, and the synthesis of the most suitable—shows us how science works, and how scientists have painstakingly built a solid set of explanations for what Darwin called the "grandeur" of life. (shrink)

This paper, accessible for a general philosophical audience having only some fleeting acquaintance with set-theory and category-theory, concerns the philosophy of mathematics, specifically the bearing of category-theory on the foundations of mathematics. We argue for six claims. (I) A founding theory for category-theory based on the primitive concept of a set or a class is worthwile to pursue. (II) The extant set-theoretical founding theories for category-theory are conceptually flawed. (III) The conceptual distinction between a set and a class can be (...) seen to be formally codified in Ackermann's axiomatisation of set-theory. (IV) A slight but significant deductive extension of Ackermann's theory of sets and classes founds Cantorian set-theory as well as category-theory, and therefore can pass as a founding theory of the whole of mathematics. (V) The extended theory does not suffer from the conceptual flaws of the extant set-theoretical founding theories. (VI) The extended theory is not only conceptually but also logically superior to the competing set-theories because its consistency can be proved on the basis of weaker assumptions than the consistency of the competition. (shrink)

The term body integrity identity disorder (BIID) describes the extremely rare phenomenon of persons who desire the amputation of one or more healthy limbs or who desire a paralysis. Some of these persons mutilate themselves; others ask surgeons for an amputation or for the transection of their spinal cord. Psychologists and physicians explain this phenomenon in quite different ways; but a successful psychotherapeutic or pharmaceutical therapy is not known. Lobbies of persons suffering from BIID explain the desire for amputation in (...) analogy to the desire of transsexuals for surgical sex reassignment. Medical ethicists discuss the controversy about elective amputations of healthy limbs: on the one hand the principle of autonomy is used to deduce the right for body modifications; on the other hand the autonomy of BIID patients is doubted. Neurological results suggest that BIID is a brain disorder producing a disruption of the body image, for which parallels for stroke patients are known. If BIID were a neuropsychological disturbance, which includes missing insight into the illness and a specific lack of autonomy, then amputations would be contraindicated and must be evaluated as bodily injuries of mentally disordered patients. Instead of only curing the symptom, a causal therapy should be developed to integrate the alien limb into the body image. (shrink)

In this introduction we discuss the motivation behind the workshop “Towards a New Epistemology of Mathematics” of which this special issue constitutes the proceedings. We elaborate on historical and empirical aspects of the desired new epistemology, connect it to the public image of mathematics, and give a summary and an introduction to the contributions to this issue.

The metaphor of a branching tree of future possibilities has a number of important philosophical and logical uses. In this paper we trace this metaphor through some of its uses and argue that the metaphor works the same way in physics as in philosophy. We then give an overview of formal systems for branching possibilities, viz., branching time and (briefly) branching space-times. In a next step we describe a number of different notions of possibility, thereby sketching a landscape of possibilities. (...) In the final section of the paper we look at the place of branching-based possibilities in that larger landscape of possibilities. Our main message is that far from being an outlandish metaphysical extravagancy, branching-based possibilities are epistemically as well as metaphysically basic. (shrink)

In a recent issue of this journal, P.E. Vermaas ([2005]) claims to have demonstrated that standard quantum mechanics is technologically inadequate in that it violates the 'technical functions condition'. We argue that this claim is false because based on a 'narrow' interpretation of this technical functions condition that Vermaas can only accept on pain of contradiction. We also argue that if, in order to avoid this contradiction, the technical functions condition is interpreted 'widely' rather than 'narrowly', then Vermaas, argument for (...) his claim collapses. The conclusion is that Vermaas' claim that standard quantum mechanics is technologically inadequate evaporates. (shrink)

Since the origins of the notion of emergence in attempts to recover the content of vitalistic anti-reductionism without its questionable metaphysics, emergence has been treated in terms of logical properties. This approach was doomed to failure, because logical properties are either sui generis or they are constructions from other logical properties. If the former, they do not explain on their own and are inevitably somewhat arbitrary (the problem with the related concept of supervenience, Collier, 1988a), but if the latter, reducibility (...) is assured because logical constructs are reducible, by definition, to their logical components. A satisfactory account of emergence must recognise that it is a dynamical, not a logical property of property of natural systems, and that its basis is dynamical rather than logical composition. Collier (1988a) introduced the concept of cohesion as the closure of the causal relations among the dynamical parts of a dynamical particular that determine its resistance to external and internal fluctuations that might disrupt its integrity. Cohesion is an equivalence relation that partitions a set of dynamical particulars into unified and distinct entities, providing the identity conditions for such particulars. Cohesion blocks reduction of dynamical particulars, and is necessary for dynamical emergence. We will give reasons for thinking that cohesion might be sufficient for emergence as well. (shrink)

In its reaction on the terroristic attacks of September 9th, 2001, the US-government threatened Afghanistan's Taleban with war in order to force them to extradite terrorist leader Bin Laden; the Taleban said that they would not surrender to this kind of blackmail – and so, they were removed from Kabul by means of military force. The rivalling versions of this story depend crucially on notions such as "terrorism" and "blackmail". Obviously you'll gain public support for your preferrend version of the (...) story if you are able to determine how those notions are to be used. So we had better reflect about their very meaning and about the moral implications of their proper usage. To gain a deeper understanding of our notions of "blackmail" and "terrorism" I shall propose an extreme thought experiment: Cassandra's plan. Cassandra foresees that sooner or later one of the nuclear powers might take the liberty to use atomic bombs. From fright she founds an NGO for blackmailing the statesmen who are in charge of nuclear weapons; she announces in public that all ministers and leaders of any government shall be hunted down, and executed, whose soldiers drop but one atomic bomb. (Cassandra's NGO keeps killer teams in constant training so as to increase the effect of the threat; this is being financiated from private donations). In my paper I shall raise two questions (without claiming to provide definite answers). First, would we have to say that Cassandra's NGO was a terrorist organisation? Second, would it be morally wrong if Cassandra blackmailed statesmen in the way indicated? (shrink)

On the basis of the Suppes–Sneed structuralview of scientific theories, we take a freshlook at the concept of refutability,which was famously proposed by K.R. Popper in 1934 as a criterion for the demarcation of scientific theories from non-scientific ones, e.g., pseudo-scientificand metaphysical theories. By way of an introduction we argue that a clash between Popper and his critics on whether scientific theories are, in fact, refutablecan be partly explained by the fact Popper and his criticsascribed different meanings to the term (...) theoryThen we narrow our attention to one particular theory,namely quantum mechanics, in order to elucidate general matters discussed. We prove that quantum mechanics is irrefutable in a rather straightforward sense, but argue that it is refutable in a more sophisticated sense, which incorporates someobservations obtained by looking closely at the practiceof physics. We shall locate exactly where non-rigourous elements enter the evaluation of a scientific theory – thismakes us see clearly how fruitful mathematics isfor the philosophy of science. (shrink)