Thursday, December 29, 2011

A key part of the argument for causal dispositionalism in
Mumford and Anjum’sGetting Causes FromPowers is the case against causal necessitarianism (chapter 3 in the
book). Causality is commonly thought to
imply a necessary connection between cause and effect: the authors say this is a mistake, and that
the proper modality of causation is dispositional. Causes dispose toward their effects - they
don’t guarantee them. The insight here
is that other factors can prevent or interfere with the expected manifestation
(and in everyday experience, they often do). In fact, such prevention is always possible
in causal situations, and if one moves to evade this fact by stipulating that
prevention or interference is impossible, then the resulting necessity is not
really coming from the causal process itself, but is being imposed in another
way.

To see this, suppose I specify the causal factors involved
in some manifested effect, and then someone points out another factor which
could possibly interfere (despite my match being dry, a proper striking motion
made and sufficient oxygen being present, a gust of wind might prevent the
match from lighting). Can’t I modify my
scenario to specify that the threatening factor is absent (the wind is
calm)? Leaving aside the potential
problem of listing an absence as a causal factor, the objector might present another
possible interferer (a passing car might splash water on the match as it is
being struck). So, then I, in turn, specify
that there is no nearby traffic, and so on.
In fact, no finite list of factors will ever suffice to rule out every
interferer (however unlikely). And by
the time one is led to propose a “catch-all” condition, covering the whole
state of the universe, we’re really not talking about a process of causal
production anymore.

The authors note something interesting here. They say that their argument against causal
necessitarianism does not mean they are ruling out determinism. This was a
helpful observation for me because I have been guilty of confusion on this
point. One might think “determinism”
means “causal determinism” which means “causal necessitarianism”. However, determinism can be specified in
other ways (including what might be the most common conception – see
below). Then causal dispositionalism
could be compatible with determinism. There
is a causal process, and while it doesn’t necessitate effects, necessity is
imposed in another way.

Note, that for the moment, we are leaving aside the idea of irreducibly
probabilistic causation. Such causation
is likely a feature of our world (in fact I think the a posteriori case for it is
nearly airtight), and therefore determinism is false. But disentangling these ideas remains
philosophically valuable.

As Mumford and Anjum say:
“The core idea in determinism is the fixity of the future by the past
(p.75)” If one wanted to build a model of a deterministic world, causal
necessitarianism is probably not the best tool, since the causal process doesn’t
promise to cover all the possible loopholes - for instance if there are such
things as uncaused events, then they would not be addressed.

It seems to me that the most common notion of determinism (probably
inspired by classical mechanics) is this:
given a specification of all facts, and given comprehensive deterministic
laws of nature, then the future is
fixed. There is no reason here to even
mention causation – it adds nothing to the scenario. One could be a Humean about causation and
still endorse the deterministic picture.
And given the fact that in this physics-inspired vision the mathematical
depiction of laws is symmetric with regard to time, it would be equally true to
say that the past is fixed by the specification. This is inconsistent with causation, which is
not a symmetric process. One might
believe that the mathematically specified physical laws comprise a model of a
causal world, but the laws themselves don’t constitute a theory of causation,
and may very well be inconsistent with the idea of a causal process.

On this last point, I recalled a paper I had read a few
years ago by Carl Hoefer: “Causality and Determinism: Tension, or OutrightConflict.” In this paper, Hoefer defines
a deterministic world specifically as one governed by deterministic
micro-physical laws, and then goes on to argue that this definition is
inconsistent with the presence of causation, using several philosophical
theories from the literature as examples of how causation might be characterized.

Tuesday, December 06, 2011

Presently I’m reading GettingCauses From Powers by Stephen Mumford and Rani Lill Anjum (and have
finished six chapters out of ten). I
expect to blog more about this book, of which I think very highly. I just wanted to very briefly comment on
events, inspired by the treatment they are getting so far in the book.

Years ago, influenced by reading (later) Russell and Whitehead, I acquired
the notion that (all else equal) there is an attraction to an ontology which
gave a leading role to events rather than one primarily featuring substances (or
objects) and their properties. There
seemed to be more potential for explaining the dynamic aspects of nature (including
mind).

But while there has been an active modern debate on the nature
of events, the most common depictions don’t seem to offer specific advantages
to an event-focused ontology. To greatly
simplify, it seems philosophers would model events either as property exemplifications,
in which case they are in danger of seeming much like static facts or states of
affairs; or else events would be associated with spacetime locations, in which
case they are little distinguished from objects, which are the quintessential
occupiers of spacetime. (The SEP article
on events is here; an IEP article with additional focus on the theories of Kim,
Davidson, and Lewis is here). These
sorts of models of events don’t seem to bring differentiated resources to metaphysical
theorizing.

The goal of the Getting
Causes From Powers book is to develop a theory of causation based on
dispositional properties, or powers. While
powers play the leading role, their theory incorporates an intriguing view of events
(at least causal events: they don’t take a position on whether there
are other sorts). Specifically, causal
events, which are manifestations of powers, are temporally extended processes.
The authors reject as misguided the typical “two-event” conception of
causation, where cause is temporally prior to effect, in part because no one
has a compelling account of how you get from one to the other. Rather causes and effects are simultaneous –
they are two aspects of a process
which brings about a change. Very Whiteheadian!

Monday, November 14, 2011

I’ve just been starting to read and think more about
essences, in particular the debate which has followed Kit Fine’s argument in
“Essence and Modality” (1993) that essences cannot be understood in modal
terms. The modal understanding is that
an essential property of an object is one is must have of necessity (in order
to be that object), while properties it can (possibly) do without are
accidental. Fine, on his way to
advocating a definitional notion of essence, said the modal understanding was
too broad: an object may have certain
necessary attributes which are intuitively not essential. In the paper, he offered some examples intended to bolster this point.

Now I’ve read a few papers which take issue with Fine’s
argument. In particular, one line of protest, due to Michael Della Rocca, notes
that Fine’s examples are of necessities which seem trivially true of all
objects or existents, and the modal understanding of essence can be recast by a
focus on non-trivial necessary properties.

For present purposes, though, I want to concede that there
is an intuitive sense that essence seems prior to its modal understanding (even
if the latter turns out to be extensionally equivalent): when I try to think of what properties are
necessary to an object I seem to be appealing to some non-modal definition I
have in mind.

But do we really want to add essences as irreducible
elements in our ontology? While I’m attracted to some Aristotelian or
neo-Aristotelian notions (such as causal powers), my inner Occam wants to
resist essences.

One clue to a way to think about this dilemma occurred to me
while reading Michael Gorman’s paper, “The Essential and the Accidental”. Gorman highlights one of the passages about
the nature of essential properties discussed by Fine in another paper (“Senses
of Essence”, which I have not read):
“An essential property of an object is a constitutive part of the
essence of that object if it is not had in virtue of being a consequence of
some more basic essential properties of the object; and otherwise it is a
consequential part of the essence.”
Perhaps this constitutive subset should be the real target of our idea
of essence.

While Fine’s idea of distinguishing consequential properties
from constitutive properties is one of logical consequence, Gorman takes this
as an inspiration to develop an account of essence that depends on the notion
of explanation. Perhaps the essential properties of an object are those which
cannot be explained by appeal to other characteristics (while accidental
properties are those that can be so explained).
The paper elucidates this argument and considers possible
objections. This view is distinct from
the modal understanding because it can accomodate necessary but non-essential
properties.

For myself, being in a very preliminary stage of studying
these issues, I reserve my opinion about Gorman’s particular strategy, but am
led to a desire to link essences to some other metaphysical problem, such as
causation (which is obviously related to explanation). By the time we conceive of an object, we
already have in mind something which has been caused and has its own causal
powers. And we already know (I believe)
that modality alone, say a mosaic of categorical properties distributed across
possible worlds as in David Lewis, doesn’t provide a theory of causation. So, it shouldn’t be a surprise if there is a
problem with defining essence solely in modal terms if essence relates to
causation. I’ll try to see what’s been
written along these lines.

UPDATE: 20 November 2011:Kathrin Koslicki has posted a preprint of a chapter for a forthcoming book called "Essence, necessity, and explanation" which fleshes out a discussion similar to Gorman's. She uses an analysis of Aristotelian notions of explanation, including cause, to account for how necessary but not essential properties follow from constitutive essential ones. However, the scheme here is that essence is basic and prior to cause (I was wondering if there was a way to reverse this priority.)

Tuesday, October 18, 2011

Coming up very soon is the next event on this year’s GPPC program: our Public Issues Forum. The topic this year is Philosophy for Children. The date is Saturday October 29th, 1pm, at the University of Pennsylvania. The event is free and open to the public.

Monday, October 10, 2011

I’ve been interested in the metaphysics of dispositional properties (or powers), and I’ve ordered Getting Causes from Powers, a new book from Stephen Mumford and Rani Lill Anjum. I look forward to reading this later in the fall, but it the meantime I have read a couple of related papers by the authors (see an earlier post here).

In “Dispositional Modality” Anjum and Mumford argue that the modal value of dispositions is distinct from necessity and possibility: it is described as “sui generis” and “irreducible”. What I thought was interesting, though, is that the authors themselves include a nice account of dispositional modality in terms of restricted possibility, which seemed to me had the flavor of a reduction.

I’ll pass over the first several sections of the paper, which covered some familiar ground: the fundamental disagreement with Hume about powers; the failure of the semantic reduction of dispositional ascriptions to conditionals; and the fact that dispositions, by their nature, clearly do not necessitate their manifestations (they are disposed toward, or tend toward their manifestations).

So, dispositional modality is distinct from necessity, but what about possibility? In section 5, Anjum and Mumford argue that dispositional modality is also distinct from possibility, in the sense that it is different from “pure” possibility. By pure possibility, the authors mean the broadest sort of logical or metaphysical possibility. A glass vase is disposed to shatter when dropped. One might suppose it is logically possible that the vase will turn to jelly when dropped, but it is clearly not disposed to do so – at least in our world.

The authors say we might think of dispositional modality as a subclass or restricted version of possibility. But this is already a familiar idea. Modal theorists talk about various nested classes of possibility: logical possibility (typically the broadest notion), various formulations of metaphysical possibility, and nomological possibility -- the subclass which holds the laws of nature fixed (some philosophers would identify one or more of these). Anjum and Mumford see dispositions as giving rise to “natural” possibility: “The reason some things are naturally possible is because there are dispositions for them.”

The authors don’t elaborate greatly on this point, but it seems as if natural possibility is the set of all the possibilities inherent in the dispositions contained in the state of the natural world at a given time. Building on this idea: when it comes to individual natural objects or systems, in saying they possess dispositional properties (or powers), we might equally well say they possess a certain restricted bundle of possibilities. Then we might turn to a discussion of how causation is explained in terms of the “actualization” of some of these possibilities (compare the “manifestation” of a disposition), perhaps depending on how a system interacts with other systems which similarly bear possibilities.

If one can retell the story of dispositions with restricted sets of possibilities, might this be a reductive analysis? It’s not completely clear to me, because one might say the introduction of restricted sets of possibility of the type needed is an irreducible extension of our notions of modality. And perhaps the idea of an object bearing a set of possibilities, rather than bearing properties, isn’t coherent (but maybe the terminology can be worked out). In any case, this paper’s comparison of dispositional modality to the idea of restricted possibility was very thought-provoking.

Tuesday, September 20, 2011

One thing that has frustrated me in the past is the fact that folks tend to think indeterministic means “just random”, where by random they mean some stochastic process (like a dice roll) where one can’t predict which outcome will be chosen from some probability distribution. Quantum indeterminism doesn’t work this way, but it’s a difficult subject and experts don’t agree on exactly how to characterize it. It seems clear one cannot simply use a “frequency" interpretation, the way you can with a classical stochastic system. There seems to be something more involved, something spontaneous which resists reduction, but I have a hard time being more precise about this.

Computation theorist Scott Aaronson (home page, blog) recently gave a presentation on free will (at an FQXi conference) which was very thought-provoking (see this Sciam piece with helpful links), and had an interesting take on this issue.

Wednesday, September 14, 2011

The GPPC website has the updated program information for 2011-2012. The site also has other news, including this year's discussion groups and other lectures at the member schools which are open to the public. I'm also anticipating that, like last year, there will be further GPPC-sponsored events added to the calendar as we move forward.

Friday, September 09, 2011

Aquinas follows Aristotle in utilizing the interplay of potentiality and actuality to explain substance and change. And the idea that the actual is prior to the potential, which I got a bit hung up on when reading Edward Feser’s book Aquinas, is from Aristotle as well.

First, Actuality is prior in sense of logos (account or definition), because we cite the actuality or actualities in describing a potential (something is fragile because it is capable of being broken). I think this is a good point, as it relates to everyday examples we can describe. However, if you think there is novelty in the world, then latent potentials exist which we cannot so define. A novel actuality may very well be described after the fact in terms of potentials which were previously unknown.

Next, Aristotle also views the actual as prior in a temporal sense: while an acorn’s potential to be a tree is prior to its actually becoming one, actual adult trees had to exist beforehand. This seems like a chicken and egg situation.

Finally, Aristotle argues that the actual is prior “in substance” because the actuality is the end or telos, and the potentiality exists for the sake of the end – actuality is the final cause of the potential.

An added argument is that Aristotle looks at the bigger picture and sees that potentials may or may not be fulfilled, therefore they are perishable. On the other hand, something eternal would have to be imperishable, hence actual. Since the eternal can exist without the perishable, but not conversely, this is another way to see that the actual is prior in substance. Just to be devil’s advocate here, though, I can easily conceive of eternal potentials: if potentiality is truly a mode of being.

These points about the priority and eternal nature of actuality lead us into the territory of the unmoved mover being seen as pure Act, utilized by Aquinas in arguments for God. I have been entertaining the different idea that if there is an ultimate being it should encompass both (infinite) potential and the power to act. But we’ll see how this holds up with further reading.

Wednesday, August 24, 2011

I recently read Aquinas, by Edward Feser (home page, blog). I would recommend the book; it is an excellent introduction to the thought of Aquinas (it deals with his philosophy – it is not a biography of his life and times, nor does it cover all the theology). It is very accessible to the non-expert, but is best suited for those with some background knowledge of philosophy. In about 200 well written pages, Feser both presents and advocates for Thomist positions through 4 chapters devoted respectively to metaphysics, natural theology, psychology, and ethics.

I think Feser‘s greatest success is in his arguments for a re-consideration of Aquinas’ Aristotelian metaphysical ideas, especially with regard to causation, but also with regard to an ontology of potency and action, and hylomorphic (form/matter) dualism.

My main criticism is that while Feser’s assumed role as Aquinas’ champion is usually a benefit to the reader, as Aquinas is presented in most sympathetic light, he is inclined to insist that all of Aquinas' ideas are equally meritorious. In some cases this leads him to present arguments which seem to go beyond what would have occurred to Thomas himself.

But, with plenty of references for further reading, Feser has given the reader a roadmap for further study to follow onto his fine introduction.

Below are somewhat scattershot notes and comments I made while reading the book. To briefly summarize my own views: I'm attracted to some of the metaphysical elements of Aquinas/Aristotle as they relate to causation and mind, and I'm even sympathetic to some of the cosmological arguments. On the other hand, I was unconvinced by significant parts of the Thomist package, including arguments by analogy for some of the divine attributes, God's nature as pure act and his separateness from matter, and the special nature of the human soul.

Sunday, July 31, 2011

William Seager, of the University of Toronto, has written a number of interesting papers over the years on the mind, with panpsychism and emergence/reduction included as frequent topics. I’m grateful for his contributions on panpsychism, which remains a neglected option in philosophy of mind.

Monday, July 18, 2011

The picture of our universe as a machine governed by deterministic laws is hard to shake. Physics has profoundly undermined this vision of course, but even before this was known, it is a bit surprising that philosophers and scientists were inspired by everyday macroscopic experience to form such a conception. Was Hume so talented at billiards that he experienced constant conjunction? I can’t come close! Certainly to build and maintain a machine which achieves any determined outcome with regularity takes a lot of human effort.

This last point was made by philosopher John Dupré in a 1995 paper called “A Solution to the Problem of the Freedom of the Will” (hat tip: tweet by Rani Lill Anjum). This paper has a number of thought-provoking insights: contrary to the title, it’s doesn’t put forth a full theory of free will, but argues that the way the world works makes human autonomy unsurprising. (Dupré, a philosopher of biology among other research interests, took part in an interesting debate regarding reduction and emergence on Philosophy TV here). Most assume the world is governed by global microphysical laws, such that autonomy would require an exception to these laws. Dupré argues that we actually have no reason to think the world is governed by such globally applicable laws. Using his terminology, the world is far from causally complete.

Thursday, June 30, 2011

How reliable is knowledge gained through introspection? On the one hand, we have been learning that knowledge gained through introspection is surprisingly fallible. A philosopher who has worked on bringing this fact to light is Eric Schwitzgebel. In this paper, among others, he has argued that,based on the evidence, knowledge gained from introspection is less reliable than knowledge gained in other ways. He now has a book out called Perplexities of Consciousness which looks worth reading -- see reviews here and here.

On the other hand, perhaps there is still something special about knowledge gained via introspection. Schwitzgebel participated in a debate on introspection with Brie Gertler last fall on Philosophy TV. In a draft of a forthcoming paper called "Renewed Acquaintance" (Word doc), Gertler defends a view called the acquaintance approach to introspective knowledge of phenomenal qualities (a book is forthcoming titled Self-Knowledge).

Tuesday, June 21, 2011

I'm helping to organize this GPPC forum on philosophy for children coming up on October 29. I've gotten interested in this topic over the last year, and have talked to a number of smart and passionate people who are working in this field. Please check out the preliminary information below and forward to anyone you know who might be interested. This event will be free and open to the public. For updates, check the GPPC website or contact me with any questions. Thanks, and I hope to see you there!

Friday, June 17, 2011

The Russellian view of the mind-body problem explains why experience has qualitative content: the natural world has qualities, and this fact poses no conflict with physics, because physics offers only a formal description of nature’s causal regularities. Our participation in the world acquaints us with these qualities (even if our knowledge is fallible about details). There is a further question, however, beyond this issue of “raw” qualitative content. Recently, in interesting comments on this old post, dnn8350 posed the question of why our experience features macroscopic objects/events and not just a flux of the micro-level entities which are fundamental in physics.*

Now, separately from philosophy of mind, metaphysicians also debate problems concerning the compositions of objects (mereology). Perhaps the problem of composition and the problem of why minds feature macroscopic clumps are related. Uriah Kriegel has a couple of papers which connect with these issues: he has a 2008 paper called “Composition as a Secondary Quality”, and now a draft of a forthcoming paper called “Kantian Monism”. The first paper addresses things from the point of view of ontological pluralism, and presents an argument that (very roughly) states that objects compose a larger composite object if it is the case that a subject would judge it to be so (I’m obviously glossing over many important details and conditions specified by Kriegel). In the new draft paper, Kriegel explores this view from the perspective of monism, and presents the case that the world decomposes into parts just in case it would appear that way to a subject in the world.

If mind-world interaction is responsible for the composition of objects (or decomposition of the world into parts), the task remains of filling in how this works, but perhaps there is a sense that we have combined two problems into one.

*I guess this question may also be related to the so-called binding problem of consciousness – that is the question of how our experience unifies various disparate sensory inputs – but I’m not sure.

Tuesday, May 31, 2011

Sean Carroll had an interesting post at the Cosmic Variance blog. The post discusses the idea, outlined in a couple of recent papers, of finding a concordance between the multiverses which exist according to some speculative cosmological models and the many-worlds interpretation of quantum mechanics. Carroll provides a sketch of his own thoughts about how this might work. (The referenced papers are by Nomura and Bousso & Susskind).

I have some thoughts about this broad question, but for now I want to highlight one key notion utilized in the discussion, which is that of “horizon complementarity”.

I was familiar with the holographic principle, which says roughly that the information about what is inside a region of space-time can be encoded on the surface boundary of the region. This idea developed from the study of black holes, where it was earlier theorized that black hole entropy was proportional to the area of its event horizon. Horizon complementarity is likewise an extension of another idea which was developed in the study of black hole entropy/information paradox. Here’s a lengthy excerpt from Carroll (who is skilled in explaining difficult topics to a general audience - see the original for embedded links):

Tuesday, May 17, 2011

I’ve been very interested in the search for a theory of quantum gravity. General Relativity and Quantum theory, the twin crowning achievements of twentieth century physics, are not compatible, and the hunt has been on for a successor theory which would underlie or reconcile the two.

Approaches include trying to extend or modify the quantum field theory programs which were so successful for building models of particle interactions and forces, but which failed to accommodate gravity (superstring theory falls broadly into this category). Alternatively, some researchers have explored approaches which feature some conceptual rethinking of the issues involved. I’ve been intrigued by recent research programs which posit that the space-time geometry of GR emerges from a more fundamental background theory, such as a dimensionless quantum causal framework of some kind.

In thinking about the conceptual, rather than technical issues involved, it is worth reflecting on the fact that there may be a basic conflict between ordinary quantum mechanics and relativity, which predates the issues of reconciling quantum field theories and Einstein’s theory of gravity.

Seevinck argues that non-locality is simply not consistent with the local causal structure inherent in SR. Now, there is a weaker sense in which one might argue the theories are compatible: while the nonlocal correlations which arise in entangled systems are themselves well demonstrated empirically, we have been unable to utilize these phenomena to create an experimental conflict with SR such as superluminal signaling. Theoreticians also have characterized no-signaling as an essential part of quantum theory, developing no-signaling theorems.

Seevinck is critical of no-signaling theorems, saying they are either circular or else serve as consistency proofs (QM and no-signaling can, rather than must, be compatible). In the case of some theorems derived from QFT it seems clear why they might be circular: quantum field theory obscures the issue at hand because it is formulated against a backdrop of SR – so compatibility is enhanced by construction.

But the compatibility between QM and SR is not all about no-signaling. It can be argued that the spirit of SR is a geometric causal structure of space-time, and there can be no story of nonlocal correlations arising causally in this structure.

Seevinck briefly reviews several general approaches to resolving the conflict through interpretation or modification of the theories, without endorsing one. Part of his discussion references the ideas of Nicolas Gisin. In several papers, Gisin has also argued the case for incompatibility. He has been critical of the traditional discussion of Bell’s results which describe it as presenting a choice to reject either locality or “realism”. He finds no sensible "irrealism" option which gives a reason to reject the conclusion of nonlocality (see brief Gisin papers here and here).

Gisin’s view is that we must accept that nonlocal correlations can’t originate in space-time, and he ponders the alternative, which is that they must emerge from “outside” space-time. What this means needs to be fleshed out, but it seems compatible with the idea that space-time geometry is an emergent rather than fundamental aspect of nature.

Tuesday, May 03, 2011

There was a conference at St. Louis University last weekend on causal powers (Putting Powers to Work). The program looked excellent. I note that the John Templeton Foundation was a sponsor. IMO this is an excellent use of their funds, given how they describe the portion of their mission which is devoted to "Science and the Big Questions". One subset of this funding area is called "Philosophy and Theology" (other areas relate to direct science grants, which are obviously welcome, and also the promotion of "dialogue" between science and "theology and/or philosophy").

The reason this is notable is that until recently the Foundation's material and grant record struck me as notable for a neglect of philosophy in (often quixotic-seeming) pursuit of dialogue between science and religion. Three years ago, I wrote a letter to the foundation about this, and got a polite reply acknowledging this but saying it was in keeping with the founder's vision.

Since then, the Foundation has been revamping its organization and programs fairly extensively, and philosophy has been getting more into the mix -- notable was a recent significant grant for the study of free will. Hopefully, they will continue to fund pure philosophical research, particularly in the areas of metaphysics which are indispensible IMO if one wants to pursue answers to "Big Questions". Looking at the free will program and the inclusion of a philosophy of religion talk at the powers conference, it appears Templeton will likely insist on some PoR or theology aspect to these programs, but I would think that needn't be problem as long as it's not heavy-handed.

Friday, April 15, 2011

This is a brief follow up to the discussion at the end of the last post. The problem of being for pure powers is the challenge of explaining their nature or ontological grounding when not manifested (if they are not grounded by other properties). I had been exploring the model that powers have the ontological character of quantum systems between measurements (=manifestations). And I think the best way to evaluate the being of quantum systems is as truly existing but non-concrete propensities (while manifestation events=concreta). I hesitate to use the term “abstract” to describe this sort of being, since that has come to imply “causally irrelevant”, and powers are anything but – indeed they underwrite causality in nature. But I do think we need to accept that there are indeed two modes of being.

I draw connections to two papers recently discussed.

I think William A. Bauer’s notion that pure powers are self-grounding due to ongoing minimally sufficient manifestations can connect with the above view in an interesting way. Our analysis of powers typically begins with paradigm cases involving everyday macroscopic objects (like the fragility of a vase). Well, the strange ontological status of quantum systems is only apparent when very small systems are considered in isolation. In everyday situations larger systems do have ongoing minimal interactions with the environment – leading to the kind of apparent sustained being that Bauer envisions. So while his explanation for the self-grounding of powers is not ultimately correct, it connotes an approximate fact about the ongoing manifestation of powers in the rough and tumble world.

Finally, I would note another interesting connection between the view I’ve sketched lining up powers with quantum systems by referring back to the Neil E. Williams paper I discussed in my post "Power Holism". Williams wondered how multiple powers had the capability of fitting together to produce mutual manifestations, arguing that one may need to appeal to a kind of holism to explain this. I would just quickly note that the ability of quantum systems to become correlated through non-local connections and entanglement provides a model of holism which may address his concerns.

Monday, April 11, 2011

Here are some brief thoughts on papers I read recently on the metaphysics of powers/dispositions.

I thought “A Powerful Theory of Causation” by Rani Lill Anjum and Stephen Mumford was an important paper. The authors set out to show that powers have the right degree of “modal strength” to support a theory of causation. Powers have been long thought to necessitate their manifestations, and necessity (and the sense of constant conjunction) is too strong to describe causation. The notion of metaphysically necessary connections in nature has long supplied a basis for arguing against powers and associated theories of causation. Anjum and Mumford say that powers (dispositions) “dispose” toward their manifestations, but don’t necessitate them.

One way of seeing that powers fall short of necessity is to note that when placed in a context, a disposition can be enhanced or, importantly, hindered by other powers. The authors use a vector addition model as a heuristic to see how this works. Only when the sum of vectors (with various strengths and directions) exceeds some threshold do we get the manifestation.

In later sections of the paper the authors deal with various potential objections and place their theory in a historical context of the difficulties faced by causal models, showing again that the unwarranted assumption of necessitation was the key stumbling block.

While the vector model was interesting, my favorite section of the paper (section 5) deals with explaining probabilistic causation. Here Anjum and Mumford endorse a propensity (propensity=probabilistic power) interpretation for a single disposition. I myself think this is the key to understanding how powers can have the right modal strength “all the way down”; it also has the virtue of fitting with our best physical theory of how the actual world works (quantum mechanics).

I eagerly look forward to a forthcoming book from Mumford and Anjum called Getting Causes from Powers – this will elaborate upon the theory in greater detail. Also note that a podcast and slides from a recent talk by Stephen Mumford from the PhilSci forum at UMB (Norwegian University of Life Sciences) are available here (scroll down for previous talks). It is a very nice introduction to powers, and focuses on contrasting a powers approach with a laws-based theory of causation.

Some quick takes on other papers.

Michael Esfeld argues that a metaphysics of powers has an advantage in terms of compatibility with physics in his paper “Humean metaphysics versus a metaphysics of powers.” In the paper, Esfeld summarizes the difference between a Humean approach and the powers approach: in contrast to the above he does characterize powers as having necessary connections with their manifestations. However, he does then describe the option of treating powers as propensities to explain probabilistic causation. His main point in the paper is that while physics can be compatible with more than one metaphysical picture, the powers model is the best fit given the commitment of physics (and other sciences) to describing dispositional and functional properties.

William A. Bauer takes up the issue of the ontological grounding of powers in "Four theories of pure dispositions". Can they exist without depending on categorical properties for their being? What is the nature of powers when they aren’t manifesting if they lack such grounding? He reviews several approaches to this problem, and concludes that the best model is one where they are self-grounding via a continuous low-level manifestation (which is distinct from their more pronounced, distinguishing potential manifestation).

This was a thought provoking paper, which prompted me to go back and review others on this topic, including Mumford’s “The ungrounded argument”, Neil E. Williams’s response “The ungrounded argument is unfounded”, and Stathis Psillos’s 2006 paper “What do powers do when they are not manifested?”. My quick two cents on this issue is inspired by QM and the idea of powers as propensities: I think powers have a real-but-not-concrete status akin to possibilities in a framework of modal realism. Unlike a static notion of possibilia, however, propensities causally impact our world through their disposition toward actual manifestation events. This is again consistent with QM: between measurement events, quantum systems don’t have concrete existence, but they certainly exist in a causally relevant way, as their influence on events is apparent and measurable.

Monday, March 14, 2011

{Note: this is a draft of some work that I might develop further with add'l research at some point. Comments or suggestions are welcome.}

I’m gratified that the position in Philosophy of Mind known as Russellian Monism (also known as Russellian theory of mind and probably the best developed account of neutral monism) has gotten more attention in recent years. However, the terminology typically used to describe the position today is different from Bertrand Russell’s, as presented in his 1927 work, The Analysis of Matter. This post discusses some of the issues involved, and briefly looks at how some stances in contemporary debates would fit with the original account.

In a recent post on the Brains blog, Richard Brown (referencing an online discussion he had with David Chalmers) said: “RM [Russellian Monism] is the view that the dispositional properties talked about by physics have as their categorical base phenomenal or protophenomenal properties.” While descriptions vary, the reference to dispositional and categorical properties is common. In his book, Ignorance and Imagination, Daniel Stoljar says the position is a combination of two theses. First: “…that physical theory tells us only about dispositional properties.” And: “The second thesis we need to consider is that the dispositional properties of physical objects do require categorical grounds; that is, for all dispositional properties, there must be a non-dispositional property... (p.110)”

Now, Russell never uses discusses properties at all, and certainly not dispositional or categorical properties specifically! These are terms which have emerged in the more recent debates of analytic philosophy. So, how well is the intent of RM captured when using this terminology? (Please again note I’m only speaking of Russell’s work in The Analysis of Matter).

Tuesday, March 01, 2011

Why would God, assumed to be omniscient, omnipotent, and omnibenevolent, create a world which is suffused with gratuitous suffering? There are many responses given by theists, but I’ve thought the most persuasive one was an appeal to a theistic multiverse. I was reminded of this strategy by reading Bradley Monton’s draft paper titled “Against Multiverse Theodicies” (warning – Word document). It was a helpful paper to review because in it he describes various approaches that have been taken in the literature, on his way to formulating an argument against them. There are many variants, but a typical version of the theodicy says that God maximizes total value by creating infinite universes, not just the one we observe, and all we need to accept about our own world is that it is minimally worth creating by the deity -- perhaps the good it embodies just barely outweighs the bad. Then can imagine that the countless superior worlds of which we can conceive also exist. Terrible worlds unrelieved by sufficient good would not be created.

Now, Monton’s paper argues the strategy doesn’t work: The key to his argument has to do with God’s ability to create duplicate and near duplicate universes (without end). To greatly simplify, he says that instead of creating a world with a given amount of suffering, God could create duplicates of better worlds and create more aggregate good. (A counterargument, he says, would have to involve a successful defense of Leibniz’ principle of the Identity of indiscernibles).

I recommend the paper, although I’m not going to engage his argument here. Rather I bring this up because, while I’m not a theist, the issues raised in this discussion bear on concerns I have about my own views.

Tuesday, February 22, 2011

The latest FQXi Essay Contest – “Is Reality Digital or Analog” attracted a large number of submissions. As in past contests, there will likely be some insightful “diamonds in the rough”. I’ll be looking for these.

My own view is that the right answer is ‘both’, and that the two processes of quantum mechanics give us a clue to this. I would say that concrete reality is discrete (so “digital”), since it consists of a network of distinct measurement events (I think space-time is not fundamental, but emerges from the distribution of events). But events are actualized possibilities. So, reality also includes possibilities or propensities (like quantum systems between measurements), and it appears that these have a continuous (or analog) nature.

In addition to being inspired by an interpretation of QM (such as I’ve discussed many times before), this sort of view comports with a Whitehead-style metaphysics. I’ll mention again here a recent blog post by Stuart Kauffman which covered some of this ground in a nice way.

Tuesday, February 08, 2011

Before leaving the topic of American Grace, I should mention the authors’ notably upbeat conclusion. They argue that despite the substantial religious divisions among Americans (both by denomination and between the most and least religious), the vast majority of Americans are very tolerant of each other.

They say the source of this high degree of tolerance is (simply) the high level of diversity among our extended family and friends. Due to a high degree of intermarriage and religious mixing (outlined by survey data), they surmise most people know an “Aunt Susan” or a “Neighbor Al” who is an undeniably good person of a different religious affiliation.

Admirably, Americans are very generous in allowing that people who don’t share their faith can still go to heaven. Putnam and Campbell report the percentages by affiliation of those who believe “people of other religions can go to heaven”(p.535): evangelicals affirm this 83% of the time, whereas all other groups are at 90% or more (Mormons are the highest at 98%).

But the authors note that this question could be ambiguous with regard to what the respondent conceives of when he or she hears “other religions”. So they asked Christians whether people “not of my faith, including non-Christians, can go to heaven”(p.537): Mormons stayed at 98%, Mainline Protestants and Catholics drop from the 90’s to the low 80’s, and Evangelicals drop to 54% (still much better than one might have guessed).

Friday, February 04, 2011

In an effort to improve the dialogue between scientists and science writers on the one hand and religious folks (who are sometimes science skeptics) on the other, it has been suggested that emphasizing a common spirituality might help. This would be possible because even in the case of atheists, the universe inspires feelings of awe and wonder which might be considered “spiritual”. Science journalist and author Chris Mooney made this case in a recent op-ed titled “Spirituality can bridge science-religion divide.”

Some of the religious survey data I’ve been looking at suggests this is not a well-founded recommendation.

Wednesday, February 02, 2011

In its Religious Landscape Survey, Pew asked Americans the following: “Do you believe in God or a Universal Spirit?” Then, for those who answered affirmatively, they asked this follow-up question:

“Which comes closest to your view of God? ‘God is a person with whom people can have a personal relationship’ or ‘God is an impersonal force’.”

I found the results here surprising. The 92% who replied yes to the first question broke down this way: 60% personal God, 25% impersonal force, 7% other/both. Here’s some of the breakdown by affiliation: 19% of Protestants believe in God as an impersonal force (13% of evangelicals); 29% of Catholics agree, as do 50% of Jews. It would appear many folks are not fully on board with their official theology. 35% of the nones believe in a God who is an impersonal force (representing half of those who reported a belief in God or a universal spirit).

As a check, I looked at data from the ARIS report, which is somewhat less dramatic. Here they asked a different question – no “impersonal force” option, per se. 70% affirmed a belief in a personal God, while 12% selected the option “there is a higher power but no personal God.” This question elicited a bit more in categories called “I’m not sure” and “don’t know/refuse” compared to similar options in the Pew survey (6% each).

Still, as someone who is broadly in the impersonal force/higher power camp, I was interested to learn I might have so much company.

Monday, January 31, 2011

Surveys show that the “nones” (those who report no religious affiliation) are a diverse group. The first observation often made is to note that only a small percentage self-identify as atheists or agnostics. For instance, Putnam and Campbell, on page 16 of American Grace, report that only 5 people in their 2006 “Faith Matters Survey” of 3,108 described themselves by either label. But it would be better to look at some larger surveys which addressed this question.

The Pew U.S Religious Landscape Survey of over 35,000 Americans in 2007 found 1.6% responding as atheist, 2.4% as agnostic, and 12.1% selecting “no particular religion” (total nones coming to 16.1%). In analyzing the “no particular religion” group, Pew looked at their responses to another survey question: “How important is religion in your life”? They found about half of this group answered “not at all important” or “not too important” while the rest answered “somewhat important” or “very important”. Pew decided to label these two groups as Secular unaffiliated (6.3% of the total) and Religious unaffiliated (5.8%) for the purpose of summarizing the results on other parts of the survey.

For one more comparison, the 2008 American Religious Identification Survey (with 54,461 respondents) found 0.9% atheist, 0.7% agnostic (out of total nones of 15.0%). It should be noted, however, that the percentage of atheists/agnostics in the survey nearly doubled from a previous 2001 tabulation.

The book discusses a huge number of issues, trends, and cross-currents, and makes for thought-provoking reading. A topic which I find particularly interesting is the recent growth in the portion of Americans with no religious affiliation (sometimes referred to as the “nones”), and I was curious how the authors analyze the phenomenon.

Tuesday, January 11, 2011

85 years after the formulation of quantum mechanics, it is still often assumed that distinctively quantum phenomena have no role to play in explaining life or mind. I think this assumption is unjustified.