domingo, 31 de maio de 2009

Science includes many principles once thought to be laws of nature: Newton's law of gravitation, his three laws of motion, the ideal gas laws, Mendel’s laws, the laws of supply and demand, and so on. Philosophers of science and metaphysicians address various issues about laws, but the basic question is: What is it to be a law? Two influential answers are David Lewis’s systems approach (1973, 1983, 1986, 1994) and David Armstrong’s universals approach (1978, 1983, 1991, 1993). More recent treatments include antirealist views (van Fraassen 1989, Giere 1999, Ward 2002, Mumford 2004) and antireductionist views (Carroll 1994, Lange 2000). Besides the basic question, the recent literature has also focused on (i) whether laws supervene on matters of fact, (ii) the role laws play in the problem of induction, (iii) whether laws involve metaphysical necessity, and (iv) the role of laws in physics and how that contrasts with the role of laws in the special sciences.

Here are four reasons philosophers examine what it is to be a law of nature: First, laws at least appear to have a central role in scientific practice. Second, laws are important to many other philosophical issues. For example, sparked by the account of counterfactuals defended by Roderick Chisholm (1946, 1955) and Nelson Goodman (1947), and also prompted by Carl Hempel and Paul Oppenheim's (1948) deductive-nomological model of explanation, philosophers have wondered what makes counterfactual and explanatory claims true, have thought that laws must play some part, and so also have wondered what distinguishes laws from nonlaws. Third, Goodman famously suggested that there is a connection between lawhood and confirmability by an inductive inference. So, some sympathetic to Goodman's idea come to the problem of laws as a result of their interest in the problem of induction. Fourth, philosophers love a good puzzle. Suppose that everyone here is seated (cf., Langford 1941, 67). Then, trivially, that everyone here is seated is true. Though true, this generalization does not seem to be a law. It is just too accidental. Einstein's principle that no signals travel faster than light is also a true generalization but, in contrast, it is thought to be a law; it is not nearly so accidental. What makes the difference?

This may not seem like much of a puzzle. That everyone here is seated is spatially restricted in that it is about a specific place; the principle of relativity is not similarly restricted. So, it is easy to think that, unlike laws, accidentally true generalizations are about specific places. But that's not what makes the difference. There are true nonlaws that are not spatially restricted. Consider the unrestricted generalization that all gold spheres are less than one mile in diameter. There are no gold spheres that size and in all likelihood there never will be, but this is still not a law. There also appear to be generalizations that could express laws that are restricted. Galileo's law of free fall is the generalization that, on Earth, free-falling bodies accelerate at a rate of 9.8 meters per second squared. The perplexing nature of the puzzle is clearly revealed when the gold-sphere generalization is paired with a remarkably similar generalization about uranium spheres:

All gold spheres are less than a mile in diameter.

All uranium spheres are less than a mile in diameter.

Though the former is not a law, the latter arguably is. The latter is not nearly so accidental as the first, since uranium's critical mass is such as to guarantee that such a large sphere will never exist (van Fraassen 1989, 27). What makes the difference? What makes the former an accidental generalization and the latter a law?2. Systems

One popular answer ties being a law to deductive systems. The idea dates back to John Stuart Mill (1947 [f.p. 1843]), but has been defended in one form or another by Frank Ramsey (1978 [f.p. 1928]), Lewis (1973, 1983, 1986, 1994), John Earman (1984) and Barry Loewer (1996). Deductive systems are individuated by their axioms. The logical consequences of the axioms are the theorems. Some true deductive systems will be stronger than others; some will be simpler than others. These two virtues, strength and simplicity, compete. (It is easy to make a system stronger by sacrificing simplicity: include all the truths as axioms. It is easy to make a system simple by sacrificing strength: have just the axiom that 2 + 2 = 4.) According to Lewis (1973, 73), the laws of nature belong to all the true deductive systems with a best combination of simplicity and strength. So, for example, the thought is that it is a law that all uranium spheres are less than a mile in diameter because it is, arguably, part of the best deductive systems; quantum theory is an excellent theory of our universe and might be part of the best systems, and it is plausible to think that quantum theory plus truths describing the nature of uranium would logically entail that there are no uranium spheres of that size (Loewer 1996, 112). It is doubtful that the generalization that all gold spheres are less than a mile in diameter would be part of the best systems. It could be added as an axiom to any system, but not without sacrificing something in terms of simplicity. (Lewis later made significant revisions to his account in order to address problems involving physical probability. See his 1986 and his 1994.)

Many features of the systems approach are appealing. For one thing, it appears to deal with a challenge posed by vacuous laws. Some laws are vacuously true: Newton's first law of motion — that all inertial bodies have no acceleration — is a law, even though there are no inertial bodies. But there are also lots of vacuously true nonlaws: all plaid pandas weigh 5 lbs., all unicorns are unmarried, etc. With the systems approach, there is no exclusion of vacuous generalizations from the realm of laws, and yet only those vacuous generalizations that belong to the best systems qualify (cf., Lewis 1986, 123). Furthermore, it is reasonable to think that one goal of scientific theorizing is the formulation of true theories that are well balanced in terms of their simplicity and strength. So, the systems approach seems to underwrite the truism that an aim of science is the discovery of laws (Earman 1978, 180; Loewer 1996, 112). One last aspect of the systems view that is appealing to many (though not all) is that it is in keeping with broadly Humean constraints on a sensible metaphysics. There is no overt appeal to closely related modal concepts (e.g., the counterfactual conditional) and no overt appeal to modality-supplying entities (e.g., universals or God; for the supposed need to appeal to God, see Foster 2004). Indeed, the systems approach is the centerpiece of Lewis's defense of Humean supervenience, “the doctrine that all there is in the world is a vast mosaic of local matters of particular fact, just one little thing and then another” (1986, ix).

Other features of the systems approach have made philosophers wary. (See, especially, Armstrong 1983, 66-73; van Fraassen 1989, 40-64; Carroll 1990, 197-206.) Some argue that this approach will have the untoward consequence that laws are inappropriately mind-dependent in virtue of the account's appeal to the concepts of simplicity, strength and best balance, concepts whose application seems to depend on cognitive abilities, interests, and purposes. The appeal to simplicity raises further questions stemming from the apparent need for a regimented language to permit reasonable comparisons of the systems. Interestingly, sometimes the view is abandoned because it satisfies the broadly Humean constraints on an account of laws of nature; some argue that what generalizations are laws is not determined by local matters of particular fact.3. Universals

In the late 1970s, there emerged competition for the systems approach and all other Humean attempts to say what it is to be a law. Led by Armstrong (1978, 1983, 1991, 1993), Fred Dretske (1977), and Michael Tooley (1977, 1987), the rival approach appeals to universals to distinguish laws from nonlaws.

Focusing on Armstrong's development of the view, here is one of his concise statements of the framework characteristic of the universals approach:

Suppose it to be a law that Fs are Gs. F-ness and G-ness are taken to be universals. A certain relation, a relation of non-logical or contingent necessitation, holds between F-ness and G-ness. This state of affairs may be symbolized as ‘N(F,G)’ (1983, 85).

This framework promises to address familiar puzzles and problems: Maybe the difference between the uranium-spheres generalization and the gold-spheres generalization is that being uranium does necessitate being less than one mile in diameter, but being gold does not. Worries about the subjective nature of simplicity, strength and best balance do not emerge; there is no threat of lawhood being mind-dependent so long as necessitation is not mind-dependent. Some (Armstrong 1991, Dretske 1977) think that the framework supports the idea that laws play a special explanatory role in inductive inferences, since a law is not just a universal generalization, but is an entirely different creature — a relation holding between two other universals. The framework is also consistent with lawhood not supervening on local matters of particular fact; the denial of Humean supervenience often accompanies acceptance of the universals approach.

For there truly to be this payoff, however, more has to be said about what N is. This is the problem Bas van Fraassen calls the identification problem. He couples this with a second problem, what he calls the inference problem (1989, 96). The essence of this pair of problems was captured early on by Lewis with his usual flair:

Whatever N may be, I cannot see how it could be absolutely impossible to have N(F,G) and Fa without Ga. (Unless N just is constant conjunction, or constant conjunction plus something else, in which case Armstrong's theory turns into a form of the regularity theory he rejects.) The mystery is somewhat hidden by Armstrong's terminology. He uses ‘necessitates’ as a name for the lawmaking universal N; and who would be surprised to hear that if F ‘necessitates’ G and a has F, then a must have G? But I say that N deserves the name of ‘necessitation’ only if, somehow, it really can enter into the requisite necessary connections. It can't enter into them just by bearing a name, any more than one can have mighty biceps just by being called ‘Armstrong’ (1983, 366).

Basically, there needs to be a specification of what the lawmaking relation is (the identification problem). Then, there needs to be a determination of whether it is suited to the task (the inference problem): Does N's holding between F and G entail that Fs are Gs? Does its holding support corresponding counterfactuals? Do laws really turn out not to supervene, to be mind-independent, to be explanatory? Armstrong does say more about what his lawmaking relation is. He states in reply to van Fraassen:

It is at this point that, I claim, the Identification problem has been solved. The required relation is the causal relation, … now hypothesized to relate types not tokens (1993, 422).

Questions remain about the nature of this causal relation understood as a relation that relates both token events and universals. (See van Fraassen 1993, 435-437, and Carroll 1994, 170-174.)4. Humean Supervenience

Rather than trying to detail all the critical issues that divide the systems approach and the universals approach, we will do better to focus our attention on the especially divisive issue of supervenience. It concerns whether Humean considerations really determine what the laws are. There are some important examples that appear to show that they do not.

Tooley (1977, 669) asks us to suppose that there are ten different kinds of fundamental particles. So, there are fifty-five possible kinds of two-particle interactions. Suppose that fifty-four of these kinds have been studied and fifty-four laws have been discovered. The interaction of X and Y particles have not been studied because conditions are such that they never will interact. Nevertheless, it seems that it might be a law that, when X particles and Y particles interact, P occurs. Similarly it might be a law that when X and Y particles interact, Q occurs. There seems to be nothing about the local matters of particular fact in this world that fixes which of these generalizations is a law.

John Carroll's nonsupervenience argument (1994, 60-68) begins with a possible world, U1. Focusing on one specific moment t, let's suppose that there is a specific X particle, b, that is subject to a Y field. It has spin up. b's behavior is in no way exceptional. Whenever an X particle enters a Y field, it acquires spin up. Everyone should agree that L1, the generalization that all X particles subject to a Y field have spin up, could be a law of U1. U2 is very similar to U1. It has X particles and Yfields. X particles that enter Y fields even do so at exactly the same time and place that they do in U1. So, for instance, b enters that same Y field at time t. What is new about U2 is that when b enters the Y field at time t it does not acquire spin up. Of course, there must be at least one more difference between these two worlds. Though L1 could be a law in U1, L1 could not be a law of U2; L1 is not true in U2. There is nothing particularly remarkable about either U1 or U2 — nothing to make a Humean suspicious. But here is the catch: It is natural to think that L1's status as a law in U1 does not depend on the fact that b entered that Y field at time t. That is, even if particle b had not been subject to a Y field at time t, L1 would surely still be a law. Yet it is just as natural to think that L1's status as a non-law in U2 does not depend on the fact that b entered that Y field at time t. L1 would not be a law in U2 even if particle b had not been subject to that Y field then. Just as you can't prevent L1 from being a law in U1 by stopping b from entering that Yfield, you can't make L1 a law by doing the same in U2. So, it seems that there are two more possible worlds. In one, U1*, Xparticles are subject to a Y field, all of them have spin up, and L1 is a law. In the other, U2*, X particles are subject to a Y field, all of them have spin up, but L1 is not a law. Like Tooley's case, U1* and U2* constitute an apparent counterexample to Humean supervenience.

Though the universals approach and some of the views to be described in the next section can take Tooley's and Carroll's examples at face value, Humeans must contend that these so-called possible worlds are not both really possible. One objection to the nonsupervenience arguments from the Humean camp comes from Helen Beebee (2000). She accuses Tooley and Carroll of begging the question in virtue of assuming a governing conception of laws. (Also see Loewer 1996, Roberts 1998, and Schaffer 2008.) In a pair of papers, Earman and John Roberts (2005a and b) first address how to best formulate the thesis of Humean supervenience, then they argue based on skeptical considerations that their brand of Humean supervenience is true. Jonathan Schaffer (2008) presses an ontological concern to the effect that nonsupervening laws are ungrounded entities; he also objects to the logic of lawhood that he takes to be presupposed by Carroll's argument.5. Antirealism

The majority of contemporary philosophers are realists about laws; they believe that some reports of what are laws succeed in describing reality. There are, however, some antirealists who disagree.

For example, van Fraassen, Ronald Giere, and also Stephen Mumford believe that there are no laws. Van Fraassen finds support for his view in the problems facing accounts like Lewis's and Armstrong's, and the perceived failure of Armstrong and others to describe an adequate epistemology that permits rational belief in laws (1989, 130 and 180-181). Giere appeals to the origins of the use of the concept of law in the history of science (1999 [f.p. 1995], 86-90) and contends that the generalizations often described as laws are not in fact true (90-91). Mumford's reasons are more metaphysical; he maintains that, in order to govern, laws must be external to the properties they govern, but, to be external in this way, the governed properties must lack proper identity conditions (2004, 144-145). Others adopt a subtly different sort of antirealism. Though they will utter sentences like “It is a law that no signals travel faster than light”, they are antirealists in virtue of thinking that such sentences are not (purely) fact-stating. Whether this Einsteinian generalization is a law is not a fact about the universe; it is not something out there waiting to be discovered. Reports of what are laws only project a certain attitude (in addition to belief) about the contained generalizations. For example, Barry Ward (2002, 197) takes the attitude to be one regarding the suitability of the generalization for prediction and explanation. (Also see Blackburn 1984 and 1986.)

The challenge for antirealism is to minimize the havoc lawless reality would play with our folk and scientific practices. Regarding science, the examples of laws listed at the start of this entry attest to ‘law’ having a visible role that scientists seem prepared to take as factive. Regarding our folk practices, though ‘law’ is not often part of run-of-the-mill conversations, an antirealism about lawhood would still have wide-ranging consequences. This is due to lawhood's ties to other concepts, especially the nomic ones, concepts like the counterfactual conditional, dispositions, and causation. For example, it seems that, for there to be any interesting counterfactual truths, there must be at least one law of nature. Would an ordinary match in ordinary conditions light if struck? It seems it would, but only because we take nature to be regular in certain ways. We think this counterfactual is true because we believe there are laws. Were there no laws, it would not be the case that, if the match were struck, it would light. As a result, it would also not be the case that the match was ignitable, nor the case that striking the match would cause it to light. Indeed, it is not clear that it would even be a match!

Could an antirealist deflect this challenge by denying the connections between lawhood and other concepts? Would this allow one to be an antirealist about laws and still be a realist about, say, counterfactuals? No, the resulting position is bound to be ad hoc. Concepts like the counterfactual conditional, dispositions, and causation exhibit many of the same puzzling features that lawhood does; there are parallel philosophical questions and puzzles about these concepts. It is hard to see what would warrant antirealism about lawhood, but not the other nomic concepts.6. Antireductionism

Carroll (1994, 2008) and Marc Lange (2000) advocate antireductionist views. (Also see Woodward 1992.) Regarding the question of what it is to be a law, they reject the answers given by Humeans like Lewis, they deny Humean supervenience, and they see no advantage in an appeal to universals. They reject all attempts to say what it is to be a law that do not appeal to nomic concepts. Yet they still believe that there really are laws of nature; they are not antirealists.

Lange's (2000) treatment includes an account of what it is to be a law in terms of a counterfactual notion of stability. The overall account is intricate, but the basic idea is this: Call a logically closed set of true propositions stable if and only if the members of the set would remain true given any antecedent that is consistent with the set itself. So, for example, the set of logical truths is trivially stable, because logical truths would be true no matter what. The set of all truths is also trivially stable because every counter-to-fact antecedent will be inconsistent with this set, and every other conditional relevant to the set's stability will have a true antecedent and consequent, and so will therefore be true.[1] Lange argues that the set of physical necessities is the only non-trivially stable set. He goes on to distinguish laws of nature from the other physical necessities in virtue of their corresponding to a suitably reliable rule of inference. According to Lange, when we believe P to be a law, “we believe that the corresponding inference rule's reliability is the conclusion arrived at by one of the strategies belonging to the ‘best set of inductive strategies’” (2000, 28).

The starting point for Carroll's (2008) analysis of lawhood is that laws are not accidental, that they are not coincidences. Not being a coincidence, however, is not all there is to being a law. For example, it might be true that there are no gold spheres greater that 1000 miles in diameter because there is so little gold in the universe. In that case, strictly speaking, that generalization would be true, suitably general, and not a coincidence. Nevertheless, that would not be a law. Arguably, what blocks this generalization from being a law is that something in nature — really, an initial condition of the universe, the limited amount of gold — accounts for the generalization. Contrast this with the law that inertial bodies have no acceleration. With this generalization, it seems that it is true because of nature itself. Carroll suggests that, for a generalization to be a law, nature itself must be what makes the generalization true.

To date, challenges to antireductionism have primarily been limited to the challenges to nonsupervenience mentioned at the end of Section 4. (So, once again, see Loewer 1996, Roberts 1998, Beebee 2000, Earman and Roberts 2005a and b, and Schaffer 2008.)7. Induction

Goodman thought that the difference between laws of nature and accidental truths was linked inextricably with the problem of induction. In his “The New Riddle of Induction” (1983, [f.p. 1954], 73), Goodman says,

Only a statement that is lawlike — regardless of its truth or falsity or its scientific importance — is capable of receiving confirmation from an instance of it; accidental statements are not.

(Terminology: P is lawlike if and only if P is a law if true.) Goodman claims that, if a generalization is accidental (and so not lawlike), then it is not capable of receiving confirmation from one of its instances.

This has prompted much discussion, including some challenges. For example, suppose there are ten flips of a fair coin, and that the first nine land heads (Dretske 1977, 256-257). The first nine instances — at least in a sense — confirm the generalization that all the flips will land heads; the probability of that generalization is raised from (.5)10 up to .5. But this generalization is not lawlike; if true, it is not a law. It is standard to respond to such an example by arguing that this is not the pertinent notion of confirmation (that it is mere “content-cutting”) and by suggesting that what does require lawlikeness is confirmation of the generalization's unexamined instances. Notice that, in the coin case, the probability that the tenth flip will land heads does not change after the first nine flips land heads. There are, however, examples that generate problems for this idea too.

Suppose the room contains one hundred men and suppose you ask fifty of them whether they are third sons and they reply that they are; surely it would be reasonable to at least increase somewhat your expectation that the next one you ask will also be a third son (Jackson and Pargetter 1980, 423)

It does no good to revise the claim to say that no generalization believed to be accidental is capable of confirmation. About the third-son case, one would know that the generalization, even if true, would not be a law. The discussion continues. Frank Jackson and Robert Pargetter have proposed an alternative connection between confirmation and laws on which certain counterfactual truths must hold: observation of these As that are F and B confirms that all non-F As are Bs only if the As would still have been both A and B if they had not been F. (This suggestion is criticized by Sober 1988, 97-98.) Lange (2000, 111-142) uses a different strategy. He tries to refine further the relevant notion of confirmation, characterizing what he takes to be an intuitive notion of inductive confirmation, and then contends that only generalizations that are not believed not to be lawlike can be (in his sense) inductively confirmed.

Sometimes the idea that laws have a special role to play in induction serves as the starting point for a criticism of Humean analyses. Dretske (1977, 261-262) and Armstrong (1983, 52-59, and 1991) adopt a model of inductive inference that involves an inference to the best explanation. (Also see Foster 1983.) On its simplest construal, the model describes a pattern that begins with an observation of instances of a generalization, includes an inference to the corresponding law (this is the inference to the best explanation), and concludes with an inference to the generalization itself or to its unobserved instances. The complaint lodged against Humeans is that, on their view of what laws are, laws are not suited to explain their instances and so cannot sustain the required inference to the best explanation.

This is an area where work on laws needs to be done. Armstrong and Dretske make substantive claims on what can and can't be instance confirmed: roughly, Humean laws can't, laws-as-universals can. But, at the very least, these claims cannot be quite right. Humean laws can't? As the discussion above illustrates, Sober, Lange and others have argued that even generalizations known to be accidental can be confirmed by their instances. Dretske and Armstrong need some plausible and suitably strong premise connecting lawhood to confirmability and it is not clear that there is one to be had. Here is the basic problem: As many authors have noticed (e.g., Sober 1988, 98; van Fraassen 1987, 255), the confirmation of a hypothesis or its unexamined instances will always be sensitive to what background beliefs are in place. So much so, that with background beliefs of the right sort, just about anything can be confirmed irrespective of its status as a law or whether it is lawlike. Thus, stating a plausible principle describing the connection between laws and the problem of induction will be difficult. In order to uncover a nomological constraint on induction, something needs to be said about the role of background beliefs.8. Necessity

Philosophers have generally held that some contingent truths are (or could be) laws of nature. Furthermore, they have thought that, if it is a law that all Fs are Gs, then there need not be any (metaphysically) necessary connection between F-ness and G-ness, that it is (metaphysically) possible that something be F without being G. For example, any possible world that, as a matter of law, obeys the general principles of Newtonian physics is a world in which Newton's first is true, and a world containing accelerating inertial bodies is a world in which Newton's first is false. The latter world is also a world where inertia is instantiated but does not necessitate zero acceleration. Some necessitarians, however, hold that all laws are necessary truths. (See Shoemaker 1980 and 1998, Swoyer 1982, Fales 1990, Bird 2005.) Others have held something that is only slightly different. Maintaining that some laws are singular statements about universals, they allow that some laws are contingently true. So, on this view, an F-ness/G-ness law could be false if F-ness does not exist. Still, this difference is minor. These authors think that, for there to be an F-ness/G-ness law, it must be necessarily true that all Fs are Gs. (See Tweedale 1984, Bigelow, Ellis, and Lierse 1992, Ellis and Lierse 1994, and Ellis 2001.)

Two reasons can be given for believing that being a law does not depend on any necessary connection between properties. The first reason is the conceivability of it being a law in one possible world that all Fs are Gs even though there is another world with an F that is not G. The second is that there are laws that can only be discovered in an a posteriori manner. If necessity is always associated with laws of nature, then it is not clear why scientists cannot always get by with a priori methods. Naturally, these two reasons are often challenged. The necessitarians argue that conceivability is not a guide to possibility. They also appeal to Saul Kripke's (1972) arguments meant to reveal certain a posteriori necessary truths in order to argue that the a-posteriori nature of some laws does not prevent their lawhood from requiring a necessary connection between properties. In further support of their own view, the necessitarians argue that their position is a consequence of their favored theory of dispositions, according to which dispositions have their causal powers essentially. So, for example, on this theory, charge has as part of its essence the power to repel like charges. Laws, then, are entailed by the essences of dispositions (cf., Bird 2005, 356). As necessitarians see it, it is also a virtue of their position that they can explain why laws are counterfactual-supporting; they support counterfactuals in the same way that other necessary truths do (Swoyer 1982, 209; Fales 1990, 85-87).

The primary worry for necessitarians concerns their ability to sustain their dismissals of the traditional reasons for thinking that some laws are contingent. The problem (cf., Sidelle 2002, 311) is that they too make distinctions between necessary truths and contingent ones, and even seem to rely on considerations of conceivability to do so. Prima facie, there is nothing especially suspicious about the judgment that it is possible that an object travel faster than light. How is it any worse than the judgment that it is possible that it is raining in Paris? Another issue for necessitarians is whether their essentialism regarding dispositions can sustain all the counterfactuals that are apparently supported by laws of nature (Lange 2004).9. Physics and the Special Sciences

Two separate (but related) questions have received much recent attention in the philosophical literature surrounding laws. Neither has much to do with what it is to be a law. Instead, they have to do with the nature of the generalizations scientists try to discover. First: Does any science try to discover exceptionless regularities in its attempt to discover laws? Second: Even if one science — fundamental physics — does, do others?9.1 Do Physicists try to discover Exceptionless Regularities?

Philosophers draw a distinction between strict generalizations and ceteris-paribus generalizations. The contrast is supposed to be between universal generalizations of the sort discussed above (e.g., that all inertial bodies have no acceleration) and seemingly less formal generalizations like that, other things being equal, smoking causes cancer. The idea is that the former would be contradicted by a single counterinstance, say, one accelerating inertial body, though the latter is consistent with there being one smoker who never gets cancer. Though in theory this distinction is easy enough to understand, in practice it is often difficult to distinguish strict from ceteris-paribus generalizations. This is because many philosophers think that many utterances which include no explicit ceteris-paribus clause implicitly do include such a clause.

For the most part, philosophers have thought that if scientists have discovered any exceptionless regularities that are laws, they have done so at the level of fundamental physics. A few philosophers, however, are doubtful that there are exceptionless regularities at even this basic level. For example, Nancy Cartwright has argued that the descriptive and the explanatory aspects of laws conflict. “Rendered as descriptions of fact, they are false; amended to be true, they lose their fundamental explanatory force” (1980, 75). Consider Newton's gravitational principle, F = Gmm′/r2. Properly understood, according to Cartwright, it says that for any two bodies the force between them is Gmm′/r2. But if that is what the law says then the law is not an exceptionless regularity. This is because the force between two bodies is influenced by other properties than just their mass and the distance between them, by properties like the charge of the two bodies as described by Coulomb's law. The statement of the gravitational principle can be amended to make it true, but that, according to Cartwright, at least on certain standard ways of doing so, would strip it of its explanatory power. For example, if the principle is taken to hold only that F = Gmm′/r2 if there are no forces other than gravitational forces at work, then though it would be true it would not apply except in idealized circumstances. Lange (1993) uses a different example to make a similar point. Consider a standard expression of the law of thermal expansion: “Whenever the temperature of a metal bar of length L0 changes by T, the length of the bar changes by L = kL0T”, where k is a, constant, the thermal expansion coefficient of the metal. If this expression were used to express the strict generalization straightforwardly suggested by its grammar, then such an utterance would be false since the length of a bar does not change in the way described in cases where someone is hammering on the ends of the bar. It looks like the law will require provisos, but so many that the only apparent way of taking into consideration all the required provisos would be with something like a ceteris-paribus clause. Then the concern becomes that the statement would be empty. Because of the difficulty of stating plausible truth conditions for ceteris-paribus sentences, it is feared that “Other things being equal, L = kL0T” could only mean “L = kL0T provided that L = kL0T”.

Even those who agree with the arguments of Cartwright and Lange sometimes disagree about what ultimately the arguments say about laws. Cartwright believes that the true laws are not exceptionless regularities, but instead are statements that describe causal powers. So construed, they turn out to be both true and explanatory. Lange ends up holding that there are propositions properly adopted as laws, though in doing so one need not also believe any exceptionless regularity; there need not be one. Giere (1999) can usefully be interpreted as agreeing with Cartwright's basic arguments but insisting that law-statements don't have implicit provisos or implicit ceteris-paribus clauses. So, he concludes that there are no laws.

Earman and Roberts hold that there are exceptionless and lawful regularities. More precisely, they argue that scientists doing fundamental physics do attempt to state strict generalizations that are such that they would be strict laws if they were true:

Our claim is only … typical theories from fundamental physics are such that if they were true, there would be precise proviso free laws. For example, Einstein's gravitational field law asserts — without equivocation, qualification, proviso, ceteris paribus clause — that the Ricci curvature tensor of spacetime is proportional to the total stress-energy tensor for matter-energy; the relativistic version of Maxwell's laws of electromagnetism for charge-free flat spacetime asserts — without qualification or proviso — that the curl of the E field is proportional to the partial time derivative, etc. (1999, 446).

About Cartwright's gravitational example, they think (473, fn. 14) that a plausible understanding of the gravitational principle is as describing only the gravitational force between the two massive bodies. (Cartwright argues that there is no such component force and so thinks such an interpretation would be false. Earman and Roberts disagree.) About Lange's example, they think the law should be understood as having the single proviso that there be no external stresses on the metal bar (461). In any case, much more would need to be said to establish that all the apparently strict and explanatory generalizations that have been or will be stated by physicists have turned or will turn out to be false. (Earman, et al., 2003 includes more recent papers by both Cartwright and Lange, and also many other papers on ceteris-paribus laws.)9.2 Could there be any Special-Science Laws?

Supposing that physicists do try to discover exceptionless regularities, and even supposing that our physicists will sometimes be successful, there is a further question of whether it is a goal of any science other than fundamental physics — any so-called special science — to discover exceptionless regularities and whether these scientists have any hope of succeeding. Consider an economic law of supply and demand that says that, when demand increases and supply is held fixed, price increases. Notice that, in some places, the price of gasoline has sometimes remained the same despite an increase in demand and a fixed supply, because the price of gasoline was government regulated. It appears that the law has to be understood as having a ceteris-paribus clause in order for it to be true. This problem is a very general one. As Jerry Fodor (1989, 78) has pointed out, in virtue of being stated in a vocabulary of a special science, it is very likely that there will be limiting conditions — especially underlying physical conditions — that will undermine any interesting strict generalization of the special sciences, conditions that themselves could not be described in the special-science vocabulary. Donald Davidson prompted much of the recent interest in special-science laws with his “Mental Events” (1980 [f.p. 1970], 207-225). He gave an argument specifically directed against the possibility of strict psycho-physical laws. More importantly, he made the suggestion that the absence of such laws may be relevant to whether mental events ever cause physical events. This prompted a slew of papers dealing with the problem of reconciling the absence of strict special-science laws with the reality of mental causation (e.g., Loewer and Lepore 1987 and 1989, Fodor 1989, Schiffer 1991, Pietroski and Rey 1995).

Progress on the problem of provisos depends on three basic issues being distinguished. First, there is the question of what it is to be a law, which in essence is the search for a necessarily true completion of: ‘P is a law if and only if …’. Obviously, to be a true completion, it must hold for all P, whether P is a strict generalization or a ceteris-paribus one. Second, there is also a need to determine the truth conditions of the generalization sentences used by scientists. Third, there is the a posteriori and scientific question of which generalizations expressed by the sentences used by the scientists are true. The second of these issues is the one where the action needs to be.

On this score, it is striking how little attention is given to the possible effects of context. Mightn't it be that, when the economist utters a certain strict generalization sentence in an “economic setting” (say, in an economics textbook or at an economics conference), context-sensitive considerations affecting its truth conditions will have it turn out that the utterance is true? This might be the case despite the fact that the same sentence uttered in a different context (say, in a discussion among fundamental physicists or better yet in a philosophical discussion of laws) would result in a clearly false utterance. These changing truth conditions might be the result of something as plain as a contextual shift in the domain of quantification or perhaps something less obvious. Whatever it is, the important point is that this shift could be a function of nothing more than the linguistic meaning of the sentence and familiar rules of interpretation (e.g., the rule of accommodation).

Consider a situation where an engineering professor utters, ‘When a metal bar is heated, the change in its length is proportional to the change in its temperature’ and suppose a student offers, ‘Not when someone is hammering on both ends of the bar’. Has the student shown that the teacher's utterance was false? Maybe not. Notice that the student comes off sounding a bit insolent. In all likelihood, such an unusual situation as someone hammering on both ends of a heated bar would not have been in play when the professor said what he did. In fact, the reason the student comes off sounding insolent is because it seems that he should have known that his example was irrelevant. Notice that the professor's sentence needn't include some implicit ceteris-paribus clause in order for his utterance to be true; as this example illustrates, in ordinary conversations, plain old strict generalization sentences are not always used to cover the full range of actual cases. Indeed, they are rarely used in this way.

If special scientists do make true utterances of generalization sentences (sometimes ceteris-paribus generalization sentences, sometimes not), then apparently nothing stands in the way of them uttering true special-science lawhood sentences. The issue here has been the truth of special-science generalizations, not any other requirements of lawhood.10. Concluding Remarks: What is Next?

How will matters progress? How can philosophy advance beyond the current disputes about laws of nature? Three issues are especially pressing ones. The first concerns whether laws “govern” the universe, exactly what it means to say that they do, and how that affects our understanding of lawhood. The second is the issue of whether there are any contingent laws of nature. Necessitarians continue to work feverishly on filling in their view, while Humeans and others pay relatively little attention to what they are up to; new work needs to explain the source of the underlying commitments that divide these camps and to figure what each group is doing right. Finally, more attention needs to be paid to the language used to report what are the laws and the language used to express the laws themselves. It is clear that recent disputes about generalizations in physics and the special sciences turn on precisely these matters, but exploring them may also pay dividends on central matters regarding ontology, realism vs. antirealism, and supervenience.[2]Bibliography