Wednesday, September 21, 2005

How Objective is Rationality?

There is something subjective about rationality. Given that belief aims at truth, there is a sense in which we "fail" whenever we have false beliefs. But it would be far too harsh, too "objective", to call this a failure of rationality. On the other hand, rationality cannot be completely subjective: the mere fact that I believe myself to be rational does not guarantee that I in fact am! We might do something unreasonable without realizing it. So, where do we draw the line?John Broome has suggested various rational requirements. We ought not to believe contradictions, we should intend to do what we believe we have conclusive reason to do, etc. If we violate one of these rules, then we are irrational in doing so. This seems straightforward enough when applied to those of us who agree with the rules Broome suggests. But what if someone disputes that non-contradiction is a rational requirement? Let us assume that they are mistaken to dispute it. But nonetheless, is such a mistake necessarily irrational? Surely we cannot say that Priest and other paraconsistent logicians are irrational. Even if they are mistaken about the possibility of true contradictions, they have reasons justifying their position, which would seem to put it on a par with any other reasonably held false belief. They are not irrational in holding that the liar sentence is both false and true. (Broome grants this, and suggests that his initial non-contradiction requirement should be loosened somewhat to allow for this.)

Still, we do require some sort of objective requirements, whatever they turn out to be. Some (e.g. Scanlon) have suggested, contrarily, that rationality is just a matter of abiding by those rules that you accept. That is, being rational "by your own lights". But this is clearly far too weak -- if someone rejects all rules of logic and reasoning, they are irrational, even if they refuse to acknowledge it. Or suppose someone denies the meta-rule that it is irrational to break a rule that you yourself accept. What then? Should we go uber-subjectivist and say that even the meta-rule only applies to those who accept it? We soon descend into absurdity.

Kolodny has similarly suggested that rationality involves acting on those reasons that it seems to you that you have. (He suggests we don't really have reason to be rational. But because of the transparency of beliefs -- the way our beliefs seem to us to be true -- if we believe we have a reason, then this will have the appearance of normative force.) But again, this is insufficient for the same reason as above - it's just far too subjective. If you arbitrarily cease to believe in any rational requirements, then it won't seem to you that you have any reasons at all. So you will never fail to act on reasons that it seems to you that you have. So, on Kolodny's account (if I've understood him correctly), you would never be irrational. But that is surely a mistake. It is irrational to ignore rational requirements like the law of non-contradiction (with special exceptions for paraconsistent logicians and other principled objectors), and you cannot escape the charge of irrationality by saying that you don't see any reason to abide by the rule. The fact that you don't see these reasons is precisely why you are irrational!

But again, there is definitely something to the idea that rationality is about "apparent reasons" rather than fact-based reasons. There can be reasons that we are not yet in a position to know, and we certainly cannot be blamed for failing to act on those. So it's certainly wrong to say that rationality is about doing what you have most reason to do. That's just far too objective. But the subjective extreme -- that rationality is about doing what you believe you have most reason to do -- is equally implausible. So what's left?

I'd suggest that rationality is about doing what you have most apparent reason to do, where 'apparent reason' is a semi-objective, evidence-based notion. I do not just mean whatever you believe you have reason to do. Beliefs can be wildly mistaken, after all. Rather, it must be a justified belief. I also mean to include reasons that are (objectively) apparent but that you fail to recognize nonetheless. That is, if the available evidence indicates that R is a reason for you to X, then R is an "apparent reason" for you to X.

So I'm taking (fact-based) reasons and (evidence-based) justification as foundational notions, and defining rationality in terms of those. Rationality is a matter of acting on what the evidence indicates to be the best reasons. These apparent reasons might not actually be the best reasons -- we're not omniscient, and might reasonably make mistakes if the evidence has misled us. So this isn't a purely objective notion. But it's not purely subjective either, because not just any old belief about reasons will do. It's irrational to ignore apparent reasons, even if you do not realize that they are apparent reasons. (Indeed, as noted earlier, your failure to recognize apparent reasons is precisely what makes you irrational.) Sound like a good compromise?

Related Posts by Categories

4 comments:

- You rightly object to hard objectivity on the grounds that one might not know or have fair access to the reasons to accept the reasons which one has for an action. It would be impossible to ever act rationally on that definition.

- Yet the same can be said as to much of what we consider "apparent" reasons. For example, there's copious psychological evidence indicating that we in fact fail to react appropriately to information. So it's almost as impossible to act rationally on that definition.

So your apparentness-reliant rationality might fall for the same reasons that objective rationality falls.

On the other hand, I think Habermas gives a story of rationality that makes more sense: communicative rationality. As I understand it, his claim is that our thoughts are encoded in language, and our languages incorporate certain rational presuppositions (sorta like those you list as from Broome), so we are compelled to accept a certain minimal form of rationality. Moreover, to the extent we interact with others communicatively, we inherently endorse that rationality.

Can anyone really reject the law of non-contradiction, except as a thought experiment?

Good point about how we're not always blameworthy for missing the evidence. I'm hoping we can just modify my definition of "apparent" to get around those sorts of objections though (i.e. if it's too objective, just make it slightly more subjective).

The question then becomes: on what principle does your definition of apparent rest? Is there a social background notion of "apparentness" (which I would suggest is at least partially linguistically constructed) which doesn't itself have any reason for existence, except its necessity?

I guess we need a prior epistemological theory to base this upon. Once we have an account of what one is justified in believing, and blameworthy for failing to believe, then we can use this to define 'apparent reasons' and thus rationality.

Visitors: check my comments policy first.Non-Blogger users: If the comment form isn't working for you, email me your comment and I can post it on your behalf. (If your comment is too long, first try breaking it into two parts.)