Inhuman Rationality?

We seem comfortable celebrating those who practice complex statistical analysis, even if only one in a thousand can do so. And the one in a billion genius celebrated for greatly improving our understanding or practice of statistics, or other rationality, seems to us an epitome of the best in humanity. Who would call such exemplars "inhuman"?

But when rationality seems to require not rare celebrated abilities, but common despised tendencies, suddenly many complain rationality is "inhuman." For example, many balk at my suggestion that rationality requires us to be more conformist in our beliefs, deferring more to others’ opinions, even if in conversation we explore different evidence and analyzes. Yet conformity is widespread and most people have far more diverse word-opinions than betting-opinions. I’d guess far more than one in a thousand humans realize something close to the degree of conformity I’m suggesting.

It seems to me that the real issue here is not humanity but social status. We are eager to achieve aspects of rationality that get high social status, but not those that get low status. To gain high status, we don’t mind moving far out into the tails of the distribution of human features, but if low status is a prospect we suddenly express grave concern about moving out into the "inhuman" tails.

If you want to believe the truth, … just accept the average belief on any topic unless you have a good (and better than average) reason to think the causes of your belief difference would be substantially more informed than average.

We do not need different beliefs to hold and share different info and analysis. We can talk directly about the clues we have seen, and about the conclusions our analyzes seem to favor. … We need not be so stupid as to form our beliefs only on such things.

It’s hard to imagine a society of beings who don’t believe in the theories and positions they argue for. I can’t help thinking that pending a vast realignment of human psychology, belief is a necessary ingredient for vigorous pursuit and advocacy of ideas. One could imagine what we might call "epistemological zombies", beings who act just like people who have strong beliefs, but who don’t actually hold those beliefs at all. (They would mostly all privately believe what seems to be the consensus belief.) … [Robin] envisions beings who are sort of halfway between ordinary humans and e-zombies. … Doesn’t the fact that people have evolved with belief highly correlated with their actions suggest that this semi-e-zombie behavior is not consistent with human cognitive limitations? …

Consider people who vigorously argue for their claims but tend to find excuses not to bet on those claims, nor to act on their claims in other ways. To some extent, such people already are the "zombies" you find it hard to imagine.

These discrepancies … are … usually considered as cases of self-deception or cognitive dissonance. People still think they believe what they claim to. … My e-zombies were meant to … be free of self-deception, but still to act similarly to self-deceived humans.

“Consider people who vigorously argue for their claims but tend to find excuses not to bet on those claims, nor to act on their claims in other ways.”
Well maybe they are really still unsure. In “Thinking, Patterns and Cognition” Howard Margolis suggests there are four poles of belief instead of just two: Along one axis we have a kind of “gut feeling” belief, along the second axis we have a rational belief (I’d rather guess the second axis rates how satisfied we are with our rationalisation, but nevermind). While people will vigorousely defend their gut feeling beliefs, they might still be rather insecure about them somewhere inside. That’d explain why they often don’t back their proclaimed beliefs when being wrong may be costly.

Mark Spottswood

We do not need different beliefs to hold and share different info and analysis. We can talk directly about the clues we have seen, and about the conclusions our analyzes seem to favor. … We need not be so stupid as to form our beliefs only on such things.

As I noted previously, this seems to conflate what we are capable of doing, with what actual human beings are likely to do. You haven’t made the case that people are likely to maintain the same level of productive argumentation if they held systematically more conformist beliefs.

Consider people who vigorously argue for their claims but tend to find excuses not to bet on those claims, nor to act on their claims in other ways.

This may be the best of both worlds: vigorous argumentation motivated by weak (but sincere) belief, without the irrational (and probably harmful) actions that result from using the beliefs as a basis for risk-taking.

Will Pearson

Here is a question, what sort of way should you decide which degree to take (say you only cared about the amount of money you would earn).

If there was a market that people could bet on the likely wages of different jobs in 3 years, should everyone pick the degree best suited for the job that gave the most money?

If there were lots of people that thought that way, this would be a self-defeating strategy as the supply would sky-rocket and the wage go down.

alex

An older post of Elizer’s about moderating internet forums suggested keeping some loonies, because they acted as anchors against others taking up extreme views. While in theory, the forum would be better without them, it turns out to be a short term illusion (at least Elizer’s theory goes).
I believe your idea’s regarding this issue are correct, at least for perfect beings. For humans, I’m not convinced yet it (or a watered down version) is optimal.

Brian Macker

“For example, many balk at my suggestion that rationality requires us to be more conformist in our beliefs, deferring more to others’ opinions, even if in conversation we explore different evidence and analyzes.”

But, rationality doesn’t require this, so there is a flaw in your thinking. The majority of people believe in various irrational ideologies, and conforming would require irrationality.

Caledonian

No. Rationality requires that we derive our judgements directly from the available evidence, not from other people’s judgements that may or may not have been derived from evidence in a rational fashion.

burger flipper

I wonder if self-image maintenance isn’t as important as staking out social status.
Many people are locked into their social strata by innate ability, disposition, etc, and by external factors as well.
But a set of beliefs that allows one to believe himself different/special can provide someone who, say, flips burgers with some solace.

Hanson, get that book to the presses already.

Stuart Armstrong

For example, many balk at my suggestion that rationality requires us to be more conformist in our beliefs […] It seems to me that the real issue here is not humanity but social status.

If there was a way of signaling “I am being conformist, but only because I’m really smart”, then more people would go for it.

My old trick: conventional ideas, unconventionally expressed. You get to show off, without the cost to accuracy.

Nick Tarleton

It sounds very much like Hal is saying not that we should not do what you suggest because it’s inhuman, but that we cannot.

michael vassar

Robin: I don’t think that Hal was saying that e-zombies are worse than humans. At least I hope that he wasn’t. I interpreted him as saying that it wasn’t practical to try to turn humans into them. Even there I think he’s probably overstating the case, but that’s an empirical guess. The ability to entertain ideas without accepting them has long been held admirable in intellectual circles. It is a major point on Obama’s resume for instance.

Caledonian: Other people’s beliefs *are* evidence, just really awful evidence. From Robin’s perspective though, you are one of the “other people”, so your beliefs are also awful evidence, slightly more awful than that of aggregated beliefs across many people.

Mark Spottswood

Rationality requires that we derive our judgements directly from the available evidence, not from other people’s judgements that may or may not have been derived from evidence in a rational fashion.

Other people’s beliefs are evidence, and (contra michael vassar), they are pretty good evidence. It is irrational to refuse to change your probability assessment of a given belief once you hear that a reliable third party strongly believes it to be true. If I know that Bob is usually right regarding a particular subject, it is irrational of me to conclude that he is wrong on a given question, absent a good reason to think that he is less reliable than usual on that particular occasion. And as Robin has noted, I should be very reluctant to reject his opinion unless I have a better track record than Bob does.

Furthermore, redoing every experts work, rather than trusting them, wastes a great deal of time that could be productively invested in other projects. It would take me much, much longer to personally assess the validity of Wiles’ proof of Fermat’s Last Theorem than to ask a qualified mathematician their opinion—and in the end, that mathematician would be more likely to be correct anyway. Thus, unless the other benefits of doing it personally outweigh the extra time—such as when you place more value on developing new expertise on that subject than on other uses of your time—it is usually irrational to engage in a time-consuming survey of available evidence when a reliable expert has offered her opinion.

No. Rationality requires that we derive our judgements directly from the available evidence, not from other people’s judgements that may or may not have been derived from evidence in a rational fashion.

I believe Robin is referring to a specific body of research showing that a person X consistently hold to their initial guesses about something’s probability, even when they are told the differing opinions of person Y, and the circumstances are artificially set up so that X knows that Y has at least as much (sometimes more) information as X does.

This may not be what anybody else reading this thought of, but this brings to my mind the recent study (sorry, no link) showing that chimpanzees do not cooperate socially and are very “selfishly rational” in their apparent economic calculations, compared with humans. So, John Nash was upset when the first studies of the prisoner’s dilemma by Dresher and Flood showed people cooperating a lot in repeated games, but now we know that the Nash equilibrium in such situations may simply be “inhumanly rational.”

poke

Isn’t a lot of anti-conformist sentiment merely post-hoc rationalization? Many people are strongly attracted to the belief that there’s something inherently wrong with conformity because they find themselves outside the mainstream.

George Weinberg

In cases like prediction markets where being right is rewarded and being wrong is punished, then the consensus view is useful information. But in cases where people won’t receive significant rewards if their beliefs turn out to be correct, won’t receive significant punishment if their beliefs turn out to be wrong, probably will never find out for sure if their beliefs were right or wrong anyway, and will receive approval in their social circles for expressing the “correct” beliefs, expressed beliefs are worthless as information about reality. I think the latter condition is quite common.

I’ve noticed sometimes when arguing with someone, that I fall into a more extreme position than I mean to. I am less likely to do this when having a mere conversation. Maybe we slip into new roles depending on how others treat us. Conversation role: reporter, brainstormer et c. Argument role: lawyer. Have others had this experiences? Feeling attacked seems like the trigger from conversation to argument.

“If you want to believe the truth, … just accept the average belief on any topic unless you have a good (and better than average) reason to think the causes of your belief difference would be substantially more informed than average.”

This doesn’t seem right to me, although I don’t have the time to articulate why. I think a formulation “If you want to believe the truth, just accept the average belief on any topic by people who have a masters in the most relevant field unless you have a good (and better than average) reason to think the causes of your belief difference would be substantially more informed than average.” would be more plausible to me. I think it has to do with the apparent state of ignorance in the general population, combined with the higher status that comes with expressing knowledge or belief rather than admitting lack of knowledge or uncertainty on a topic.

Also, Robin, you and I share an interest in how status-seeking influences behavior and may deform rational interest maximization or the scientific enterprise. What are some of the best blogs, recent books, and experts on status-seeking, in your opinion? Especially ones that draw links between its behavior in humans and in other primates?

George, when other people’s expressed opinions are unreliable, yours may also be unreliable.

Phil, the theory I rely on assumes nothing about who has more info.

Jeff Borack

The idea that an increase in rationality would lead to an increase in conformity is completely biased by the amount of time spent talking about interesting topics and focusing on the things that make people different. People are all nearly identical. If you were truly objective, you would have as much trouble telling people apart as squirrels. Everyone walks forward. It’s logical because that’s where our eyes are and that’s how our knees bend. Everyone wears clothes when it’s chilly. Chilly is uncomfortable. People sometimes do walk backwards, or ski naked… but thats unusual and irrational.

How different do you think you are from the other people you pass on the street because you spend 1% of your day thinking about transhumanism? Whoa! Now this guy is non-conformist! He’s only 99.999999939% like everyone else. Oooooo, look over there, it’s a guy who likes a slightly different genera of music based off 4 beats per measure!

Where did we get this bias to focus on the differences and ignore the similarities?

Z. M. Davis

Jeff, what’s irrational about focusing on the differences? Sure, point taken that compared to the vastness of the space of all possible things, all humans are virtually identical. And compared with the vastness of the space of all possible things, humans and slugs are virtually identical. So what? We care about humans, so we label our axes on an appropriate scale. The things that everyone has in common are ipso facto irrelevant.

Unknown

Michael Vassar: as Mark Spottswood said, other people’s beliefs are evidence, and not awful evidence, but evidence just as good as yours (until you show that you have reason to believe that your evidence on the topic is better than the other guy’s evidence.)

However, you were especially mistaken in addressing your comment to Caledonian, since it is well known on this blog that his beliefs are indeed awful evidence.

Jakub P

“Consider people who vigorously argue for their claims but tend to find excuses not to bet on those claims, nor to act on their claims in other ways. To some extent, such people already are the “zombies” you find it hard to imagine.”

There are many possible explanations of such behavior.

Number 1 is: protecting ego. People want to have an opinion, because that is seen as a good thing among other people. So they will say something just to state this opinion, and protect it (not to be caught that they were wrong) “vigorously”. Some people of course.

Number 2 is: avoiding change. The fact that I KNOW something and that I ADVOCATE an idea doesn’t mean I will follow it. Following my beliefs throughout my life may be very costly or simply inconvenient, and people tend to avoid changes in life. Example: I may advocate protecting environment, I may believe it’s very important, but if I personally need to start segregating rubbish… come on, let the others take care of it. (That’s laziness – not doing the change one would need to.)

Number 3 is: fear. I may believe that fighting criminals is good, imporatnt, and I will praise people who do it, but if I’m supposed to do it myself, even if it’s a tiny thing such as helping a kid being mugged in a dark street, I won’t take the risk out of fear.

So: belief and action, especially when it touches my own life, are two separate things. People are lazy, people want comfort, and they expect others to follow higher standards than themselves — isn’t it obvious then that they will say grand things (state beliefs) which won’t show up in their lives at all?

Jeff Borack

Mr. Davis, I wasn’t trying to say that people are similar to each other relative to the “vastness of the space of all possible things”. If that were the case, I would have also argued that we are indistinguishable from dogs. The author of this post seems to believe that our differences are evidence that people are irrational. I’m arguing that our differences are almost insignificant relative to our similarities, but we focus on on the differences because that’s interesting to talk about. I think we have a bias to only see the differences.

michael vassar

Robin: If you aren’t claiming that we should use *some* waiting function then you aren’t it seems to me, really claiming much, just calling entropy over beliefs entropy over weighing functions. In so far as you are claiming something it seems perfectly wrong, as it seems to me that when a person believes something that no-one else believes AND their belief is precise and mathematical that their chance of being correct is, while not high, much much higher than the negligible probability that you are assigning.
There is also some difficulty here regarding who are target audience is. Is it advice for rocks? Dogs? The institutional insane? Fundamentalists? People who are committed to unsound epistemic practices in general? But what does it mean to provide advice to a person if their decision to adhere to your advice will not be driven by sound epistemic practices? Without the assumption that it’s judged on sound epistemic practices it sounds more like paternalistic coercion than advice really. With the assumption of sound epistemic practices it’s just bad advice, given a world where such practices are known (from widespread disagreement) to be uncommon.
If you are claiming some particular weighing function for how people should weigh the advice of others, how do you justify that weighing function? This is particularly problematic if it differs from the weighing function implicitly or explicitly used or given by itself. That suggests a solution, namely that one is only entitled to use weighing functions which are attractors, e.g. which endorse themselves. The only such weighing functions that seem to actually be implicitly endorsed by usage are those which count only a single person, and those disagree with regard to which person. For explicit endorsement of a single person, it seems clear that we should simply try to share the pope’s positions. Since he agrees, that does seem to be coherent.

In any event, since you are being endearingly sincere lately, so will I. It is emotionally expensive to even write posts criticizing yours because I know ahead of time that you don’t generally engage with counterarguments and give serious responses or seriously update your positions. I don’t think that there is anything in my comment here that you haven’t seen pointed out many years ago, nor anything that you have engaged with and responded to in any serious way (except maybe implicitly, without acknowledgment, in the suicide rock post), yet you have gone on for years making essentially the same claim, e.g. without either updating your beliefs or showing why people making this sort of objection should update theirs. Not only do you generally fail to live up to the Bayesian principle of taking other people’s beliefs (in this case almost everyone, even informed by you and upon some reflection disagrees with you) as evidence, you fail to live up to the more traditional academic principle of taking people’s arguments as evidence. Admittedly, so do most academics, hence the partial truth of “Science advances one corpse at a time” and its near total truth in academia outside of science (admittedly a soft-bounded field).

I have learned so much from you and I appreciate that, but if you don’t learn from me and from my responses to you you deny me the opportunity to learn from your learning from me and we both get much less from this blog than we could.

Michael, it is just not possible for me to engage in long debates with everyone who offers a critical comment on this blog. If you want me to engage you in more detail, you need to credibly signal somehow that your comments are much more worthy of detailed responses than the average comment. I’ve wished for a duel system to help identify the few comments I should debate in more detail. Perhaps you might post a detailed thoughtful critique somewhere?

On the substance, the advice is for everyone, whether they listen or not. You seem to ignore my qualifier “unless you have a good (and better than average) reason to think the causes of your belief difference would be substantially more informed than average.” And yes, we would want rational weights to apply well to the question of what the weights should be, but I see no proof that only dictator weights work.

josh

“We seem comfortable celebrating those who practice complex statistical analysis, even if only one in a thousand can do so. And the one in a billion genius celebrated for greatly improving our understanding or practice of statistics, or other rationality, seems to us an epitome of the best in humanity. Who would call such exemplars “inhuman”?”

You don’t watch enough baseball. If any of you are baseball fans and want to read a funny blog about old timey cranks decryiing statistical analysis precisely because it’s inhuman, check out firejoemorgan.com. It’s written by one of the writers (and the guy who plays Dwight’s cousin, Mose) on the US version of the Office.

michael vassar

Robin: Your response would be more credible if you engaged in detail to critical responses to anyone, but you don’t really seem to. More indication of significant changes in your beliefs in any context over the last 5-10 years, for instance, would greatly contribute to the plausibility that they are due to sound practices.
Could you elaborate on
“Perhaps you might post a detailed thoughtful critique somewhere?”
It seems to me that my critiques above are fairly detailed, at least enough so to merit more response than I expected or got for them. It’s certainly not clear whether you are asserting that you advise dogs to adopt weighted average opinions. If so, how exactly? Rocks too?
I certainly never claimed that only dictator weights work in principle. In practice though, surely only dictator beliefs work for explicit beliefs. For implicit beliefs its very implausible to me a-priori that any set of weights works in this light cone.
You also seem to usually ignore your qualifier “unless you have a good (and better than average) reason to think the causes of your belief difference would be substantially more informed than average.” which isn’t surprising given that it’s very far from clear how it should be interpreted. I don’t significantly doubt that majoritarianism with that caveat is true for some way of defining the caveat, but it seems to me that finding the right weighting for the caveat is no or little simpler than the whole work of getting from a standard Enlightenment perspective to an epistemically correct basis for belief.

Z. M. Davis

Jeff, it’s not obvious to me that there is a well-defined, objective way to compare similarities and differences amongst people without reference to anything else. What do you propose? Making a complete list of similarities and a complete list of differences, and seeing which list is longer? I can’t think of an objective method for compiling a complete list, or a unique, objective way of compiling a list we could actually construct. Furthermore, a lot of the things we would like to count as human universals aren’t really universals, as you note (e.g., we can’t say that literally everyone has two eyes, because some people lost one in an accident). Does “Humans: Two eyes” go in the similarities column, or “Smith: two eyes; Jones: one” go in the difference column? This problem is why it’s helpful to think of things in terms of a phase space (like Eliezer’s Thingspace, which, admittedly, we can’t construct either): it’s the correlations between trait values that make categories that make categories robust (cf.Lewontin’s fallacy). But you’re not talking about phase space, you’re saying we’re objectively more similar than different. And again, I don’t know how to interpret the question without the additional information: as compared to what?

In any case, all this is irrelevant to the point of Robin’s post. Even if the differences in humans’ beliefs are objectively trivial compared to the similarities, the differences aren’t zero, and so it can be argued that there should be less disagreement.

Also–strictly out of curiosity–I notice you address me as “Mr.” Did you have any specific evidence that I in particular am male, or were you just relying on your priors, knowing the demographics of our community?

Manon de Gaillande

The mere fact that you think about a question (collect evidence, form a belief based on it, reject what you were told as a child if evidence doesn’t point to it) almost always makes your beliefs more accurate than average, given than most people don’t bother. How should we take that into account?

Michael, Robin pointed once to a book that proposed a system for assigning weights to others’ beliefs similar to what you are talking about. They imposed the requirement of consistency, that the weighted weights were the same as the weights, so to speak, and came up with a mathematical formulation for how this could work.

The Great Paradox of Majoritarianism is that nobody believes it! That means a majoritarian is obligated not to believe it either. Of course, once he stops believing it, he is no longer obligated to respect other people’s opinions so much, hence his original reasons for believing it once again become convincing, and he has to go back to believing it again. It’s enough to keep you quite busy.

Robin’s comments quoted in this entry point to the tantalizing possibility that people really do believe in this stuff, deep down inside, but refuse to admit it (perhaps even to themselves). It does seem that on issues where it matters, issues of life and death, people are generally majoritarian and go along with everyone else’s beliefs. Perhaps on less important issues it is all a matter of posturing and pretense to gain social advantage. That is tempting reasoning to a majoritarian, as it helps to resolve the Great Paradox. But I can’t help thinking it is a little too easy.

Caledonian

“Who are you going to believe: me, or your lying eyes?”

When we’re trying to learn about a phenomenon, our observations of that phenomenon are more valuable than other people’s conclusions about those observations. This is not a matter of degree, but one of kind – there is a hierarchy of evidence in which things in themselves have a deeper priority than things we believe about things.

Thus, when we walk in upon our spouse cheating with a stranger, we are inclined to trust our senses and our interpretation of their data over the words of our spouse and the stranger.

What’s that? We’re taking our own judgement over the judgement of two people? Clearly, we must have left rationality behind… for a decidedly non-standard definition of ‘rationality’.

Manon de Gaillande

Caledonian: They have huge incentives to make you think your spouse didn’t cheat (P(they claim not to cheat|spouse is cheating) is high). Even if they didn’t, what you see and hear is very strong evidence (P(spouse is cheating) is high).

Now, if your spouse has incentives to make you think (s)he cheated, and you only saw weak evidence (say, vaguely flirtatious behavior), would you still doubt their claims? I don’t think so.

Caledonian – no-one’s suggesting that you believe the word of others in defiance of your own experiential evidence, as well you know.

There are two extremes – flatly ignoring the views of others in all cases; and ignoring your own experience to believe everything you’re told. No sane person would advocate either. Robin’s suggestion is that when our data is incomplete, the apparent beliefs of others should have (at least some) weight in informing our decisions. Tell me exactly what your problem with this stance is.

michael vassar

Thanks Hal. I have proposed something very like that in the past as pointing very close to precisely in the direction in which I am trying to point when I say, for instance, that some song or painting is objectively better than some other song or painting, and I genuinely do try to utilize this principle when talking about art and the like, e.g. stating that I don’t like Monty Python but based on the people who do they are very good. (hmm. looks like I said that in the comments of the linked post too) It seemed to me that “one is only entitled to use weighing functions which are attractors, e.g. which endorse themselves.” was an expression of this principle, but apparently not an articulate one, and that the remainder of the paragraph was an exploration of potential problems with such an approach.

Robin: Maybe this should be a “disagreement case study”. If you don’t want to spend the time for that, what about just examining how we are better off, and how worse off, after transferring uncertainty from propositions to the weighing of other people’s beliefs.

Caledonian

Caledonian – no-one’s suggesting that you believe the word of others in defiance of your own experiential evidence, as well you know.

On the contrary, that is PRECISELY what the position expressed by some here requires. Which is why it’s a ridiculous position.

Robin’s suggestion is that when our data is incomplete, the apparent beliefs of others should have (at least some) weight in informing our decisions. Tell me exactly what your problem with this stance is.

If you have no reason to believe that they have evidence at least as good or better than yours, you have no reason to accept their conclusions above your own, all else being equal. If you have no reason to believe that they are more likely to reach a correct conclusion than you, you have no reason to accept their conclusions above your own, all else being equal.

If you’re trying to evaluate your own conclusion-drawing ability, and you are using other people’s abilities as a standard, you have reason to concern yourself with their evaluations. Otherwise, you may introduce bias.

When there are multiple variables involved, in order to evaluate one, we must hold all others constant.

conchis

“If you have no reason to believe that they have evidence at least as good or better than yours, you have no reason to accept their conclusions above your own, all else being equal.”

The symmetric corrollary of this is:

“If you have no reason to believe that they have evidence that is worse than yours, you have no reason to accept your own conclusions above theirs, all else being equal.”

Which is precisely the point everyone else is making, and which you seem to insist on misinterpreting.

Michael, it would be fun to do a blog solely devoted to being critical of overcomingbias, with you and a few other consistently critical posters. It could be called “Overcoming Overcoming Bias” or something like that. 😛 As a plus, I think it would be more likely to draw the responses you (and I) would both like to see from Robin.

Caledonian

The symmetric corrollary of this is:

“If you have no reason to believe that they have evidence that is worse than yours, you have no reason to accept your own conclusions above theirs, all else being equal.”

So if you have no information at all about the quality of the observers’ judgment, how should you respond? By disregarding your own judgment and accepting theirs, or retaining yours and ignoring theirs?

Z. M. Davis

“So if you have no information at all […]”

Use an ignorance prior?

Jeff Borack

Davis, thank you for responding to my posts. I addressed you as “Mr.” only because of the demographics of this community. I used it out of respect, and I apologize if you were offended. I shouldn’t have made any assumptions. Please don’t tell me if you’re male or female, I’d rather not know.

I don’t think my above posts are irrelevant to Robin’s original post. Assuming that the differences in humans’ beliefs are objectively trivial compared to the similarities, the remaining differences might be a diminishing reflection of rationality. Having one eye is unusual because most people have two, but it certainly isn’t irrational to have only one. However, voluntarily popping an eye out is extremely irrational. People don’t do this, we all agree that not popping our eyes out is a good idea, and this is a good reflection of our rationality.

The ways that people most commonly separate themselves are arguably not a matter of rationality in most circumstances. Lets say it is irrational to believe in God. Lots of people do it, and it forces them to exist in a world where basic scientific concepts are denied while other more complex scientific concepts are accepted and made use of every day. To deny evolution but accept cancer treatment… obviously irrational.

But maybe not. What if believing in God allows you to be part of a community that provides you with something that makes your life more comfortable. If you need social support or monetary support, religious affiliation can provide it. Is it irrational to sacrifice a rational belief to obtain something beneficial? If an immigrant who just moved to this country was faced with a choice of believing in evolution or sacrificing that belief to get some help learning English, setting up a small business, and meeting new friends in the community, I would argue that it is irrational to maintain the rational belief in evolution.

I can’t think of any examples where popping out your eyeball is a good idea.

It’s not obvious to me either that there is a well-defined, objective way to compare similarities and differences amongst people without reference to anything else. I simply propose sitting in a comfortable chair, preferably one that rocks, and thinking about it. I would argue that if the differences in humans’ beliefs are objectively trivial compared to the similarities, then the usefulness in exploring these differences to gain insight into human rationality is trivial as well. I would also argue that rationality is not absolute and should be considered relative to the living situations of the people making decisions. If I believe that 1+2=4, and that belief clearly benefits me, then maybe it’s not so irrational to believe. Maybe there are some biases we shouldn’t try to overcome for the sake of rationality.

Z. M. Davis

“Assuming that the differences in humans’ beliefs are objectively trivial compared to the similarities, the remaining differences might be a diminishing reflection of rationality.”

Except I’m still not convinced the assumption is coherent, let alone true. Maybe I need a rocking chair? I know I’m the one who first used the phrase here (to paraphrase your position, in the antecedent of a conditional), but I would contend that “objectively trivial” is a contradiction in terms.

“Maybe there are some biases we shouldn’t try to overcome for the sake of rationality.”

Probably there are benefits to believing false things, for those who to whom it really doesn’t matter whether their beliefs are true or not, but I’m not in a position to care.

Mark Spottswood

Caledonian – no-one’s suggesting that you believe the word of others in defiance of your own experiential evidence, as well you know.

Actually, I am suggesting this, in some cases. If Caledonian sees a small purple unicorn that no one but he can perceive, I would suggest he ponder seriously whether the consensus view (that small purple unicorns do not exist) is correct, and he is hallucinating.

More generally, we should always keep in mind the large error factor in our own perceptual capacities (and even more starkly, the systematic defects in the reliability of our memories). Read some of the literature on the reliability of eyewitness identification testimony, and you will see what I mean. Unless you have reason to believe that you are a more reliable witness than other people, you should follow Robin’s dictum even regarding experiential beliefs.

conchis

“So if you have no information at all about the quality of the observers’ judgment, how should you respond? By disregarding your own judgment and accepting theirs, or retaining yours and ignoring theirs?”

Neither: treat each judgment as equally likely to be right.

Caledonian

Neither: treat each judgment as equally likely to be right.

I didn’t say you had no information about the quality of your own judgment.

Further, the tendency of human beings to rely on the judgments of others means that they’re not independent any longer. Ten people agreeing on something as a group does not have the same weight as ten people agreeing on something as individuals.

It may be misleading to refer to what you advocate as conformism.
It seems very similar to conformism on issues where most people behave as conformists.
But on issues where people are irrationally polarized, your advice suggests people hold rather unusual beliefs. Someone who averages the beliefs of Christians, Muslims, Confucians, etc will look quite different from a conformist.

conchis

“I didn’t say you had no information about the quality of your own judgment.”

The crucial point is whether you have information about the relative quality of the judgments. If you have information about the absolute quality of your own judgment, that will imply information about the relative quality of the other person’s judgment, given your priors. In theory you can then weight things accordingly.

This is a blog on why we believe and do what we do, why we pretend otherwise, how we might do better, and what our descendants might do, if they don't all die.