Tyler Vid on Disagreement

We often like to ask lunch visitors what is their most absurd view (in the eyes of others). Alas I have so many choices. On BloggingHeads, Tyler Cowen answers this for Will Wilkinson:

Tyler: My most absurd belief, perhaps, is the extent to which I think people should be truly uncertain about almost all of their beliefs. And it doesn’t sound absurd when you say it but I don’t on the other hand know anyone who agrees with it. … Take whatever your political beliefs happen to be. Obviously the view you hold you think is most likely to be true, but I think you should give that something like 60-40, whereas in reality most people will give it 95 to 5 or 99 to 1 in terms of probability that it is correct. Or if you ask people what is the chance this view of yours is wrong, very few people are willing to assign it any number at all. Or if you ask people who believe in God or are atheists, what’s the chance you’re wrong – I’ve asked atheists what’s the chance you’re wrong and they’ll say something like a trillion to one, and that to me is absurd, that even if you think all of the strongest arguments for atheism are correct, your estimate that atheism is in fact the correct point of view shouldn’t be that high, maybe you know 90-10 or 95 to 5, at most. So that maybe is my most absurd view. Most things are much more up for grabs than we like to say they are. Will: Yeah, I agree with you. Tyler: No you can’t agree with me because its absurd. I can agree with your absurd view, but you can’t agree with mine.

Will: I agree with you that things are more up for grabs than people think they are, but I have real problems with the idea that it’s either possible or desirable that people assign probabilities to all of their beliefs. I think it’s a weird violation of the actual computational constraints of the human mind, that we just don’t Tyler: Here, you are more of a philosopher than I am, and I’m more a Bayesian. I’m sure its possible. Now I’m not saying its desirable, I’m just saying I want people to do it in a lot of instances, maybe just for my aesthetic pleasure. I want to pin people down and get a sense for how sure they are, and interpret these probabilities as betting odds, if you want. Let’s say there’s a lot of dying starving children in India or sub-Saharan Africa, and you are offered to bet, and you know that the money won on these bets will go to feed these children and save their lives, and you have to name what odds you are going to bet at. And you can name a number. You want to name the best number you can because you want to save the lives of these children, so I’m not going to allow any evasion here. I don’t see why there is not always some pick of a number that’s better than a lot of other picks. You are not going to get it right so computationally of course its hopeless. But look, you’ve got to give it your best guess. Will: But why do you have to give it your best guess? There’s a practically infinite set of propositions that I could entertain, none of which I will ever entertain. But every now and then one of them will come up in a conversation. So all of a sudden I’m entertaining it. I’ve never devoted any time or energy to assigning a credal probability to that proposition. Why should I on the spot be able to say anything about it at all? Tyler: Well it depends on the question. If the questions I’m asking are, "What are the odds that right now exactly 23 people in the city of Cleveland, no more no less, are playing Scrabble?," that to me seems like a waste of time to consider. But if we take truly core beliefs like "Will the world end through nuclear proliferation?," "Is there a God?," "Are libertarians correct about economic policy?," and we simply want to ask people, "What do you think is the chance you’re correct?," and people are evasive, that bothers me, and I don’t think they can invoke the "Well who ever bothers to think about that?" defense, which they could if I asked them about Aunt Milly in Cleveland playing Scrabble. But its not that, they are core issues. Will: So tell me about how perfectly rational Bayesians aught to deal with disagreement. So suppose you say P and I say not P. Tyler: That’s right and lets say we have equal background on the issue in question and I have no particular reason to think that I’m less biased on it than you are. I don’t think that you can make the Robin Hanson move and say that well because we disagree, I look at you and you look at me and we realize that we have to agree, because somehow we are equally competent truth-seekers on this issue. It seems to me that even if I can mange to agree with you, there are other people who will disagree with both of us who are equally competent truth-seekers as the two of us are, and you can’t agree with everyone. So like who’s the disagreer, am I disagreeing with them or are they disagreeing with me, who’s at fault? Philosophically I don’t think there’s any way, or using Bayesian reasoning, to resolve that problem. I think that computationally is insoluble. But I think we can still step back and say look, no matter what we think, there are equally skilled people, or in reality actually smarter and more competent people on just about any question other than my personal biography, who know more about it than I do, and who and are smarter than I am, so my odds from my point of view of being correct I should winnow down really quite radically. Will: So the implication of that then is that if you actually have a strongly idiosyncratic view, about anything, you must be irrational Tyler: I don’t think you must be irrational. Will: because all these people who are at least as smart disagree with you, that ought to weigh very heavily against you. So any sort of purist in any ideology ought to see that almost everybody disagrees with them and become a moderate, right? Tyler: If no one agrees with you, you should be quite worried. If only a small number of people agree with you, you still should be quite worried. I don’t think its a numbers game, but I think whatever view you end up with, it doesn’t have to be a majority point of view, that reasons have weight, not just adding up whoever agrees with you. But you still ought to say at the end of the day, look all those other people are against me, maybe I think I’m right probability 57 to 43, but on any truly controversial question among intelligent people, you should never think its 95 to 5 in your favor. Most smart people I know have that kind of attitude.

Amazingly, I agree with all Tyler says in this interview except:

He says the chance our descendants survive past our Sun is less than one percent.

He mis-attributes to me the claim that disagreers must always instantly go to identical beliefs (he’d know better if he’d heard my standard talk on this).

He says Will can’t agree with his claim that we should all be a lot less certain of our opinions – if Tyler can hold the position so can Will.

Hat tip to David Killoren.

Added 27June:Apparently Tyler thinks it OK to estimate a controversial claim at a p = 1% as long as you see only a 60% chance "that is the correct p or even in the neighborhood of the correct p" and think it equally likely your "p estimate could go either up or down." Sigh – this is a loophole so large cosmologists would study it.

“He says the chance our descendants survive past our Sun is less than one percent.”

you think it’s more than that? I’m interested in your explanation.

“Or if you ask people who believe in God or are atheists, what’s the chance you’re wrong – I’ve asked atheists what’s the chance you’re wrong and they’ll say something like a trillion to one, and that to me is absurd, that even if you think all of the strongest arguments for atheism are correct, your estimate that atheism is in fact the correct point of view shouldn’t be that high, maybe you know 90-10 or 95 to 5, at most.”

I’d like to see Tyler’s explanation for that. It seems plausible to me that “a trillion to one” is too low an estimate.

Overall, though, I think Tyler is lying in a particularly obnoxious way. I doubt this is his “most absurd” belief. It’s almost a mainstream smart person’s position, and has been that way since the “I am wise because I know that I know nothing” line attributed to Socrates. I would hazard a guess that most people in Tyler’s social circle, just like Robin, has a similar position regarding skepticism about personal certainty. But I admit my confidence about this is 90-10, 95 to 5, at most.

“He says the chance our descendants survive past our Sun is less than one percent.”

Hopefully Anonymous: You think it’s less than that? 🙂

For one thing, we have zero examples of other technological civilizations to look at to see how many of them survive so long (about 5 billion more years, it seems), so sheer uncertainty should enter into it. I can’t see how someone could be 99% certain that humans and their descendants will die out in the time period.

“Or if you ask people who believe in God or are atheists, what’s the chance you’re wrong”

This one interests me. I’m an atheist who’s been becoming a bit more agnostic lately. Although the main religions have huge theological holes, when you simply ask “what’s the possibility that this universe was created by an intelligence of some sort?”, I’m not aware of any really strong argument against it.

We have so little understanding of what goes on outside our universe, I think it’s hard to completely rule out the existence of intelligences with the ability to create universes… so my confidence in atheism is now 95% at most… 🙂

“We have so little understanding of what goes on outside our universe, I think it’s hard to completely rule out the existence of intelligences with the ability to create universes… so my confidence in atheism is now 95% at most… :-)” That’s not the same as believing in the existence of a God, except in a cheating, god of the gaps type of way.

Unknown

One problem with “does God exist” is that people interpret the claim in different ways, and these different claims would have to be assigned different probabilities; I suspect that HA and Allan Crossman meant two different things, HA something very specific and Allan something more general; in fact, as Allan interpreted the claim, Nick Bostrom says that there’s a 33% that God exists (because if we live in a simulation, then the world was designed.) Nonetheless, I would go with Tyler and say that “a trillion to one” would be a sign of overconfidence in anyone, even in the more extremely specific cases.

“He says the chance our descendants survive past our Sun is less than one percent.” It’s amazing that he made this claim after saying that people should be much less certain of their positions.

Tiiba

“””when you simply ask “what’s the possibility that this universe was created by an intelligence of some sort?”, I’m not aware of any really strong argument against it.”””

My understanding is that a “god of the gaps” is something that is invoked to explain something difficult to understand.

I don’t think that’s what I’m doing, since the hypothesized intelligence (that I now give a 5% credence to) that created our universe would presumably have to reside in some other “universe” itself, and would probably have to be the product of evolution, etc.

This hypothesis wouldn’t help me explain anything, since it raises more new questions than it answers. But I don’t see how I can rule something like this out. I suppose the best argument against it is simply Occam’s Razor, but that can never give you certainty.

I have often felt the dilemma Tyler raises about disagreement: whether you change your mind and agree with the other person or not, you know that you will still be disagreeing with very many third parties, on any issue where opinion is widely divided. Nevertheless I think the theory says that rationally you do have to come to agreement with the other person, because the evidence from living in a larger world where people believe as they do, and for the reasons they do, has to be treated just like other evidence, and can’t be the foundation for a disagreement between the two of you.

I really like Tyler’s point about atheists being too strong about their atheism, and I assume it goes without saying that he would say the same about theists. It may seem more excusable in the case of belief in God, since many religious people are open about putting faith above reason, and what can you say to that? Whereas with atheists, they are supposed to be basing their beliefs on reason, and as Tyler argues, the evidence on the matter is really too sparse to justify an overwhelmingly strong belief. Of course there are many variants on theism and it may be that any particular one has to be of quite low probability, otherwise they add up too high. But cumulatively, the possibility that the universe we see is the product of intelligent design, well… it would explain a lot. Especially if there is a Great Filter in our past, I think we have to increase the probability of ID.

The ‘probability of ID’ is irrelevant, debating ‘the probability of an ID’ is debating how many angels can dance on the head of a pin all over again. Who created the ID?

John 4

I’m sorry, but the argument at the link is mistaken. It is plausible (at least) to think that, if God exists, God is a necessary being. (That’s part of the concept of God – we can give arguments for why it should be a part of the concept (the concept perfect being would be incomplete without it) – and it doesn’t appear to be in tension with any of the other parts of the concept of God (perfect goodness, maximal knowledge and power, etc.). But necessary beings explain themselves. To wit, if it is necessarily true that God exists, then it is necessary that it is necessarily true that God exists. And so on, ad infinitum. Hence, nothing is left unexplained. (There is no more full or complete explanation for P than that it is impossible that not-P.)

But it seems extraordinarily counterintuitive to think that a particular event such as the Big Bang, or a very large physical object such as the cosmos, could have happened/exist of necessity. Indeed, we know of no other events that happened of necessity, and of no other physical things that exist of necessity. (Note that all or almost all non-physical things/things outside of spacetime, like numbers, triangles, properties, etc. exist of necessity. So it is not unnatural or implausible to think that God exists of necessity as well.) Hence, the proposition that it is necessary that if the Big Bang occurred, it was necessary for it to occur has much less going for it than the proposition that if God exists, it is necessary that God exists. This despite the fact that the proposition that the Big Bang occurred has a lot more going for it than the proposition that God exists.

I’m not arguing here that the cosmological argument is compelling. I’m just pointing out that this particular (and common) criticism of it is misguided.

prase

It’s easy to say that people should assign numbers to their beliefs, but I wonder how to do that. What are the prior probabilities of theism and atheism? 0.5 both if we regard theism and atheism as two principal opposing opinions? Or should we count all existing and historical religions and add atheism as one additional item and distribute the probability uniformly after that, giving the overall prior probability of any kind of theism becoming true somewhere around 0.998? (I don’t know how many religions there are and even I am not interested too much in that fact, but 500 seems to me like a quite modest guess.) Or should I take theism as one of the myriads of conceivable hypotheses people can invent? (This is the way how atheists arrive to the 1 to trillion sort of numbers, I believe.)

And further, given any prior probabilities, how to evaluate the arguments in favor and against? Let’s say the priors are 50:50 and that the ontological argument for God’s existence is true (I don’t think it is, but admit it for the sake of argument). How do the probabilities change after considering the ontological argument? Should I also involve the probability of the ontological argument itself being true? When the God’s existence follows directly from the argument, is
P(ont.arg. is consistent)=P(God exists)
or should I also assume some uncertainty about the inference? Can we design a consistent, not necessarily precise, but unambiguous alorithm for devising probabilities for any hypothesis? I am truly curious about these questions. Until a good answer is given, I will gladly say I am very certain, quite certain or uncertain about a proposition, but I will avoid assigning numbers.

but on any truly controversial question among intelligent people, you should never think its 95 to 5 in your favor.

I disagree – Tyler has a much too optimistic attitude on people’s reasoning skills, thinking that when most people agree on something they should be right more probably. It is rather that we are evolved to agree, so disagreeing is difficult even if you have good reasons.

That Tyler’s view is wrong can be seen easily: imagine a person versed in the cutting edge of modern science is instantiated in the stone age: he will find himself disagreeing with most people in the tribes, but surely he will be right on more things than them? (Believing not is a very postmodernist attitude, against which so much can and has be said that it need not repeated here).

Of course, Tyler would say that the information that the future person has is different from the one the stone-agers have. But exactly this is the case in modern science: if you find yourself disagreeing with 95% of people on a subject, this is probably because you are familiar with often very counter-intuitive results of a highly specialized field – and the information has not gone public yet.

Great discussion. I particularly note Sean’s reminder that quite a lot of physicists don’t want to admit that there is a problem with the arrow of time. The underlying motivation is all too often a kind of childish machismo: that’s philosophy, and philosophy is bad. Sadly, the truth is that physicists needed philosophers like Huw Price to keep on telling them what ought to have been extremely obvious: the arrow of time is a real mystery, and our failure to explain it fully is a very strong hint that there is something major missing from our whole understanding of the early universe.

So should one really go believing that everything is ok just because the majority of physicists do? No. The social factors working against posing new difficult questions are much too evident here. So we see that even highly sophisticated disciplines have a problem: they do not take dissenters seriously enough. So much for disagreeing with intelligent people.

As to assigning a concrete number to being right or wrong about a certain belief: in absence of empirical evidence giving some numbers to plug in, that’s just begging for a false certainty of exactness – don’t do it.

Check the reasoning and evidence behind arguments, apply some evolutionary psychology, avoid cached thoughts, and you’re well on your way.

Leonid

I think the discussion misses the important point. Most people believe that adhering to their world-views (atheism/religion, liberalism/conservatism,…) and supporting the associated policies shows them to be morally superior to other people. By acknowledging a significant chance that their opinions may be wrong, they would surrender this feeling of moral superiority.

Hopefully, I agree this isn’t really Tyler’s most absurd belief, and it is not in fact clear it influences the rest of his beliefs.

Leonid, Tyler and I agree with your point, and find it obvious.

Gunther, the fact that some people are right and others wrong doesn’t say how confident the right people could reasonable have been that they were right.

Doug S.

As a declared atheist, I will admit that “the Universe has a Creator of some sort” is entirely possible, but I will also insist that “The God of The Bible is the creator of the Universe” is patently absurd. There’s more evidence for the existence of Santa Claus than for the existence of YHVH.

I’m having trouble reconciling your statement that you are a “declared atheist” with your next phrase “the Universe has a Creator of some sort is entirely possible”.

It sounds to me like you are not an atheist but rather an agnostic.

cerebus

Interminable atheist-agnostic arguments always (alliteratively!) come down to semantics. Surely in the context of this discussion we can see we should be (weakly) agnostic about all our beliefs, but that doesn’t prevent us asserting our favored position in natural language.

The argument that Eliezer has written up in Einstein’s Arrogance is relevant here. It is very improbable that the amount of evidence that people obtain for compex beliefs is just right to keep the probability around 50%. Much more likely, it is off the scale for the most such beliefs, in one or another direction. Mostly, only beliefs for which there is little evidence (I have no idea what weather is in New York right now) and few possible alternatives (coin toss) remain within 5%-95%. The problem is in systematic errors which add separate influence that can eat up all of the evidence and skyrocket the belief, probably in the wrong direction, but it doesn’t apply everywhere. The problem with this particular excerpt is that Tyler emphasizes exactly such cases where bias strives.

Gunther, do you believe it is the case that for MOST PEOPLE, “if you find yourself disagreeing with 95% of people on a subject, this is probably because you are familiar with often very counter-intuitive results of a highly specialized field – and the information has not gone public yet”? In other words, for most people who disagree with 95% of people on a subject, it is for this reason? I would be skeptical, because I imagine there are many people out there with unusual beliefs.

Or were your comments only supposed to apply to scientists, or to scientific issues? Even then I would find your proposition a little hard to accept. Certainly there are cases where it is true, but wouldn’t they be rare and not last long?

I do agree that we are largely “evolved to agree”, but we are also evolved to make a show of disagreeing, fluffing our feathers and showing off our fine intellectual plumage.

Unknown

“The argument that Eliezer has written up in Einstein’s Arrogance is relevant here.” Probably somewhat less than you think. It’s true that if you calculated a probability based on all of your evidence, it would usually be something extreme one way or the other. But the fact is that you don’t calculate a probability, and when you feel “very certain”, you don’t have a distinct criterion that distinguishes between cases when you have a lot of evidence, and cases when you are just very biased. Because you can’t distinguish these cases, you can’t give a calibrated estimate which is nearly as extreme as a calculated probability would imply. And generally, Tyler is right; if you’re talking about significantly disputed questions, you will rarely be able to give a calibrated estimate of better than 95%, even though you might often have much more evidence than that. For at least in one case in 20, you will simply be so biased that you believe that you have a lot of evidence, even though you actually have a lot of contrary evidence.

Matthew C, I recommend to you the rest of that sentence, along with the link therein. I think a great many Westerners would be comfortable taking the position “the Abrahamic God is a myth” as a statement of atheism, even if one went on to offer the possibility of some other First Cause. See the old saw about atheists’ just believing in one god fewer than theists. Without brawling about definitions, one might note weak atheism (“there is no reason to believe in the existence of a deity”) and strong atheism (“there is reason to believe in the non-existence of a deity”).

Unknown, are the three prongs of the Simulation Argument equally likely? It is a sobering moment when you realize that taking the Simulation Argument seriously means giving a fairly high p to some form of intelligent design. I will differ with US’s contention that it would explain nothing; I take from Eliezer’s story the notion of us as the AI that gets out of the box, but also there are so many philosophical and theological questions that have intuitive answers if you view the universe as a big version of The Sims with a bored, not terribly empathic player.

of course it depends on which people you are talking to: if you talk to ill-educated people, you should tell them to agree with most of the stuff they read in the next science book: it will be an improvement to their previous knowledge. (Of course, even better were to teach them critical thinking, but that is a time consuming process.)

I am not so optimistic that these cases of wrong dissent are so rare. Most people work in a system (and paradigm; with system I mean to include their social connections/authority relations/career issues). And a system/paradigm is always more stupid than a single critical thinker, because a paradigm is fixed, not so flexible when presented with new evidence and new ideas.

You will probably learn more of the real problems in a field by hallway discussions in scientific conferences than by the current published papers: why that? Because here people speak their true opinions, and about stuff that they simply dare not publish yet because it’s not backed up by enough evidence to topple the reigning paradigm or which contradicts some “well-known” assumptions.

Take String Theory for example; physics has been dominated by this approach for the last decades, and dissenters (read: grad students) were actively discouraged from pursuing other avenues (I do not want to take a position on the String Theory issue: it is just an example). That kind of behaviour is not conducive to scientific progress.

So if we start encouraging the only people who try to do critical thinking (scientists, intellectuals etc) to go with the crowd, that would be reinforcing the wrong side. A critical thinker is by nature _uncertain_ in his beliefs. This uncertainty is an achievement, still too sparsely sprinkled among humankind to already start paddling in the other direction again.

Of course, Tyler in the interview above says we should be less certain about things (the dissent thing was only a secondary point); so isn’t he paddling in the right direction?

Maybe he and I would find that we agree when discussing things out carefully in person; but I only have access to the interview, and responding to that, I think he is moving in the direction of self-defeating skepticism: if you start being _too_ uncertain, then “anything goes”. ESP? Spaghetti Monsters? and what have you… (see his proposed massive reduction of uncertainty in the atheism question) Why fund scientific education and not some New Age festival when we are so uncertain?

I always try to be open to new arguments, which probably implies that I am very “uncertain” of my knowledge. But that does not preclude me from saying that other propositions are probably even less likely to be true (sometimes even if they be current mainstream; that does not mean that I have a better answer, only that I think some assumptions are premature/biased by our human form etc).

So what I am trying to say is this: being uncertain is ok and good, but not if it lets you fall into some kind of general “anything goes” skepticism (which I have often witnessed happening in people when skepticism is adopted).

I think the approach offered by critical rationalism is a fine heuristic, and evades the problems of extreme skepticism; and probably the idea that “extreme skepticism is a good thing” is only a belief in belief; and that it is in practice actually only used to shoot down novel theories (as in: “I don’t believe these theories of yours are right, they contradict my intuition and my 30 year old education”) versus being used to shake up belief in some deeply held conviction (that would be a good thing, so seldom seen!).

And in practice, we should adopt those stances which are most conducive to progress.

Unknown

Zubon: Nick Bostrom suggested that the three possibilities were roughly equally probable.

Personally, I would think both human extinction and a human future without any simulations are more probable than the simulation hypothesis, leaving perhaps a %10 chance of a simulation.

What can we plausibly give trillion or one or lower odds to? It seems to me we can to lots of things.

As for people redefining a normal, reasonable concept of “God” to whatever created a simulation that we may be a part of, it’s hard to call that anything more than intellectual cheating. I don’t think it’s worth my energy to explain why. It seems to me that the odds of the existence of a normal, reasonable concept of “God” would be on the order of smaller than 1 in a trillion, and much, much lower than 1 in a hundred, based on comparing the concept to things that have reliably been calculated to have 1 in a trillion or smaller odds. But I’d like to see this concept more rigorously hashed out (as a general way of being able to reliably give things odds this low in informal mental estimations and sortings).

Hopefully A, many people describe their believe in God as a rather vague sense of spirituality. For example a report came out this week about a Pew Forum survey of American religious belief. Page 26 of chapter 1 finds that 92% believe in “God or a universal spirit”. Page 27 then delves into conceptions of God and finds that 60% believe in a personal God (that one could have a conversation with, apparently), while 25% see God as an “impersonal force” and 5% say other/both. It seems that this idea of God as a “universal spirit” is rather more common than many nonbelievers’ perceptions of how religious people think. I don’t know how this fits in your category of a “normal, reasonable” concept of God.

While obviously not many people believe that we are literally living in a simulation created by our descendants, sometimes you’ll hear it said that we are dreams in the mind of God, which is not so different from a simulation in the mind of a future AI. See also Tipler’s notion of God as the Omega Point at the end of the universe, when infinite computational resources become possible and all possible pasts are re-created. All these ideas blur together and make it hard IMO to rule out the more general notions of Intelligent Design. I agree however that the more specifics you layer on that general idea, the less probable it becomes.

Hal, thanks for your reply. I’m more interested commentary on the first, second, and last sentences of my last comment. (The other stuff has been done to death, and in my opinion encourages dialectical stupidism, much like whether blacks have lower IQ than whites, whether women are underrepresented in elite science positions because they have less extreme IQ variability, and whether the answer to all our political problems is less government regulation and lower taxes.)

Unknown

HA: I think you need to distinguish between calculated probabilities (i.e. in effect frequency calculations) and humanly calibrated probabilities (i.e. subjective estimates.) Robin and Tyler are talking about the latter, and odds of a trillion to one for a subjective estimate are indeed absurd.

Unknown, I think you’re making a false distinction. I’m very interested in reading the input of others about this.

Nick Tarleton

One reason you might not be able to give trillion to one odds against at least any statements about the local near future (like “will this random number I generate between 1 and 1 trillion be 189,234,859,124”) is if the probability that you’re in a practical-joke simulation where the first thing that you assign tiny odds to will happen is >>10^-9 and >>the probability of being in a simulation where it’s guaranteed not to happen.

I might be able to assign a probability to a “normal, reasonable concept of ‘God'” if I knew what that meant, but I agree that simulators aren’t part of the common-language concept, and I really don’t know what people mean when they say they believe in a “universal spirit” or “impersonal force”, if they’re evenexpressing a proposition at all.

Nick Tarleton

*Switch “and” and “but” in that last sentence.

This is a blog on why we believe and do what we do, why we pretend otherwise, how we might do better, and what our descendants might do, if they don't all die.