Dogmatic Doesn’t Have To Mean Closed-Minded

Over the course of your life you have to decide your position on a number of philosophical/social/political issues. You are open-minded so you collect as much data as you can before forming an opinion. But you are human and you can only remember so many facts.

There will come a time when the data you have collected make a very strong case for one particular position on issue A, say the right-wing position. When that happens you are pretty sure that there is never going to be enough evidence to overturn your position.

That’s not because you are closed-minded. That’s because you are very open-minded and based on the weight of all the evidence you collected and processed as objectively as a person can do, you have concluded that its very likely that this is the right position on A. And the fact that this is very likely the right position on A does not just imply but is indeed equivalent to saying that you attach very low probability to the future occurrence of strong evidence in the other direction.

Now that means that there’s not much point in collecting any more information about A. And indeed there’s not much point in remembering the detailed information that led you to this conclusion. The only reason for doing that would be to weigh it against future evidence but we’ve already established that this is unlikely to make any difference.

So what you optimally, rationally, perfectly objectively do is allow yourself to forget everything you know about A including all the reasons that justify your strongly-held views on A and to just make an indelible mental note that “The right-wing position on A is the correct one no matter what anyone else says and no matter what evidence to the contrary should come along in the future.”

The reason this is the rational thing to do is that you have scarce memory space. By allowing those memories to fade away you free up storage space for information about issues B, C, and D which you are still carefully collecting information on, forming an objective opinion about, in preparation for eventually also adopting a well-informed dogmatic opinion about.

13 comments

Only two minor flaws:
1) No one is in any danger of having scarce memory space; and
2) There’s zero evidence that allowing some memories to fade frees up storage space for new ones.

It’s certainly tempting to think of memory as the equivalent of your hard drive and RAM, and those metaphors have some use. However, like most metaphors, it only works to a point, since wetware and hardware are, in fact, very different things.

@Matt: Of course not; what an odd question. Having functionally unlimited memory does not mean being able to find all bits and pieces of the memory, nor does it mean that everything I see or do gets encoded into memory. You’re dramatically confusing concepts.

Now, probably, somewhere in my head, there’s a vague memory of what I had for lunch that day; it’s something significant enough that it would have made the jump from working to long-term memory, as evidenced by how we can all probably remember what we had for lunch yesterday. But that doesn’t mean that I’m going to connect it with many other memories (which is how we’re able to recall memories). C’mon: you already know that there are things that you can eventually recall that you can’t recall at first – but that, as you start thinking about things you can recall that are connected, that leads to the recall of the target memory (as well as times when there just aren’t enough other connections around the memory to be able to recall it).

We have more memory that we’ll ever need. We *don’t* have the ability to recall all those memories. Very, VERY different things.

I think your point about memory capacity is the line I would take as well, twicker, but matt’s argument isn’t that trivial. Remembering the fine points of an argument/piece of knowledge often requires an amount of repetition and practice, the opportunity cost of which isn’t necessarily minimal (Oh great, I’ve got to read Burke again this year, and I was so looking forward to reading the Hunger Games…)

So perhaps the real argument about taking a position could be more closely termed as “getting lazy” by someone who’s being sarcastic about it, or “lifestyle optimization” by someone who’s following it. After all, the point of having a position (on some level) is to make decisions easier and more automatic in the most intellectually efficient way possible.

@bellisaurius: Your point is well-made, though I would argue that, while Jeff’s argument isn’t trivial, Matt’s argument (and specifically saying something like, “Then I suppose you would have no trouble telling us what you had for lunch on _date_”) is, in fact, pretty trivial.

Now, Jeff’s original argument, that dogmatic doesn’t have to mean close-minded, seems absolutely on the mark to me, just not for the reasons Jeff describes. We’re back to the conceptual confusion between memory and recall: one describes the ability to store information, the other describes the ability to then retrieve that information. I’ll refer you to the great work that’s been done on spreading activation: what tends to actually happen is that we recall information that confirms our previously-held opinions more readily than information that disconfirms it, and may even seek out confirming information more than disconfirming information. We may have disconfirming information sloshing around in our wetware (in our memory); however, it doesn’t have nearly the number of neural connections to other nodes that confirming information has, so we can’t get to it.

It’s not even so much a matter of “efficiency,” depending on what your definition of efficiency is. There may be a short, direct connection to one piece of disconfirming evidence; however, it would be overwhelmed by the large number of indirect, but far more numerous (and, thus, more powerful) connections to the confirming information. You can imagine it like moving a small log: one person might be able to do it, but, if you have four people and a sling to put it in, you can move it marginally faster (assuming they’re all walking) and it’s easier individually for each person – even though, in aggregate, they’re using almost 4x the amount of man-time and they’re doing far more combined work (moving 4 people + 1 log a distance of X instead of 1 person + 1 log that same distance). If efficiency means, “faster,” then yes – though that “faster” aspect would seem to go against Jeff’s original point, since it’s faster because the disconfirming information is being ignored in favor of the more-immediately-gratifying confirming information.

There’s also an argument to be made that, if people intentionally look for disconfirming evidence, this will awaken more previously-dormant areas of the brain, so it might require more blood glucose and other nutrients in the long run; not sure if that study’s been done. Certainly, at the initial stage, that would not be my expectation; not sure about 5, 10, 30 minutes into the processing.

Now, someone could be dogmatic and still act to get around this natural bias (i.e., could intentionally seek out disconfirming evidence). A great example of this would be the Jesuits, whose training specifically includes continual challenges to their faith. They constantly test their faith to ensure that, yes, it exists not because it requires little to no conscious thought to be dogmatic (it doesn’t; it just requires blood glucose, since you’re activating larger initial regions of the brain), but that their belief in the dogma results from it having been tested and not found to be wanting, at least in their eyes. To me, this would be how someone can be dogmatic and not closed-minded.

Here’s a topic for a theoretical exploration in itself: people who think that complicating the obvious makes them sound smart. The trick would be to formalize the intuition in the following quote of Nietzsche:

“Whoever knows he is deep, strives for clarity; whoever would like to appear deep to the crowd, strives for obscurity. For the crowd considers anything deep if only it cannot see to the bottom: the crowd is so timid and afraid of going into the water.”

The case for dogma in a stationary world is compelling. But if the distribution of informative events is not stationary, or if the dogmatic individual’s mapping of realized events to “philosphical outlook” is not stationary, dogma ciuld be suboptimal.

The problem, of course, is that Issue A does not (generally) exist in isolation from other issues, and the information relevant to making a decision about Issue A is also likely to impinge on one’s decisions on other issues. Furthermore, when something happens that affects Issue K, it may (it is perhaps likely to) affect how one interprets the evidence bearing on Issue A.

So a strategy of “optimal forgetting” or “optimal dogmatism” is likely only to be appropriate when dealing with issues that are wholly orthogonal to other issues…which is to say, never.

I’m a macroeconomist, so I instinctively interpreted your post “recursively”: I think even without talking about memory constraints, you’re first arguing that there is a correct scalar *state-variable* here, which is “strength of belief in position X”

By definition, this summarizes all the stuff that led you there, so you don;t need the history anymore.

A rational model agent would indeed by “open-minded” and be absorbing stuff all the time, but in a stationary world (as Sean emphasizes), would seem to end up asymptotically with a strong view on (just those?) things warranting a strong view.

Also, it seems to me that in a stationary world, once a lot of time has passed (“a lot” relative to the stochastic properties of the things you’re trying to learn about), shouldn’t *all* your positions be strong–including the ones where you’ve decided to be “strongly middle of the road”?

[…] Ely shows how dogmatism can be the consequence rather than the circumvention of rational thinking. We may process information rationally, arrive at a position, discard the workings that got us to […]

[…] Ely shows how dogmatism can be the consequence rather than the circumvention of rational thinking. We may process information rationally, arrive at a position, discard the workings that got us to […]