Archive for July, 2007

[I’m traveling today, so I took the liberty of pulling this comment by frequent poster Michael S. up to the front page. If you’re curious as to the context, see the Africa Addio thread. – MM]

I suspect that curious condescension and the myth of the noble savage flow from the most central beliefs of Universalism.

The original theological meaning of universalism was that, at the end of days, no one would be damned – everyone would be saved. This belief was closely associated with early unitarianism, for example that of the Socinians. Universalism is a way of denying that “in Adam’s fall/we sinned all.” If there was no original sin, there is no need for a Savior, hence unitarianism.

If not by original sin, how, then, are we to explain the moral failings of humanity? If, in a state of nature, man is naturally good, the reason must be (as Rousseau suggested) that the institutions of society are to blame. Get rid of them, and man’s inherent goodness will flourish.

The concept of the noble savage and of the corrupting influence of civilization is made much easier to accept by our intimate acquaintance with all the failings of our own social institutions, and our comparative ignorance of those of others. It is easy to see primitive societies as innocent, first because the wish is father to the thought, and secondly, because we don’t know that much about them.

Universalism inverts the thesis of Bishop Reginald Heber’s famous hymn “From Greenland’s Icy Mountains.” In Heber’s lyrics, the lands of the heathen are described as places where “every prospect pleases/and only man is vile” – because ignorant of Christ’s promise to redeem vile men from the original sin that is their portion as heirs to Adam’s fall. But, since according to the Universalist view, man is not inherently vile, it is amongst civilized people that every prospect pleases, and only man is vile because of the perverting influence of civilization. Therefore we have a better chance of finding the good amongst peoples who have never been tainted by its poison breath.

The degree to which this view has persisted since the time of Rousseau, despite all the evidence to the contrary, shows the depth of the Universalist faith. The French revolution alone should have shown Rousseau’s theory that sweeping away the old corrupt institutions would inaugurate a return to the Edenic state was a fraud. Still, the same sort of people who pinned their hopes on the Jacobins later pinned them on the Bolsheviks, and on Mao Tse-tung; and they still think Castro’s Cuba is a promising experiment, if not a proved one (cf. Michael Moore).

Anthropology is merely another venue for such hopes. Consider how Margaret Mead’s Samoa, or the “gentle Tasady” or other false and romantic interpretations of the primitive. Even primate zoology does not escape this. For a long time, chimpanzees, we were assured, shared more DNA with humans than any other species. To be sure, baby chimps are cute and affectionate; but it did not long escape notice that adult ones shared, in addition to DNA, all-too-human tendencies towards territoriality, greed, and violence. Then the primatologists discovered bonobo apes, who supposedly were bisexual, had a communal economy (“it takes a village…”) and matriarchal social organization. What could more ideally answer Universalists’ fondest hopes? Unfortunately for them, this myth too is now beginning to crumble (see the most recent [July 30] issue of the New Yorker).

Whatever may be said about some of the beliefs of orthodox Christianity, it seems to me that there is deep insight into the human character in the doctrine of original sin.

Pride, anger, avarice, envy, sloth, gluttony, and lust are fundamental and dominant aspects of human nature, semper et ubique et ab omnibus. They are innate, because they are all in one way or another assertions of self-interest. Consider the squalling infant, who is too young to be sensible to any moral instruction. He thinks only of his own comfort, and never mind his mama’s or papa’s inconvenience, he wants it NOW! The infant displays six of the seven deadlies – lust alone fails to appear until the onset of puberty.

All of the character traits we admire, such as kindness, honesty, and generosity, by contrast, ane not instinctual in this way; they have to be learnt. It is (among others) the job of civilization to teach them, and that point is one that Universalism fails to acknowledge. It may be the germ of its destruction.

Samantha Power is, as her latest in the NY Times informs us, “professor of the practice of global leadership” at Harvard’s Kennedy School of Government.

In other words, you might say, Power is a professor of power. Global power, to be exact. One would expect her thinking to be characteristic of the Powers-That-Be, which we at UR so fondly call the Polygon.

And the piece linked above does not disappoint. A Times Book Review front-pager in the grand old style, it delivers the true Universalist goods on how to win the “War on Terror.” Without, of course, calling it that.

It should go without saying that Luttwak is a realist and Power is a priest. Or priestess, I guess. I will simply take Luttwak’s point of view for granted – I don’t think the debate is even worth discussing. But I still think it’s worth reading Power’s sermon, because it demonstrates so many of the pathologies that fascinate us here at UR.

First of all, it’s important to note the common denominator of all her policy proposals. Note that every solution Power proposes involves increasing the importance of the State Department, and/or decreasing the importance of the Defense Department. Presumably funding is to shift accordingly.

Of course, this reflects the fact that State is a Universalist (BDH) bastion deep in the Polygon, and the Revelationist (OV) enclave of DoD is its ancient hereditary enemy. With DoD’s defeats in Iraq, State smells blood and entertains a vague hope of capturing this pesky varmint alive, the same fate it meted out to CIA in the ’70s. (Unless you’ve been in a cave for the last five years, you may have chortled a little at how the scourge of Chile, Iran and Guatemala is now joined-at-the-hip with State and the Times. Same Agency – different people.)

But Power, of course, hardly sees herself as a foot soldier in this tawdry bureaucratic melee. (Any such thought is itself heretical.) She has a philosophy to peddle here, and it’s as much a philosophy of power as Machiavelli’s. Let’s look a little more closely at this case study in applied Universalism.

The thing is: “global leadership” is exactly what it says it is. It’s about ruling the world.

Does Samantha Power rule the world? Perhaps the best way to explain it is that she and her ilk pretend to rule the world. In both the common and monarchical meanings of the word.

Power does not have any real influence over Iran, Syria, Venezuela, Hezbollah, etc. Neither she nor anyone else in Cambridge, New York or Washington can call Mahmoud Ahmadinejad and tell him what to do. The Polygon can’t even influence his behavior, except perhaps by feeding him. And feeding certainly does not tame a creature of this sort.

But the Universalist view is that, since Universalism is universal, everyone in the world can and will become a Universalist. Likewise, if they are not antagonized and constantly beaten and threatened, all these rogue states, liberation movements, etc, will eventually settle down and become part of the new world order. All humans everywhere will be subject to the Polygon. There are no aliens, only citizens we haven’t naturalized yet.

A key aspect of the Polygon’s power is its ability to maintain the opinion of its literal constituents – American voting taxpayers – that things are, in fact, moving in this direction. Any policy that denies this violates the vision and is a direct attack on the Polygon.

For example, any Iran policy that says Iran is Iran, that it is and presumably will always be an Islamic republic, that Islam and Universalism are two different things, and that the way for the US to deal with Iran is to make the terms of the relationship clear and provide effective disincentives for Iranian transgressions, is a serious violation. Because this says that Iran is not subject to the “global leadership” of Harvard, that all is not becoming one, that the best interests of the American and Iranian governments and populations may in fact conflict. It says that Samantha Power does not rule the world.

I don’t think it’s a coincidence that no mainstream party or politician in the United States has any such policy on its menu. Power is always hard to let go of, even when it’s imaginary.

The essential assumption of “global leadership” is that Power and her ilk have no actual enemies. The Polygon is so powerful that no reasonable person can oppose it. Thus, anyone who attacks it is either (a) insane, (b) criminal, or (c) enraged by injustice.

Counterinsurgency theory tells us that all members of groups (a), (b), and (c) are easy to reconcile to Power’s world order. To deal with (a) and (b), either hospitalize, rehabilitate (as the ex-Foreign Service blogger New Nationalist puts it, “we got a note!“), or for the very worst cases, prosecute and imprison them. To deal with (c), redress any grievances they may have, and make very, very sure you – or, of course, your evil twins over at DoD – don’t create any new ones. Obviously (c) is the hard part, but if you can do it, it’s problemo solvato.

It’s very difficult for me to avoid the conclusion that when historians 100 years hence look at this kind of thinking, assuming of course that there are any historians 100 years hence, their overwhelming impression will be of solipsism and hubris. Or possibly hubris and solipsism. It’s sometimes hard to tell them apart.

Note, for example, that in this hefty article on terrorism, there is no discussion of actual terrorists. Power seems completely uninterested in the actual motivation and organization of these friends we haven’t gotten to know yet.

She is similarly uninterested in the actual motivation and organization of unruly teenager-states such as Iran. Presumably borrowing a phrase from one of her reviewees – it’s a little difficult to imagine her actually thinking the e-word – she says “the United States must learn to get inside the minds of its enemies.” Does she follow this with an actual discussion of “the minds of its enemies?” Of course not.

What this diplomatic chestnut turns out to mean is that our quote-unquote enemies actually see the world in just the same way that we do. Quelle coincidence! “A Bush administration that had stepped into Iran’s shoes might have toned down its inflammatory rhetoric…” Indeed. I mean, if there’s one thing you never hear from Iran, it’s inflammatory rhetoric.

The way Power sees the world is about the way a kindergarten teacher sees her charges. For kindergarteners, one policy fits all. The way to deal with their tantrums is to let them simmer down, and never actually get angry at them. After all, they can’t actually harm you. Remember that you’re in control, and you will stay that way.

Of course, this assumes that there is no group (d) – consisting of reasonable people who are perfectly happy to fight a war to gain the usual booty of war, that is, power, or at least money. Since no one besides Power, and her fellow practitioners of “global leadership,” has or can have any real power, group (d) cannot possibly exist.

If they did exist, however, one would expect them to adopt the strategy of pretending to be members of group (c), that is, people enraged by the injustices they have suffered. Redressing these injustices (which are presumably real by at least someone’s definition of “injustice”) involves giving group (d) power, or at least money. Of course this dissuades them from trying the same trick again. And it certainly persuades others that strategy (d) is no good and not worth messing around with.

I wonder if strategy (d) has ever been tried in the past? Hm, I can’t imagine.

People throw around the word “appeasement” a lot. In my opinion, it’s a mistake to use this word, because it now has absurdly negative connotations, and the word was once used seriously by those who promoted it. At the very least, before talking about “appeasement,” we should understand who the “appeasers” were and why they thought it was a good idea.

The idea of appeasement was very simple. The goal was not to bribe or buy off Hitler, but to defeat him. The logic worked like this.

First, Herr Hitler’s speeches are full of ranting about the injustices that were supposedly done to the German people at Versailles. (True.)

Second, injustices were in fact done to the German people at Versailles. (True.)

Third, Herr Hitler is a politician, and he derives his power from popular support. (True.)

Fourth, redressing the grievances of the German people will leave Herr Hitler with nothing to whine about, so he will fall and a reasonable government will replace him. (Not true.)

Statistically, therefore, the logic of appeasement is 75% true. While this is not true enough to be actually true, it tends to be true enough to be convincing. Especially to its supporters.

And this is the thing about how Samantha Power rules the world. The interesting thing, the dog that didn’t bark in the night, is that her approach doesn’t work at all. To borrow Luttwak’s medical metaphor, it is about as effective as bloodletting. The only historical examples I can think of in which it’s led to peace are those in which the equation has just reduced to surrender. And in many others – notably Palestine – the result has been permanent war.

But – the grain of truth behind the kindergarten-teacher mentality is that the US does, indeed, have overwhelming military power. Absent some serious changes in US immigration law, the jihadis are not about to conquer North America. And so the bloodletting is just that: bloodletting. It can continue indefinitely. At present it looks set to do just that.

Therefore, Power’s proposals are not counterproductive at all. They are, in fact, adaptive. They strengthen the Polygon and weaken its real enemies – which are not foreign, but domestic. Call me crazy, but I think the whole thing makes perfect sense.

here. Since I prodded him to this, I suppose I should say a few words.

Of course this is a lovely review – what more could one expect? Conrad’s choice of screenshots is excellent – revealing the cinematic achievement that is Africa Addio. If not for the content and the title font, it’d be almost impossible to believe this documentary was made in 1966. Merely as moviemakers, Jacopetti and Prosperi (directors of the more famous Mondo films, although they consider Addio their masterpiece) were far ahead of their time.

Addio is not merely a movie, however. It is a historical document. And this is where I must fault Conrad slightly, because the problem is, he is misleading you a little. It is a very pardonable reviewer’s trick. The trouble is that he wants you to watch the movie (the Google Video copy has been pulled, but it’s easy to find on DVD), and so he makes it sound a little bit like a National Geographic Special.

And you almost certainly don’t want to watch this movie. If there is any good reason for the world to have an MPAA, Africa Addio is it. In fact, I don’t think an NC-17 would really do Addio justice. I think it might well be something like an NC-40. Certainly, if you are between the ages of three and ninety-three, there’s no chance you will ever forget the experience.

Let me put it this way. In Addio, you see two men killed onscreen. (These executions are almost certainly not faked – I don’t believe any of the footage is, although clearly some of the events, such as charismatic megafauna being hunted with spears, are orchestrated rather than incidental.) But what’s really appalling is that by the point at which you see this, which is toward the end of the film, it hardly bothers you at all. It seems, really, almost normal.

The genius of Addio is that it sweeps you into the welter of stupendous tragedy that was Africa in the ’60s, engaging your senses and forcing you beyond any conceivable denial. If you saw Hotel Rwanda or Last King of Scotland, perhaps you got a small taste of this experience. But both of these were, of course, fiction.

For example, Addio contains the only footage of actual genocide that I know of. And it’s not just footage of genocide – it’s 35mm film, shot by one of the leading documentary cinematographers of his generation, from a helicopter, of a genocide which I had never even heard of.

The genocide is the murder of the Arabs of Zanzibar in 1964. It’s briefly mentioned on this page, which gives a death count of 5000, a number which anyone who sees the film can tell is understated. You see almost that many people on screen, and that’s just the rolls they used. Murdering 5000 people barely counts as genocide these days, and it hardly requires the use of large, preprepared, mechanically-dug mass graves.

(My guess is that the memory hole in this case is due to the fact that the Arab ruling class of Zanzibar was generally aligned with the British Empire, and the African party which sponsored the coup and genocide was aligned with Tanzania, which Zanzibar merged with Tanganyika to form three months later, and which was a longtime darling of the Western left.)

Allow me to set the scene. After trying to land in Zanzibar, and being forced to make a quick takeoff when their plane is shot at and a companion plane is burned on the landing strip, our filmmakers rent a helicopter in Kenya and fly over the scene at a reasonable height. We see a line of people, dark-skinned men, women and children in white Arab clothing, walking single file as far as the eye can see, toward the aforementioned mass graves. The soldiers guarding them occasionally look up and take a potshot at the copter, but it’s too high.

Cut to a shallow tidal flat, where hundreds at least, probably thousands of Arabs have been literally driven into the sea. Small boats are collecting a few of them. The rest merely stand around in water up to their ankles, presumably wondering what in the hell to do. There is no answer. The mainland is not in swimming distance. The next day, the helicopter returns, and where there had been people, there are now bodies – still in the same white robes.

Have you ever wondered how, if the Nazis had invented some miracle wonder-weapon in 1944 and actually won the war, they would explain the Holocaust? Because I’m sure there would be a way. Hitler never ordered it. It was war, things happen. It was the Allies’ fault for not accepting Jewish refugees. The British and Americans bombed city centers, boiling Germans alive like rats in tunnels. The Soviets did all kinds of crazy horrible things. All of these and more, I’m sure, would be deployed.

But if the Nazis knew one thing, it was how to distort reality. It’s often forgotten that when Hitler wrote of the Big Lie, he meant – of course – the lies of others. He, Hitler, was debunking these lies, offering truth to the people. But of course he was projecting, and how better to present his own Lie?

And so all educated people on the planet today learn that there was something called “colonialism,” that “colonialism” was evil, and that its death was a “liberation.”

And when I tell you that, in reality, this “liberation” amounted to an orgy of tyranny and murder which surely at least competes with the achievements of Stalin, Hitler or Mao – that the transition from “colonialism” to “postcolonialism” amounted to a transfer of the Third World from one Western faction to another, from Optimate to Brahmin, Revelationist to Universalist, from indirect rule at the local level to the same at the national level – that it replaced governments whose quality of service was generally indifferent to good, with ones whose quality of service was disastrous to mediocre, but whose officials at least had the right skin color – who are you to believe? Me, or every educated person on the planet?

Perhaps I am just like Hitler. After all, Unqualified Reservations has basically the same goal as Mein Kampf – to convince the reader that he has been fooled, that the world he thinks he lives in is a simulacrum, a fiction, a faked documentary. True, I don’t attribute the disparity to the international Jewish conspiracy, or to any conspiracy at all. But at least Hitler’s readers knew his real name.

And this is why I treasure a film like Africa Addio. Because it’s 40 years old and the things it shows happened, and because the men who made it were mad geniuses, who could turn some of the world’s ugliest history into undeniable beauty, whose work can still command our eyes when no sane man should want to see. Again, believe me – you don’t want to watch Addio. On the other hand, if you don’t believe me, you are free to watch it. At least for now.

It’s been a while since I posted anything really controversial and offensive here, and I have a vague sense that there are some new readers who don’t know what they’ve gotten into. Sure, it’s still legal to read UR. But unless you take special precautions, you’re leaving a trail of HTTP requests that future regimes may have no trouble at all in tracing to you personally. These may well qualify you for a stint in one of the new inpatient sensitivity facilities. Mellow out, as Jello Biafra put it, or you will pay. Try tapping on the wall – I might hear you.

In any case. Today I thought it’d be fun to talk about democracy. Unless you are 107 years old and a veteran of the Austrian Landwehr, you probably associate democracy with peace, freedom, progress and prosperity. Since I associate democracy with war, tyranny, destruction and poverty, we certainly have something to talk about.

My guess is that the conventional view of democracy, which I of course grew up with, is what we can call an adaptive fiction. An adaptive fiction is a misperception of reality that, unlike most such misperceptions, manages to outcompete the truth.

For example, suppose we somehow became convinced that warm beer is refreshing, whereas cold beer is poisonous. Obviously a fiction, and obviously maladaptive in our society. However, if we imagine a hot country ruled by brewers, who control their serfs by paying them only in lager, which being warm leaves them both tipsy and unrefreshed, hence quite incapable of revolt… you get the idea.

A political formula is a belief that makes the ruled accept their rulers. Since the former tend to outnumber the latter, a political formula is, if not absolutely essential, an excellent way to cut down on your security costs. A political formula is adaptive because the rulers have, obviously, both motive and opportunity to promote it.

The best example of a political formula is divine-right monarchy – simply because this formula is defunct. Hardly anyone these days believes in the divine right of kings. Since at one time, most everyone did, we have incontrovertible proof that adaptive fictions can exist in human societies. Either divine-right monarchy is a fiction, and people then were systematically deluded. Or kings do rule by the grace of God, and people now are systematically deluded.

Or, of course, both. Because Mosca’s second example of a political formula is – democracy.

In UR terms, democracy is a core tenet of Universalism. It’s really not possible to be a Universalist and not believe in democracy. It’s like being a Catholic and thinking the Virgin Mary was “just some chick.”

Universalism is the faith of the Brahmins, the intellectual caste whose global dominance has been unchallenged arguably since World War II, and certainly since the end of the Cold War. Since an intellectual is defined by his or her ability to influence the opinions of others, it’s not hard to see why democracy is such an effective political formula. Democracy means that popular opinion controls the State; intellectuals guide popular opinion; ergo, intellectuals guide the State.

As Walter Lippmann pointed out 75 years ago, public opinion in a democracy is a sort of funhouse mirror that reflects – albeit inaccurately, imperfectly and often quite reluctantly – the views of the governing elite. To be fair, it also has a certain filtering effect which discourages some of the nuttiest intellectual fads, if only because they can be positively incomprehensible to anyone who hasn’t been to Harvard. But the history of extraordinary popular delusions does not afford much confidence – and with only a few exceptions, the beliefs held at elite schools in the Unionist (Lincoln to Wilson), Progressive (Wilson to FDR), and Universalist (FDR to now) periods have been leading indicators of American public opinion. Very generally, the consensus at Harvard at year Y is the consensus of America at Y+50. If this isn’t power, what is?

I don’t think anyone reasonable would dispute this. What I do think many reasonable people would dispute is the claim that democracy is a fiction – which, note, I have not justified at all.

In fact it’s perfectly possible for a political formula to be an accurate description of reality. If democracy is the rule of Brahmins, fine. But don’t the Brahmins seem to be doing a pretty good job of it? Don’t we have – with a few small exceptions – peace, freedom, prosperity and progress? And, even more damning, don’t the places in the world that lack democracy also seem to lack these things?

It is all very convincing. But, you see, a political formula has to be convincing. We’re not talking about something some asshole came up with on his lunch break here. We’re looking at the result of 200-plus years of adaptive evolution. We shouldn’t expect a sordid little lie. We should expect a spectacular masterpiece of incredible mendacity. If it is, in fact, an adaptive fiction – and it certainly seems prudent to start by assuming the worst – democracy has fooled pretty much all of the people, pretty much all the time. At least for most of the 20th century.

So I could point out that the Austro-Hungarian Empire had plenty of peace, freedom, prosperity and progress, and hardly any democracy. Or that the same can be said of Dubai, Hong Kong, and even in many ways Singapore. Or that the Founders who created the American Republic for the most part feared and despised mob rule, or that the Civil War more than justified these fears. Or that the so-called democracies of the Progressive and Universalist eras, especially colonial confections such as the EU, combine a homeopathic dose of democracy with an allopathic dose of the Hegelian civil-service state, whose functionaries are intentionally unaccountable to “the People,” and whose jobs would change not at all if elective offices suddenly became familial – as in fact they may be in the early stages of doing.

But this would be the same kind of argument that is made in favor of democracy. A jumble of negative associations to counter the jumble of positive associations. Hardly effective against a sacred status quo.

As Swift said, it’s useless to try to reason a man out of a thing he was never reasoned into, and certainly few of us were reasoned into democracy. However, I do vaguely remember my earliest, and surely entirely received, thoughts on why democracy is so great. And perhaps it’s worthwhile trying to unravel the string from the beginning.

As I recall, I thought democracy was great because America was obviously democratic and free, it was opposed to the Soviet system which was non-democratic and non-free, and both had fought a war against the Nazis, who were non-democratic and evil. It was pretty clear to me, as it still is, that the parties running these non-democratic states were simply mafias.

So we have the association again: democracy equals free and prosperous, non-democracy equals tyrannous and poor. Case closed, it would seem.

In the standard view, democracy is like the cure for a disease. This disease might simply be described as primitiveness. The primitive way of government is tyrannous and, frankly, bestial, going back to the chimpanzees with their chief-chimps and chimp wars. Democracy cures this disease and allows us to have HDTVs and iPhones. Those who don’t take the democracy pill are stuck in chimp world and have to live under chimp government, fishing for ants with sticks.

In the inverted view, democracy is like a poison. The permanent contest for political power that democracy creates is an extreme case of limited war, in which no weapons at all are allowed, and battle is resolved by counting heads. In other words, democracy is a permanent source of friction. Only very stable, healthy and homogeneous societies can withstand this poison. In those that can’t, the cultural convention of limited warfare breaks down, and true civil war emerges, culminating in, of course, chimp government.

So a free, prosperous democratic society is like a person who’s so strong and healthy he can take a dose of arsenic every day – or at least, every four years – and still survive, sort of. The free, prosperous democratic society might be remarkably unfree and unprosperous compared to an undemocratic society that never took the arsenic, but so few of the latter survived the last two centuries that we have no basis for comparison. (You can’t really compare the US or France to Singapore or Dubai. Even the Central Powers of WWI were anything but free from democratic politics. Any exercise in imagining what 180 years of technical progress would have brought to, say, the France of Charles X, is entirely in the department of fantasy.)

Meanwhile, the undemocratic, tyrannous societies are not those which failed to take the democratic arsenic, but those which took it and found it fatal. Of course they are no longer ingesting the medication. Their lips do not move and their throat does not swallow. Civil society has been destroyed. I’m sure there are one or two 20th-century tyrannies which did not get that way as the result of a democratic degringolade, but I find it hard to think of them.

Both the standard and inverted perspectives are quite consistent with historical fact. And the inverted model is by no means as unusual as one might think. Every time you hear someone decrying the presence of politics in government, he or she is expressing it. Anyone who praises “nonpartisan” or “bipartisan” or, so help me God, “post-partisan” government, or (especially in Europe) decries the existence of “populist” parties or politicians, or even who believes that there is no room for “extremism” in politics, is stating their fear and distrust of democracy.

Yet none of them will put it in these terms. In conventional Universalist discourse, therefore, the democratic state becomes a kind of sickbed patient, an employment opportunity for every chiropractor, homeopath or bloodletter under the sun. Its health is constantly fretted over in the direst of terms. All the problems of democracy can be solved by… more democracy.

Most people don’t know this, but Marxist-Leninist thinkers saw socialism in the same way. Socialism had this problem, it had that problem, yes, it was true, the turnips were rotting in the fields and men were sent to Siberia for speaking their minds. But was this an occasion to discard all the achievements of socialism? Wouldn’t that be curing acne with decapitation? Shouldn’t we instead move forward, to a kinder and more efficient socialism? The temptation to reform, rather than abandon, the adaptive fiction, is omnipresent.

Another way the democratic fiction protects itself is to define “democracy” as “successful democracy.” Therefore, it is easy to see that democracy is always successful. For example, there was a democratic election in Iraq – using one of the most democratic of democratic forms, proportional representation, specifically recommended by the UN – and there is now a democratic government. This government is incapable of enforcing the law or even administering itself, however, so it cannot be true democracy.

(And no one thinks the failure of democracy in Iraq casts any aspersion on democracy. Even the pessimists conclude that Iraq is simply not “ready” for democracy. The ultra-pessimists conclude that Iraq may never be ready, presumably because of its strong tribal culture and its national IQ of 87. No one seems to suspect democracy itself. If your medicine routinely kills the weak and spares only the sturdy, Occam’s razor doesn’t lead you to suspect that it’s bad for sick people, but good for healthy ones.)

In fact, the word “democracy” has narrowed over time to focus on those democratic forms which have been more correlated with success. Reversing this definition creep is a difficult and unenviable task, and so I’ll resort to my usual tricks and define a new word, which corresponds to the literal derivation of “democracy” rather than its present connotations.

Let’s define demotism as rule in the name of the People. Any system of government in which the regime defines itself as representing or embodying the popular or general will can be described as “demotist.” Demotism includes all systems of government which trace their heritage to the French or American Revolutions – if anything, it errs on the broad side.

The Eastern bloc (which regularly described itself as “people’s democracy”) was certainly demotist. So was National Socialism – it is hard to see how Volk and Demos are anything but synonyms. Both Communism and Nazism were, in fact, obsessed with managing public opinion. Like all governments, their rule was certainly backed up by force, if more so in the case of Communism (the prewar Gestapo had less than 10,000 employees). But political formulae were of great importance to them. It’s hard to argue that the Nazi and Bolshevik states were any less deified than any clerical divine-right monarchy.

Most people in democratic states tend to instinctively classify political systems into two types: democracies and everything else. (Of course, this dichotomy is typical of all political formulas – any regime constituted under a conflicting formula must be somehow invalid.) The old monarchist-aristocratic order in Europe, which was certainly not perfect, falls into “everything else,” and thus we wind up putting, say, Elizabeth I and Stalin into the same bag.

The difference between a monarch and a dictator is that the monarchical succession is defined by law and the dictatorial succession is defined by power. The effect in the latter is that the fish rots from the head down – lawlessness permeates the state, as in a mafia family, because contending leaders must build informal coalitions. Since another name for a monarchist is a legitimist, we can contrast the legitimist and demotist theories of government.

Perhaps unsurprisingly, I see legitimism as a sort of proto-formalism. The royal family is a perpetual corporation, the kingdom is the property of this corporation, and the whole thing is a sort of real-estate venture on a grand scale. Why does the family own the corporation and the corporation own the kingdom? Because it does. Property is historically arbitrary.

The best way for the monarchies of Old Europe to modernize, in my book, would have been to transition the corporation from family ownership to shareholder ownership, eliminating the hereditary principle which caused so many problems for so many monarchies. However, the trouble with corporate monarchism is that it presents no obvious political formula. “Because it does” cuts no ice with a mob of pitchfork-wielding peasants.

So the legitimist system went down another path, which led eventually to its destruction: the path of divine-right monarchy. When everyone believes in God, “because God says so” is a much more impressive formula.

Perhaps the best way to look at demotism is to see it as the Protestant version of rule by divine right – based on the theory of vox populi, vox dei. If you add divine-right monarchy to a religious system that is shifting from the worship of God to the worship of Man, demotism is pretty much what you’d expect to precipitate in the beaker.

Demotist political formulas have varied a good bit, but the phrase that expresses demotism as well as any I can think of is “self-government.” I frequently see this term used as if it meant something. In fact it does not, which is perhaps the best debunking of democracy I can offer.

Does “self-government” mean “government by yourself”? Certainly “self-employment” means “employment by yourself,” “self-abuse” means “abuse by yourself,” etc, etc. But the idea of “government by yourself” is inherently tautological. Unless you’re possessed by a demon, you govern yourself by definition. If the term means anything in this sense, it means that there is no other form of government, ie, no government at all – anarchy. But clearly this is not what the people who talk of “self-government” mean. If we are governed at all, we are governed by others – and thus “self-government” is a classic Orwellian paradox.

In practice the term seems most commonly to refer to “government by persons of the same race, culture, language, or social class or as oneself.” Since I am not, in fact, a bigot, it’s quite unclear why this should matter to me. Surely I can be either oppressed or treated decently by people of any race, color or creed, whether my own or someone else’s.

From the perspective of its subjects, what counts is not who runs the government, but what the government does. Good government is effective, lawful government. Bad government is ineffective, lawless government. How anyone reasonable could disagree with these statements is quite beyond me. And yet clearly almost everyone does.

If we look at the entire demotist family, consisting of Anglo-American liberal democracy, Marxist-Leninism, and National Socialism, the last two are clearly disasters. (There is a strange tendency in contemporary Universalist thought to see National Socialism as somehow on an entirely different plane of evil than Marxist-Leninism – for example, purging neo-Nazis is routine, whereas purging neo-Communists is McCarthyism. I don’t understand this at all, but then again, I don’t understand a lot of Universalist doctrine.)

This leaves us with liberal democracy. As we’ve seen time and again here at UR, the word “liberal” is meretricious to perfection, so we need a substitute – perhaps “lawful” will do. Let’s define “lawful democracy” as any demotist government that upholds the rule of law.

In other words, Universalist lawful democracy is the least demotist of demotisms, Demotism Lite if you will. Compared to Communism and Nazism, there’s much to be said for it. But this is a rather low bar.

I think it’s pretty clear that, if you lived in 1750 and a djinn appeared to you, explained the history of demotism in the next 250 years, and gave you the option of erasing all of it and just sticking with legitimism, you’d have to be a fairly perverse and sadistic fellow to decline the offer. It’s difficult to even scrape together 10^6 victims of legitimist government, let alone the 10^8 plus that Communism and Nazism racked up – not forgetting the million or so killed in the ruthless Universalist city-bombings of WWII, which were certainly war crimes by the standard of anyone who can produce a river of tears for the sufferers of Guantanamo.

The reason it’s so difficult to oppose lawful democracy is that we have so few alternatives to compare it to. Existential dissidence in the Soviet Union, for example – the desire to defeat the system, not just reform it – derived an enormous percentage of its credibility from the fact that the West clearly existed, and clearly (much propaganda notwithstanding) worked better.

The West has no West of its own. Besides tiny fossils of old Europe like Andorra, Monaco and Lichtenstein, the only successful non-democratic states in the world are Singapore, Hong Kong and Dubai, each of which is interesting and impressive, but none of which are without problems. (I don’t normally spend much time in the Universalist blogosphere, because I consider myself pretty familiar with the product, but thesethreads on Singapore struck me as interesting and sincere.)

So there is no getting around it: democracy may be, as I contend, a lie, but this lie has us by the gills. It is not going away any time soon. The reason I oppose it is not because I believe there is any chance of getting rid of it in the near future, but simply because I prefer to live with what I consider an accurate perception of reality.

Also, remember that democracy is a state of limited civil war. It is always pregnant with the spark of war proper, at home or abroad. It’s fairly obvious that, in many of the international conflicts of the Universalist era, the two sides have been allied or parallied with different American political parties – even when the US military is involved in the war. To call this phenomenon dangerous would be an understatement, and I’ll say more about it shortly.

Two 20th-century writers who have existentially opposed democracy are Hans-Hermann Hoppe and Erik von Kuehnelt-Leddihn. Hoppe is a libertarian and K-L was a monarchist, so neither’s views are exactly the same as mine, but they are both worth reading. Hoppe is probably the more rigorous thinker; K-L was a much better writer with a broader, more intuitive feel for history. If you’re considering the hard and rocky road of the anti-democratic dissident, you should definitely check out their works.

Having safely arrived in an old Dutch village on the Eastern Seaboard, I find that though my Dutch is nonexistent, they all seem to speak English these days and the Internet works perfectly. So I’m in a position to stave off the growing impression of an abandoned blog.

First, various persons have complained about the monicker “Mencius Moldbug.” I adopted this handle because of my habit of posting as “Mencius” on theseblogs and “moldbug” on this rather different one. I agree that it is anything but euphonious, but handles are hard to change – ask The Edge or CmdrTaco, both of whom I’m sure would love to ditch their puerile pseuds. I’d like to think mine is at least better than that. However, it certainly has nothing to do with the Chinese philosopher, Carlos Mencia, fungi, insects, etc, and I apologize if anyone is misled.

In general, if the double-barreled monstrosity is too much of a mouthful to repeat and there is no chance of confusion with the Chinese philosopher, I prefer the name “Mencius.” Because no one really is named “Moldbug” – it only really works as a lower-cased handle.

I’ll probably just unmask myself at some point. I mean, it’s not like I have an actual career, anyway. I’m just very resistant to posting under my real name because, if you knew my real name and you searched the archives of an obsolete network called Usenet for it, you’d get far too many hits. Since I am 34, no reasonable person would associate anything written in 1992 with the individual I am now, but unfortunately, not everyone is reasonable. I recognize that this predicament has nothing to do with anything. But we all have our phobias.

I also want to reiterate that I am not, in fact, reducing UR to one post a week. Rather, there’s a narrative thread that runs through this blog and that has generated some long essays, and I want to make this thread slightly more formal and put it on a weekly basis. But there are also little fuzzy pieces of yarn sticking out in random directions, and these will remain.

(Also, there are now a few people with whose email I am extremely delinquent. If you are one of these people, I will get back to you in the next two days – I swear by Odhinn’s spear. And he didn’t hang those nine long nights for nothing, you know.)

Moving back to actual content, one commenter mentions science fiction as a locus of resistance to Universalism. Indeed, I read a huge quantity of SF growing up (and I don’t mean by this to suggest that the entire genre is somehow automatically puerile). Much of it was libertarian in tone, and even when it wasn’t, the exercise of imagining alternative political systems is automatically liberating. I suspect most anyone reading this had more or less the same experience.

However, if you want to seriously consider alternative political systems, it’s not clear to me that a fictional context – despite its distinguished historical pedigree – makes for either the best argument, or the best fiction. Memorability, while not the be-all and end-all, is often an interesting test of quality, and the SF I remember best after a decade-plus of abstinence tends not to be Heinlein or Stephenson, but more imaginative writers like Paul Park and Lucius Shepard. And if you have something to say, why not just say it?

There’s also a case to be made that the sugar-pill of an imaginative context helps communicate these messages to the masses. Perhaps. But, first, this idea that intellectuals have a duty to lead the non-intellectual masses is a lot of how we got into this mess to begin with. I’d much prefer to live in a world in which the masses can think whatever they want to think, ditto for the intellectuals, and no one’s philosophy affects anyone else. Of course to actually accomplish a transition to such a world, or to modify present political realities in any way, the masses must at least show up and affix their imprimatur. However, this can only happen if it follows an intellectual consensus or at least a movement, and any successful intellectual movement tends to attract the masses whether it wants to or not. So I feel it’s much more useful and effective, for libertarians and other dissidents, to simply focus on being right.

Also, it’s not really clear how well the dissident themes in these books are absorbed. To me it’s obvious that J.K. Rowling has had one too many run-ins with moralizing Universalist bureaucrats, and one would expect her readers to be suitably primed for rebellion. On the other hand, to me it’s obvious that J.R.R. Tolkien – a far greater writer – despised the State and Power in every form, and was horrified by the Universalist belief that this Ring could be used, Boromir style, for good as well as evil. But if 0.1% of the people who have read Tolkien, or seen those awful, tone-deaf movies that were made out of his books, understand this, I’d be very, very surprised.

The same commenter also mentions Congregationalism. Of course, Congregationalists were Puritans and are now Universalists, so the link is entirely justified, and the phrase “mutant Congregationalism” is inarguable. Still, it’s interesting to note that the literal meaning of the name – the principle that each church is intellectually independent, and can decide for itself on theological questions – is in fact the direct opposite to the actual orthodoxy that was imposed under this name. Because the churches did not, in fact, differ, and do not differ to this day. The tolerance is entirely illusory.

These kinds of ironies are very common in the whole Protestant complex. And going back to the biological analogy, they represent a kind of misdirected immune response, an attempt to achieve mental independence which in many (if not all) cases only resulted in a newer, more effective system of indoctrination, that’s worth discussing in much more detail.

Someone else wants to know what I think of Ayn Rand. An excellent question, although it’s one I have some difficulty in answering because I’ve never read any of Rand’s books from cover to cover. I simply don’t like her as a writer, which makes it hard for me to express a fair opinion on her as a thinker.

However, with that caveat, my general view is that Rand’s attempt to break out of the Universalist-Revelationist (aka “liberal-conservative”) dichotomy was a bold one and worthy of much respect. Objectivism is one of the few genuine root nodes in the cladogram of Western thought. You simply cannot describe it as Christian in any way, and as such it represents a considerable achievement. For example, it differs from Rothbardian libertarianism here – Rothbardian ethics are basically Lockean ethics, and Locke was certainly a Christian. Connecting natural rights to the Bible is not hard at all.

But there is something much too Papal about Rand. She essentially constructed a system of morality and required all reasonable people to accept it. I don’t find her solution to the is-ought problem any more compelling than anyone else’s – to me, ethics are fundamentally a matter of taste, and I feel no more entitled to tell someone else they should find X ethical or Y unethical, than to tell them they should like poetry and they shouldn’t like badminton.

I feel that all reasonable people should be reasonable. I don’t ask anything more than this, and I certainly have no intention of asking Universalists to stop being Universalists, Revelationists to stop being Revelationists, Muslims to stop being Muslims, etc, etc – at least not in the sense of the value systems associated with these faiths. My view is just that a great many beliefs about the real world have become associated with these value systems, and a great many people who are otherwise quite reasonable fail to evaluate these beliefs reasonably.

For example, Universalists (like Revelationists, and basically all Christians) believe that all humans are ethically equal. No rational argument can be made either for or against this position. It is what it is, and I happen to more or less (like Peter Singer, I make some allowance for diminished states of consciousness) share it. After all, I was raised a Universalist.

But Universalists also believe that this proposition implies the proposition that all humans should be governed by “civil servants” of their own race, color, creed, or at least “nationality” (never mind that this concept should be meaningless to a Universalist). Obviously, I find this derivation – which is all the sense I can glean from the bizarre phrase “self-government” – debatable at best and ludicrously incoherent at worst.

I feel this debate is quite enough for anyone to take on. I don’t think that any set of beliefs about the spirit world, theistic or atheistic, are incompatible with an accurate perception of reality. The same goes for any set of ethical beliefs. My quarrel with the various modern religions, including but not limited to the Universalist and Revelationist versions of Christianity, is that they are all associated with beliefs about reality that are transmitted along with them, many of which are quite sound, many of which strike me as extremely strange and remarkably unreasonable.

I’m sure my judgment of many of these beliefs – metaphysical, moral and temporal – is exactly the same as Rand’s. Nonetheless, I don’t feel it helps anyone to attack the metaphysical and moral elements of Christianity, especially not in the same breath in which one suggests that the temporal elements of their received belief system may be complete baloney – or at least that it may be a useful exercise to treat these elements as if they were complete baloney, if only for the purpose of reconfirming them. If this is misguided altruism, call me an altruist.

In my ideal world, there are still Universalists, Revelationists, Buddhists, Muslims, etc. In fact, in my ideal world, I would have no problem in describing myself as a Universalist. But all I mean by this is that my metaphysical and moral beliefs are basically Universalist. In my ideal world, however, your metaphysical and moral beliefs would be entirely orthogonal to your understanding of reality and how it works, and especially to your understanding of such politically delicate fields as history and economics. Obviously, in the real world, this is not so.

Another commenter mentions – among many interesting points (yes, I certainly do see the whole bizarre Puritan obsession with the Old Testament mythos as a fundamental strand in the Universalist creed) – David Gelernter’s new book, Americanism. I have a copy of this book and I intend to review it here, so please stay tuned. (Also, anyone can call me anything they like – so long as they don’t mention my true name, which I received a thousand centuries ago on the fire-planet Zond. I would be forced to return to my original form, and destroy them. And nobody wants that.)

I feel a blockquote is desirable here, for no particular reason:

However, I think serious discussions began on both sides when CFR personnel began appearing on television news programs with the person’s name and the simple title, “Council on Foreign Relations” beneath. Personally, I keep tabs on the world conspiracy by reading Foreign Affairs every month. At $32 a year (two years for $60)you may peer directly into the core of the Progressive-Universalist nervous system, and monitor the most intimate and therfore banal goings on there. Wells was right – the conspiracy is totally open. I mean really – what are you going to do about it? FA is evil at its wonkiest: how best to achieve a federated world state without sexism or racism, managed by transnational NGOs of zero accountability? The rest of it is even more trivial: now that we have the levers of power, what settings are best? The siesta-inducing cover story this month is concerned that globalization’s benefits are insufficiently distributed, which any honest economist (not that there are any living) could have told you would be the case. The apparatchik’s solution: a New Deal for Globalism! Well, that can’t go wrong. Once the UN is granted direct power to tax, all of the successes that the US is currently enjoying from the Income Tax, the Great Society, the War on Poverty, the Civil Rights Movement, et. al. can be expanded to encircle the globe. The rough beast come round to Bethlehem at last!

Indeed. As for honest economics, I hope to offer a little in that vein myself, but one could do a lot worse than the Mises Institute.

Another commenter more or less answers his or her own question, but observes that views of the UN have changed over time. Indeed they have – reality has shown many Universalists that the UN is not the organization they want it to be. But this does not change their emotional attachment to the idea of the UN, the ideal of world government that it represents. It simply reminds them that there’s a very long road from here to there. Moreover, I think that the number of die-hard right-wing UN-haters in the US – as represented by the set of people who believe the US should leave the UN, an easily pollable question – has in general declined over time. But I have no numbers on this – it’s just a guess.

Lastly, the question everyone’s been waiting for: why don’t the Universalists do something useful, like legalizing pot? A fascinating question, because one notes that all the hippie ideas of the ’60s that involved making the State bigger and stronger have (pretty much) happened, whereas all the others have (pretty much) not. This certainly deserves its own discussion, but I will say one thing: the answer is in Jouvenel.

One, posting will be limited for the next couple of weeks, as I’m traveling. Readers should also expect limited responsiveness in the comments section, although frankly sometimes I feel like my contributions in this department only lower the tone. (UR continues to have the best comments of any blog on the net in my opinion, and since this certainly will not last, it should be enjoyed as the ephemeral mayfly pleasure it is.)

Two, I’ve decided to move to a weekly format for the jumbo-size posts, although occasional squibs, verses, etc, may appear irregularly. Large posts will appear on Thursdays by 9am, US-Pacific – the next one will be July 26. This should provide more time for leisurely digestion and genteel conversation, two of my favorite undefinable universals.

Apparently there once was a kind of obsolete proto-blog that was called a “book.”

A “book” was like a blog except that the author saved up all his posts for a year or two, then dumped them all in one big printout. This product cost thousands of microcredits, and you had to apply to the Department of Facts if you wanted to write one. And even if Facts stamped your party card, you still had to convince Information to promote you, and those who excelled at this gloriously-opaque task tended to make Talleyrand look like Montaigne.

While this sucked really just as much as it sounds, it did have certain advantages. One of them was that your readers were presented with a crisp and structured argument, rather than a great river of instant manure whose color and consistency can vary alarmingly. This was because the “book” could be “revised” and “edited,” practices we now consider unethical.

And rightly so, of course. We don’t want to return to the past. No one wants that. However, if you write online and you want to speak with any kind of confidence, you have to be able to change your mind. Ideally this is not done by surreptitiously editing the archives, as if one were writing a “book.”

As UR readers have been reminded ad nauseam, one of my many eccentric opinions is that the tradition to which most sophisticated Westerners of 2007 conform is best seen as a sect of Christianity. Since this tradition sees itself as a pure product of science and reason, neither sectarian nor Christian nor even traditional, my perspective is heretical in the strict sense of the word. We can’t both be right.

My argument is that though the tradition is theologically atrophied, its moral and political positions, and its personal and institutional patterns of transmission, identify it as the legitimate modern successor of mainline progressive Protestantism. Since this is only the most powerful branch of Christianity in the most powerful nation on the planet, swallowing its claims of dewy-eyed innocence is a little difficult for me.

This heresy implies a substantial qualitative revision of reality as we know it. For example, Richard Dawkins considers himself a follower of something he calls “Einsteinian religion,” which appears to differ not at all from the aforementioned tradition. From Dawkins’ perspective, he is defending reason against superstition. From my perspective, he is prosecuting one Christian sect on behalf of another. Doh.

It’s simply unrealistic to expect to be able to make this revision, or even evaluate it fairly, without adjusting the language we use to “frame” the problem. To this end I’ve field-tested some neologisms, such as ultracalvinism and cryptocalvinism, and also satisfied myself that existing names, such as liberalism, are just as useless and confusing as they seem.

The problem with the neologisms is that they prejudge the argument. It’s impossible to make them nonpejorative. Perhaps this tradition-to-be-named is a bolus of ancient, benighted lies, and perhaps its followers are either deluded zombies or unprincipled opportunists who need to be stopped. But the whole point of naming it is to synthesize a “red pill” that we can feed to the former, and no such pill has any reason to be bitter.

So I’ve decided I like the name Universalism, with a capital U. Most Universalists would accept this name as an improper noun, because after all they consider their beliefs universal. That is, they think everyone should share them, and eventually everyone will. So all they have to swallow is the capital letter. It goes down easily with a sip of water, dissolves quickly in any hot beverage, can be crushed and mixed with applesauce, etc.

Universalism is the faith of our ruling caste, the Brahmins. It’s best seen as the victory creed of World War II, and it’s easy to connect to the various international institutions born in that victory, which Universalists still regard as sacred if occasionally stained by human frailty, much as an intelligent Catholic sees the Roman Church. (It is not a coincidence that “catholic” and “universal” are synonyms.)

Universalism is actually already the name of a Christian doctrine, the doctrine of universal salvation. This idea, that all dogs go to Heaven and there is no Hell, is best regarded as an extremist mutation of Calvinism, in which everyone is part of the elect. The modern idea of universal salvation comes to us from Unitarian thinkers such as Emerson, and forms the second half of UUism, whose devotees are, needless to say, Universalist to perfection. (It’s an interesting exercise to compare the tenets of UUism to those of “political correctness.”)

The Universalist synthesis united two American traditions that in the past had sometimes been at odds. One was the ecumenical mainline Protestant movement, exemplified by institutions such as the Federal Council of Churches, whose most daring theologians were moving toward humanism. The other was what might (with homage to Edward Bellamy) be called the Nationalist movement, a vast raft of secular pragmatists, socialists, anarchists, communists, and other reformers, who flocked to the German-inspired university system that developed in the late 19th century, becoming a sort of roach motel for bad ideas.

(One of the most sensible of the Nationalist philosophers, William James, seriously proposed paramilitary forced labor as the cure for all social ills – in 1906. Oh, Billy, if only you knew! And the utopia of Bellamy’s enormously-influential Looking Backward (1888) is essentially the Soviet Union.)

While these groups had generally cooperated in the Progressive Era, there were some tensions – for example, over Prohibition, which the secular Nationalists found hard to swallow. These eased substantially in the New Deal, largely due to the brilliant coup in which Progressives captured the Democratic Party, their former opposition, and converted it into an extremist Progressive movement – while repealing Prohibition. FDR even had a book called Looking Forward printed under his name.

(Interestingly, both the mainline Protestant and secular Nationalist movements have deep links to the evil John Calvin, ayatollah of Geneva. Mainline Protestantism descends from Calvinism through, of course, the Puritans. The Nationalists were strongly influenced by this tradition as well, in its later Unitarian and Transcendentalist forms, but many also studied in the Prussian university system, where they learned the secular versions of Calvin’s divine State propounded by the Genevan Rousseau, and later by Hegel. Death is a master from Germany.)

After WWII, there was no longer any visible quarrel between these factions. Any views which contradicted Universalism became socially unacceptable in polite society. Progressive Christianity, through secular theologians such as Harvey Cox, abandoned the last shreds of Biblical theology and completed the long transformation into mere socialism. Nationalism also becomes an inappropriate term, as with the growth in American power it morphed into internationalism and, as most now call it, transnationalism. Instead of sacralized regional governments, transnationalists want to build a sacralized planetary government – on the principle that, as Albert Jay Nock put it, “if a spoonful of prussic acid will kill you, a bottleful is just what you need to do you a great deal of good.”

Creedal declarations of Universalism are not hard to find. I am fond of the Humanist Manifestos (version 1, version 2, version 3), which pretty much say it all. The UN Declaration of Human Rights is good as well. No mainline Protestant will find anything morally objectionable in any of these documents.

In a probably-vain attempt to boil down all this cant, I’ve defined the four principal Ideals of the creed as Social Justice, Peace, Equality and Community. As we’ve already seen, Social Justice means political violence, and Peace means victory. We’ll get to Equality and Community shortly.

The latest chapter in this sad and savage story was written in the 1960s, when the first postwar generation came of age. These young men and women had been educated by the Universalist “Establishment” which won the war, and were quite unaware that any serious and intelligent person could disagree with the Universalist consensus. The result was a sort of creeping Talibanization in which the doctrines of Universalism became constantly more extreme, a process that continues to this day.

Today, perhaps the simplest definition of Universalism is that it’s the belief system taught in American universities (at least, Federally funded universities). But, again, it is fundamentally a Christian sect, and its moral and political tenets will find echoes in Massachusetts and upstate New York at any time since the 1830s. Hawthorne’s Blithedale Romance, for example, is a satire of hippies – written in 1852. All that’s missing is the patchouli.

Universalists, as descendants of Calvin’s postmillennial eschatology, are in the business of building God’s kingdom on Earth. (The original postmillennialists believed that once this kingdom was built, Christ would return – a theological spandrel long since discarded.) The city-on-a-hill vision is a continuous tradition from John Winthrop to Barack Obama. In Britain, the closely-related Evangelical movement used the term “New Jerusalem,” which I’m afraid never really made it across the pond, but expresses the vision perhaps best of all. I always picture the New Jerusalem (“in England’s green and pleasant land”) as involving a lot of enormous concrete tower blocks, with the Clash’s “Guns of Brixton” playing somewhere on someone’s ghetto-blaster, and a forty-year-old grandmother screaming at her junkie daughter, but I’m not sure this is how they saw it in the 1890s.

What’s really impressive about Universalism is the way in which this messianic teenage fantasy power-trip has attracted, and continues to attract, so many people who don’t believe at all in the spirit world, only smoke weed on the weekends, and think of themselves as sensible and down-to-earth. Of course, the belief that all Universalist ideals can be justified by reason alone is a necessary condition. But Christian apologists have been deriving Christianity from pure reason since St. Augustine. You’d think these supposedly-skeptical thinkers would be a little more skeptical.

As a non-Universalist, I can’t help but admire the success of this particular replicator. It is brilliantly designed, like the smallpox virus. The fact that no one actually designed it, any more than someone designed the smallpox virus, that it is simply the result of adaptive selection in a highly competitive environment, heightens rather than detracts from my awe.

The coolest thing about Universalism is that it has the perfect opposition. If a Christian who believes his or her faith is justified by universal reason is a Universalist, a Christian who believes his or her faith is justified by divine revelation – in other words, a “Christian” as the word is commonly used today – might be called a Revelationist.

Suppose you have two faiths. Both claim to be absolutely and undebatably true. Faith A tells you it is an ineluctable consequence of reason. Faith B tells you it is the literal word of God. Which is more likely to be accurate?

The answer is that you have no information at all. Perhaps faith B is the literal word of God, but you have no way to distinguish it from something that someone just made up. Perhaps faith A can be derived from pure reason, but you have no way to know if the derivation is accurate unless you work through it yourself. In which case, why do you need faith A?

In fact, of the two, faith A is almost certainly more powerful and dangerous. As anyone who’s majored in Marxist-Leninist Studies knows, it’s very easy to construct an edifice of pseudo-reason so vast and daunting that working through it is quite impractical. And this edifice is much more free to contradict common sense – in fact, it has an incentive to do so, because nonsensical results are especially subtle and hard to follow.

Whereas when the word of God contradicts common sense, the idea that it might not actually be the word of God isn’t too hard to come by. In other words, if faith A contains any fallacies, they are effectively camouflaged, whereas the “and God says” steps in faith B’s syllogisms are clearly marked and brightly colored, and faith B pays a price in skepticism if God’s opinion is obviously at variance with physical reality.

So a reasonable observer might guess that, in fact, the tenets of faith B are more likely to be true, simply because it is more difficult for them to get away with being false. But in reality, these derivations tell us nothing. Probably faith A is right about some things, and faith B is right about some others.

However, in the struggle between Universalism and Revelationism, the former always wins. Because the Universalists control the mainstream educational and information system, this is really not at all surprising. But since, as we’ve seen, it is not rational for a reasonable observer to choose justification by reason over justification by revelation, a political system in which the Universalists are the Globetrotters and the Revelationists are the Generals is almost certain to be one which systematically propagates lies.

We’ve already seen a few of these lies, and we’ll see quite a few more. However, I think the dynamics of the struggle are better illustrated by questions in which, by whatever coincidence, the Universalists are right and the Revelationists are wrong.

For example, because my zip code is 94114, although I am straight as an iron spear, I happen to see nothing at all wrong with “gay marriage.” In fact I am completely sympathetic to the Universalist view, in which the fact that couples have to be of opposite sexes is a sort of bizarre holdover from the Middle Ages, like the ducking-stool or trial by fire. It’s not clear to me why homosexuality, which obviously has some extremely concrete biological cause, is so common in modern Western populations, but it is what it is.

However, because I am straight etc, and also because I’m not a Universalist, I happen to think the issue is not really one of the most pressing concerns facing humanity. And so it occurs to wonder to me how exactly gay marriage became an “issue,” when no one twenty years ago even thought of it as a possibility. Whatever the force is that brought this about, I find it hard to imagine anyone describing it as “democratic” with a straight face.

If anyone can come up with an example of a way in which American public opinion has changed in this way, but the change has gone against the Universalists and in favor of the Revelationists, I would certainly be interested to hear it. I think there are a few exceptions – notably in the domain of economics – but they all seem to involve an extremely dramatic intrusion of reality, a force which rarely has any direct impact on American opinion.

God is the whick-whick-whack-whack-whockof gray wings cut so short his elbows knocklike a high-test motor on low-test gas –can they really lift his plump, silky ass?He’s the converse of the case from design,the fake Rolex, the Mississippi line,and if the thing has the shape of the handI almost feel like I’ve worked with the man,a good guy, if you didn’t let him nearyour code. Who’s really born an engineer?Your pinky’d ought to have a Philips head,and balance argues for the quadruped;and yet he flies, this pig, a miracle,a joke, a love note to the cynical.

There are three rules you need to remember if you want to survive grad school.

Rule (a) is: never go to grad school before you’re either old enough to drink, or old enough to have had a drink. Rule (b) is: never go to grad school without first having had a real job, that is, one which you for some reason were once tricked into actually giving a crap about, at least up till they hired that horrible woman with the bad hair. Rule (c) is: never stay in grad school.

Since I have broken only (a) and (b), and managed to restrain myself on (c), I feel that while I am certainly nothing special in the world, I have some right to present myself as scarred but not devoured. Granted, those who made it all the way into the whale, and especially those who have chosen to remain there, are often wonderful people, and one should in no way be embarrassed to have them as one’s friends. But sometimes one is unsure of their voices. It is hard to always be absolutely sure.

The overwhelming fact of the modern world is that universities are not merely the charming, bucolic gardens of knowledge that they pretend to be. Granted, they retain a few leafy spots. A tree or two, a neatly sprinkled lawn. But the modern American university is a machine, and its business end – which seems to command a rapidly increasing percentage of its abdomen – is certainly sharp and appears to be rotating. It tells us it has no intention of grinding us into paste, but it would be hard to design a more impressive tool for the task.

The charge that universities are directly responsible for almost all the violence in the world today, for example, strikes me as essentially accurate. I’m sure it will strike you as absurd at best, and libelous at worst. But if you can stop these reflexes before they engage, please ask yourself whether you have ever seriously considered whether this accusation is or is not, not in some ideal world, but on the actual Earth planet we inhabit right now as I am typing this, actually true or actually false.

Because if it is true, it sure as hell explains a lot of obviously-insane crap that is otherwise extremely hard for me to understand or make any sense of whatsoever. Perhaps others can offer a better story, but until I hear otherwise – and I would like to – I will continue to assume that the universities, along with all other official information sources, are hostile replicating subsystems and need to be terminated.

Hopefully without any prejudice at all. For example, suppose you were a professor at MIT, or an assistant editor at the Times, or a senior economist at the Fed. Not really a public figure, but certainly someone with a very large pair of balls, or ovaries as it may be. Obviously you would need to find a new line of work, but it’s not clear that you would want the old one on your resume. You could say you were off trying to write a novel, or fighting as a “contractor” in Iraq, or something. Can anyone really check up on these things? Do they even want to know? And it explains your weary mien, otherwise unusual in one with no evidence or prospect of professional growth.

This is how it would go in my imaginary ideal future. Of course this bears no resemblance to anything I actually expect to actually happen in the aforementioned actual real world. I expect it will be quite a bit nastier, and not soon at all. But I certainly think the sooner it happens, the better the whole experience will be for everyone.

There is definitely no point in saving any particular department which claims to be “science,” any university which pleads that it’s “private,” any “newspaper” or “public school,” etc. The entire system of official “education” has to be completely wiped, preferably even swapping out the hardware – as we say in the trade. (Many university campuses, for example, could easily be redeveloped as prisons, luxury housing, police academies or corporate headquarters.)

It’s not clear to me that Digg, Wikipedia, arxiv.org, and other modern systems which solve, or at least purport to solve, the critical problem of separating content from nonsense, are quite ready for their new roles. But perhaps we’ll be surprised. Certainly, industry will not suffer from the impact of a large population of extremely intelligent and potentially productive individuals, who until now have been devoting their nervous systems to what might as well be Neoplatonist astrology. As for “science,” most of the advances in Western scientific history, contrary to popular belief, occurred when scientists were not servants of the State.

In any case. So this is basically my perspective on the American university system. Some will certainly take it as extreme. But I actually think it’s quite moderate.

A Navrozov moment is a moment when you realize that the university, which was established as a refuge whose purpose was to pursue truth without regard for the opinions of the world, has become a power center whose purpose is to impose its own opinions on the world. As such it has no more use for independent thought than a dog has for beets.

The name honors this piece by Andrei Navrozov, which I’m sure that, since he is a gentleman, he and his notorious pit-bull lawyers will allow me to steal. It’s from his Gingerbread Race, which is not nearly as hard to find as it should be. Navrozov, son of the equally eccentric and perceptive Lev Navrozov, is a little too concerned with Skull and Bones and not nearly concerned enough with paragraph breaks, but he is basically a sane man and a brilliant raconteur, and the following is not at all atypical.

‘The trick of being tiresome,’ said Voltaire, ‘is to tell all.’ The great historic upheavals that are the reference points of my childhood and adolescence may all be looked up in Britannica, which can equally be relied upon to furnish a superficial history of Yale, or of American universities generally. Abstractions like cultural diversity, liberal education and academic freedom have lost none of their popularity since the day I first encountered them in the admissions brochures. What no encyclopedia can be expected to suggest, however, is that what paranoid misfits like Mill and Orwell have always known to be true, namely that when, for one reason or other, a society lets go of the adversarial principle I have compared with the human soul, it develops therapeutic myths of itself which present its weaknesses as strengths, myths that displace truth in the pages of encyclopedias and allow the many to diagnose the few as paranoid misfits. The popular abstractions I place among the constituent myths of modern civilization’s public religion are not outright lies, of course. They are what Mill called half-truths, noting that ‘not the violent conflict between parts of the truth, but the quiet suppression of half of it, is the formidable evil’. By absorbing the violent shock of dissent once represented by such abstractions into its placidly gaseous whole, the religion quietly dissolves potential opposition, with the consequence that, in Mill’s words, ‘truth itself ceases to have the effect of truth by being exaggerated into falsehood’. But, as the Bach prelude, gently pealing from the chapel’s Gothic tower, stilled my paranoid aspirations, I had no time for formulations of this kind. Courses had to be chose, and under the influence of the visible environment, which I obediently interpreted as the university had intended, I chose a course of lectures on Hegel. The first paper assigned by Professor Rockmore was an analysis of a famous chapter in The Phenomenology of Mind in which Hegel examines the relationship between master and slave. I hoped to approach Hegel, and indeed all my studies in those early weeks of my first term, exactly the way such matters had been treated at the Vnukovo dinner table. Obviously none of our guests liked to be thought of as a learned bore, and consequently it was unimaginable that in the course of a conversation bubbling into the small hours, somebody would summarize a chapter from a book everyone else had read. I viewed the professor as my host, and the essay I submitted was intended to divert him by presenting Hegel as a slave to platitude, an antihero of thought, a man so wanting existential imagination that in a Napoleonic Europe steeped in serfdom he was unable to recognize serfdom as a reality transcending the insular concerns of an ambitious Privatdozent. Hegel’s idea that the slave enslaves the master, I reasoned, is not a paradox because in the broad historical context of universal servility it is sycophantic, as Proudhon’s idea that property is theft would not be a paradox in a society of thieves. As I wrote, I imagined Father and our guests, eviscerating an academician’s conceit here, taking a stab at a bureaucrat’s witticism there, Tsinandali flowing amid roars of laughter. I read the essay to Father over the telephone, adding news of this new university life of mine, which I imagined as a continuation of and perhaps even an improvement on the lost life of the Vnukovo enclave, a paradise perfected. The following week I came to class, expecting the thrill of violent conflict between parts of the truth, the thesis being that Georg Wilhelm Friedrich Hegel was a celebrated philosopher and the antithesis, that this son of a Stuttgart government clerk led an intellectually sheltered existence. The dialectic, however, did not work out as I expected. ‘May I see you for a moment?’ said Professor Rockmore. I noticed red blotches on his face. He told me that he could not give my essay a mark, and that if I wanted to stay in his course I would have to rewrite it. ‘But, Mr Rockmore, this is what I think,’ I protested, ‘these are my thoughts on Hegel’s treatment of the subject.’ He referred me to my college Dean, who received me in the Gothic grandeur of his study. The Dean advised me to withdraw from the course, explaining that it was for advanced students and closed to freshmen anyway. I have been strictly reared, as Mark Twain used to joke, but if it had not been so dark and solemn and awful there in that vast room, I believe I should have said something which could not be put into a Sunday-school book without injuring the sale of it. With a sinking heart I realized that the faux pas I had made was not unlike that of a tramp barging in on a ladies’ circle evening devoted to problems of the homeless. This Shavian dramatization aside, suppose philosophy were a science, like mathematics or chemistry, and a drunken beggar barged in to disrupt a university lecture on metal ethoxides with his ideas about ethanol and its applications. On the other hand, it would never have occurred to me to disrupt Professor Rockmore’s course in this way if the subject discussed was symbolic logic, or any branch of philosophy that borders on mathematics. I remembered that Mill, in his discussions of intellectual freedom, specifically used mathematics as an example of an exact science ‘where there is nothing at all to be said on the wrong side of the question’ in contrast to ‘every subject on which difference of opinion is possible’ and, in Mill’s view, essential to what makes a freethinker’s life worth living. Yet the subject under discussion was not Hegel’s logic but his view of slavery, a subject upon the stark reality of which Mill began reflecting wile the Jena timeserver was still alive. Besides, Hegel’s dialectical vision of the world process added a new dimension to Leibniz’s optimistic myopia, and while I considered myself no more competent to discuss Hegel’s logic than Leibniz’s mathematics, I failed to see why discussion of a subject like slavery by the former should be closed to literary intrusion when, in the case of the latter, such an intrusion had produced Candide. I then approached several of the students attending the unfortunate course of lectures, none of them, admittedly, a fellow freshman. Many were even bearded, after the Young Hegelian fashion of Professor Rockmore himself. One student essay from the unfortunate week was finally produced, complete with a top mark and the professor’s comments, whereupon, with the pain that I can only compare to that of a forcibly extracted illusion, I discovered that the bearded essayist had done just what schoolchildren do the world over, namely, repeated Hegel’s argument paraphrastically, just as if it had been the proof of a Euclidean theorem or the tale of a big bad wolf called Sein. The tramp had not quite expected, perhaps, that he would be given crumpets with tea and asked to tell the ladies what the homeless need. He might not have expected that his ladies’ kindness would outlast the short speech he planned to wind up by demanding a shilling from everyone present. But least of all he expected to find the good ladies naked, or mute, or dead. Yet this was precisely what I, in the role of tramp, admitted to university for reasons that had less to do with diversity than with the homogenizing of diversity, found there.

(It would be fun to imagine that the bearded essayist was, in fact, Daniel Larison. But I believe Larison was in diapers when Navrozov was at Yale, and a beard surpasses even his precocity.)

Now, Navrozov studied, of course, literature. I took a European history class at Hopkins, one each in Chinese, Japanese, and (definitely the most fun) early Levantine history at Brown, one creative writing course at Brown, and one each in hippie economics and hippie law at a pre-college summer in Cornell, and this is the absolute limit and total extent of my formal education in the humanities. I don’t even speak any languages, although I’m told I do a good Indian accent.

Instead I escaped alone to tell thee, for all I studied is computer science. And it is hard to make CS be about anything except actual computers and how to actually work the fsckers, although the Good Lord knows enough people have tried. In particular, there are, or at least in the early ’90s were, a few schools that had very solid programs in system software, and I actually think I wound up with a very good basic education at about a master’s level in CS.

Computer science, I hasten to say, is no less a human garbage disposal than any department at any school. It is just slightly less obvious about it. The blade is slower, the motor less powerful. Perhaps some useful activity exists in this field, perhaps there is some peach-pit stuck deep in its drain with a leaf or two of actual life clinging to the chewed and ruined stem, but do we really need to reach in and retrieve it? I mean, for example, so-called “programming language researchers” have not designed a new language whose reception among the townies, ie, actual programmers who actually program for a living, could be described as even remotely warm, certainly in the last 30 years and arguably ever. There is a reason CS’s entire existence is notoriously debatable.

My only personal advantage in embarking on this heinous and obviously unprofitable course of study was that my parents worked for the US Federal Government, another institution whose abolition I consider urgent, if a slightly lower priority. Even while the Cold War was still on, the odor of Brezhnev was remarkable, especially where my father worked: at State, a department largely responsible for there having been a Brezhnev in the first place. (And whose tentpole can scarce conceal itself now that the EU is actually the sweet Eden those nasty Soviets would never quite let themselves be – but I digress.)

Unfortunately, when Uncle Sam’s testicles expand and press outward, they emit the shocking odor of an adult male marmoset, and this stench is now apparent to everyone between here and Saturn with a nose. It is the smell of power.

As Navrozov explains, the word “power” in Russian means “possession” and is a cognate to the English word “wield.” Since in a democracy public opinion is power and universities are the source of all legitimate opinion, they can be said in a sense to possess and wield our minds. So no one at a university should be surprised to smell the marmoset, not even in an innocent little department called “computer science,” but I knew the stench from childhood and to say I was shocked would be an understatement.

(Of course, all your actual, official dissidents or “activists” are trained from an even earlier age to misidentify the organ behind this secretion. They cauterize the imaginary gland of their paranoid fantasies with bogus moxibustions, ointments and poultices, which applied to the actual source are not just ineffective but often even nourishing. They claim to be stopping the drip, they believe they are stopping the drip, they are sure if they flagged in their effort for a second it would become a full-on faucet or even a flood. In fact the effect of their labors is at best neutral, and often constitutes actual lactation. Though the whole system is a fine case of the proverbial self-licking ice-cream cone, not to mention a substantial source of distraction or as we naively call it “employment,” we do need to remember that the origin of the fluid is not, in fact, a “scent gland.”)

So – in any case, computer science. (And definitely not Hegel.)

The only professor I ever actually learned anything from, at least anything that an ordinary person can’t learn out of a book (CS is not in general hard, and I avoided the hard parts), in one year at Johns Hopkins (don’t ask), three at Brown, and one and a half at Berkeley (really don’t ask), was a fellow by the name of F. Kenneth Zadeck, an assistant professor at Brown, who I’m very confident cannot identify me at all.

Zadeck was (and I think still is) a compiler man and a good one, one of the inventors of static single-assignment (SSA) form, an approach to compiler optimization (basically, making your binaries run as fast as possible) which was new then but has since been widely adopted. But all of this you can get out of a book. It was not the content but the way he taught that was special.

Zadeck ran his graduate seminars in a very interesting way. As in most graduate classes in CS, at least at that time, the style was to do papers. At Berkeley, for example, we would read three or four papers a week, and spend maybe half an hour discussing each. Basically the goal would be to look at this cool smart paper and see how clever the people who wrote it were. Could we be clever, like that? Perhaps we could.

This is not how it worked in Kenny’s class. He did one paper a week. And basically his methodology would be to have us read this one paper – typically a very cool paper, by people who were clearly very smart – and discuss it for at least a couple of hours.

And Zadeck’s goal was almost never to praise this paper. It was to rip it apart. It was to show us the clever ways in which the authors had disguised the fact that their work was, while still cool and certainly not inaccurate in any way, utterly useless for any practical purpose.

As, of course, are 99.99% of the things that all computer scientists have ever built. (It is an error to confuse the open-source community, which for example wrote Linux, with the academic CS world. Basically, the relationship is that the former would like the prestige and power of the latter, whereas the latter would like the success and productivity of the former. This is an unstable relationship and I think it’s not hard to predict how it will play out.)

Zadeck’s adversarial version of CS was incredibly fun. Not, of course, that as a good Brahmin child I needed any convincing, but it helped convince me to go to grad school. I wound up assuming, much like poor Navrozov, that this essentially critical, aesthetic and realistic approach was simply the right way to study system software, which would of course be the way that it was studied at a great center of the art such as Berkeley.

However, I did feel a slight twinge of concern at the realization that there was such a high level of what could only be called dishonesty in the profession. It was certainly not that the authors of these papers had failed to realize the drawbacks of their approaches, and it was also not that they had merely summarized the technical content and noted neither pros nor cons. Rather, they had explored the pros in lavish and impressive detail, and they had set up the entire structure of their problem to avoid the possibility that anyone might consider the cons. Hm.

Then there was the fact that as a class project I actually implemented Zadeck’s SSA form, quite crudely of course, inside the GNU C compiler, which even then was a monster with hundreds of thousands of lines of code. I believe gcc has a proper implementation of SSA form now and I’m sure it works much better, and probably the fault was mine. But it disturbed me slightly that when I used my souped-up gcc, with this groundbreaking optimization model, to compile itself, it found something like three extra optimizations in the whole codebase. Hm.

In any case, off I went to Berkeley, where I had my real Navrozov moment. Basically, what I discovered at Berkeley was that the Zadeck approach to CS is an exception – to put it mildly.

My Navrozov moment at Berkeley came from the one and only paper I published, which was a clever way of reducing the time it takes for an operating system to “context switch,” or shift between working on different processes. In a modern computer this depends on a piece of hardware called an MMU, which can be slow and cumbersome, so my paper described a way of securely separating two processes without using the MMU.

This was not even really my idea. I’d actually gotten it from a professor at Arizona, which had been my safety school and was nice enough to fly me out for a visit, whereas Berkeley knew I was lucky to have earned their blessing and didn’t need to bother. I elaborated slightly on the Arizona professor’s idea, giving him full credit of course, for my OS class at Berkeley, and the Berkeley professor was impressed enough to have me write it up and submit it to a minor conference, where it won “best student paper” – there being one other such.

In any case, this same problem was popular at the time – the only real way to succeed in CS is to invent a new problem which generates more employment for your peers – and other people at Berkeley were working on it. Two of these were a pair of third-year grad students whose names sounded a little like “Sacco and Vanzetti.”

Sacco and Vanzetti came up with an entirely different solution to the slow-MMU problem, one which if I do say so myself was less imaginative than mine, but both more general and more practical. They published theirs in a real conference, received much acclaim for it, and I believe patented it, started a so-called company and eventually sold it to Microsoft. (Ah, Bayh-Dole.)

At some point during this period, however, I realized that the entire problem was a complete and utter pseudo-problem. The reason that an MMU context-switch is slow is that, when applied to the problem it was actually designed to solve, it is more than fast enough. The lily needed no gilding at all, and it certainly did not need to be nanofabricated from isotopically pure, individually selected gold atoms. Academic CS researchers at the time, for whatever ridiculous reason (probably something to do with microkernels), thought that there should be many more fine-grained security transitions in an OS environment. In fact if anything the trend is away from multiuser computing and toward virtualized or “shared-nothing” designs in which communication between protection domains is minimal. Furthermore, if this trend actually did reverse, which it hasn’t, it would be very easy to fix MMU-based context switching to make it every bit as fast as needed.

So I am very confident that neither of these techniques, neither mine nor Sacco and Vanzetti’s, has ever been used in practice. There is no need for them, there has never been any need for them, and there will never be any need for them. And this was quite obvious in 1993.

My Navrozov moment, of course, was when I approached one of the two – Sacco, I think – and attempted to have an intellectual discussion of this realization. The story is basically the same as Navrozov’s, so it would be boring to repeat, but basically I came away with the feeling that I’d told someone his Sicilian grandmother liked to get drunk and fuck her own goats.

Which, in fact, I had. Because I’d essentially told him his research was fraudulent. The fact that my research was also fraudulent, and that neither of ours was particularly noteworthy in that regard, did not matter. And why should it? Others’ crimes cannot excuse your own.

Of course “fraud” is a strong word in the world of science, or even “computer science.” It has a generally-accepted technical definition which certainly none of us were violating. But it is also a word in the English language, and most nonscientists would agree that when you lie for money, you are committing fraud.

Suppose you are a CS researcher, let’s say in the area of “programming languages.” You are almost certainly a government contractor. You and/or your students are funded by a grant or grants, which you spent a considerable amount of effort in securing, in competition with many other researchers. The grant was approved by a board at an organization such as NSF or Darpa, and the reviewers were other researchers such as yourself – in fact, you may even know one or two of them. (Try to avoid using the word “mafia” – it is unseemly.) This outfit must in turn obtain its funding (the dirtiest word in the English language) from Congress, before which its lawyer-flanked flacks present themselves on a regular basis. Congress, in turn, receives its paycheck from good old Fedco, which gets it you know how.

As a PL researcher specifically, you are basically a mathematician. That is to say, your work consists largely of stating and proving propositions. For example, one popular area of PL research is what’s called “proof-carrying code,” which is solving a similar problem to the one that Sacco, Vanzetti and I were working on. It is also equally pointless, because the simplest possible form of “proof-carrying code” is what we programmers call “source code,” and in practice the various approaches to this problem that have been proposed – such as “typed assembly language” – amount to no more than insanely-elaborate compression algorithms. Needless to say, no such thing has ever been deployed, and nor will it ever be.

But it is also a source of essentially permanent fascination to students and researchers the world over, because it creates an infinite variety of extremely difficult problems, which one can demonstrate one’s intelligence by solving. And this is after all one of the main purposes of the university system: to employ extremely intelligent people, who might otherwise be out causing trouble, in tasks that consume their spare brainpower.

So let’s look at how the fraud works in detail. Let’s say you are a student working on one of the project groups working in this area – for example, there is one at Yale. Let’s say you have some cool idea, for something everyone used to think was absolutely impossible – let’s say, a type system in which one can write a provably-correct garbage collector. Quelle surprise! Of course, the resulting type system is expressed in the form of an 80-page proof, and the idea that any programmer would actually learn and use any such thing makes about as much sense as putting a Wal-Mart on top of Mount Everest and issuing pitons to the “greeters.” But never mind all this. The idea is cool.

So it goes to the professor, who knows how to write grants, and it gets folded into the next grant proposal. A type system is of course essentially a security system – it ensures that your program will behave properly, and not be infected by viruses and such. And Congress is concerned about computer security, which, as we know, is part of national security. With a large quantity of skillful grantsmanship, and various euphemistic ambiguities at all levels, Aunt Gladys’s tax dollars end up funding Feizhuang Zhu’s type system, Prof. Smith’s group gets its funding bumped by 10% and can take on another grad student or even a postdoc, and all is well in Mudville.

Except, of course, for the fact that the whole thing is basically a criminal enterprise, and all these crazy-smart people could actually be out doing real work, instead of spending their lives pulling each others’ dicks in this bizarre, pathetic and dishonest manner.

The problem with CS – and I suspect in other sciences, such as physics, although I am certainly not qualified to fire so much as a BB gun in the great Woit–Motl war – is that science today is, contrary to popular belief, a business.

And it is a very special kind of business. In this business, there is exactly one customer, and his name is Uncle Sam. And there are no companies in this business – apart from your “mafia,” you’re on your own. You can get students to do your programming, true, but you have to do your own research and, more importantly, your own sales.

Selling to Uncle Sam is a fascinating problem. Uncle Sam wants his serfs to know that their tax dollars are being spent on top-notch research which will make America #1. If the dollars are being spent in the constituency of a Congressman with the right seniority, this is even better. Otherwise, Uncle Sam does not give a tinker’s damn what he funds, as long as the result does not actually make him look like an idiot. Fortunately, Sen. Proxmire has departed this earth and all of your big-league journalists are pro-science pretty much the way Pat Robertson is pro-God, not to mention that if they have a BA in anything besides basketweaving it’s a surprise, so Uncle Sam is unlikely to see any trouble from this front.

The optimization algorithm for Congress is obvious, which is to keep funding what you’re already funding. Except for minor porcine concerns, Uncle Sam certainly has no reason to ever cancel, steer, or otherwise redirect any research direction. Why would he? How could he possibly know more than the researchers themselves? Who better to ask about string theory than the string theorists?

The result is that Fedco’s approach to research bears some resemblance to that of the large, and often slightly Fedco-like, software-hardware corporations that have dominated the industry for quite some time. Typically these outfits employ large numbers of researchers, at places like Microsoft Research, Sun Labs, etc. And these researchers, who are PhD types from academia, receive some mild encouragement toward productive directions, but of course have actual rank and can’t simply be told what to do, as if they were mere employees. For the most part (although with some exceptions), these corporate research arms, which are basically run as a tax writeoff and general prestige farm, are simply sponsoring these scientists’ academic careers in a way that provides less status than working at a research university, but does not involve the onerous and degrading T-word.

The result is that the researchers wind up managing themselves. And one of the things I learned after I said my goodbyes to the whale is that, again contrary to popular belief, there is this thing called management and it’s actually necessary. There are individuals who can be productive without active management, but there are no organizations that can. And when basic research is treated as a self-managing organization, you will get unproductive basic research. If you were previously unaware that there was any such thing, I’m sorry to have to break it.

Most managers are easy for a scientist to scam, in precisely the manner described above. It’s a case of what economists call “asymmetrical information,” and the result is that your research program is simply producing status and credibility for the scientist, who is in the business of demonstrating his intelligence, as if he was in the sixth grade. It takes a really talented manager – General Groves is the all-time great example – to get an organization of super-smart people to work together on a real problem. (It is worth noting that the Manhattan Project’s personnel were veterans not of Federal science, but of course of prewar science, a system under which the profession of “grantwriter” was, I believe, unknown.)

If there is any equivalent of General Groves today, his name is certainly Steve Jobs. I have never worked in a Steve-run company, but I have certainly heard the stories. And my favorite is one I heard from an Apple QA guy (QA, ie testing, is basically the lowest-prestige profession in the Valley) around the time Steve was returning to Apple.

Heads were of course flying left and right, all sorts of people were moving offices and changing jobs and the like, and this guy Dan, who was a project lead or something, got called in by his manager. “Hey, we’re moving to building X next week,” said the manager.

“But isn’t that where – ATG is?” ATG being the “Advanced Technology Group,” ie, Apple’s research arm. Which was of course the most prestigious arm of the octopus.

“Yeah,” said the manager. “Hey, could you keep it under wraps for a little? I don’t know if they’ve heard.”

In fact, unless I have been misinformed, when Steve came back he laid off Apple’s entire research division. No funding cuts, no baselines, nothing. He killed the whole thing, and from what I knew of what they were doing, it was nothing but richly deserved.

Now what do you think Steve Jobs would do if they made him President? Or CEO, perhaps, of Fedco? With a mandate from the board to perform an arbitrary reorg as he saw fit? Frankly, the mind boggles.

I actually haven’t even started to explain how pernicious the university phenomenon is. For example, I haven’t justified my claim that they are responsible for most of the violence in the world today. Please remain on this channel for further eccentric and informative broadcasts.

But I will repeat my policy proposal: I believe the only effective way to deal with the universities is the Henry VIII treatment. That is, unconditional abolition and confiscation. The endowments and campuses can be treated as rough compensation for the vast streams of subsidies the universities have received since 1945. Simply wrap the whole thing up and call it a day. Let it be summer all year long.

However, I am strongly opposed to any prosecution for anyone involved in the university system, even in exceptional cases such as that of Michael Mann. I feel it’s much better to let bygones be bygones. I’m sure some will criticize me for this stand, but I will stick to it.