Taki's Magazine2016-12-09T08:58:04ZCopyright (c) 2016, Joe Bob BriggsExpressionEnginetag:takimag.com,2016:12:08Articles by Christopher RoachThe Pentagon Needs a Choreographertag:takimag.com,2008:article/1.97202008-07-22T13:44:00Z1999-11-30T00:00:00ZChristopher Roachchris@takimag.comTo combat Islamic terrorists more effectively, the US government should spend some real energy on image management, perhaps hiring a big Hollywood guru. Consider Maliki’s alleged insouciance. It’s good for the mission for us to get slapped down by Maliki if it appeases the honor-obsessed locals. Indeed, it may have a double benefit: slowing down the insurgents and providing a face-saving way for the US to leave under honorable conditions to “respect the will of the Iraqi people.”

By contrast, it’s not good that trials of al Qaeda terrorists are held behind closed doors at Gitmo, where government prosecutors quietly and all-too-slowly go about their business. They should be high drama events on Tru TV, complete with analysis by Geraldo and Nancy Grace. The terrorists should be discussed by shrinks who suggest they’re all repressed homosexuals along with other such defamations. The goal should be to demystify al Qaeda’s foot soldiers, showing them instead as ordinary, angry, and pathetic figures. There’s something antiseptic and tone deaf about the manner of these off-shore proceedings. This is unfortunate, because US commitment, resolve, and the enemy’s ugliness would be manifest in any serious exposure of the trials.

Certainly the whining of a Covington law firm partner—complete with dropped “trou”—that these poor bastards are searched for weapons in the same manner as in US prisons would contrast sharply with the professionalism of military prosecutors and testimony revealing how the satanic manner in which al Qaeda does business, trains its cadres, and views the world. Today, the procedure-obsessed defense lawyers for al Qaeda and the anti-American international human rights community dominate this discussion. The war on al Qaeda is supposedly the good war even in the eyes of most liberals, but somehow a great number of them lose a lot of sleep over the treatment of al Qaeda’s prisoners, as if the justice of our war against those who attacked us hinges on these guys getting the OJ Dream Team in their eventual trials.

One key feature of insurgencies and fourth generation conflicts—such as ours with al Qaeda—is that much of the war serves as fodder for an information campaign. It’s a campaign for the allegiance and sympathies of the largely unaligned masses, in this case the masses of the Arab and Islamic world. It’s also a campaign by either side to shore up or demoralize the will of the American people. There’s a reason al Qaeda films all their attacks; they want to show how strong they are to their own people and to us through CNN—in effect, their ministry of information. In light of this reality, it’s often better strategically to deprive al Qaeda of prestige than it is to talk about how tough they are. The latter makes al Qaeda look stronger than it really is by making us, on the stage at least, their equal.

Operations like Columbia’s out-foxing the FARC have important lessons. They do not tap into the honor-revenge-cycle of the hyper-masculine Third World, instead subjecting the enemy to ridicule in contrast to the extreme cleverness of the government. Saddam coming out of his hole the way he did had a similar result; I heard a Turkish woman at the time say, “He’s a coward. He should have gone down fighting.”

We need to be a lot more clever. Whack-a-mole attrition strategies won’t succeed in Iraq, in Afghanistan, or against al Qaeda internationally. One thing Americans can do, however, is image management. Consider all the creepy political events where some word like “Trust” or “Leadership” is emblazoned all over the podium. Likewise, our pop culture is one of our biggest exports; we clearly have a deep bench in this area. But for some reason, when it comes to the war, we’re back to the Five O’Clock Follies. The combination of too much force, putting someone like Karen Hughes front and center in the Islamic World, and habitutal hostility to the media within the military is making the campaigns in Iraq and Afghanistan much more drawn out and “kinetic” than they have to be.

]]>Articles by Christopher RoachCan America’s Nukes Deter Iran’s?tag:takimag.com,2008:article/1.97342008-07-13T17:09:00Z1999-11-30T00:00:00ZChristopher Roachchris@takimag.comDeterrence was profferred as a legitimate, noninterventionist solution to the problem of Iranian and other Third World nations’ nuclear weapons in the earlier discussion. Nuclear proliferation to the Third World in general is a problem because such countries are less likely to keep a tight hold on their nukes, rocked as they are by periodic coups, a culture of endemic bribery, and infighting among personnel more loyal to their religion or tribe than they are to their state. Iranian nukes will almost certainly lead to neighboring nations getting nukes for realist reasons in response, and, more important, Third World nukes increase the possibility of such weapons leaking to non-government organizations such as al Qaeda and ultimately being used. This insight into the dysfunction of states is one of the more important insights of Fourth Generation Warfare theory. Such weapons can slip from state control for ideological reasons or bribes or both. If more than one or two such nations goes nuclear, and such “leaks” happen, we won’t know whom to retaliate against in the event a nuclear weapon is used against our people. Any conclusion about “whodunit” will be undermined by the likely gaps in evidence and our nascent conspiracy-thinking culture. It would be wrong to retaliate if we were truly in the dark about the source of a nuclear weapon or, at best, could attribute it to a rogue official. This would be a terrible position for a great and proud nation such as ours to have placed itself in. But this unnecessary position would be the natural result of the indifferentist culture about nuclear proliferation among a sizeable fraction of antiwar critics—paleoconserative or otherwise.

There is another important problem, though, that is particularly apposite in the case of Iran. Nuclear weapons matter for immediate practical reasons short of nuclear war; namely, they make nuclear-armed countries essentially undeterrable within a much broader range of action. When the USSR invaded Hungary in ‘56 or Czechoslovakia in ‘68, the USSR’s nukes were a key reason we did not intervene to help the liberal revolutionaries and roll back Soviet power in Eastern Europe. A nuclear-armed Iran would create similar challenges in the Gulf region. By way of analogy, Pakistan’s nuclear arms are one reason our war against al Qaeda is so hamstrung by difficulties. Pakistan, a putative ally, is unstable and internally divided. So al Qaeda has found refuge in Western Pakistan, and there is little we can do to force their hand. Also, unlike our current relations with Pakistan and our likely relations with a nuclear Iran, there was a modus vivendi between the USSR and the West owing to the relative rationality of their respective foreign policies. The USSR was notoriously centralized, unlike much of the Third World, so it was more predictable. Iran shares few of these characteristics, as evidenced by its regime’s acquiescence to lawless hostage-taking of American embassy personnel and proliferation of parallel and competing institutions of state.

Since so much of the world’s oil is dependent on the relative stability of sea lanes in the Middle East, Iran matters even if we don’t conceive of Israel as an ally whom we should take risks for, which is a policy of strategic disengagement I have advocated since the end of the Cold War. I’m not so dull as the neoconservatives to suggest that Iran is akin to the next Hitler or to suggest that Iran is going to attack the US conventionally, nor do I think the deus ex machina of liberal revolution is around the corner if only we bomb the hell out of them. But critics should avoid the same kind of unrealism. Iran, like most nations, acts for reasons other than territorial self-defense. This quest for power is a key insight of realist international relations theory. There is no territorial self-defense reason for Iran to fund Hezbollah, but it does. There was no territorial self-defense reason for Cuba to support rebels in Angola, but it did. Iraq, acting rationally, should have pulled out of Kuwait in 1991. Critics would increase their credibility if they acknowledged as much. In this instance, I could conceive of a nuclear Iran exacting tribute over its non-nuclear neighbors thereby exacting monopoly rents from the region’s oil, the Saudis—whom I would particularly not like to see so armed—could acquire nuclear arms in response, and a nuclear Iran could easily restrict US rights in the region to travel and trade and engage in other legitimate activities. Further, I conceive Iran, like Pakistan, to increase the risk of any nuclear weapons slipping to undeterrable, non-state actors, such as al Qaeda and similar organizations.

Finally, the promise that deterrence will protect us from even the whole world going nuclear is unrealistic for the reasons cited above: any such nuclear weapon use could be well concealed and plausibly blamed on another. More important, such deterrence depends on the kind of hard-headed Machiavellian realism so often pilloried by the same critics. Even I think it’s a great moral problem that massive nuclear retaliation against civilian cities is proffered as the deterrent promise in the case of nuclear attack. But this technique of shielding ourselves with an arguably immoral promise is now championed by the same people who think a limited conventional strike against Iranian nuclear facilities is a great moral evil unjustified even in the (admittedly not yet proven) case where Iran is on the brink of nuclear weapons acquisition. Critics say wars are expensive and costly to us and enemy civilians. This is all undoubtedly true. But will the people who consider blowing up Iranian nuclear reactors a great injustice and that $4.00 gaoline is a massive oppression have the tenacity to support nuking Iranian cities if an Iranian nuclear weapon is used somehow against the U.S.? Because that grave promise is the key to nuclear deterrence. Between this promise—massive killing of civilians and destruction of infrastructure with nuclear weapons—and possible alternatives like sabotage and precision airstrikes on rural Iranian nuclear facilities, it does not seem like a “slam dunk” in the moral scales to risk long-term and unpredictable deterrence versus precision disarmament now.

The Bush administration has unfortunately (though understandably) created conditions for a reflexive anti-war movement in all instances. We should not allow its mistakes to create overcorrective mistakes in our own reasoning. Its errors of policy should be disaggregated. Bush’s application of power was confused in Iraq by the counsel of disloyal Israeli partisans, Rumsfeld’s desire to experiment in military “transformation,” flawed public rhetoric, and commitment to the quixotic goal of promoting exemplary democracies abroad. This is the reason we’re still in Iraq today. A true American patriot concerned with our basic interests in trade and self-protection and the avoidance of nuclear war and nuclear terrorism could apply a policy of preemption more narrowly and sensibly for the reasons outlined above that have nothing to do with protecting Israel or the forceful imposition of democratic political regimes upon the peoples of the Middle East.

]]>Articles by Christopher RoachLet’s Not Re-Fight the Last War in Irantag:takimag.com,2008:article/1.97392008-07-10T20:48:00Z1999-11-30T00:00:00ZChristopher Roachchris@takimag.comThe Iraq Campaign has more or less discredited the idea of preemptive war to stop the acquisition of nuclear weapons by unfriendly nations. But does our unlucky situation in Iraq mean we should never use force to prevent nations such as Iran from getting nuclear weapons? Let me explore some paleoconservative heresies. Nuclear arms would make Iran an order of magnitude more powerful than it is and, more important, would increase the risk of nonstate actors such as al Qaeda obtaining such weapons as they proliferate among corruption-ridden and terrorist-supporting Third World nations.

I certainly don’t think fairness or reciprocity should be a major consideration in the moves we make. International affairs are a bit like a prison yard; it’s survival of the fittest, and what we have and enjoy should sometimes be denied to others who are unfriendly, unstable, or simply “not us.”

I do think prudence matters a great deal, though, and while I don’t want Iran to have nuclear weapons for a great many reasons, short of war I can’t see too many ways for us to stop them. Judging by our experience in North Korea, bribes are easily ignored or exploited to our detriment. That said, there are costs of war too. Contrasting Iraq with the peaceful denoument to the Cold War, it is clear that wars to prevent such acquisitions may be more costly than the alternatives. There simply is no fail-safe playbook that an honest patriot can repair to. This is undeniably a dangerous and difficult game.

Conservatives must keep their wits. We should not be re”-fighting the last war” with Iran. It would simply be one more overcorrection in a series of such errors to take military action, including a land invasion, off the table because of difficulties in Iraq (just as the Iraq invasion was an overcorrection to passivity in the face of Afghanistan’s support for al Qaeda).

If we look at the Iraq War, it’s clear that the initial invasion, the exploitation of WMD sites, and simple regime change were skillfully accomplished by American forces in 2003. It was hardly a campaign of staggering casualties, commitment, or overall cost. The chief reason the Iraq War drones on is that the post-war strategic goals have been remarkably ambitious, the outgrowth of the neoconservative philosophy of democratic revolution and universal human rights imposed by American arms. In evaluating the Iraq policy, the disappointing and inconclusive counterinsurgency operations of 2004-2008 should be disaggregated from the earlier conventional operations’ accomplishments, which include regime change, exploitation of WMD sites, and the capture of Saddam Hussein. On a purely operational level, these were low cost successes.

If America left Iraq in 2003 in a beaten and disorderly state, decapitating the regime of Saddam Hussein would have prevented Iraq from imposing any significant threat to the US going forward. Its scientists and leaders would be in jail, and the nation’s various factions would likely remain, as they have been, self-absorbed with parochial and tribal goals.

Conventional attacks aimed at weakening and disarming the Iranian regime should not be ruled out if they prove necessary to prevent Iran from acquiring a nuclear capability. It simply is better to attack than to be attacked. The existence of not of Iraqi WMDs does not change this principle. Just because force can be misused and intelligence can be mistaken does not mean that intelligence is always mistaken or that force is always misused. It is an ignorant and womanly form of unreason to contrast the undeniably bad train of events that have resulted from our actual policy with a halcyon alternative conjured up in a counter-factual fantasy world. If we did not attack Iraq, we would be facing an entirely different set of problems. Life would still be hard, and risk would be the perrenial companion of international relations.

So long as Iran’s leadership is not merely self-interested, but concerned with an aggressive ideological program–empowering the Islamic World through isolation, cultivating a terrorist apparatus, and developing a nuclear capability–then the need for the US to prevent Iran from acquiring nuclear weapons will remain. The threat of Iranian nuclear weapons may not be existential. But the threat of their constant bullying, interference in sea lanes, harassment, and overall insanity will be very real. There is no doubt, for example, that Iran has armed insurgents who have killed our soldiers in Iraq. Do the newfangled “pacifist conservatives” have no sense of honor or revenge in the face of these provocations?

My friends on the paleoconservative right are fond of the Confederate cause. Perhaps they would consider John C. Calhoun’s first congressional speech defending America’s robust preparations for and aggressive strategy in the War of 1812:

The gentleman’s imagination, so fruitful on this subject, conceives that our constitution is not calculated for war, and that it cannot stand its rude shock. This is rather extraordinary. If true, we must then depend upon the commiseration or contempt of other nations for our existence. The constitution, then, it seems, has failed in an essential object, “to provide for the common defence.” No, says the gentleman from Virginia, it is competent for a defensive, but not for an offensive war. It is not necessary for me to expose the error of this opinion. Why make the distinction in this instance? Will he pretend to say that this is an offensive war; a war of conquest? Yes, the gentleman has dared to make this assertion; and for reasons no less extraordinary than the assertion itself. He says our rights are violated on the ocean, and that these violations affect our shipping, and commercial rights, to which the Canadas have no relation. The doctrine of retaliation has been much abused of late by an unreasonable extension; we have now to witness a new abuse. The gentleman from Virginia has limited it down to a point. By his rule if you receive a blow on the breast, you dare not return it on the head; you are obliged to measure and return it on the precise point on which it was received. If you do not proceed with this mathematical accuracy, it ceases to be just self-defence; it becomes an unprovoked attack.

It would be unwise in the extreme to allow intelligence failures and the setbacks of the neoconservative segment of the Iraq War to dissuade conservatives from their abiding view that the world is a dangerous place that sometimes requires the use of force against dangerous and extremist regimes. It is a red herring to pin this all on Israeli agitation. For reasons of self-interest, conservatives should care that the US and its people be able to obtain oil, trade with whom we will, and not see our forces bombed and terrorized by Iranian-funded extremists.

]]>Articles by Christopher RoachGun Rights, the Militia, and Communitytag:takimag.com,2008:article/1.97592008-06-30T00:21:01Z1999-11-30T00:00:00ZChristopher Roachchris@takimag.comDuring the Cold War, conservatives rightly pointed out that the collectivist materialism of the Soviet Union was anti-human in the worst ways. It elevated the state to mythic proportions. It denied the value of individual human beings. It suppressed the human spirit and focused on minimal material comfort to the exclusion of other values. The state could undo social injustices, we were told, but conservatives reminded us that life always would involve certain unavoidable inconveniences and inequalities. No law could completely eliminate evil, and the attempt to do so would lead to other evils that have been the constant fellow traveler of the leftist program.

Every state that has sought heaven-on-earth has imposed crushing burdens on qualities such as initiative, enterprise, idiosyncrasy, self-reliance, law-abidingness, trust, and regard for one’s own. During the post-war period, conservatives made common cause with libertarian critics of “The State.” Individualism was the watchword of the day. But the emphasis on individualism was always a bit out of tune with the conservative ethos. As other disorders worked their way through society since the 50s, including nihilistic disregard for family and social obligations in general, conservatives expressed their concerns about the breakdown of civil society and community, trends rooted in an “atomistic” individualism.

Conservative political philosophy is concerned above all with balance. Excessive individualism and excessive collectivism both exhibit genuine evils in political life. We are skeptical of change not least because the happy balance of traditional Anglo-American liberties avoided the evils of both. It has been difficult to preserve these liberties under the American Constitution and even harder for others to replicate. The uniquely American balance of our historical liberties is expressed perfectly in the Second Amendment:

A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.

The Second Amendment has always flummoxed modern observers. For starters, it has a preamble. In the law, there is always an issue of interpretation—whether in contract law, property deeds, or statutes—about whether a preamble limits the meaning of the words to follow. Is it surplusage, an exhortation, or a restriction on the specification that follows? In this instance, it is what it appears to be: an expression of purpose. The right remains “one of the people,” but that right is in the service of a broader objective: “the security of a free State.” The Founders rightly worried that the federal government’s power to “provide for organizing, arming, and disciplining, the militia, and for governing such part of them as may be employed in the service of the United States” would be abused to create a federal “select militia” that weakened states’ and individuals’ right to create militias and bear individual arms respectively.

The Second Amendment is also confusing today because of the degradation of the militia over the last 100 years. A true militia may be thought of as a cooperative arrangement of the people and the state. Like the jury system, it injects the sensibility of ordinary people into the state’s exertion of power. The formalized National Guard appeared in 1903 taking over the role of the formerly more numerous and less uniform state militias. The routine use of the posse comitatus has also gone by the wayside in the age of professional policing, though it still persists in various locales.

The right to bear arms at the time of the founding, while an individual right, was not conceived completely individualistically. In this sense, Scalia’s recent opinion in Heller, with its focus on self-defense, downplays unfairly the “classical republicanism” of the Founders. The right to keep and bear arms undoubtedly allows arms as a means of self-defense from ordinary criminals, as well as the predators of nature. But the Heller decision’s dicta—including its gratuitous dig at the M-16--paves the way for eliminating weapons chiefly useful for a broader and more political concept of self defense: resistance to military enemies of the Constitution, whether foreign or domestic, through the actions of the citizen-militia.

The Founders knew that a community was a fragile thing. It can be harmed from moral disorder within, a foreign conquest, and, most insidiously, the evil of “faction.” A purely individualistic focus on the right to bear arms—typical in the rhetoric of libertarians and the Founding era’s Francophile left-wing—does not take into account that the Founding generation, soon after enacting the Second Amendment, imposed certain duties that relate to this right. The federal Militia Act of 1792 provided as follows:

That every citizen so enrolled and notified, shall, within six months thereafter, provide himself with a good musket or firelock, a sufficient bayonet and belt, two spare flints, and a knapsack, a pouch with a box therein to contain not less than twenty-four cartridges, suited to the bore of his musket or firelock, each cartridge to contain a proper quantity of powder and ball: or with a good rifle, knapsack, shot-pouch and powder-horn, twenty balls suited to the bore of his rifle, and a quarter of a pound of powder; and shall appear, so armed, accoutered and provided, when called out to exercise, or into service, except, that when called out on company days to exercise only, he may appear without a knapsack.

Would that the United States mandated such training today! Gun control would have an entirely different meaning involving shot groups and tactical reloads. The founding era’s rhetoric was more than a recitation of rights. Even among the more liberal elements, a right was rarely disembodied from some sense of community obligation. The right to bear arms existed alongside a duty to bear arms. While the counterbalancing action of the different branches of government figures prominently in the Federalist Papers, the authors of that hoary work emphasized the need for a virtuous citizenry to preserve republican government. They knew that the political liberty of all depended upon the widespread inculcation of individual virtues—such as self-reliance—but also political virtues, such as watchfulness over the state and the willingness to forego private advantage when the common good was at stake. After all, the term republic comes from the Latin “res publica,” literally public things but better translated as the common good. The limitations on majority control contained in the Constitution could work at most to stop a temporary majority in the grip of some passion or mania. The Constitution could not, in Rube-Goldberg fashion, forever channel any sort of collection of people, however devoid of virtue and public spiritedness, away from the natural results of their collective character. The Founders knew that character and liberty were mutually reinforcing and necessary for republican government to serve the individual and common good.

As Patrick Henry put the matter:

Are we at last brought to such an humiliating and debasing degradation that we cannot be trusted with arms for our own defense? Where is the difference between having our arms under our own possession and under our own direction, and having them under the management of Congress? If our defense be the real object of having those arms, in whose hands can they be trusted with more propriety, or equal safety to us, as in our own hands?

A robust militia serves to improve the virtue of the people and ties their fortunes with those of the state. Military drill instills characteristics of physical courage and discipline, while also giving the people the necessary skills to resist any threats to their liberties. It is worth remembering that that the Founders were not only concerned with preventing tyranny; another important intervening event preceded the Constitutional Convention of 1787. That event is Shay’s Rebellion, a lawless veteran’s movement that threatened the fragile order that prevailed under the Articles of Confederation. In other words, the Second Amendment in particular evinces the U.S. Constitution’s dual aims: liberty and order. The liberties the Constitution recognizes are historical in nature, and certain seeming inconsistencies—in truth, necessary limitations—flow from their historical contours, which are by necessity more circumscribed that the abstract liberty one might imagine from a purely theoretical point of view.

The Founders’ Solomon-like solution to the problem of creating a government energetic enough to discharge its duties, but not so powerful to oppress the people, finds itself most emphatically in the concept of the militia. The militia is simply ordinary male citizens assembled to perform some necessary government task such as preventing a riot, responding to a foreign invader, pursuing a fugitive, or, if need be, breaking off from de jure control and responding to some emergency from within the apparatus of the government itself. For those who find this institution an anachronism in the age of nuclear weapons, consider the relative inability of modern militaries to suppress insurrections with small arms in such varied locales as Iraq, Vietnam, Algeria, and New Orleans. How much happier would the events in New Orleans have been if some reasonable percentage of the citizenry were routinely accustomed to assisting law enforcement and the National Guard in preserving order and responding to disasters.

Like a strong military in foreign relations, a well-organized militia has a deterrent to would-be tyrants both at home and abroad. While some standing military is necessary today, how much less of a threat such a military would pose to our liberties if it were counter-balanced by tens of millions of American men armed, trained and organized at the county and state level, enforcing laws that they have chosen to live under as a free, self-governing people.

As it stands, the American people are disorganized and increasingly servile. Partly because of the proliferation of meddlesome laws, their relationship to law enforcement and the military is typically one of indifference or hostility. The increasing professionalization of law enforcement and military functions has reinforced this gap between the State and the People. A robust militia working hand-in-hand with full-time government officials would do much to restore civic pride, reduce tension between the government and the community, and deter the worst government excesses. The common extreme individualist notion of gun rights is problematic. Without some sense of common destiny and moral courage, an armed but selfish population would be of little use against either foreign or domestic threats. Why? Because it would always be in one’s individual interest to let some other guy do the fighting. To paraphrase General Patton, without teamwork you can’t fight your way out of a “piss-soaked paper bag.” This criticism of the disorganized militia was commonly levied during the War for American Independence. Consider the account of George Washington in a letter to the Continental Congress dated September 1776:

To place any dependence upon Militia, is, assuredly, resting upon a broken staff. Men just dragged from the tender Scenes of domestick life; unaccustomed to the din of Arms; totally unacquainted with every kind of Military skill, which being followed by a want of confidence in themselves, when opposed to Troops regularly train’d, disciplined, and appointed, superior in knowledge, and superior in Arms, makes them timid, and ready to fly from their own shadows.

Does this line of criticism mean we should not have a militia? Hardly. But it does mean that a disorganized militia is not nearly so useful in securing a free state as an organized and well-armed one. Without some subordination to law and public purpose, armed Americans acting as lone wolves or in some other disorganized groupings would more likely become a rabble like the Quantrill gang. Without some concern beyond the self and without some coordination with self-governing and local political life, the right to keep and bear arms is nearly useless as a bulwark of liberty.

Constitutional and republican government aims to preserve liberty and government without extinguishing either. As Burke put the matter:

To make a government requires no great prudence. Settle the seat of power, teach obedience, and the work is done. To give freedom is still more easy. It is not necessary to guide; it only requires to let go the rein. But to form a free government, that is, to temper together these opposite elements of liberty and restraint in one consistent work, requires much thought, deep reflection, a sagacious, powerful, and combining mind.

The historical right to keep and bear arms is the product of such minds. But conservatives should consider the Founders’ solution in all of its detail. They preserved an uncompromising individual right to keep and bear arms. But that right existed in a larger tableau of duties and institutions that balanced the individual good with the need for cooperation in social life. In an age of out-of-control crime, rampant illegal immigration, natural disaster, and threats of terrorism and urban disorder, a revitalized militia movement to assist local law enforcement and the National Guard, something like a well-armed variation on the Cold War Civil Defense programs, would be a worthy conservative endeavor that would secure a great number of the benefits of our historical right to keep and bear arms.

]]>Articles by Christopher RoachSticking It Outtag:takimag.com,2008:article/1.97692008-06-22T17:51:00Z1999-11-30T00:00:00ZChristopher Roachchris@takimag.comConservatives historically have taken pride in their hard-headedness. It is supposed to be a manly persuasion with a long view, rooted in concepts like deferred gratification, the proper appreciation of applied violence, skepticism of fads and fashions, and a dour view of human nature. In lean economic times, conservatives counsel austerity and sound money, even if this means very painful effects of liquidation. In foreign policy, conservatives emphasize the anarchic nature of international relations and the need for a strong defense. Hard-headeness, however, always runs the risk of pig-headedness. There is also a time hard-headedly to cease doing something that has proven to be a mistake.

Conservatives should remember that just because many anti-American liberals oppose something, doesn’t make it right. The Iraq War is wrong for reasons pacifists and unpatriotic globalists don’t appreciate. As army veteran Andrew Bacevich observes, “The costs to the United States of sustaining this dependency are difficult to calculate with precision, but figures such as $3 billion per week and 30 to 40 American lives per month provide a good approximation. What can we expect to gain in return for this investment? The Bush administration was counting on the Iraq War to demonstrate the viability of its Freedom Agenda and to affirm the efficacy of the Bush Doctrine of preventive war. Measured in those terms, the war has long since failed.”

The absolute worst reason to stay in this war is for some emotional notion of national honor and commitment to the troops, impulses that undergird the very unstrategic thinking of John McCain and numerous buck sergeants. War opponents and war proponents are both stuck in the sane sentimental humanitarianism that justifies or criticizes war with Wilsonian rhetoric of liberation. Both forget that even in the most just wars, war is at best a necessary evil. We don’t go to war to do the conquered a favor. We don’t stay to avenge the deaths or our men like some armed camp of Zulus. A nation sends its military to war to accomplish foreign policy goals. This same nation can and should withdraw these troops when it’s in our interests to do so, when those goals are out of reach, no longer important, or too costly. It is not as if Iraq is sacred American soil with which our nation has any historical connection. This is a foreign land half way around the world in a very bad neighborhood, populated mostly by uncivilized people, whom we do not understand and who do not appreciate our attempts to impose American-style government upon them.

We will suffer (but not unbearably) if we spend $20 or $30 trillion and a few thousand American lives pursuing the goal of nation-building in Iraq over the next decade. But even if everything turns out for the best, this will accomplish a strategic benefit worth some fraction of that. And then what? We’ll still have al Qaeda to worry about. North Korea will still remain an unpredictable, nuclear power. Our borders will be too porous. Our ranks of third world immigrants will remain too numerous. The Middle East will still produce large numbers of pissed-off young men who receive moral support to vent their anger at the western world in the dictates of their religion. The deterrent value of staying or leaving Iraq is a wash. Iran knows we won’t easily commit to a similar campaign on its territory. Russia and China will still be ascendant in their spheres of influence. Oil will still be scarce and in the hands of unstable autocrats and their resentful subjects.

The modest strategic benefits promised in Iraq to the U.S. and the Iraqis are very unrealistic. Vast swaths of people all around the world will not appreciate Iraq as a model of good government. At best, it will end up as stable and prosperous as Pakistan or Indonesia. Instead of seeing idealistic U.S. sacrifices for democracy, most Arabs and Muslims will perceive a marginally successful U.S. bid for power. Most of the world’s peoples will continue to be more passionate about religion, nationalism, ideology, wealth, prosperity, and tribalism than democracy and the rule of law. Not only that, they’ll treat these tangible goods as far higher priorities than democracy.

A democratic Iraq will remain contested by sectarian parties, and, for this same reason, uncompromising religious fanatics will not accept deviation from the pure regime dictated by Islamic Sharia law. Democracy will be seen as a decadent insult. No traditions of loyal opposition and the peaceful transfer of power will develop in Iraq for these reasons. Worse, the U.S., instead of being seen merely as a self-interested or incompetent party in the Middle East, will be seen as the prime mover of politically-empowered heresy.

Instead of taking the wind out of the sails of Islamic fundamentalism, a “successful” Iraqi democracy will be an irritant to either the United States or Islam. To the United States, it will show that democracy is not the same as constitutionalism, and that the U.S. has brought to power a regime that has a democratic imprimatur for the worst abuses of its ethnic and religious minorities, including Iraqi Christians. If the laws somehow resemble our own, the Iraqi state will be unstable and contested, a heretical insult to Islam, which demands Sharia. It will prove—as Britain and Spain have proven to themselves—that Islam and western freedoms and the rule of law are incompatible. Either way, “success” in Iraq would lead to a mountain of lies and denial. If the facts were looked at fairly, liberalism itself would be discredited, and the associated principles of open borders and multiculturalism would be dragged down in the reckoning.

Populist conservatism has been enlisted to support “sticking it out” in Iraq as a testament that we are indefatigable and serious in the face of liberal weakness. But the ring-leaders of this fiasco have more self-interested reasons for stoking this sentiment: our elites themselves would be discredited in the process of any withdrawal from Iraq. For them, better a long-term U.S. presence in a simmering war than a palpable expose of their wrong-headedness in the disastrous, illiberal Iraqi state that would exist without U.S. supervision and control.

]]>Articles by Christopher RoachWho Guards the Guardians?tag:takimag.com,2008:article/1.97852008-06-16T03:09:00Z1999-11-30T00:00:00ZChristopher Roachchris@takimag.comThe Supreme Court has provided another nail in the coffin of executive war powers in its recent opinion on the rights of Guantanamo Bay detainees. Earlier decisions by the court in Hamdan and Rasul ignored statutory enactment after statutory enactment that deprived these detainees of access the courts. This is certainly not an issue of the Court trying to divine legislative intent. After reviewing the latest decision in Boumadiene, it’s clear the earlier hand-wringing about the statutory meaning of the 2001 authorization of force against al Qaeda, as well as related, clarifying acts of Congress, was a smokescreen. The Court has ignored its earlier precedents and read into the Constitution an extra-textual right to control the other branches of government in all instances under the banner of constitutional habeas, even though its jurisdiction is subject to the Constitution’s grant of power to the Congress to restrict jurisdiction, and even though the Constitution’s text and history provides almost no basis to extend habeas review to non-citizens, captured in war, and held overseas.

Conservative critics of Bush should not forget that separation of powers concerns and the need for vigorous executive action against terrorists transcend the particular demerits of George W. Bush. The Constitutional system does not ebb and flow when a good or bad president is in office. All powers can be abused, and this is especially true in the case of the courts, whose own sense of restraint is the thin reed that limits most of their abuses. The same Supreme Court that arrogated to itself the right to interfere with law enforcement activities at the state level and that divined a right to abortion in the sixties, also imagines itself the final say now on all aspects of the war against al Qaeda. Every single decision it has authored since 2001 has allowed al Qaeda detainees the means to clog our judicial system, embarrass the President, potentially obtain release from detention, or otherwise continue their war against the United States by using our legal system to burden the armed forces and our government. These results are all the more disturbing because they are not required by constitutional text, deviate from earlier precedents, and often involve cabalistic interpretations of congressional statutes to circumscribe executive authority.

Consider the various problems with the Court’s approach.

First, the Court has deviated again from its long line of earlier jurisprudence disclaiming any judicial authority over US government actions overseas that involve military detainees, military affairs, or foreign policy. The Johnson v. Eisentrager, 339 U.S. 763 (1950) decision specifically disclaimed any authority for Article III courts to review the actions of US prison officials detaining German POWs and war criminals held in Germany. While the Court then confusingly considered the merits of the petitioners’ habeas petition, it ultimately concluded that the Courts were under no duty to actually grant such petitions or require military officials to produce enemy prisoners who would seek relief from US courts. As the Court said then—and forgot in its latest opinion—the “writ, since it is held to be a matter of right, would be equally available to enemies during active hostilities as in the present twilight between war and peace. Such trials would hamper the war effort and bring aid and comfort to the enemy. They would diminish the prestige of our commanders, not only with enemies but with wavering neutrals. It would be difficult to devise more effective fettering of a field commander than to allow the very enemies he is ordered to reduce to submission to call him to account in his own civil courts and divert his efforts and attention from the military offensive abroad to the legal defensive at home.” Consider also, the Court’s simple failure to mention its own decisions in other areas that hold US Constitutional rights do not apply to foreigners overseas. See, e.g., United States v. Verdugo-Urquidez, 494 U.S. 259 (1990) (holding that Fourth Amendment protections do not apply to searches and seizures by United States agents of property owned by a nonresident alien in a foreign country)

Second, the Court has deviated from an extensive body of jurisprudence that demands deference to executive interpretations of Congressional enactments. In the numerous cases expanding on “Chevron deference,” the Court articulated the notion that reasonable executive interpretations of laws on which more than one reasonable interpretation may be allowed will not be questioned by the Court, even if some other interpretation is arguably more reasonable than the alternatives. This principle preserves the traditional executive responsibility of implementing congressional legislation, especially when conforming abstract legislation to particular circumstances requires more detail than any statute can be expected to have. Along these lines, Ex parte Quirin, 317 U.S. 1 (1942), recognized Presidential authority to create military tribunals under a Congressional authorization of force with far less explicit language allowing detention of “persons” than President Bush enjoyed under the 2001 Authorization of Military Force. Since the powers of war are refracted through various legislation that clearly weighs in favor of the Guantanamo detention—the initial 2001 Authorization of Military Force and the later Military Commissions Act of 2006—the Court’s actions have nothing to do with ambiguity or filling in gaps in the case of congressional silence. It’s a pure act of substituting its judgment for the combined judgment of the Congress and the President.

Third, the Court has deviated from an extremely deferential standard of review for executive actions that may be grouped under the rubric of military affairs. Even in the Court’s most hands-on realm of expertise and intervention—the First Amendment—the Court has shied away from interfering with actions that affect military discipline and efficiency. See generally Goldman v. Weinberger, 475 US 503 (1986). The Court suggests in Boumadiene that the procedures for the review of detainee status will always be inadequate without court review. But there is no requirement of appellate review of any court decision at any level; appellate jurisdiction may simply be removed by statute by the Congress under its powers granted in the Constitution. The Court does not recognize this glaring exception to its theory of what good government requires, because it undermines all of its pretentious as the necessary “final say” on other political branches’ actions.

Fourth, the Court’s ruling has deviated from the principle expressed in Youngstown Sheet & Tube Co. v. Sawyer, 343 U.S. 579 (1952), that in the case of an explicit grant of authority by Congress, the President acts at the highest level of personal and constitutional authority. The GITMO detentions contrast to cases grounded solely in implied powers or executive actions contrary to explicit congressional directives. In spire of this long-established concept, in Hamdi v. Rumsfeld, 542 U.S. 507 (2004) and Rasul v. Bush, 542 U.S. 466 (2004), the Court (1) second guessed executive fact-finding of unlawful combatant status even for prisoners held overseas and instead demanded another layer of process before special status tribunals and (2) the Court did not find that the 2001 Congressional Authorization of Military Force against Al Qaeda, which authorized the President to “use all necessary force against those nations, organizations, or persons…” involved in the September 11 attacks, also authorized the detention of those the executive finds to be unlawful combatants. The Court took its extremely narrow reading of Congressional authorization a step further in Hamdan.

The Congress responded to these rulings, and the procedural history is telling. In Hamdan, the 2005 Detainee Treatment Act explicitly authorized the use of military tribunals and disallowed habeas review in Article III courts for claims made by the Guantanamo prisoners. It made certain provisions explicitly retroactive in effect, but the statute was less-than-entirely-clear as to its impact on pending challenges, such as Hamdan’s. Nonetheless, the statute expressed a fairly certain congressional intent to strip the Court of plenary habeas jurisdiction of challenges by the Guantanamo prisoners. Nonetheless, the Court engaged in a detailed textual analysis to show that Congress did not explicitly manifest any intent that its 2005 statute be applied retroactively, and it ruled against the executive interpretation (saying it violated both Geneva and the UCMJ’s Article 36 (b)). The Congress quickly said otherwise in the 2006 statute, which the Court has now undone with a supposed implied jurisdiction to reach every government action under constitutional habeas, even though jurisdiction on any issue can be deprived from the Court by congressional statute. The Constitution itself says at Article III, Section 2 that “the Supreme Court shall have appellate Jurisdiction, both as to Law and Fact, with such Exceptions, and under such Regulations as the Congress shall make.” How much more explicit could a withholding of Court authority be than the 2005 and 2006 statutes on tribunals? The Court has simply invented a right to listen to habeas cases having no limits whatsoever. Under the Court’s logic any person, anywhere, affected in any way by US government action has redress to US courts. Bush, to his discredit, has not explained why his power in these areas is important, and he also lacks the moral courage simply not to enforce the Court’s ultra vires decisions.

In Boumediene, the Court undertakes a lengthy, historical analysis of habeas that is inconclusive by the majority’s own admission. But instead of deferring to coequal branches of government, it forges ahead with interference. Rules of jurisprudence rightly contain certain principles of deference, particularly when traditional prerogatives of another branch of government are implicated. Those principles are especially appropriate in this case of an executive military function. The reasons are plain. First, the Congress has ample power of its own to reign in executive abuses in this arena, not least because it can withhold funding. Second, the President surely must be understood by precedent, structure, and text to have some kind of inherent authority to address military affairs, including treatment of prisoners, under the “take care” clause and the “commander in chief” clauses. Finally, even common says says the Court has little appreciation for the swift means by which military decisions, including detention decisions, must be made. Warfare is not a situation of “better ten guilty men go free, rather than one innocent be held.” In war, these presumptions are reversed. When non-citizens overseas are involved, it’s not a question of our common welfare as a people being involved at all. The Padilla case, which I admit is more complicated, is very different from foreigners captured overseas in Afghanistan. The only restraint is prudential, military necessity, executive interpretations of treaty obligation, and the common-sense view that we do not want innocents detained. The Court’s interference has been costly already. Under the overly deferential review regime set up to address the Court’s directive in Hamdan, many detainees have been released only to return to the battlefield.

Un-uniformed, extra-national forces that commit terrorism have long been given few rights under the law of war. And that law has long been administered with little outside court control of the executive branch worldwide. This context has been completely lost on the Court from its long list of forays since 2001 into the arena of unlawful combatants. The Court, uncomfortable with its duty to defer to the political branches has simply ignored and defied its prior precedent without regard to the long-run consequences. These consequences include the negative effects of this decision on US morale, the actuality under the Court’s oversight that dangerous terrorists will be released from custody, and the potential logistical and military nightmare appellate review will do to detention procedures and military commissions. Today the decision is something of a nuisance; how much more burdensome would it have been in a time like WWII where hundreds of thousands of POWs and other detainees were held for the duration of hostilities?

The Court’s decision ultimately betrays three major biases, all of which are very dangerous to our constitutional system and the future success of the war against al Qaeda. First, the Court will countenance no distinction between military and peace-time realities. All of the Court’s decisions demand, in effect, the same level of court involvement and scrutiny involving unlawful combatants that are not (and could never be) signatories to the Geneva Conventions entitled to their protections. Second, the Court basically shows at every turn, in spite of its lip-service to the destruction of 9/11, that it does not think this is a real war, with a real enemy, where the safety of actual Americans is in grave danger. Why do I know this? Because the Court has resisted every demand to treat these military measures in military operations against a military organization differently from ordinary criminal procedures. Here, as in criminal cases, the burdens, procedures, rules of evidence, and likely outcomes are designed to favor defendants heavily under the Court’s recent line of cases. Finally, the Court has moved away from its own precedents allowing other branches of government to act without its ultimate approval and involvement. Doctrines of nonjusticiability and “political questions” are apparently out the window when it comes to the war against al Qaeda.

The Court’s concerns are clearly not the human rights of the suspects, so much as the power of the courts as a whole. A friend in a discussion offline put it as follows:

It seems to me that a part of the difference of opinion on the Habeas issue is what the principle of Habeas Corpus is about. The conservative line tends to view it as an individual civil right. Thus, it is designed to protect citizens and those others as determined by law. Kennedy’s opinion seems to me to have a very different slant. He seems to think it is a way to check the power of the executive branch, especially as a separation-of-powers issue. In that way, it seems a right more properly of the judicial branch, and extends to anyone who might come under the power of the executive branch. The right then is more a part of the fabric of our form of government, and only secondarily a right of the individual. To the extent it belongs to the individual, the court seems to consider it more a natural right—a right that exists to people as human persons qua persons (not citizens) to be free from undue governmental power.

There is another factor leading the Court astray: any distinction of citizen and non-citizen is missing from the Court’s view. It’s as if war—the starkest demonstration of conflict of the loyal and the disloyal, the native and the foreign—is just too old-fashioned. It’s clear that Obama and other critics of the GITMO procedure are less concerned with the various strategic mistakes of the Iraq War than they are with a generalized pacifism and discomfort with the inherent distinction of war and peace. Otherwise, they’d be up in arms against these Court interferences with the non-Iraq campaign against terrorists. Obama and Justice Kennedy both reveal a lawyer’s obtuse disregard in general for non-legal situations such as military operations that require alacrity, discretion, and force. Under their logic, a Court could stop D-Day with an injunction, and US commanders can be hauled before US courts to answer for treaty obligation compliance that has, up until now, been left to the President’s interpretation.

All of this is not to deny the obvious: abuses can happen. Innocents may be mistakenly detained. Their detention may last a long time. And this would be unfortunate in the extreme. But the whole difference of war and peace, of foreign terrorists and domestic criminals, is that the former are more dangerous and require different techniques to be resolved. The balance of concern for citizen safety and the rights of the innocent is reversed from the ordinary situation of civilian law enforcement.

For a constitutional system that is supposed to embody a balance of powers, in which unreviewable action by any one branch is suspect, the Court never expresses any doubts about its own rectitude and authority, even when it interferes in traditional executive wartime responsibilities. As always in matters of politics, we should ask: “Quis custodiet custodes?”

]]>Articles by Christopher RoachLiberalism Forbids An Awakeningtag:takimag.com,2008:article/1.98012008-06-06T23:46:00Z1999-11-30T00:00:00ZChristopher Roachchris@takimag.comLiberalism views much of history as a morality play. The past was very bad. We are making progress. The future will be better. But every new achievement serves also as an indictment. The past is guilty of offending the liberal goals of inclusion and equality. As Lawrence Auster has noted, every advance of inclusion and equality is an indication of how the past up until this point was less inclusive and equality and that we still have a “long way to go.”

An Obama presidency, far from calming down demanding liberal interest groups, may instead add fuel to the fire. Moderates and establishment conservatives are confused. Why can liberals find so little to love in our country and civilization’s history? Why is every defense of the past and its heroes met with predictable bromides: “Well, it’s easy to be patriotic if you’re white and privileged. Most people didn’t have access to the good life in America’s past, and, even if more and more do, many still do not, which is the height of injustice.” The ending of certain inequalities and oppressive forces, including the ending of slavery, women’s suffrage, defeating the Nazis, and the magnanimous civil rights movement, are of no consequence. Far from decreasing America’s collective guilt, these liberal achievements add to the bill.

Why? Because liberalism does not permit a non-liberal definition of the good. To the extent nonliberal values like safety, peace, and prosperity are admitted as important political goals at all, they are always trumped by the liberal goals of inclusion and equality. Consider the liberal view of racism in the stark example of post-colonial Africa. Much of Africa was once under white-rule. By the mid-twentieth century this meant prosperity, efficient government, European-style capital cities, the rule of law, and rising standards of living for white and black alike. Since the overthrow of white rule in places like Rhodesia and South Africa, the continent has declined on nearly every measure of human flourishing: GDP, life-span, violence, and corruption. Nonetheless, the white-ruled past is considered more worthy of condemnation than the tragic present. In the comparison of broke, crime-ridden, and unstable “democratic” states with wealthy, safe, and orderly white-rule regimes, the latter always loses no matter how extreme the differences. Even if we admit that white-rule was both unstable and not entirely fair, its other merits cannot be considered. Liberalism proffers an ideological definition of the good that is immune to facts. Adherence to liberal aspirational principles always takes greater importance in this value scheme, and this unalterable commitment is a great cause of our present troubles.

Liberalism does not allow us to recognize an abiding reality: that the present is better in some ways and worse in others than the past, and, thus, in understanding our own society and that of other peoples, we must weigh the relevant factors. One can, in fact, compare Rhodesia and Zimbabwe. But, for a liberal, any such comparison is no contest. The present is obviously better, though we still have a long way to go. When we get there—wherever there is in the future—we’ll realize how backwards and illiberal we are today. Someday that self-satisfied future will itself be condemned by a superseding liberal future. Eventually, liberalism itself will have to collapse. By ignoring the other goods of political life, it will neglect them, as its votaries have in Africa. A serious post-mortem will not happen, however, whenever that comes to pass. Nothing could possibly happen that would change minds, because those minds are programmed with heuristics and categories and screwy moral reasoning that redefines the good in a lop-sided way. For liberalism consists above all in an exaggerated concern for a single aspect of justice—broadly speaking, equal treatment of all—to the exclusion of all other political and social concerns.

There is a vain hope by some conservatives that reasonable people that actually love this country and are fed up with being guilt-tripped all the time will finally wake up after something really bad and offensive happens in the name of liberalism. That some insult by liberalism will be a bridge too far. From the Watts Riots to the LA Riots to Bill Clinton’s corruption to Barack Obama’s long-standing affiliation with a racist church, this is very unlikely. Establishment conservatives are in the grips of the same liberalism described above. They have lost the vocabulary with which to define good government and human flourishing outside of the liberal ideals of inclusion and equality. Even George W. Bush cannot imagine fighting a war for the traditional reasons of revenge and national security. It must be for “democracy” and a fight against “evil.” We must prove it’s for democracy—not just ours but everyone’s—by sticking around Iraq and Afghanistan for nearly a decade to impose alien political institutions. Every terrorist attack by the ungrateful Iraqi people is met with attempts to exclude the perpetrators from the mass of the people; they are dead-enders, a small minority, terrorists, etc. Since liberalism is the good, the Iraqis must want it, and those that fight against our war are therefore not real Iraqis. For our liberal crusading President it’s very simple: Iraqis are people. People want liberalism. Illiberal people are malevolent, evil, confused, and never express a widely held view. Thus the Iraqi nationalist opposition that wants power to oppress its sectarian neighbors does not represent the will of Iraqis. But every Iraqi thinks this way about what to do with political power, even out “friends.” Liberalism blinds us to how most people do not want “freedom” and “democracy” but are, rather, tribal. They want tribal goods like wealth for the tribe and the subordination of hostile tribes. This is what democracy in the name of liberalism means to Iraqis, and this is what it means to most of the aggrieved minority groups that make up the Democratic Party.

Liberalism does not allow tangible failures to cause a reevaluation of the liberal hierarchy of values, because that hierarchy defines failure in advance as either growing pains or an improvement over the racist, exclusionary past. Iraq is one example. Gutted black-ruled American cities like Detroit are another. But the best example of this is Zimbabwe, formerly Rhodesia. It was once a nice, orderly place you might want to visit. Unlike South Africa some blacks could vote. Even those that couldn’t lived reasonably well compared to every other African nation. Under Ian Smith, the country’s government achieved all of the goods we expect of government and fought a good fight against Soviet-supported terrorists, but that regime was minority-ruled by the whites and wealthier blacks. It was abandoned and then castigated by liberal European nations in the name of liberalism. It eventually fell. Today, Zimbabwe is a hell-hole that is about to experience mass starvation. Yet for liberals, the past was worse than this, because the exclusion of the ignorant black peasantry from political power offends liberal principles of equality, while its present descent into anarchy and starvation does not.

If the horrors of decolonialization and black-rule in Africa did not wake up liberals—whose trendy cause in the 70s and 80s was fighting against Apartheid—why should we expect the more mild pains and inconveniences of Barack Obama’s rule to wake us up here at home?

]]>Articles by Christopher RoachThe Afghanistan Fallacytag:takimag.com,2008:article/1.98122008-06-01T20:49:00Z1999-11-30T00:00:00ZChristopher Roachchris@takimag.comAmerica’s wars in Iraq and Afghanistan both involve fractious societies, weak governments installed by force from without, rampant criminality, persistent insurgencies, and the spectre of unknown costs from a U.S. withdrawal. The chief reason we are told to stay on both battlefields—in particular Afghanistan—is that they may become natural havens for terrorists without U.S.-imposed order. Yet the dominant rhetoric of critics is that Iraq is the “bad war,” a distraction at best . . . a major injustice to the Iraqis at worst. These critics—including Barack Obama—describe Afghanistan in essence as the good war. Our counterinsurgency efforts there are widely held to be necessary to prevent the reemergence of terror camps and to avenge the 9/11 attacks. For Democrats in particular, strenuous expressions of support for the Afghanistan War also serve to deflect their post-Vietnam reputation as naive pacifists incapable of marshaling force to defend national interests.

Both wars, however, rest upon the same, mistaken strategic assumption: the idea that we must create and support new democratic states in the chaotic regions where Islamic terrorists train and live and battle insurgents until native forces can take over the fight. Any serious proposal to increase the focus on Afghanistan must explain why our strategy there will succeed where the nearly identical Iraq strategy has so far succeeded only in moving the United States position sideways. By this I mean that Americans have alternately fought Saddam, his loyalists, the Sunnis, and now dissident Iraqi Shia factions opposing the Iranian-friendly Shia regime that the U.S. also happens to support. Violence ebbs and flows, but no real light at the end of the tunnel appears in either case, because the structural factors for disorder remain the same. The existence or not of democracy is a relatively minor factor in fueling the persistent violence in these societies. Indeed, the relatively greater primitivism and poverty of Afghanistan suggest any nation-building cum counter-insurgency efforts there face greater intrinsic challenges.

Both theaters are counterinsurgencies among bellicose and tribal people with whom we share very few values and interests. In both theaters, U.S. commanders are embroiled in resolving parochial tribal disputes that have no obvious bearing on American security or the strength of al Qaeda proper. These two counterinsurgencies are especially ambitious because they are are not in defense of established regimes, but are being waged alongside new regimes that we have created, similarly to the ideologically-tinged US War in Vietnam or the Russian experience in Afghanistan. We can expect, and so far have witnessed, similarly ambiguous results in both Afghanistan and Iraq. This lack of success in both wars is a distinct issue apart from the retrospective concern for the administration’s description of the casus belli in Iraq as Saddam’s possession of weapons of mass destruction.

Both campaigns are at the heart of America’s fight against al Qaeda, but the strategy in both cases is questionable. Disorderly groups, ranging from pirates to criminals, have long existed on the international scene. History in this regard has not been linear. For a time, western societies had a decisive edge over hostile anti-modern peoples, symbolized most dramatically by Napoleon’s rapid conquest of Egypt. Today, terrorists of all kinds are more deadly and more capable of power projection through home-made bomb technology combined with cheap international travel and porous western borders. The fragility of interconnected modern economies, western doubt about its own self-defense, and high levels of freedom within western societies give terrorists more capability than they enjoyed in the era of the Barbary Pirates. It does not follow, however, that the only way to address the problem is to sort out “the root causes,” whether defined as Islam, poverty, the lack of Middle Eastern democracy, or dysfunctional states in general.

George Bush’s response to 9/11, while bold and superficially effective, comes from the same thinking as the Johnson-era War on Poverty. It aims ambitiously to attack the root causes of terrorism. The seeds of American failure are found in the strategy itself. International terrorism has features in common with other permanent afflictions, such as poverty and crime, insofar as in all of these cases the symptoms can be more effectively treated than the root causes. In the case of terrorism, the way to do this is to develop an overall strategy of defense.

Instead of viewing terrorism as a persistent but manageable problem, Bush proposed to wipe it out forever with an “American exceptionalist” idealism and ambition. The proposal centered on rearranging the internal politics of hostile Islamic societies. Today we would resolve Afghanistan and Iraq . . . tomorrow, Syria, Iran, and Palestine. Since Bush and his neoconservative advisers were so steeped in the foundational principles of liberalism, they could not admit that some combination of militant Islamic religion coupled with American foreign policy choices in the Middle East were the chief source of grievance in the Islamic world, rather than a lack of money, bad p.r. or a lack of democracy.

We were told, “Install democracies, and everything would be fine. We did it in Germany and Japan, which are just like Afghanistan, you know.” Setting aside the now laughable optimism, is it necessary to install even nondemocratic friendly regimes in these countries? Conservatives have discussed at length the impossibility of creating democratic regimes in much of the Middle East, and, where democratic procedures have been adopted, conservatives have correctly noted the unhappy results from the standpoint of U.S. interests. The democracy component of U.S. operations hinges on a series of related and highly questionable premises, namely, that the U.S. should wallow into the mire of fanaticism, disorder, and chaos that has always existed in countries like Iraq and Afghanistan. The option of a withdrawal coupled with a cordon sanitare, surgical attacks, and sophisticated defenses is too often dismissed without analysis.

Conservatives of a manly bent gravitate towards resolving these matters “once and for all,” and that almost always involves going on the offensive. Defensive operations are treated as passing the buck. This is a false dilemma. Defense is doing something as well. Much of warfare and foreign policy, historically speaking, has consisted of various defensive measures, particularly in efforts to deter threats by preparedness and social cohesion at home. In addressing Islamic terrorism, both neoconservatives and the pacifist left are trapped by the same liberal world view. Neoconservatives believe it would be a great moral failing to leave the Afghans and Iraqis to stew in their own juices and, when necessary, to impose collective punishment on nations that do not prevent their nationals from undertaking terrorist attacks on the United States. The sentimental universalism of the neoconservative right makes it difficult to conceive of another nation or tribe as an “enemy” with a collective responsibility. Where a nonideological person sees an enemy, the neoconservatives see an opportunity for do-gooder intervention. The pacifist left is trapped by the corollary: indifference to America’s survival as a distinct people with a right to continue as such. In both cases, rationalist universalism confuses the national interest with fidelity to abstract principles, which are the supposed right of all mankind. The left’s sentimental universalism makes it difficult to define the United States in particular as worthy of protection as a coherent entity. The old refrain says it all: Who are we to judge?

So, while many on the left have criticized Iraq in clear and powerful terms, none of the major Democratic candidates seriously proposes a defensive strategy that would offend liberalism, such as restricting immigration from the Middle East or expelling disloyal Muslims. Likewise, border security in general is pooh-poohed because of the overlapping liberal interest in expanding our diversity, which adds to the natural clients of the welfare state and undermines the influence of traditional American elites. Finally, the pacifism of the Democratic Party stems from their sour view of America as a western oppressor nation, guilty of usurping the Indians and slavery. In this view, we and our people in particular are unworthy of the moral right to self-preservation, even if our enemies have a moral right to resist us. Only by the triumph of liberal ideals will America “be America again.” In other words, our annihilation as an historical entity is the barely concealed goal of the mainstream left.

I should be clear: most leftists do not consciously support the mass murder of Americans by terrorists; instead, they see our peaceful destruction through demographic trends, international pressure, mass immigration, and redistributing our resources in the name of “global justice” as welcome measures. For example, Barack Obama sponsored a Global Poverty bill that would have sent upwards of $845 billions of U.S. dollars overseas. This is an unusually ideological account of American interests in a time of great economic trouble. This lack of identification with an American community with a distinct interest among the world’s peoples is why normal patriotic feelings remain difficult for the left—-for them, it’s obvious that the suffering of terrorists in Guantanamo Bay and racial profiling of unassimilated American Muslims are viable campaign issues.

The Afghanistan Campaign is certainly morally justified. But is it strategically justified? Conservatives should ask whether it furthers American safety and independence to fight terrorism by building and supporting ersatz states in naturally disorderly parts of the world. It does not appear so for the reasons outlined above, chief among them that such wars are unnecessary and less effective than defensive alternatives. The goal of national defense is furthered just as much by Osama bin Laden being six feet under in a grave, as it is by him hiding six feet underground in a cave, impotently producing videos and audio tapes. If he emerges from the caves, there is little reason a rump U.S. presence consisting of carrier-based air power and small special forces teams could not do destroy him. Such punitive and surgical applications of force are certainly far less costly and leave the U.S. with more flexibility than the current approach, where one third of our forces tied up in semi-permanent nation-building projects. In this model, the Afghan nation-state would have to stand up on its own, with the incentive of major U.S. punitive retaliation if it should not do its level best to stop our enemies that would use its land for terrorist-training.

A defensive counter-terrorism strategy would focus on matching America’s comparative advantages to al Qaeda’s weaknesses. The happy example of Switzerland—not blessed, like we are, by two enormous oceans on either side—shows that defensive neutrality, or something like it, is possible in the modern era and brings with it a great number of economic and other advantages. In the counter-terrorism context, such a strategy would focus on securing U.S. borders, restricting immigration from unfriendly regions, enhancing the resources of domestic law enforcement, and undertaking the occasional punitive raid; however, such a strategy would not counsel the U.S. to get bogged down in nation-building, whether for strategic or humanitarian reasons.

For the neoconservatives, the tail is wagging the dog. In their eyes, commitment to Israel and the promotion of liberal democracy are at best coequal with any concern for preserving historical American independence. Any foreign policy advice they give should be viewed skeptically on account of these commitments. Their ideology ultimately renders them and their followers incapable of questioning Bush’s approach. Defensive strategies continue to be dismissed out of hand by other conservatives through a perfect storm of leftover Wilsonian idealism, the identification of terrorism’s root causes in the lack of Middle Eastern democracy, and the interaction of both of these tendencies with a habitual American desire to solve problems permanently as an expression of resolve and hard-headedness.

While manly resolve and hard-headedness are admirable traits without which civilization could not exist, their close cousin is self-destructive hubris. Willpower alone will not transform the world. Doing “something, anything” often has as much to do with the psychological needs of the actor, as it does with accomplishing a particular end. Our troops on the ground have learned through trial and error about the benefits of tactical patience, realism, local understanding, and the need for clear mission guidance from higher command. Tactical excellence, however, cannot overcome the lack of strategic realism among our top leaders, which requires above all a renewed commitment to the defensive art and the jettisoning of liberal categories that make clear thinking about national self defense impossible.

What is the object of defense? To preserve. To preserve is easier than to acquire; from which follows at once that the means on both sides being supposed equal, the defensive is easier than the offensive. But in what consists the greater facility of preserving or keeping possession? In this, that all time which is not turned to any account falls into the scale in favor of the defense. He reaps where he has not sowed. Every suspension of offensive action, either from erroneous views, from fear or from indolence, is in favor of the side acting defensively. This advantage saved the State of Prussia from ruin more than once in the Seven Years’ War. It is one which derives itself from the conception and object of the defensive, lies in the nature of all defense, and in ordinary life, particularly in legal business which bears so much resemblance to war, it is expressed by the Latin proverb, Beati sunt possidentes. Another advantage arising from the nature of war and belonging to it exclusively, is the aid afforded by locality or ground; this is one of which the defensive form has a preferential use.

]]>Articles by Christopher RoachWhat is Paleoconservatism?tag:takimag.com,2008:article/1.98272008-05-24T06:17:00Z1999-11-30T00:00:00ZChristopher Roachchris@takimag.comAt the end of the Cold War, conservatives found themselves in a state of disunity and intellectual ferment. The neoconservative faction demanded a continuation of the Cold War model of interventionist foreign policy and rejected the domestic small government conservatism popular in the South and West. Neo-nationalists, such as Pat Buchanan, pushed for a turn inward, the rejection of various liberal cultural trends, and a dismantlement of much of the welfare state, while advocating restrictions on immigration to reduce the growing welfare state’s largest and growing constituency.

If the expanded government power of the Cold War was conceived of as a necessary evil in the eyes of paleoconservatives, for neoconservatives it was America’s finest hour. In the beginning, neoconservatives were chiefly made up of liberal, Jewish defectors from the Democratic Party who rejected its embrace of the New Left at the tail end of the Vietnam War. In the New Left, the neoconservatives saw nihilism, indifference to Soviet expansionism, solidarity with anti-Western (and anti-Israeli) movements for “national liberation,” and deep alienation from the consensus American position of the Cold War. As liberals with strong ties to the civil rights movement of the 1960s, neoconservatives also saw themselves as the conscience of the conservative movement, natural moderates without the taint of racism that fueled some southern conservatives’ opposition to the civil rights movement.

In the early 90s, the burgeoning Welfare State and its invasive focus on the activities of private life and private businesses presented itself as a natural locus of unity among traditional conservatives. Paleoconservatives were uneasy with the compromises of the Cold War, and after 1989 these could no longer be justified as necessary and temporary measures to oppose the Soviet Union. Even quasi-authoritarian Catholics (who lionized Franco and Pinochet) had little use for the federal state’s invasiveness because so much of it was in the service of evil and revolutionary ends: undoing state prohibitions on abortion, imposing crushing tax burdens on small business, interference with naturally patriarchal sex relations, and preventing self-defense through gun control. This anti-Welfare-Warfare-State coalition included the self-described paleolibertarians. As the “emergency” needs of the Cold War ended, paleoconservatives urged a major reduction in America’s foreign policy commitments, just as they had urged an end to the “emergency” programs of the New Deal after the crisis of the Great Depression had passed. The divisions between traditionalist paleoconservatives with the neoconservatives–temporarily hinted at in the derailment of Mel Bradford’ appointment to the NEH–became thoroughly manifest during the reign of George H.W. Bush. The FDR coalition refugees never fully embraced the goal of dismantling the welfare state, including the New Deal. More important, the end of the Cold War led neoconservatives to find new dragons to slay, advocating permanent US interventionism in the Middle East, the quixotic goal of expanding democratic capitalism, protecting Israel, resisting a revanchist Russia, and generally preserving the beneficent applicaiton of US power. Pointless interventions in cases of dubious national interest—particularly in Somalia—hardened divisions between these two disparate wings of the conservative movement.

Earlier friction on such varied issues as antidiscrimination laws, the meaning of the Civil War, and the existence and nature of “racism” provided additional friction. This friction burst into flame in the wake of the 9/11 attacks. Where paleoconservatives described the attacks as the fruits of excessive intervention in the Middle East and an overly generous immigration policy, neoconservatives saw the attacks as a pretext to cleave closer to Israel, pursue long-established plans to defang Iraq, and generally pursue an “American greatness” foreign policy. Since liberals, libertarians, and traditionalist conservatives all had various degrees of opposition to the War in Iraq–or developed opposition as the war’s long-term idealist project became manifest–pacifist libertarian ideas on foreign policy allowed paleoconservatism to be reduced to a single, small government principle in the eyes of recent arrivals. This was a natural enough inference considering the focus of publications like Chronicles and The American Conservative during the last five years. It is erroneous, however, and this is apparent to anyone involved in conservative politics prior to 2001. The identification of paleoconservatism as coterminous with absolutist small government ideas has confused fellow travelers with long term believers and wrongly substituted part of its authentic conservative vision for the whole.

Like any species of conservatism, paleoconservatism demands different treatment of differently situated people. If paleoconservatism is for small government at the federal and international level, it often embraces “republicanism” at the local level, a tradition that extols the idea of a small, self-governing society where happiness and virtue follows the salutary act of considering and debating the good, being an active citizen, and expressing that commitment politically in law. By way of illustration, anti-democratic interference in the name of newfangled liberties is one of the core sources of conservative opposition to judicial activism, which interfered with the right of states to organize schools, address vice, find and punish criminals, and chart a course attuned to local circumstances.

Conservatism is defined above all else by the instinct to defend a known way of life that is under threat. In today’s America, that certainly includes the constitutionalist limited government traditions of the Founders, but it also includes the moral leadership of the WASP elite and the Low Church culture originating in rough-hewn Scotch-Irish pioneers that finds expression today in the scorn for elitism and disdain for dependency among a significant fraction of longer-established Americans.

A “conservatism” that decries everything from 1789 onward and rejects large swaths of historical American practice is not conservatism, but is instead a kind of ideological romanticism. Like any ideology, it does not have to deal with compromise, results, facts, and lived experience. For the ideological romantic, the past and the present can both be dismissed as cynical compromises accepted by inconsistent and unserious men. For folks like these—whether liberal, libertarian, socialist, or something else-–the best is yest to come, and, if we enact their a priori proposals, the perfect society will appear just over the the next hill, like the Lost City of El Dorado.

]]>Articles by Christopher RoachAn Imaginary Edmund Burketag:takimag.com,2008:article/1.98332008-05-22T05:31:00Z1999-11-30T00:00:00ZChristopher Roachchris@takimag.comIt seems everyone wanted to be on the side of progress in the Seventies, but today everyone’s a Burkean. Gay marriage advocates, Barack Obama supporters, and defenders of the welfare state all identifiy themselves as the rightful heirs of Edmund Burke, the grandfather of conservative philosophy. This is a strange development. Burke and the conservatism he preached have long been relegated by respectable modern opinion to the realm of curiosity, at best. After all, the “times are a changin’,” and the opponents of change are, in the words of a typical liberal engaged in “elitist tomfoolery since they are not on the receiving end of racism, sexism, and adoptionism [?] in the US.” In other words, conservatism, Burkean or otherwise, was not what the cool kids were wearing until very recently.

Part of the increased interest in Burke stems from the sharp contrast of his rhetoric of prudence with the extremely imprudent ideological campaign that Bush has undertaken in Iraq. If older veins of conservatism find their inspiration in Burke, Bush and the neoconservatives look to FDR, seeing in him moral clarity and the “national greatness” of America’s campaigns in World War II and the Cold War. Where Burke campaigned against the explicitly ideological violence of Revolutionary France, Bush and his neoconservative advisers see something romantic and providential in the application of massive violence against the Axis powers. In particular, neoconservatives regard any collateral damage to have been atoned for in the more or less successful American reconstruction of the once troublesome nations of Germany and Japan.

For mostly the same reasons, Lincoln looms large in the neoconservative vision because he is a figure that marshaled power, applied violence, and sidestepped the restrictions of tradition and positive law in the service of a grand philosophical vision of natural right. If only Lincoln had been around in 1939, the thinking goes, there would never have been a Holocaust. The superficial pacifist opponents of Vietnam were something of an anomaly. The more abiding liberal tendency remains one of comfort with necessary revolutionary violence to bring about an egalitarian vision of social justice. It started with Robespierre, continued with Lincoln, and culminates today in the far less technically capable crew in the White House. In times like these—times of never-ending ideological wars and incompetence in their prosecution—some prudential restraint is naturally quite attractive to a wide variety of observers weary of remaking the world in our own image.

Burke’s latest fans, however, misunderstand something important about Burke and his philosophy by abstracting from his ouevre only the following two ideas: (1) criticism of ideological fanaticism, particularly the concern with uniformity, and (2) Burke’s promotion of gradualism under the rubric of “organic change.” Burke had many more themes, all of which find echoes in conservative thinkers today. For example, Burke also defended the necessity of social inequality, political authority over moral matters, organized religion (including state support of the same), sound money, chivalry and traditional sex roles, and traditional political institutions, most notably in the form of hereditary monarchy. Burke was neither a libertarian, nor a Classical Liberal. He famously sparred with Thomas Paine, who penned his obnoxious work, The Rights of Man, as a criticism of Burke’s Reflections on the Revolution in France.

Modern “Burkeans,” who defend such varied matters as universal healthcare and gay marriage in Burkean terms, effectively cut Burke’s philosophy in half, focusing exclusively on his concerns for procedure and the pace of political action, while distorting or ignoring Burke’s more controversial treatment of the substantive ends of political activity.

Consider by way of illustration a representative argument from Dale Carpenter, writing at Volokh.com, :

It is possible that, from a Burkean perspective, it is some of the opponents of gay marriage who operate on abstract theories that have little to do with real human lives. Some, but not all, opponents of gay marriage appear to cling to an anachronistic view of gay people that is increasingly divorced from all learning, law, life, and experience. . . . Gay marriage is a proposal to change an entrance rule, to let more people in. There have been many changes in marriage entrance rules over our history: interracial marriage, age requirements, consanguinity requirements, to name a few. I am not aware of any evidence that a change in marriage entrance rules has ever harmed marriage as an institution. And gay marriage does not directly affect every marriage, since every other marriage remains heterosexual. To believe gay marriage affects every marriage is to rest on very abstract theorizing about present “social meaning” or wild speculation about distant future social meanings. A traditionalist conservative should distrust such reasoning.

Notice how the burden shifts from those who counsel change to those wary of its unforseen consequences. Andrew Sullivan echoes this narrow and upside-down view of Burke when he says, “Doubt-based conservatism, in other words, is not just Burkean and English. It is Madisonian and American. This reckless era of big government fundamentalism is exactly the time to recover and celebrate it.”. Burke never said institutions must change in a particular liberal direction over time or that existing institutions should themselves be treated by citizens with doubt and timorousness. Indeed, he defended sentimental attachments to the inherited order and was skeptical of change and liberal rhetoric in general, arguing instead:

The people of England will not ape the fashions they have never tried, nor go back to those which they have found mischievous on trial. They look upon the legal hereditary succession of their crown as among their rights, not as among their wrongs; as a benefit, not as a grievance; as a security for their liberty, not as a badge of servitude. They look on the frame of their commonwealth, such as it stands, to be of inestimable value, and they conceive the undisturbed succession of the crown to be a pledge of the stability and perpetuity of all the other members of our constitution.

Burke is even more out of step with modern tendencies in his treatment of religion. Burke defended the established Church of England—hey, no one’s perfect—but, more important, Burke made nearly all of his arguments with reference to an austere vision of Christian truth:

We know, and it is our pride to know, that man is by his constitution a religious animal; that atheism is against, not only our reason, but our instincts; and that it cannot prevail long. But if, in the moment of riot and in a drunken delirium from the hot spirit drawn out of the alembic of hell, which in France is now so furiously boiling, we should uncover our nakedness by throwing off that Christian religion which has hitherto been our boast and comfort, and one great source of civilization amongst us and amongst many other nations, we are apprehensive (being well aware that the mind will not endure a void) that some uncouth, pernicious, and degrading superstition might take place of it.

In short, Burke’s philosophy is far more robust and particular and undeniably conservative than the gradualist liberal skepticism championed by Sullivan et alia.

Today, Burke is being employed to support what should more properly be called “Fabian” change, i.e., deliberate but cautious movement towards a preordained liberal, ideological goal. Burke and the Fabians both rejected revolutionary change. Burke rejected revolutionary change because he thought the goals it pursued were bad or unlikely to achieve the promised results, colored as they were by an uncompromising ideological filter. The 19th Century Fabian Society socialists, by contrast, opposed revolutionary socialism, not because they thought the end result was a bad one, but because they thought revolutionary means would be less effective, might harden opposition, and otherwise presented tactical problems to the achievement of socialism. The Fabians were clearly onto something, judging by the steady increase of government power and socialist thinking in the once free market oriented Anglo-American regimes.

For the pseudo-Burkeans, the goals are almost always liberal in nature, based on the modern and very un-Burkean idea that the justice of any society consists of the equal treatment and provision of equal political power to people in very unequal situations, i.e., same sex couples and traditional married couples, foreign immigrants and native citizens, the educated and the uneducated. Burke was no egalitarian. In particular, he argued that it was unhelpful to emphasize our “common humanity” in discussing political matters, because group identities and the associated differences in station demanded different treatment in proportion to those differences:

The legislators who framed the ancient republics knew that their business was too arduous to be accomplished with no better apparatus than the metaphysics of an undergraduate, and the mathematics and arithmetic of an exciseman. They had to do with men, and they were obliged to study human nature. They had to do with citizens, and they were obliged to study the effects of those habits which are communicated by the circumstances of civil life. They were sensible that the operation of this second nature on the first produced a new combination; and thence arose many diversities amongst men, according to their birth, their education, their professions, the periods of their lives, their residence in towns or in the country, their several ways of acquiring and of fixing property, and according to the quality of the property itself — all which rendered them as it were so many different species of animals. From hence they thought themselves obliged to dispose their citizens into such classes, and to place them in such situations in the state, as their peculiar habits might qualify them to fill, and to allot to them such appropriated privileges as might secure to them what their specific occasions required, and which might furnish to each description such force as might protect it in the conflict caused by the diversity of interests that must exist and must contend in all complex society; for the legislator would have been ashamed that the coarse husbandman should well know how to assort and to use his sheep, horses, and oxen, and should have enough of common sense not to abstract and equalize them all into animals without providing for each kind an appropriate food, care, and employment, whilst he, the economist, disposer, and shepherd of his own kindred, subliming himself into an airy metaphysician, was resolved to know nothing of his flocks but as men in general.

Burke defended gradual change as an exceptional matter, not a general principle of political activity. The au courant Burkeans obscure that Burke’s more abiding tendency is skepticism and hostility to change, particularly change based on the Enlightenment idea of remaking society “rationally” to fit a particular philosophical vision of consistency and justice. For Burke, the goal of social life was not the “rights of man” or equality, so much as it was the prosaic concerns of order, wealth, stability, and happiness. A commitment to these limited and reactionary goals is usually at work when Burke defends a particular change, such as the so-called Glorious Revolution, which Burke identified as restorative of the principle of hereditary monarchy.

Burke was skeptical of change because, for him, the basic problems of social life have always existed. Institutions that somehow survived deserved extreme deference because they represent the accumulated wisdom of other men dealing with the same problems with the same resources and, if the inherited balance were somehow disturbed, society might spiral into oblivion and its leaders may be unable to restore the status quo ante. He thought ill-advised change more dangerous than the alternative, just as he thought most promises of change were either overly optimistic or constituted matters of deception, in which would-be reformers would succeed only in shifting the locus of political power from the old guard to themselves, acting in the name of “the people”:

Religion, morals, laws, prerogatives, privileges, liberties, rights of men are the pretexts. The pretexts are always found in some specious appearance of a real good. You would not secure men from tyranny and sedition by rooting out of the mind the principles to which these fraudulent pretexts apply? If you did, you would root out everything that is valuable in the human breast. As these are the pretexts, so the ordinary actors and instruments in great public evils are kings, priests, magistrates, senates, parliaments, national assemblies, judges, and captains. You would not cure the evil by resolving that there should be no more monarchs, nor ministers of state, nor of the gospel; no interpreters of law; no general officers; no public councils. You might change the names. The things in some shape must remain. A certain quantum of power must always exist in the community in some hands and under some appellation.

Burke thought happiness and authentic political justice consisted not in equal access to power, nor in certain procedures of political action, so much as they resided in preserving the inherited society as a whole and each person’s distinct role in it. He was particularly skeptical of calls to equalize participation in government simply for the sake of making the rough, but workable, edges of society more consistent. Consider his point here:

The science of constructing a commonwealth, or renovating it, or reforming it, is, like every other experimental science, not to be taught a priori. Nor is it a short experience that can instruct us in that practical science, because the real effects of moral causes are not always immediate; but that which in the first instance is prejudicial may be excellent in its remoter operation, and its excellence may arise even from the ill effects it produces in the beginning. The reverse also happens: and very plausible schemes, with very pleasing commencements, have often shameful and lamentable conclusions. In states there are often some obscure and almost latent causes, things which appear at first view of little moment, on which a very great part of its prosperity or adversity may most essentially depend. The science of government being therefore so practical in itself and intended for such practical purposes — a matter which requires experience, and even more experience than any person can gain in his whole life, however sagacious and observing he may be — it is with infinite caution that any man ought to venture upon pulling down an edifice which has answered in any tolerable degree for ages the common purposes of society, or on building it up again without having models and patterns of approved utility before his eyes.

Burke was, needless to say, no Fabian and no great fan of ideological equality. Nonetheless, his sense of justice frequently compelled him to defend the voiceless, such as the hapless Indian subjects mistreated by Warren Hastings and his defense of his oppressed co-ethnics in Ireland, put down by a short-sighted British concern for power. Indeed, imperialism everywhere seemed unnatural to Burke and a natural object of his condemnation precisely because it forced unlike peoples to associate with one another when they would otherwise proceed on their natural courses, guided by the ancient rhythms of their societies. Burke seemed particularly concerned about the corrupting effect of maintaining an empire in the Third World: “[W]e have not feared any odium whatsoever, in the long warfare which we have carried on with the crimes, with the vices, with the exorbitant wealth, with the enormous and overpowering influence of Eastern corruption.”

One might hazard the guess that because of his high regard for Christianity and long-established English liberties that Burke was only a conservative in liberal England who would welcome more revolutionary changes among unlucky peoples inheriting less effective and more oppressive social structures and mores. This is easily refuted by the main theme of his magnum opus: it opposed revolution in a society that Burke himself admitted was corrupt and in need of reform. The argument about some societies being “too bad to conserve” is a very familiar excuse to deviate from the core teachings of Burkeanism. But there is little hint of this situational approach in his writings; even in pagan India, Burke counseled against imposing English social order in the name of either English superiority or the alleged beneficial influence of British civilization upon the Indians themselves.

Burke was strongly anti-change, not merely skeptical or in favor of gradualism. Major change was a last resort. Consider his vivid comparison of the potential costs and benefits of political change:

To avoid, therefore, the evils of inconstancy and versatility, ten thousand times worse than those of obstinacy and the blindest prejudice, we have consecrated the state, that no man should approach to look into its defects or corruptions but with due caution, that he should never dream of beginning its reformation by its subversion, that he should approach to the faults of the state as to the wounds of a father, with pious awe and trembling solicitude. By this wise prejudice we are taught to look with horror on those children of their country who are prompt rashly to hack that aged parent in pieces and put him into the kettle of magicians, in hopes that by their poisonous weeds and wild incantations they may regenerate the paternal constitution and renovate their father’s life.

Conservatism must by necessity vary with time and place. It is chiefly an attitude about change, and what change means depends in part upon how an existing society is structured. But Burkean conservatism does not concede the goals of liberalism, goals which ultimately counsel un-Burkean methods precisely because of liberalism’s uncompromising account of he good. Our modern era is in many ways the heir of the French Revolution and its ugliness. Modern liberalism’s methods and ideals—uniformity, equality, ahistorical liberte, purifying violence, secularism—find echoes in everything from Bolshevik Communism to the forceful imposition of “democratic capitalism” on the ancient peoples of the Middle East. Burke’s defense of gradualism and social diversity have certain merits standing alone. But standing alone, such ideas are not Burkeanism, nor are they sufficient for guiding political action in a society thoroughly suffused with recent revolutionary change. Burke expressed his concerns for gradualism and diversity as part of a unified view in which these intermediate political goals acquired value only in relation to Burke’s substantive concerns for legality, necessary inequality, Christian justice, peace, order, legitimacy, and preserving a known and workable English Christian way of life. Burke’s later writings in favor of counterrevolution and restoration of the French Monarchy give us some insight into how he would address those that would defend gay marriage or affirmative action or some other artifact of modern liberalism in his name.

]]>Articles by Christopher RoachThe Great Education Bubbletag:takimag.com,2008:article/1.98542008-05-11T23:20:01Z1999-11-30T00:00:00ZChristopher Roachchris@takimag.comThe recent meltdown of the mortgage bubble illustrates a basic insight of Austrian Economics: cheap money leads to distortion and malinvestment, which can only be resolved through mass liquidation. Liquidation is an anodyne term, but in real life it means lost jobs, declining wages, “upside down” bank notes, bankrupt businesses, and stagnant housing values. The Federal Reserve’s decision after the September 11 attacks to drop interest rates precipitously—at one point lowering the federal funds rate to 1%—contributed significantly to the situation the country finds itself in. The sheer cheapness of this money created various pressures on lenders to loosen lending criteria, which in turn led to massive real estate speculation, artificially inflated housing values, and, now that the gap with actual economic demand has appeared, a necessary and precipitous drop in prices.

The real estate boom is not the only bubble in our economic life, and when these other credit-driven bubbles burst, similar episodes of liquidation will be necessary. Federal subsidization, coupled with immaturity and ignorance by borrowers, has created another bubble that threatens to burst: the great education bubble.

The latter half of the Twentieth Century has led to an explosion of spending on higher education in the United States. In 1900, 2 percent of young people went to college; today the percentage is above 30%. The GI Bill was a major factor in expanding college attendance after WWII. Avoiding the Vietnam draft was another. In the years since, the number of educational institutions and offerings expanded. Many small teachers colleges became universities. Notoriously unrigorous “for profit” universities appeared on the scene, while large and growing McUniversities are found in every state. The cost of education has risen, as well; in fact, it is an order of magnitude larger than the rate of inflation in the economy in general. All of this growth—in institutions, students, degrees, and educational investment—is driven by government-subsidized credit. As in housing, excessively cheap credit has created the conditions for a correction, which will lead to fewer universities, fewer degrees, recalibrated obligations to lenders, and, one should hope, a change in the culture of higher education in the United States.

While it is laudable that the United States has become more of a meritocracy with few barriers to higher education for qualified students, human nature has not changed. Only a small percentage of students can really benefit from a traditional university education. Yet many people desire a degree for themselves and their children, even if they don’t really care for what it entails. Underlying much of what we do is the great American goal of upward mobility and social respectability. As Paul Fussell observed, “In the absence of a system of heriditary ranks and titles, without a tradition of honors conferred by a monarch, and with no well-known status ladder even of high–class regiments to confer various degrees of cachet, Americans have had to depend for their mechanism of snobbery far more than other peoples on their college and university hierarchy. In this country, just about all that’s finally available as a fount of honor is the institution of higher learning.”

Nonetheless, money is the other great American passion. People do not attend modern-day universities to study Plato and Aristotle and Shakespeare, so much as to get a credential to obtain a job that gives them access to an upper-middle class standard of living: plasma TVs, SUVs, a nice house in a neighborhood with “good schools,” and the like. To accommodate the ranks of modestly intellectual students, educational offerings have become more vocational in nature, including such dubious degrees as “packaging” and “communications.” For a lot of reasons, university graduates typically earn more than those without a degree. Aspirants and their parents have reasoned that the degree itself—rather than what it once signified in terms of IQ and work ethic—is the key. This is a confusion of correlation and causation of the worst kind. Parents and students of modest IQ are starting to realize that a $100,000 student loan obligation for a Nova University degree hinders, more than it advances, the goal of an upper-middle class lifestyle. As this happens, this racket will collapse. Indeed, the decline of male university attendance suggests that this process is already underway.

There are several reasons degrees, universities, and the associated educational debt load has risen so dramatically. For starters, the folk wisdom persists that universities are a good investment. More important, the young decision-makers delay payments on their student loans for four years or more. Like adjustable rate mortgages, the tangible pain and constraints of these obligations are hard to fathom. Unrealistic hope about future earnings encourage additional indebtedness, and everyone from parents to guidance counselors encourage educational investments without rigorously considering the student, institution, and degree involved. Students and their parents may even reason that the market itself is revealing the relevant information and that the return on investment must be some multiple of the present-day cost of education.

Universities certainly have little incentive not to maximize tuition and encourage the acquisition of student loans, because they get paid up front by students. Subsequent defaults do not affect the university and only create burdens for the lender, the borrower, and the federal student loan insurance programs. Finally, the near impossibility of escaping student loans—even through bankruptcy—is not fully appreciated by borrowers. Everything from deficiency judgments on mortgages to credit cart profligacy can be set aside through bankruptcy, but not student loans.
Lenders are already starting to get the message. As the federal government has withdrawn its once generous support for student loans—through the (now private) Sallie Mae Corporation and its insurance programs—banks have tightened up lending criteria as well. While the Harvard and Princeton students will still do fine, folks at the Novas, Fiskes, and Culinary Institutes of America will face more rigorous scrutiny. Many lenders will likely leave the field altogether, a process already begun because of the meltdown of secondary markets in securitized loans of all kinds, including educational loans.
But the biggest revolution will have to come from students and their parents. As young college students working as secretaries, paralegals, and restaurant managers see the kids from shop class, military technical skills programs, and other “blue collar” fields buying nicer homes, nicer cars, taking more vacations and generally doing well, word will trickle down to the buyers and their parents. This is the essential thesis of the best-selling book: The Millionaire Next Door. Instead of repeating hoary and well-meaning advice about education, it’s becoming clear that the most secure jobs will remain those that must be done locally. Everything that involves a mobile product—from programming to engineering and other fields that require college education—have been pummeled by outsourcing and will remain less appealing. If it happened to telephone engineers in the late 1990s, why not accountants and marketers tomorrow?

The declining economic fortunes of college graduates, coupled with tales of white collar drudgery, suggests that necessary and high skill blue collar jobs—plumbing, car repair, cable installation—will become more appealing and more renumeritive. Graduates of high school and community college vocational programs, far from dooming these students to second class status, are starting to have the last laugh as marginal college graduates (and drop outs) enter default status on their student loans. As the information of inflated degrees expands, businesses will likely drop college degree requirements to obtain quality employees, instead giving high marks in “two year” training programs more respect than four years of partying at State U. Not available when Fussell wrote his work Class, the U.S. News college rankings have done much to demonstrate to the general public how little a degree from a third or fourth tier institution is worth.

A certain percentage of students belong in college, benefit from it, and have higher wages afterwards. Their ultimate life successes stem from the qualities that got them into a top school: brains, a work ethic, comfort with complexity, and creativity. An education for these kinds of skills and abilities is not for everybody. And this is not a tragedy. In America, folks of average intelligent are neither handicapped nor destined to a life of unproductivity. An advanced economy like the United States’ provides many opportunities for nearly everyone to do well, engaged in everything from service occupations, repairing complicated devices, people-centered occupations, and the like. But most of these occupations do not justify or require a four year degree. The reasonable (though modest) incomes of these positions make a $75-150K student loan burden a very real one.

Government insurance, government subsidies, borrower ignorance, and a culture of excessive optimism have done their part to create a distorted market filled with mediocre colleges and mediocre college graduates. Just as the McMansion bubble couldn’t last forever, neither will the education bubble as defaults from low-earning borrowers put extreme stress on lenders to marginal applicants. The borrowers themselves may soon create a political constituency demanding relief in a way that shifts costs from the government to private lenders in the form of more generous bankruptcy protections. This will be easier for liberal politicians to promote than a government bailout, but it won’t be good news for the University of Phoenixes out there, nor all the other mediocre institutions propped up by disappearing cheap credit.

]]>Articles by Christopher RoachThe Relativist Roots of Libertarianismtag:takimag.com,2008:article/1.98622008-05-06T19:10:01Z1999-11-30T00:00:00ZChristopher Roachchris@takimag.comConservatism is sometimes criticized as unprincipled, relativistic, or contradictory. This criticism stems from the very nature of conservatism; it is a philosophy rooted primarily in attitudes about change, so its starting point is always a given society, as it is found in all of its contradictions. Conservatism recognizes how the various goals of civilized, social life are often at odds with one another. These include: good government, liberty, order, safety, national security, prosperity, culture, and a virtuous and educated people.

Libertarians purport to be more principled and consistent defenders of natural rights. They are undoubtedly consistent. As demonstrated in the earlier thread on the drug war, many libertarians view conservative concerns for order and virtue as unprincipled enchroachments on the absolute primacy of liberty. Far from being relativists, libertarians imagine themselves to be the most principled voices in political life. Regardless of its “consistency,” popular libertarianism today is unmoored from the classical tradition of political philosophy, and often stands on a relativist epistemology to justify its various formulae. This relativism makes the libertarian philosophy both unstable and also unpersuasive to people of a conservative bent.

Relativism is not necessarily the same thing as nihilism. One can be a relativist and have certain beliefs one lives by and wishes others would live by too. Many libertarians from K-Street to Coeur d’Alene live moderate, decent lives. A libertarian’s relativism stems rather from the view that we cannot and should not “impose moral behavior” on individuals. The foundation of this view is often a belief that there is no certain moral knowledge about how to live one’s life that is true across time, place, and individuals. Thus, a relativist might agree that polygamy is wrong in our culture (or at least for himself), but he wouldn’t feel confident saying it’s wrong for a tribe of savages (or rural Mormons).

Many libertarian philosophies are founded on a kind of limited relativism. That is, while these libertarians say there are universal and certain principles of natural rights that we can discern from the nature of man and government, we cannot obtain certain moral knowledge in other areas of human life, particularly in regard to behaviors without direct effects on others. Thus, while we can know with certainty it’s wrong to rob and murder and rape (and tax and conscript in a national emergency etc., etc.), we cannot know with any certainty that it’s wrong to engage in polygamy, drunkenness, drug addiction, or usury. And, moreover, even if there were some knowledge in any of these areas, this viewpoint says that it would be wrong to give government power over those areas, because a moral principle of autonomy would be offended, which gives each of us the right to do things that are bad for us and choose our own path, so long as we don’t (directly) harm anyone else.

As commenter McBrown said in the earlier discussion, “That’s exactly the point. Who decides what’s “salutary”? If you take your comment to its next step, alcohol and cigarettes should be illegal.” I think there is a major theoretical problem with this view, even assuming McBrown has ever had the pleasures of smoking and drinking. For starters the answer to the rhetorical question “Who decides?” is the people, seeking answers in good faith through their elected representatives. If small government liberalism represents one important leg of the stool on which the historical American republic rested, “classical republicanism” is the other. The Founders sought liberty on the national and international level so that localities could self govern, creating communities with laws that aimed to make men more virtuous through such various prohibitions as blue laws, laws against sodomy, laws regulating markets, and even laws requiring firearm ownership as a means of maintaining “a well regulated militia.” The motto was not “taxation is theft” but, rather, that among a free people there should be “no taxation without representation.” The model of freedom was a corporate one: a free people had a right to govern itself.

Much libertarianism expresses a lack of confidence that a republican government can set up a legal regime to stop citizens from engaging in immoral and self-destructive behavior without devolving into an oppressive and meddlesome tyranny. Historically, the American balance on this score was sound. Usury, sodomy, most gambling, and various drugs were outlawed or regulated over the course of the Nineteenth Century, while freely negotiated wages, firearms, religious matters, and education were largely left up to the choices of individuals and families. The premise behind historical “morals legislation” is that we can have reasonably certain moral knowledge on a wide range of matters where laws are long established and effective at tamping down the worst human excesses. Such legislation, while it deprives individuals of pure freedom of action, does not do them harm because the conduct in question is objectively harmful. Defenders of the historical Anglo-American liberties often repeated the formula—from John Locke himself—that “liberty is not license.”

Now the whole basis of such a “natural law” view is that we have a nature, the essentials of which can be understood (and have long been understood), that nature implies right and wrong uses of human capacities, and that we can know with some confidence what is harmful to individuals, even when their impulsive free will might compel them to harm themselves. In other words, some behaviors are so obviously self-destructive and socially useless that there is no reason not to make them unlawful, or, as is more often the case, keep them from being lawful. This includes everything from suicide and polygamy to beastiality and gay “marriage.” The same understanding of nature which allows us to understand that these things are wrong also allows us to understand something like natural rights. Both insights are founded on a classical view that political philosophy (and the limits of state action) must derive from an understanding of the inherent purposes of human life.

The nature of human action and moral responsibility has long been known through introspection and observation, as well as the accumulated wisdom of Christendom. The fundaemntals of this understanding won’t be changing any time soon. There are simply certain areas of long-established laws where the only justificatoin for change would be an unflagging concern for consistency. Libertarianism does not allow any deviation from an unpersuasive claim that the vast majority of historical laws, even laws that we associate with the most happy moments of American history, are indistinguishable from the pervasive invasions of property rights, free speech, and free association that characterize the modern state.

To libertarians skeptical of laws based on a concrete conceptions of virtuous human behavior, I would ask the following: if we cannot achieve any certain moral knowledge in the areas of traditional morals legislation, on what basis can we obtain any certain knowledge in the other areas of libertarian concern, namely the traditional protection of life, liberty, and property and the restriction of government from any interference in these areas, broadly understood? And if we can achieve certainty in the case of the latter, why not the former? This question should be considered separate from whether or not we choose to enact and enforce a particular piece of legislation as a prudential matter .

Like all Americans, I think a great deal of paternalistic legislation is ill-advised, tendentious, or useless as a matter of policy. But I don’t believe subjects of traditional morals legislation (and laws against hard drugs) for a fairly simple reason: those behaviors are wrong, self-destructive, denying of human reason and serve no function beyond mere hedonism. More important, by rendering citizens unhealthy addicts, such behaviors (and the purveryors of the same) are induldging in the very behavior states exist to control: the pursuit individual good in a way that harms the common welfare of the society.

This historical appraisal of the matter underlies the practical “libertarianism” that many small government conservatives embrace. This view—the ideas of Reagan and Goldwater—is that it’s better to establish a strong consensus against ambitious, do-gooder legislation, in spite of the certainty of our moral knowledge, because we would not want to establish a precedent whereby my neighbor, with whom I may someday disagree, could then constrain me in the manner he sees fit on the basis of his subjectively certain, but defective, reasoning. In a society with moral dissensus, this consensus around the limits of state action is doubly important. And, as an Irish Catholic, the best historical example that I can think of where this went wrong is Prohibition. But Americans found their way out of that desert, just as they turned from the evil of slavery. This pragmatic commitment to limited government is a much narrower view than the deontological pronouncements of a doctrinaire libertarian, schooled on simple formula cribbed from John Stuart Mill and Reason magazine, that unwisely compares necessary taxation to theft and criminal punishments to assault and kidnapping.

Liberty is not the only goal of good government. As Burke said regarding that great, failed experiment in tragically consistent governance in France:

I should, therefore, suspend my congratulations on the new liberty of France until I was informed how it had been combined with government, with public force, with the discipline and obedience of armies, with the collection of an effective and well-distributed revenue, with morality and religion, with the solidity of property, with peace and order, with civil and social manners. All these (in their way) are good things, too, and without them liberty is not a benefit whilst it lasts, and is not likely to continue long. The effect of liberty to individuals is that they may do what they please; we ought to see what it will please them to do, before we risk congratulations which may be soon turned into complaints

Libertarians up in arms against the drug war purport not to care too much about how people exercise their freedom. This stands in sharp contrast to the old view that a free government required a robust civil society. Classical liberals from Sir Henry Maine to James Fitzjames Stephens understood that social opprobrium must take up the slack created by laws that afford ample freedom. The new view, by contrast, appears to be as much against private discrimination as legal restraint amounting in the end to a merely adolescent and thoroughly ordinary celebration of hedonism.

The so-called “cosmotarians” superadded concerns for equality—leading to their breaking ranks with Ron Paul—show that they are substantially confused on the level of principle—confused about the actual and natural tensions of liberty and equality, as well as the proper role of opinion in regulating social life. It is noteworthy that the most liberal societies in history—18th and 19th Century America and England—were replete with inequality and characterized by austere, religious cultures. By contrast, anti-clerical “liberals” on the Continent were concerned as much with equality as liberty, and their various revolutions predictably did not fare quite as well, often devolving in blood-baths, whether in France, Germany, Mexico, or anywhere else where philospher-kings tried to marry the antithetical principles of liberty and equality.

Since the tone of our age is one where concern for equality is lauded, while the virtues associated with liberty—individuality, merit, ambition, self reliance—are frequently condemned, it’s noteworthy that the new generation of K-Street libertarians can’t even bring themselves to defend free speech when that speech is politically incorrect and may offend some preferred minority group. The end result is a mockery of good sense and good government: strident demands for the government not to police morality are coupled with increasing indifference among self-styled libertarians to the state’s interference in private institutions, businesses, and clubs because of the au courtant moral principle of equality. Yet most people in the past, as now, supported limited government for “bread and butter” reasons, including the right to make a living, worship God, and raise a family in peace. A politically effective movement of these productive people will always have to emphasize the necessity of liberty and limited government to accomplish these things, rather than any abstract appeal where the dominant images of “liberty’s triumph” are pot-smoking hippies, blinged-out drug dealers, and other anti-social trash.

]]>Articles by Christopher RoachThe War on Drugs Has Its Advantagestag:takimag.com,2008:article/1.98792008-04-29T04:29:00Z1999-11-30T00:00:00ZChristopher Roachchris@takimag.comThough not conservatives, libertarians nonetheless rely on selective portrayals of the past to support their fantasies about drug legalization. They are particularly fond of painting a scary picture of the drug war replete with thuggish cops, draconian sentences, and a scary authoritarian picture of the new America. They contrast this picture with the recent past, a time without SWAT teams and drug-sniffing dogs. But do these accounts square with reality? Was the policing of the past any more “libertarian” than the present?

The state of the society at any given point in time is typically mixed. We are often declining in one area, while improving in another. As Burke noted, “The science of constructing a commonwealth, or renovating it, or reforming it, is, like every other experimental science, not to be taught a priori. Nor is it a short experience that can instruct us in that practical science, because the real effects of moral causes are not always immediate; but that which in the first instance is prejudicial may be excellent in its remoter operation, and its excellence may arise even from the ill effects it produces in the beginning.” Criticisms of the drug war, lodged by libertarians and liberals alike, are too often maudlin and sentimental, ignoring the ways the drug war has mollified the mistakes of yesterday’s liberals. Critics are particularly fond of misleading anecdotes, ignoring that policing of the past was harsher than the present.

Consider Radley Balko, an archetypal libertarian and sometime staffer at the storied Cato Institute. In a recent entry, he decries police “militarization” as represented by the provision of scary uniforms to a police anti-gang unit. This is an instance where libertarians’ rationalism leads them down to error when trying to make sense of the real world. Laws and law enforcement do not solely involve questions of abstract liberty. The question is whether law enforcements leads to an overall improvement in the quality of life: safety, prosperity, and, yes, liberty too. Obviously, most people do not want to be mistreated by police. But most people are law abiding, and they do want criminals to be scared of police.

Popular attitudes about crime and law enforcement recognize that criminals are atypical. Thus, most people sympathize more with folks like themselves busted on au courantcriminal environmental regulations or for carrying a gun for self defense—i.e., the behavior of typical law abiding people—than they do when typical criminals are busted for pimping prostitutes or selling dope. The fundamental error of libertarianism is a monomaniacal focus on the state. They ignore the fact that we lose our practical freedom equally in a time of high crime and disorder. The barricaded urban apartments during the 1970s were just as much a signal of restrained freedom as our W-2 forms are today.

Whether on balance drugs should be legal, most people recognize that drug dealers are law-breakers that are often vicious and violent. Their violation of the drug laws is not so much an expression of their natural rights as it is the manifestation of their naturally anti-social characteristics. Popular movements to increase penalties for drug dealers and to expand the capabilities of police are sensible and healthy.

Before the drug war and the increase in paramilitary SWAT and anti-gang units, police shootings were higher in absolute terms than the present. Consider the chart below: police shot and killed more suspects than the present in the 1970s, even though the country had nearly 100 million fewer people.

I’ve written about this before, but Radley and other libertarians continue the specious technique of assembling anecdotes about negligent discharges and rogue cops, as if these cherry-picked examples can refute the statistical reality represented above.

Libertarians are correct that the drug war often entails long mandatory sentences, disproportionate impact on young minorities, and that these crimes, strictly speaking, are not violent. Yet violent crime has dropped in recent years. Consider the data below:

There’s a logical reason that drug laws have been effective at gathering up society’s trash. It’s much harder to prove burglary, rape, and murder than it is to prove a drug offense. The evidence, for starters, is easier to come by. It’s much easier to find a kilo of cocaine in a car trunk or a few rocks of crack in a pack of Newports than it is to match offender DNA or otherwise prove a violent offense. Drug convictions are analogous to punishing Al Capone for tax evasion.

The extent to which drug offenders re-offend (or would offend) violently is harder to predict with any exactness. That is, the net cast by the war on drugs is certainly an overinclusive one, sentencing harmless mules and big-purchasing users for long sentences in ways that do not make sense in particular cases. But the aggregate result is telling: violent crime has dropped markedly as the rate of long term drug incarceration has risen.

There is another factor in their account of the war on drugs that is ignored by libertarians. Lower rates of incarceration in the past mask the fact that the total institutionalization rate of the past was much higher. In other words, more people were in loony bins in the pre-drug-war era. For the rest of us to enjoy the freedom that comes with safety from crime, it is important that the smallish percentage of societal misfits are identified and locked up. Today’s misfits are often locked up for dealing drugs; in the past, they could be easily labled crazy and put away for life. Surely this aspect of the present day regime is more libertarian (and overall much better) than that of the 1940s and 1950s, where forced mental institutionalization could take place on a relatively flimsy showing without any criminal behavior having taken place. This important fact doesn’t fit the script, however, and the libertarians instead project their vision of an ideal society onto the much more complicated reality of the past. Instead of sanitariums and the “third degree,” it is instead painted as a time of Officer Friendly and brief stays at orderly, safe prisons.

In short, America locked up more people in the past, and law enforcement was frequently more violent in its tactics. Law enforcement effectiveness declined under the impact of liberal utopianism in the 1960s and 1970s. Fed up with high crime, and identifying the culprit in drug gangs, various common sense reforms led to longer sentences for drug pushers. In addition to punishing an inherently predatory and anti-social behavior, these sentences have had the happy byproduct of locking up the self-identified law-breaking young men—most of whom are minorities—many of whom would otherwise be committing violent crimes.

Ron Paul and other critics of the present drug war would do well to explain what aspects of the past law enforcement balance they would restore, which they would reject, and how they would continue to suppress violent crime that the drug war is now tamping down quite effectively, albeit indirectly.

]]>Articles by Christopher RoachFads, Tradition, and Real Knowledgetag:takimag.com,2008:article/1.98882008-04-24T19:00:01Z1999-11-30T00:00:00ZChristopher Roachchris@takimag.comOne of the most distracting phenomena of modern times is a kind of “hyper skepticism.” For example, well known truths—that minorities commit more crime, that men are stronger than women, that many criminals can’t be rehabilitated—are met with demands for statistics, studies, and the like. If a study can’t be found, this often grinds a debate to a halt. Recall the interminable discussions during the 1980s about whether the death penalty had any deterrent effect. The lack of such a “study” supposedly disproved what common sense tells us: people being hanged from lamp-posts encourager les autres.

I’m skeptical of the claim that scientific studies or other secondary tools are the most efficient means of coming up with the right answers. Consider the recent mania for online predictive markets. Somehow the behavior of these bettors is supposed to be superior to that of seasoned observers, common sense, and the like.

Of course, real markets are efficient at pricing things that people pay for. And good studies do measure what can be put into numbers. But information does not always come in such a format. People have other ways of receiving and sharing information in non-market arenas, such as war, courtship, and personal safety. Indeed, the most complex fields defy logic, formulae, and various rationalist short-cuts. Men make due most especially with the gifts of tradition and intuition.

I realize a worthy goal of social science is to move beyond mere intuition to actual knowledge, but decisions must be made all the time without the benefit of double blind studies—studies complete with regressions, peer review, and all the rest. We can neither be paralyzed by their absence, nor foolish enough to think we “know nothing at all” about matters long considered simply because of the format of our knowledge in traditions and folk-ways.

Moreover, traditional understanding benefit from being less certain and less infused with the patina of science than the pronouncements of avant-garde social scientists. Consider all the ways we know a neighborhood is “bad”: quality of real estate, the clothing of its denizens, the number of aimless young men, the presence of dirty cars, the loudness of music, the number of police, the number of young children with tired-looking single mothers, the presence of Newport t-shirts, boarded up businesses, its reputation, etc. In other words, most people can navigate their way through the world without social science. Worse, because social science is burdened by politically correct blinders, the natural implications of the data these observers amass are ignored, suppressed, or otherwise explained away by overly complex (and false) analysis. Imagine we wanted to know, for example, whether a neighborhood was dangerous. Instead of a survey or statistical data from criminologists, we would be better served by seeing whether the “Club” anti-theft device was in common use.

In spite of the shortcomings and numerous hair-brained failures of social science, important decisions on crime, sexual behavior, education, the institution of marriage, the production of CO2 and the like are increasingly put into the hands of social scientists, whose teachings are often far inferior to the crystallized common sense expressed in the “way things were always done.”

As Burke famously noted, “We know that we have made no discoveries, and we think that no discoveries are to be made, in morality; nor many in the great principles of government, nor in the ideas of liberty, which were understood long before we were born, altogether as well as they will be after the grave has heaped its mould upon our presumption, and the silent tomb shall have imposed its law on our pert loquacity.” In other words, any properly understood social science has a limited role, fine-tuning what is already known.

Why is this exactly? Social traditions that are around today (or in a very recent yesterday) exist in a kind of Darwininan competition. They would not persist into the present if they did not well serve society, because a tradition disappears if it is wrongly attuned to nature and circumstances. Over time, it may change, but its very existence coupled with its origin in “time immemorial” suggests its value. Far from proving that traditions are useless or infinitely malleable, their rapid envelopment by short-term fads in behavior—feminism, mass promiscuity, pacifism, economic redistributionism, multiculturalism—only shows that traditions and societies are fragile things. It would be shocking if these even more fragile and unstable alternatives survived two or three generations. Witness how post-Christian Europe, for example, is essentially contracepting itself out of existence.

Studies, derivative markets, and other predictions are only as good as the tools used by their participants. Studies and statistical tools that contradict hard-won knowledge, common sense, or the behavior of people with “skin in the game” are almost always dangerous and ultimately wrong. Notice the lemming-like behavior in mortgage markets that have preceded our current over-building disaster. Much of this was rooted in “econometric” models that ignored any theoretical understanding of how economies worked and blindly plugged in data to predict what would happen in the future. These mathematical models were based solely on mathematical trend lines from what is happening now. This rather obvious stupidity did not work, and failures became manifest in a short period of three yaras. Among others, the “sophisticated” banks whose staffs are full of econometricians are bleeding money.
Worse, this stupidity has filtered down into the society. Far from advising folks to diversify, save, and live moderately, the speculators and their studies are giving justification to the worst instincts of common people. “Man Money” is creating an entire nation of leveraged (i.e., massively in debt) people trying to get rich quick. Instead, they are ending up shipwrecked in a declining market in front of their overpriced homes and overused e-trade terminals. Any kind of deferred gratification, economic horse sense, or other traditional restraints on bad behavior are going by the wayside. And our short term memories—the tech boom was only ten years ago—suggest some other manic overinvestment scheme is on the horizon. When such a trend emerges, various studies, statistics, and the false confidence they bring will enable participants and policymakers to again lose their minds.

The Charles Murrays and John Lotts of the world have done something useful in supporting conservative intuitions on IQ, welfare, crime, and gun control through rigorous social science. At the same time, we should never put too much stock in the untried, the novel, and the counter-intuitive. Common sense, tradition, and skepticism should be hallmarks of any real conservatism.

]]>Articles by Christopher Roach“Never Again” Nationtag:takimag.com,2008:article/1.99062008-04-15T14:39:00Z1999-11-30T00:00:00ZChristopher Roachchris@takimag.comOur view of what kind of nation we are is related to the question of “nationalism.” Are we a normal nation? A “creedal” nation? An “exceptional” nation? For many on the left and the neoconservative right, America is only authentic and just when it uses its immense power in a selfless ideological struggle on behalf of the powerless. This view of the West is a major influence on the neoconservatives, whose historical memory finds an especially important turn in 1939. For them, this is the year when the West, and America in particular, became morally suspect by failing to help European Jews by putting down the dreaded old nationalist forces that the neconservatives’ parents had recently fled.

A surefire way to get American politicians to take notice of some problem in the world is to be told it’s a Second Holocaust. Americans and Europeans meekly accept the charge from the Nazi’s Jewish victims that this episode was as much a moral failing of “bystanders” as it was the responsibility of the perpetrators themselves, and that therefore the whole world should stand united in the future when such events occur. We are told that a surplus of nationalism leads to selfishness and indifference and that ultimately such feelings lead to the greatest symbol of evil in the Western World. Post-national states must intervene, militarily if need be, so that such an atrocity would happen “never again.” Neoconservatism’s twists and turns may best be explained as follows: their views of American national identity and foreign policy must always yield an interventionist and open borders response to the events of 1939; all other views must be rejected as inadequate.

They’ve made some headway with this critique, because the Holocaust is the chief agreed-upon symbol of evil in the moral imagination of the Western World. And this symbol is sometimes used, particularly by the far left, to show the fundamental moral failings of the Western World (as opposed to showing the failings only of some of its members). For them, the Holocaust is the Evil Western World’s apotheosis, the culmination of the crusades, witch burnings, slavery, pogroms, mistreatment of indigenous peoples, etc. Of course, we all agree that this evil event should not happen to this group again.

But much more is required.

Equality and nondiscrimination demand that one puts the citizenship of one’s countrymen on an equal plane with that of strangers. The measure of our worth will not be the advancement of something so parochial as our national security and commonwealth, but, rather, will consist only in the elimination of any distinction between ourselves and the other. This distinction is supposedly the root of all discrimination, all racism, all ethnocentrism, and, by implication, is the root of the Holocaust itself. The neoconservative and idealist agenda is as much a test of our own moral integrity and commitment, as it is a formula for political and foreign policy.

For the neoconservatives’ conservatism is not about conserving anything tangible and historical. It is, instead, about the march of abstractions: Free Markets, Democracy, Color Blindness, Tolerance. America can be defined as a few slogans. Under this grandiose philosophy, a government’s role is not to advance the parochial and particular good of America, even when its interest is as basic as self-defence. It’s instead to support the triumph of these universal values. We all are being asked to take one for the team. And the team is not our country. The team is the whole human race, which would supposedly recoil in horror if we behaved like a normal, self-interested society.

Why else have we not done more to deport illegals after 9/11? Why else hasn’t Bush spoken out forcefully about the Muslim overreaction to a few cartoons in an obscure Danish paper? Why else do people in other nations (such as Nigerian Christians) react so differently and more predictably compared to Westerners when they’re harassed by Muslim minorities? Why else do we help Muslims in Kosovo and Iraq, when it’s so obvious these people are hostile to us, our religion (or what’s left of it), and our way of life?

Like so much else in liberalism, our objective decline and endangerment is described as the march of universal justice. Our meek defenses are recast as offensive “attacks.” This is why James Burnham called liberalism an “ideology of western suicide.” It functions to redefine our destruction as a good thing that we should welcome. This decline serves another function, a spiritual function. We can take solace in our decline as atonement for our participation in a crime that is widely reputed to be the worst in human history, the moral dagger at the heart of the Western World’s pretensions of morality.

Let’s consider reality, though. Gallantry, heroism, and expensive support for strangers are simply too much to ask from the general lot of nations. It’s an unrealistic demand that misdiagnoses the roots of the Holocaust—revolutionary ideology and disregard for Christian limits on state action—while it also misunderstands the real costs that such a “do gooder” ideology imposes not only on one’s own citizens but on foreigners too.

Because when “nations” stop wars and genocides, they do not do so collectively. It is their soldiers, whose interests are a public trust. When “nations” take on refugees, it is not the nation, but individuals and communities that are affected. It is Newark and Wausau and Minneapolis who must absorb the Central American, Hmong, and Somali refugees respectively. Acts of generosity and heroism are noble sentiments that should be praised and encouraged and remembered among communities and individuals. Yet they are rare. They should not be imposed from a faction on a nation’s soldiers and small towns without some proportionate benefit to the nation. And the more common absence of these qualities in nations and individuals should not be an occasion for condemnation by “armchair Oscar Schindlers.”

I also question the ultimate moral calculus of these moralizers. The idea that America or other large nations should “do something” when evil is afoot is the chief reason petty border squabbles in the Balkans can metastasize into something like World War I. In the name of creating world unity against aggression, the interventionists instead create a formula for perpetual and ever larger wars fought by enormous coalitions of people with no direct stake in the conflict. This is madness. Yet this is the fundamental premise of the United Nations, the “New World Order,” and the neoconservatives’ “idealist” foreign policy.

Rejecting this reasoning is only possible when one is a nationalist with a sense of greater responsibility, loyalty, and love to one’s own than to foreigners. This is a perfectly natural love and is immediately tangible when one travels overseas. It’s more than mere patriotism. It requires not just love of one’s own, but rejection of an alluring suitor: the siren song of “universal human brotherhood.” Today, the alternative to nationalism is not localism so much as it two bad alternatives: a descent into primitive tribalism at home with no sense of common interest between different ethnic groups and social classes, or a sentimental globalism that devalues national pride and glorifies run-away materialism. It’s a pincer movement with Spike Lee on one side and Benetton on the other. Healthy nationalism is an antidote to both of these unsustainable extremes.

If we accept this view, we must revisit the solemn invocation: “never again.” Because if “never again” means we must always go to war to protect the weak from the strong—because Americans, Britons, Eastern European Jews, or Bosniaks must never be valued differently by Americans in this moral calculus—then we’ll always be at war everywhere. Our people will suffer. And we may find ourselves victimized in turn for having created new enemies. Worse, we may be unwittingly strengthening future victimizers posing as victims in far flung locales involving people we know almost nothing about. Consider Iraq as an example of a “humanitarian war” gone awry: who are the good guys again? Is it the Shiites? The Sunnis? Or was that last week?

Most saliently, we should look to how a real ethnostate behaves. Israel, the chief cheerleader for “never again” politics, turned away Sudanese refugees in spite of the atrocities they are fleeing in 2007. If Israel expends its resources so parsimoniously on behalf of strangers in need, how persuasive is the claim from its supporters that we must do the same for strangers the globe over? In light of this shabby treatment of the Sudanese, how persuasive is the associated claim that America owes Israel substantial military and financial support to atone for our “earlier failing” to intervene more quickly during the Holocaust?

It’s all a bunch of double standards. No one can follow them. So the principles should be revisited. And we should wisen up so that Americans and the West do not get brow-beaten into doing things that no sane nation, not least the Israelis, would ever do with its immigration and foreign policies. The start of this critique must be some sense of nationhood, which is to say, some distinct sense of self that prioritizes ourselves, our loyalties, and our proper group concerns above those of every other nation and above those of every imploring claimant.

When we look at wars like Iraq and Kosovo, we should indeed say “never again”: Never again will America be guilt-tripped into doing something so stupid with with the flawed “blank check” slogan: “Never Again.”