Historical Perspectives on World Affairs

Last night, as the fireworks of July 4 were fizzling out, CNN reporter Andrew Kaczynski posted a troubling update to the saga of the now-infamous wrestling video that President Trump tweeted several days ago. This video was a clip from Wrestlemania XXIII in which Trump attacked WWE chairman Vince McMahon, but with McMahon’s face replaced by the CNN logo.

No time of year brings on pangs of nostalgia like the holidays. Lights, decorations, timeless songs, shared rituals and traditions – they provide a sense of comfort in the coldest, darkest days of the Northern Hemisphere’s winter, but they also provide a sense of meaning. For the secular-minded, holidays like Hanukkah and Christmas are occasions for fellowship, but they originated to commemorate events that remind Jews and Christians of who they are. Beyond merely reinforcing the group identity that comes with those particular labels, these times are meaningful because they reenact the past in order to impart meaning to the present, for meaning is not possible without history.

And yet, the meaning we find in the past can be seductive – and dangerous. If the present seems to provide no real meaning, then we start to look for meaning only in the past. If we dream of restoring our ideal version of the past, the present state of the world becomes an enemy, something that has to be destroyed before we can put things back the way they were – the way that we believe they were always supposed to be. Unfortunately, the past, by virtue of not being around anymore, effectively exists only in our imaginations. If we tear apart what we have in the here and now, our imagined past will not return. If we realize that and yet still keep tearing away, then we have succumbed to nihilism.

I see Trump, Brexit, and the general rise of illiberalism in the Western world today as products of a nostalgic trend that threatens to lead us down the path to nihilism. These movements turn their backs on the managerial-liberal politics that has prevailed in the West since the end of the Cold War, but without advocating new institutions or practices. Instead, they want to unravel existing norms and structures in hopes that once they do away with supposed deviations like multiculturalism or free trade, a rebirth of national greatness will ensue. I place no hope in such dreams.

Last week I talked a bit about Eric Voegelin’s philosophy of history, especially the role of what Voegelin called “Gnostic” movements. In Voegelin’s terminology, Gnostics are those who believe that they must destroy Babylon – the present unjust order of things – so that the new Jerusalem – a perfect and godly society – can come into being. In Voegelin’s most famous turn of phrase, the Gnostics believe that they alone can “immanentize the eschaton.” Secularization did not destroy the Gnostic schema of history, but rather appropriated it, incorporating it into teleological political movements like Nazism and Marxism.

There is also a teleological aspect to managerial liberalism, even if it does not valorize heroic struggle like Nazism or Marxism, nor aim at a rebirth of any particular ideal moment in the past. Liberalism’s 18th-century origins placed it in opposition to monarchy, aristocracy, and established religion, the forces that had guided European civilization for more than a millennium. While liberalism might look to the past for lessons on things not to do, it rejects the idea of a past utopia as myth, conceiving of the world it is trying to create as something fundamentally new. Liberalism is novelty opposing nostalgia, Babylon at war with Jerusalem.

A wholehearted embrace of novelty, and especially a devotion to “progress,” is dangerous, as it can lead one to believe that the past is irrelevant and that human will is infallible: whatever we imagine, we can create, and it will all function exactly as we had planned. But nostalgia is an equally dangerous chimera. The warm, fuzzy feeling you get when you think about the past is a product of present action, not a memory in and of itself. It is not the way you felt then, but a feeling that comes through the present act of reminiscing. You can’t get that feeling back, because it only comes from looking backward. Turn away from the present in a failed attempt to recreate the past and you will be left with nothing.

Whether the promised earthly paradise is a new creation or a re-creation, I don’t believe that human agency can create it. In his Confessions, the fifth-century bishop and theologian Augustine of Hippo defined the essence of sin as man’s desire to be like God, to live without limitations and make the world in our own image. Achieving this is impossible and yet, as Augustine said, “We can’t not sin” (Non possumus non peccare). Our very nature means that we can never be rid of that desire, and that realization was what led Augustine to Christ. It is not an exclusively Christian problem, though, as Plato wrestled with similar dilemmas centuries earlier: we can perceive higher truths, but we remain perpetually bound by the limits of material existence. This places us in an in-between state that the Greeks called metaxy.

Our place in history is also one of metaxy: our striving is not fruitless, but building the new Jerusalem is beyond our abilities. Like the philosophers, I believe that an ethic of in-betweenness – of moderation – can help make our metaxic state more tolerable. Moderately pursuing a tolerable existence will not sound very exciting to anyone who believes in the necessity of changing the world, as it denies us the exhilaration – and the sense of immanent meaning – that comes from seeing oneself as a protagonist in a heroic struggle of world-historical import. Instead, it requires a belief that meaning is transcendent – that it is somewhere out there, but never fully perceived, never wholly enacted. We may want to see it represented in the here and now, but the best we can do is arrive at some sort of crude approximation. In this manner, philosophy (in all its domains, history and theology among them) is a kind of faith, but its practice ought to inspire humility more than zeal. And my humble vision of a future utopia is merely a world in which we would all have to contend with slightly less zeal.

I’ve had long stays in Spain, Germany, and France, and have an intense scholarly and personal interest in the histories and cultures of all three countries. And yet I would sooner proclaim myself an Anglophile than a “-phile” of any of those other three, even though I’ve spent far less time in the UK. I engage with other countries on a more critical, less emotional level, but I can’t help but feel stirrings of romanticism in England that I don’t feel elsewhere. It is, of course, Shakespeare’s “sceptr’d isle, / This earth of majesty, this seat of Mars, / This other Eden, demi-paradise….” The music of the British Invasion, which I discovered as a teenager, to me is no less evocative than anything by Blake or the Bard. All these things made my first visit to London a kind of pilgrimage. Inspired by the Kinks, that that most quintessentially English of British Invasion bands, I timed my exit from Waterloo Station to cross over the Thames at just the right moment, for as the song promised, “As long as I gaze on Waterloo sunset, I am in paradise.”

My training as a medievalist has led me to appreciate the depth of tradition embodied in the British political system. Western political thought rests on rather irrational assumptions, some of which I explored last week. The Enlightenment judged those assumptions to be bad, and the countries of continental Europe have largely attempted to paper over them since the French Revolution, but in the UK their continued presence has become part of the country’s distinct identity.

The pageantry associated with royalty is liturgical in its origins, a sublime expression of the belief that the State participates in the divine order through sacred ritual, a belief as old as government itself. Although the Christian Church claimed to be the sole universal vehicle for salvation, it never denied the State’s divine mandate, for as Paul said in Romans 13:1, “the powers that be are ordained of God.” The purpose of the State in traditional Christian thought was to safeguard the Church until the end of time, when the relationship between God and man would come to perfection in the new Jerusalem.

Throughout Christian history, though, there have occasionally arisen movements that see the new Jerusalem not as the perfect Church beyond time, but as an actual State to be established on Earth by God’s elect. Austrian political philosopher Eric Voegelin (1901-1985) grouped such movements together under the label “Gnosticism,” and theorized that such beliefs lay at the origin of all modern political ideologies.

The core belief of Gnosticism, according to Voegelin, is that divine guidance will lead humanity to establish a perfect, godly society in the here and now. Interpreting the Christian apocalypse in political terms, Gnostics viewed the present order of the world as “Babylon,” the immoral city that appears in Revelation 17 and must fall before the new Jerusalem can arrive. In England’s 17th-century civil wars, the Puritans declared it their mission to “dasheth the brats of Babylon against the stones,” after which “the inhabitants of Jerusalem, that is, the Saints of God” would build a new and better world. But first Babylon had to fall, so the “Saints of God” had to bring it down. They attempted to do so in 1649 when Puritan parliamentarians executed King Charles I and set up a republic in his place.

In opposition to the revolutionary millenarianism of the Puritans, Thomas Hobbes penned Leviathan in 1651, ascribing the origin of the State to human initiative and arguing that its purpose was to secure the material prosperity and bodily security of its inhabitants. Hobbes’s State would not promote morality for otherworldly ends, but merely for the purpose of maintaining worldly order; thus Hobbes denied the State its traditional role in the cosmic drama of history. In Voegelin’s analysis, what Hobbes really did with Leviathan was make possible a godless Gnosticism. If the State can be justified in and of itself, without reference to a transcendent order, then the State can say to man, “I am that I am” (Exodus 3:14). The State takes the place of God as the source of morality and meaning, while science (both natural and social) becomes the new revelation.

As an academic in 1930s Vienna, Voegelin was an outspoken critic of both Nazism and communism, which he saw as heirs to the Gnostic tradition: neither could build its new Jerusalem without first toppling the Babylon – the immoral world order – that stood in its way. For the Nazis, this was international Jewry; for the communists, bourgeois capitalism. Voegelin fled Austria in 1938, settling in the United States, where he passed away in 1985. I can’t be sure what he would have made of the collapse of Soviet communism, as I’m hardly an expert on Voegelin, but if I were to try extending his analysis to the present, then the managerial liberalism which prevailed in the West after 1989 would seem a sort of “Gnosticism Lite.” It affirms the eventual perfectibility of humanity, but denies that apocalyptic struggle is necessary to accomplish it. Instead, it can be done gradually by tinkering experts following theoretical “best practices.”

The European Union is without a doubt the most ambitious structure ever erected by managerial liberals. The EU’s commitment to “ever-closer union” rests on a teleology in which free trade between member states and deference to supranational regulatory bodies in Brussels would inevitably “generate a European identity to sit alongside and eventually supplant national identities.” I would argue that “European identity” is nothing new, as such an identity united the continent’s ruling classes during the medieval and early modern periods, but that identity was both explicitly Christian and exclusively aristocratic. From the coronation of Charlemagne until the French Revolution, Europe’s interrelated ruling families enforced a continent-wide social hierarchy that they believed to be part of a divinely-mandated cosmic order. In many ways, the UK is the last outpost of that classical European identity, as I hope to address in the next installment.

More than 200 years ago, English poet William Blake extolled his homeland’s ancient connection with Christianity and projected that connection forward into a future in which a new Jerusalem will arise in England. These words from Blake’s preface to his epic poem Milton have since become both anthem and hymn, imbuing English patriotism with a sacrality uncharacteristic of the country’s increasingly secular society.

Jerusalem is a potent and multivalent symbol in the Judeo-Christian tradition, particularly when associated with visions of the future. In the Biblical book of Revelation, Jerusalem sits at the end of time as the culmination of the relationship between God and man, where they will dwell together and “reign for ever and ever” (Revelation 22:5). Jerusalem in the original Greek text of Revelation is described as a polis, a political unit, that will endure eternally. The Latin text carries the same connotation: here Jerusalem is a civitas – a “city” in the sense of an incorporated government as opposed to an inhabited place (urbs in Latin), a political abstraction rather than a geographic location.

The language of Revelation describes the perfection of the relationship between God and man not as a home, a family, or a temple, but as a State in which all previous relationships and institutions will dissolve, as they will no longer be needed. When the final book of Christian scripture chooses the vocabulary of politics to describe the ultimate experience of the sacred, it is not hard to make the assumption that there must be something of the sacred in the political. Until William Blake’s own generation, that assumption was indeed nearly universal.

Last week I lightly sketched a few defining features of Britain’s political history, calling attention to the assumptions that underlie the functioning of the UK’s “unwritten constitution.” My interpretation owes a debt to J.G.A. Pocock‘s essay “Burke and the Ancient Constitution,” in which the author – one of the preeminent historians of Anglo-American political thought – notes how the English political vocabulary derives from legal terms relating to property rights, essentially making those rights one of the strands of England’s political DNA.

Like property rights, theology is also part of that genetic material, but is perhaps not as easy for historians to trace. While property rights provided the literal vocabulary for talking about rights in general, theology provided the ontological concepts that underlie Western assumptions about the existence and endurance of the state. Medievalist Ernst Kantorowicz explored English political theology in his 1957 classic The King’s Two Bodies, which functions as a sort of case study testing ideas originally proposed by Carl Schmitt in his seminal 1922 work, Political Theology:

All significant concepts of the modern theory of the state are secularized theological concepts not only because of their historical development – in which they were transferred from theology to the theory of the state, whereby, for example, the omnipotent god became the omnipotent lawgiver – but also because of their systematic structure, the recognition of which is necessary for a sociological consideration of these concepts.

Kantorowicz acknowledged his intellectual debt to Schmitt in spite of the circumstances that separated these two eminent German scholars: Schmitt joined the Nazi Party following Hitler’s appointment as Chancellor in 1933, while Kantorowicz, who was Jewish, fled his homeland in 1938 as Nazi race laws made his life there increasingly difficult. Before being ostracized by his Nazi colleagues, Kantorowicz had been part of conservative intellectual circles that revered the 19th-century German Romantics, supported militant anti-communism, and eventually threw their support behind Hitler. After emigrating to Britain and then to the US, Kantorowicz continued his scholarship in a conservative Prussian mode, skeptical toward secularism, rationalism, and liberal democracy. In the expansive, fascinating, and sometimes frustrating work that is The King’s Two Bodies, he argued that England’s unwritten constitution – and by extension the Anglo-American political tradition – ultimately rests on the dual foundations of Christian theology and traditional kinship structures.

According to Kantorowicz, English political theory assumed (but rarely stated explicitly) that the Crown was a mystical body in which each individual monarch was present in the same way that each individual Christian was present in the Body of Christ. The distinction between the mystical body of the Crown and the physical body of the monarch provided a theoretical basis for opposing royal authority when English jurists translated these otherworldly concepts into the practical language of common-law property rights. The monarch held the Crown in trust, as a guardian would with the estate of an underage orphan in their care. If said guardian used the child’s estate for their own gain, the rest of the family could rightfully remove the child from the guardian’s care. Parliament thus used its position as the kingdom’s preeminent defender of property rights to monitor the monarch’s guardianship of the Crown, reserving the right to remove the Crown from any monarch who treated it in a manner not keeping with tradition.

The King’s Two Bodies inspired me to investigate similar assumptions underlying the political structures of medieval Catalonia for my dissertation. In archives in Barcelona, I found charters that described the Crown of Aragon enduring “for ever and ever,” intentionally echoing the language of Revelation. I suspect that many more parallels can be found in other countries in Western Europe, with each of them staking a claim to the eternal Jerusalem. The UK is unique among these countries in that it has persisted without a written constitution, continuing to accept tradition as the basis of its polity. Thus it is easier to discern there than elsewhere the mystical glamour at the heart of the Western political tradition, for the British have not yet painted over it with the dull beige hues of Enlightenment rationalism.

MORE TO COME SOON —

Anytime I mention my favorite movies, Children of Men always comes up. If you haven’t seen it, it’s about the impending extinction of humanity. Fun stuff, right? Purely on its surface merits, it’s an excellent post-apocalyptic thriller, but it also functions on an allegorical level. Philosopher Slavoj Zizek explains the multiple levels of meaning in the movie in one of the special features of its DVD release, excerpted below. Even though Zizek is a crazy old Marxist, he still has some worthy insights into the film’s significance:

Zizek proposes that the UK’s lack of a written constitution makes it fundamentally different from other countries, and also makes it the only country where Children of Men could plausibly take place. That a country can govern itself without a written constitution might sound surprising to Americans, for whom our founding document is an object of reverence, but we should recognize that the US only needed to draft such a document because it established its political system through revolution. Until Enlightenment thinkers demanded that government be systematized in the light of rationally determined principles, there was generally little question that the “substance of tradition,” as Zizek puts it, was sufficient foundation for the rule of law.

In spite of Zizek’s Marxist orientation, his comments on Children of Men echo the godfather of English conservatism, Whig politician Edmund Burke (1729-1797). Burke rejected the Enlightenment dream of rationalizing and systematizing government, insisting that tradition alone should provide England with the basis of its polity. The fact that England had successfully resisted both absolutism and republicanism was sufficient proof to Burke that any effort to re-found its political system according to a specific set of declared principles would be folly. General principles of law could not be positively declared, but had to be discerned out of the messy particulars of centuries worth of individual cases.

The “substance of tradition” at the heart of the English political system is ineffable by nature: one can describe certain attributes of it, but cannot say definitively what it is; only by looking back on its history can one say what lies outside of it. “Revolution,” like that which Britain experienced in 1688, is only permissible in order to shear off artificialities forcibly grafted on without concern for tradition’s organic nature.

Americans in 1776 saw their revolution in just such a way: the Crown repeatedly denied them “the rights of Englishmen” to which they believed they were entitled, demonstrating that it had rejected its own political tradition and thus forfeited its legitimacy in the thirteen colonies. The Bill of Rights attempted to define this tradition, not to supplant it. The first ten amendments to the US Constitution enumerated rights, but could not create them (as they derived from nature), nor did they deny any rights left unmentioned (as stated in the Ninth Amendment). The US Bill of Rights was thus essentially a gloss on England’s unwritten constitution.

The term “constitution” (constitutio in Latin) originated in the Roman Empire to designate an imperial decree that had the force of law. The emperor’s authority to issue such decrees derived from his position as head of the Senate, which, while not a legislature in the modern sense, still had final say in matters of interpreting law. Beginning with the first Roman emperor, Augustus, constitutiones gradually superseded the senatus consultum as the form such legal rulings took. Imperial constitutions stood above ordinary legislation and jurisprudence, marking the boundaries of what the Roman state could and could not do.

When the Western Roman Empire dissolved at the end of the fifth century, the idea that a particular office, institution, or text defined the nature of the state faded away. Custom was now the only true sovereign, but what custom consists of is malleable. Following the Norman conquest of England in 1066, the Crown authorized judges to decide property disputes based on local, pre-conquest customs. Over time, and with royal support, certain customs of landholding and inheritance became standardized and generalized as “common law” across the entire kingdom.

Parliament emerged in the thirteenth century as an assembly of the kingdom’s main landholders — the Church and the nobility in the House of Lords, knights and burghers in the House of Commons — at which they negotiated with the Crown over taxation. Such negotiations offered landholders the opportunity to defend their common-law property rights against royal encroachment, while the legal expertise needed to do so gave parliamentarians influence over the judiciary.

By 1600, judges and lawyers had come to equate English common law with the Roman constitutiones: it determined the boundaries of what the Crown could and could not do. Unlike imperial constitutions, though, no one could decree common law. It existed beyond time, independent of the will of any individual ruler. In the aftermath of the Revolution of 1688, the Crown finally abandoned its absolutist pretensions to legislate by decree and formally recognized parliamentary sovereignty.

Through a process of six centuries of give and take between the Crown, jurists, and landholders, England arrived at the unwritten constitution celebrated by Burke. The Crown could no longer legislate independently, but only through Parliament, which was still constrained by the common law. Individual pieces of legislation can only elaborate elements of that law; its substance remains ineffable and timeless.

Without a founding document to define it, the UK’s constitution is a mystical body, a Neoplatonic form not of our reality, but actualized within it. In short, it is sacred. This is why the spiritual infertility allegorized in Children of Men produces its most profound despair in Britain: however secular British culture becomes, its political system draws upon theology more than any other in the Western world. Watching Children of Men today, it strikes me as prophetic, foretelling the political paradigm shift through which we are now living. And perhaps more than any of the other ongoing upheavals in the Western world, Brexit best reveals this paradigm shift for what it is: a spiritual crisis.

MORE TO COME SOON —

In case you missed it, there was an election last week. American voters decided that a reality TV star who boasted of committing sexual assault should be the next President of the United States. Well, that’s not exactly true. Hillary Clinton received around 1 million more votes than Donald Trump did. But in spite of winning the popular vote, the former Secretary of State will not become the nation’s chief executive. Instead, that office will soon belong to a man who was brought down by “Stone Cold” Steve Austin at Wrestlemania XXIII.

Like his 2007 humiliation of Vince McMahon, there is a pyrrhic element to Trump’s victory in Electionmania LVI: lacking a popular mandate, Trump will arrive at the Oval Office thanks to the Electoral College, making 2016 the second election in less than two decades in which a Republican has won the presidency in such a manner.

Bear with me for a civics lesson, O knowledgeable reader. There’s a point coming, I promise.

The President of the United States is not elected by a national popular vote, but rather by the aggregate of 51 individual contests held in each of the 50 US states and the District of Columbia. Winning a state contest secures the candidate a number of votes in the Electoral College equal to that state’s representation in Congress. (Although the District of Columbia is not represented in Congress, it is nevertheless allotted three electoral votes.) With the exception of Maine and Nebraska, which split their electoral votes according to congressional district, these contests are winner-take-all. The candidate who secures a majority of the 538 electoral votes thus wins the presidential election, to be inaugurated in January of the following year.

The Electoral College has been the method of electing our head of state since 1788, when the office of President was constitutionally established. As with virtually every aspect of the US Constitution, it is a clunky compromise intended to hold together a heterogeneous collection of former colonies under an awkwardly-fitting federal roof.

Inspired by the classical heritage of Greece and Rome, the framers feared that direct democracy would likely lead to tyranny in the name of the majority. They hoped that disconnecting the chief magistracy from a direct mandate would make its occupant less inclined to use adherence to the popular will as a justification for despotism.

As James Madison remarked at the 1787 Constitutional Convention, the states most in favor of adopting the Electoral College were those in which much of the population was enslaved and thus disenfranchised. Given that the slave-holding states had a smaller voting public than the free states, the representatives of these states feared that they would be ignored if the presidential election were decided by popular vote.

This is the “original sin” of the Electoral College: it was designed to simultaneously stave off a tyranny of the majority while buttressing a system of oppression against the country’s largest racial minority. Its origin is hardly a reason to abolish it, though: no human institution was ever immaculately conceived. But in every election from 1892 through 1996, the Electoral College did exactly what it was designed to do: every candidate who won a plurality of the popular vote also won a majority of electoral votes, a feat that is only possible by winning a broad coalition of states, which in turn requires appealing to a diverse range of voters.

As the UK is learning with Brexit, narrow majorities often reflect the whim of a few rather than the conviction of many; hence, tyranny of the majority is a real thing. The Electoral College favors candidates who can build a broad geographic base of support over those who simply mobilize 50% of the electorate plus one. This is not a design flaw, but a deliberate incentive for consensus-building. The system has become dysfunctional because its extra-constitutional elements – the Democratic and Republican parties – no longer mesh with the constitutional infrastructure as a result of party infighting during the 1970s. Fortunately, reforming a political party is much easier than amending the Constitution.

Democrats frustrated with their Electoral College defeats in both 2000 and 2016 have called for a constitutional amendment to abolish the institution, but the hurdles for achieving this are impossibly high given Republican dominance at both federal and state levels. Just proposing an amendment requires support from either two-thirds of both houses of Congress or two-thirds of the individual state legislatures. Even then, it still takes ratification by three-fourths of the individual states for the Constitution to actually be amended. Requiring super-majorities at both steps of the process ensures that the Constitution cannot be amended without a broad national consensus.

That same desire for consensus is at the heart of the Electoral College, but both major parties currently discourage it. Writing in the conservative journal National Affairs following the 2012 reelection of Barack Obama, Jeffrey Anderson and Jay Cost proposed that the Republican Party adopt a presidential nomination process that favored local and state conventions over primary elections. At the time, moderate Republicans feared that cultural and demographic shifts would make it difficult for their party to ever capture the presidency again without significant changes in both structure and message. Instead, Donald Trump made the party his own by taking advantage of the very same weaknesses signaled by Anderson and Cost.

Trump’s hostile takeover of the Republican Party brought him victory in the Electoral College, but not a popular mandate; thus the dysfunction remains. As long as it does, as I’ve argued before, we are likely to see more Donald Trumps. Ironically, the Democrats might now be more tempted to make the kinds of internal reforms that Anderson and Cost were recommending for Republicans. State caucuses favored Bernie Sanders over Hillary Clinton during the nomination process, and the Vermont socialist and his allies are now rising into the party leadership. If they can place consensus and coalition-building above Clinton’s dashed hopes of merely turning out a loyal base, they might have better chances of wooing back disaffected working-class and rural whites in 2020.

I told myself I wasn’t going to write about the US presidential election until I had had a few days to reflect on the results. After all, I’m an intellectual. Reflecting on things is what I do. But as I was drinking my coffee this morning, it dawned on me that I might as well write something today because I’ve basically been reflecting on the results of this election for the past six months. You see, I never feared a Trump presidency as much as I feared his nomination. That was enough to let me know that every possible outcome of this election would be bad.

I don’t have a natural home within the American political spectrum. Part of this comes from temperament, that ruminative process I mentioned above. I find it difficult to get excited about the hot-button issues of the day. This makes me ill-suited for activism, even when I sympathize with the motivations of the activists. Part of it comes from life experience: from 1997 through 2009, I spent more time outside the US than inside, so looking at my country from the outside-in became a habit. And part of it comes from education: Cicero, Augustine, Machiavelli, and Tocqueville provide a running commentary inside my head. Mostly they fold their arms, shake their heads, and grumble.

All this means that you’re unlikely to hear much from me about things like taxes or bathrooms. Temperament and training incline me to seek the Grundstoffe – the raw materials – of politics, i.e., the assumptions about human nature and our place in the cosmos that underlie our vocabulary, our institutions, and our social practices. As I noted in my first blog post, the building blocks of liberal modernity are relics from a time of flux centuries in the past. They began taking shape after Galileo pointed a telescope at the night sky and Odierna placed a fly under a microscope, dissolving the classical world-view that had stood for thousands of years, but not providing anything nearly as comprehensive to replace it. These concepts then spread via the printing press, inspiring a leisured and literate elite to reshape institutions across the globe.

Social media is doing to modern politics what the telescope and the microscope did to the theory of the four classical elements – dissolving it without offering a coherent replacement. Without such a replacement, nostalgia drives politics. The interconnectedness of the modern world prompted the political movements of our day to reaffirm old identities, to promise to make things as they once were; the media that sustains that interconnectedness enabled them to push their message relentlessly.

In my writings elsewhere, I diagnosed the current state of our polity as a result of the failure of political parties to function as guiding intermediaries between the state and the people. Hillary Clinton was the calcified apotheosis of the modern Democratic Party – a dynastic candidate offering little apart from rote homilies to multiculturalism and assurances that she was competent enough to marginally tweak the regulatory state left by the current administration. Donald Trump, on the other hand, was essentially post-partisan. For him, the Republican Party was nothing more than a brand that he needed to add to his portfolio in order to open up a new market. He didn’t need the structure of the party at all; the media gave him all the “ground game” his campaign needed.

With Clinton’s defeat the Democratic Party is likely to move further to the left, amplifying the voices of people like Bernie Sanders and Elizabeth Warren. On the other side of the aisle, the Republican Party has seen that Trumpism works as an electoral strategy. Republican legislators who refuse to play ball with the new administration will face primary challenges from a rising class of Trump loyalists.

The pessimist in me says that the future of American politics will be driven by celebrity candidates whose electoral strategy, regardless of their political affiliation, will mimic Trump’s: do whatever it takes to grab the media spotlight, because that’s how you will get votes. Give voice to outrage and the votes will come. If there isn’t enough outrage going around already, manufacture some so you can fire up more of your base. The pessimist in me says that our politics will deteriorate into mob rule.

I can’t offer a prescription for what ails the polity, but I do have some thoughts about what people of any political persuasion can do to perhaps alleviate some of the suffering:

Make sure that your social media feed doesn’t turn into an echo chamber. For me this is easy: I was raised in rural Alabama, but my educational and professional ambitions transplanted me to the Northeast Corridor. I see a pretty wide spectrum of political opinions on Facebook and haven’t unfriended (or even blocked) a single person over the course of this election. On Twitter I follow everything from Jacobin to First Things to Reason. They provide food for thought even when I don’t agree with them.

Don’t feed the mindlessness of the social media beast. Political memes are prefabricated opinions. We don’t need them because we all have our own. Write yours down, whether in the form of 1000-word blog posts or 140-character tweets. A picture of that politician you love or hate with some dubious “facts” pasted across it isn’t going to change anyone’s mind about anything, but your authentic voice might.

If you’re an “independent,” join a political party. Yes, yes, I know you’re not a joiner, but that means you have no role in the party primaries that could actually filter out the candidates you hate. Maybe you don’t agree with all of that party’s platform, but you can’t challenge that platform from the outside. In any case, suggestions #1 and #2 will help you resist groupthink and keep you from turning into anyone’s propaganda mouthpiece.

So those are my thoughts on the election. Now please excuse me while I fold my arms, shake my head, and grumble.

The Southern Poverty Law Center, an organization with a long history of fighting against hate crime in the US, recently compiled a list of supposed “anti-Muslim extremists.” On this list – alongside people who claim that Barack Obama was born in Kenya, or who want to ban Muslim immigration into the US – was Maajid Nawaz, a self-described “liberal Muslim” activist born in the UK.

I first became aware of Maajid Nawaz and his organization, the Quilliam Foundation, following the November 2015 terrorist attacks in Paris. Based on what I know of his work, I do not see him as fomenting bigotry against his fellow Muslims. Instead, he opposes both the politicization of Islam (i.e., Islamism) as well as the Islamophobia spread by the people now ironically on the SPLC’s watch list with him. The “extremist” sentiments the SPLC attributes to Nawaz are thus nothing short of nonsense.

According to his autobiography, Radical: My Journey out of Islamist Extremism, Nawaz became a member of the Islamist organization Hizb ut-Tahrir as a teen, but grew disenchanted with Islamism following a stint in an Egyptian prison. He and other ex-radicals formed Quilliam in 2008, with the mission to counter radicalization within Britain’s Muslim community.

A 2010 Quilliam report, cited by the SPLC as evidence of Nawaz’s “anti-Muslim extremism,” warned that Islamist organizations in Britain wanted to “bring together all Muslims around the world under a single government and then impose on them a single interpretation of shari’ah as state law.” This ideology, it continues, makes non-violent Islamists sympathetic to the ends, if not the means, of jihadist groups like al-Qaeda. While some have argued that Quilliam paints with too broad a brush, the document does note that only a minority of British Muslims participate in Islamist organizations and also recognizes that not all members of such organizations are ideologues. Nor does Nawaz see Islamists as cliched villains: when Tunisia’s main Islamist party formally embraced pluralist politics and the democratic process, he praised the development.

Criticizing the goals and ideology of certain Islamist groups does not equate to anti-Muslim propaganda, and for the SPLC to suggest that it does implies that they regard these groups as the de facto spokespeople for all Muslims. The remaining examples of Nawaz’s “extremism” presented by the SPLC are similarly specious, indicating that they and their progressive partners possess little awareness of (or concern for) diverse points of view within Islam.

Nawaz has called for prohibiting facial coverings like the niqab or the burka in any place “where a balaclava, motorcycle helmet or face mask would be deemed inappropriate.” Such facial coverings are neither universally nor exclusively Islamic, having originated in pre-Islamic Persia. Persian culture influenced art, literature, and dress throughout the Islamic world (to say nothing of the West, as evidenced by the khaki trousers I’m wearing as I type this), but the niqab and the burka have only been mandated under very strict Islamist regimes, like the Taliban in Afghanistan. To limit the wearing of such garments might be a restriction on individual liberty, but it is not an attack on Islam.

Another act inviting the SPLC’s opprobrium was Nawaz tweeting a drawing of Muhammad in 2014, almost a decade after the 2005 Danish cartoon controversy. While it was once acceptable among Turkish and Persian Muslims to create images of the Prophet, such depictions are controversial today largely because of the influence of Wahhabism, a conservative form of Islam that originated in 18th-century Arabia as a reaction against the cultural dominance of those very same Turks and Persians. The oil wealth of the Saudi royal family has allowed them to export Wahhabism throughout the Islamic world over the past few decades, but it would be a mistake to see Wahhabi positions as constituting the whole of Islam.

By far the most ridiculous piece of evidence provided by the SPLC that Maajid Nawaz is an “anti-Muslim extremist” is the accusation that he touched a stripper at his bachelor party. Even if true, such behavior has no connection whatsoever to the bigotry that the SPLC accuses him of promoting. The implication here, rather, is that Nawaz, who identifies as a feminist, is insufficiently liberal and thus a hypocrite. A negative profile of the activist that appeared in The New Republic concluded the same. Author Nathan Lean drew his conclusions not only from Nawaz’s seeming eagerness to cozy up to the surveillance state, but also from his oh-so-fashionable wardrobe. One of these is a germane criticism, the other is not.

Lean depicts Nawaz as a less than pious liberal today, but also asserts that he wasn’t a particularly committed Islamist during his younger days, calling Nawaz’s credibility into question. Such questions are legitimate, but what the Southern Poverty Law Center has done in order to paint Nawaz as an “anti-Muslim extremist” is not. For their profile, the SPLC relied on a straw-man version of Islam, cobbled together out of pieces of Wahhabism, Islamism, and pre-Islamic custom. By holding up the burqa-wearing, cartoon-hating Islamist as the normative Muslim, the SPLC reinforces the most negative Western stereotypes of Islam. By then insisting that this fetishized Muslim stereotype is a victim in need of their protection, they also feed the Islamophobic narrative that anyone sympathetic toward Muslims is somehow in league with the terrorists.

The term “extremist” should describe views that fall well outside of a mainstream, centrist consensus, but it’s difficult to find such a consensus between the polarized Western views of Islam today. To counter the view that Muslims are all burqa-wearing, cartoon-hating Islamists who are out to get us, the SPLC insists that the burqa-wearing, cartoon-hating Islamists are really just misunderstood and unfairly persecuted by our racist Western society. Regardless of whether or not his liberal bona fides are 100% up to snuff, Maajid Nawaz puts forth the idea that Muslims are capable of integrating into Western society and that Western society is capable of accommodating Muslims. To me, at least, that sounds like a position eschewing extremes.

John W. McKerley is a research associate at the University of Iowa. His publications include (as co-editor) Civic Labors: Scholar Activism and Working-Class Studies (University of Illinois Press, 2016).

In an April 2016 piece in The American Spectator, written while the presidential primaries were still ongoing, J.E. Blanton argues that the modern primary process – which brought us Trump and briefly threatened Bernie Sanders – has abandoned the public good in favor of factional infighting. Such infighting, he asserts, has the potential to so undermine the nation’s partisan politics as to invite “mob rule,” as one or another populist faction vies for supremacy.

Blanton’s solution is to roll back many of the internal partisan reforms put in place since the early 1970s. Although these reforms were designed to empower rank-and-file primary voters at the expense of party powerbrokers, he argues, “voter turnout has declined, polarization has overtaken the national electorate, and factionalization is deepening within both parties.”

Blanton hopes to avert an even worse national crisis by returning to a system guided by the latest generation of powerbrokers, people whose self-interest in political spoils through electoral success would presumably push them back toward political moderation and compromise. While such an arrangement might be less democratic, he concedes, it would better represent the people’s best interests.

To support his argument, Blanton cites ancient writers and the Founding Fathers, who themselves cited many of those same ancients, regarding their distrust of the alleged excesses of direct democracy. While I can’t speak to the ancient or even eighteenth-century contexts of his sources, I’d like to direct the discussion toward what I think is a much more apt and instructive period in US history – the Progressive Era (roughly 1890 to 1920) – which, I think, suggests very different conclusions regarding American democracy and our current political moment.

The Progressive Era saw its own wave of partisan factional battles and electoral reform. While we in the early 21st century continue to experience the breakdown of Cold War political alliances, Americans of one hundred years ago faced the shattering of a party system forged in the Civil War. By the turn of the 20th century, not only were Democrats and Republicans riven with various (and sometimes overlapping) “reform” and “boss” factions at local and state levels, but those same parties faced vigorous third-party competitors, including the Union Labor, People’s (or Populist), and Socialist Party.

While this period produced some significant and longstanding democratic reforms (including nationwide women’s suffrage and the direct election of US senators), arguably it was distinguished more by an anti-democratic backlash orchestrated by partisan powerbrokers (if not always “bosses”), whose interest in stability was also an interest in moderating or reversing the broad anti-elite reforms made possible through factionalism and third-party politics.

Perhaps the most important expression of anti-democratic backlash took place across the states of the former Confederacy, in which elites within the Democratic Party (sometimes in collusion with frustrated white Populists) enacted a variety of electoral changes aimed at curbing the factional (often biracial) politics that had characterized the South since the enfranchisement of black men and the rise of competitors to Democratic (and white) rule. In the wake of “reforms” like the polls tax and literacy tests, statewide voting percentages fell precipitously, sometimes into the single digits (with both black and many working-class whites disfranchised), opening the door for the codification of a new, stricter, and more expansive culture of segregation that would define southern society for at least another half century.

We can see the role of factionalism in this process even more clearly if we look to those former slave states where such black disfranchisement failed – the states of the Border South, slave states that had remained in the Union (even if nominally neutral). In Maryland, West Virginia, Kentucky, and Missouri, proponents of black disfranchisement had pressed their case only to be beaten back, most often not because of resistance from Republicans but from other factions within their own party that saw the reforms as opposed to their own interests. While the failure of southern-style black disfranchisement did not solve any of the pre-existing problems with democratic politics in the Border States, it did prevent the success of the radically conservative, anti-democratic strain of politics that reshaped their neighbors in the former Confederacy.

Thus, the lesson of the Progressive Era is that the greatest threat to democracy is not too much democracy but the over-concentration of power in the hands of the very powerbrokers to which Blanton looks. Far from saving us from ourselves, such political elites (often deeply tied to powerful private economic interests, as in the case of the large southern landowners who benefited from crushing biracial democracy in the South) more often than not use consensus as a way of maintaining an order that benefits them and their allies. Or to put it another way, the more consistent danger in US politics has been oligarchy and not mob rule.

But what, if anything, does this tell us about the 2016 election? To be honest, I’m not sure, but, to hazard a guess, I would say that the recent partisan instability is more hopeful than anything. For over a generation, Republicans have dominated national politics by appealing to the same block of former Confederate states that Democrats had used during the first half of the twentieth century to the same effect. And, much like the Democrats, over time, Republicans have found that appealing to the elite-driven, ethno-nationalism that dominates those states can create significant contradictions and difficulties in winning national elections.

Hopefully, the collapse of a Trump presidential bid will open new opportunities for southern pro-democratic initiatives – like the Moral Monday movements – that can begin to undo the region’s deeply damaged political culture. Likewise, on the Left, perhaps the Sanders movement can energize resistance to the Democrats’ anti-democratic neoliberal consensus. But, whoever wins the election in the next two weeks, Americans will be best served by reviving participatory democracy rather than trusting in the enlightened self-interest of elites.