Archive for December, 2010

1.) More tablets!
The iPad was the first, but there will be more. Low hanging fruit, I know, but I thought I’d warm up with an easy one.

2.) The Democratic People’s Republic of Korea will peacefully fall under the wise stewardship of Kim Jong-un, who will bring about another one hundred years of peace and plentitude.
Hahahah, just kidding! As Kim Jong-il’s health deteriorates, rival factions among the military and his own family will compete for his throne, leading the state into increasingly erratic behavior. On the plus side, this might put some distance between them and China. On the downside, South Korea might soon get the whole peninsula to itself, whether it wants it or not.

3.) From its start, the Republican presidential primary will make Jersey Shore look like the Lincoln-Douglas debates.
I suspect Palin will run, if only because there’s no other way to sustain her cultural relevance past the breathtaking season finale of Sarah Palin’s Alaska. Haley Barbour’s cachet within the party has decreased by any significant degree, so he might still run as well. Mitt Romney’s been running for the 2012 nomination since somewhere near the end of the 2008 primary battle. As if the lineup weren’t already insane enough, Fox News is going to do as much as it can to heighten the crazy even further. A tight race between increasingly bellicose caricatures (many of whom are Fox News contributors or regulars) is guaranteed ratings Viagra, especially when a lot of these candidates will be reluctant to even speak to other news outlets.

4.) Things are going to get worse for American civil liberties.
Guantanamo’s going to be open for a loooooooong time, and we now have formalized indefinite detention. Plus, I’m fairly convinced that, instead of letting Julian Assange face those Swedish rape charges, the Justice Department is going to have him extradited and dust off the ol’ Espionage Act. But even if they decide to let Assange face justice in Western Europe instead of “justice” over here, thereby maintaining the First Amendment status quo, things will certainly not get better. See here for a further exploration of why.

5.) Your various public and private personas will continue to melt together in increasingly uncontrollable and disconcerting ways.
Online behemoths like Google and Facebook aren’t just becoming even more ubiquitous when it comes to how you interact with people: they’re also finding more points of contact between one another. For a while now, we haven’t so much been managing our public projections of ourselves as we’ve been negotiating with them. That trend is going to continue, and get weirder yet.

6.) Not only that, but forces like Gawker and Anonymous are going to exacerbate the problem.
Whether they have a political agenda, a personal axe to grind, or just want page views, Nick Denton, his faceless hacker enemies, and the lesser imitators of each will continue to publicly humiliate random people. On the plus side, those random people might find themselves elevated to celebrity status, at which point they can choose between a reality show, a political candidacy, or both.

7.) Online espionage and cyber warfare will become a more frequently used tool of states.
The fun part is going to start when this trend clashes with Anonymous’ growing prominence as a political actor.

8.) The economy will continue to suck.
No kidding.

9.) House committee hearings are going to be fucking nuts.
Various Republican committee chairs are going to start holding hearings on the most insane topics imaginable, partially because of wide-eyed conspiracy theorists like Michelle Bachmann and partially because of a cynical desire to drown the Obama administration in Whitewater-esque faux-scandals in advance of the 2012 election. Even John Boehner will start to get uncomfortable with all of this.

10.) The national security apparatus is going to find some new secret wars to wage.
After Pakistan and Yemen, who knows what’s next? My money’s on Somalia or Tajikistan.

11.) South Sudan will vote to secede, and violence will ensue.
As if the people of Sudan didn’t already have enough problems, Khartoum won’t let a secession occur quietly and peacefully. If they don’t reject the referendum results entirely, then they’ll pick a fight over where the border with their new neighbor gets drawn.

12.) Putin will clearly signal his intention to become President of Russia again.
Medvedev’s term as understudy-in-chief runs out in May 2012. In the meantime, Putin will make no secret of his intention to retake the crown, thereby cementing Medvedev’s irrelevance in the eyes of the world and turning Russian democracy into an even bigger joke.

13.) On the plus side, TV will still be good.
The new seasons of Mad Men,Breaking Bad,Curb Your Enthusiasm,Community, and others will be predictably awesome. The Walking Dead might finally find its legs. The Daily Show and The Colbert Report are going to mine a lot of gold out of C-SPAN.

14.) But mostly things will stay the same.
For the vast majority of the world’s population, life will go on more or less as it always has. At least until global climate change causes droughts in some places, floods in others, and generally exacerbates resource scarcity. But I’ll save that for my “Predictions for 2020.”

Happy New Year! And remember: Even if I end up being mostly right (which I sincerely hope is not the case), you and most of the people you care about will probably still be doing pretty well. As for the other stuff, well, there’s always room for improvement.

Share this:

Like this:

For all my Christian friends, here’s a Christmas song that’s both pretty good and hasn’t already been overplayed to death by every radio station everywhere:

Tonight: The traditional Jewish Christmas celebration with my parents of chinese food for dinner followed by a movie. Later this week I might do some of the obligatory top-10 year end lists, and some more atheism stuff.

UPDATE: There’s also this one. I had forgotten about this one.

Share this:

Like this:

So yesterday I alluded to the notion that you can’t coherently rebut arguments for the existence of God without either confronting them on their own turf (metaphysics) or challenging the whole concept of metaphysical fact. Ever since I finally rejected the argument from empiricism, I’ve preferred the latter.

Crudely speaking, I’m a logical positivist. My view is that there is no such thing as metaphysical fact, since facts are statements about the word that are verifiable and either true or false. You can create a false statement pertaining to metaphysics if you claim a metaphysical force (like, say, the mind, a ghost, etc.) somehow interacted directly with physical objects, because such a phenomenon is logically impossible. However, statements that are of a purely metaphysical nature can be neither true nor false.

So if you ask me if I think there exists an interventionist God who has agency in the real world, I’m going to give an emphatic no. But if you asked me if I think the God of Einstein and Spinoza exists — an impersonal, abstract intelligence who “reveals himself in the orderly harmony of what exists” — then I would say you’re asking the wrong question.

When it comes to purely metaphysical claims about God, both the people making them and the people listening tend to mistake them for statements of fact about the universe. Instead, I see them as statements about the conscious state of the speaker, and the structure of her perception and cognition. “God is all around me” isn’t a factual statement like “the dog is brown,” but an expression of sentiment. A clearer way to phrase this expression of sentiment might be: “I am having the experience of being surrounded by an omniscient being.”

Put that way, most forms of metaphysics might be better understood as wayward children of phenomenology, or the philosophical study of the structure of our experiences. This discipline is where philosophy overlaps most closely with good old-fashioned psychoanalysis. It also has a certain literary quality to it: phenomenology can illuminate the inner workings of the human psyche in a manner very similar to (but more direct than) that of brilliant first-person narration. It’s no coincidence that phenomenology overlaps quite a bit with existentialism, probably the one school of Western philosophy to have directly inspired more fiction than any other. The phenomenologist, the psychoanalyst, the existentialist and the fiction writer all share a common mission: to articulate what it means to want, fear, and feel like a living person. We remain fascinated by these efforts because they help explain us to ourselves and make us better at being what we are.

Religious narratives do the same thing, although stylistically they’re obviously closer to fiction than phenomenology. This is why it’s disappointing to hear other atheists refer to them as “myth” and use the term derisively. Since when have myths been less than fascinating? Since when hasn’t pretty much everyone used some fictional narrative or another (whether they were aware it was fiction or not) as an explanatory tool for understanding the self? Even phenomenology is more or less a myth: it’s an ongoing attempt to put into words a structural understanding of our “mind” and “consciousness.” These things are purely metaphysical beasts, not real-world entities. Any attempt to articulate them can only end in crude metaphor.

But if crude metaphor is all we have to go on, it suffices. It may not be capital-T true, but it still reveals something in us we couldn’t even catch a glimpse of otherwise. That’s why I compared it to literature, and that’s why I think religion still has something to teach to the atheist. Remember: When we want to commend authors like Herman Melville and Cormac McCarthy for the richness, beauty, purity, and sheer, awe-inspiring might of their fiction, we have a word that encapsulates all of that. We call their work Biblical.

Like this:

Playing off of my last post, I think one of the worst intellectual traps the atheist can fall into is the shallow argument. Pretty much everyone has a natural bias to arguments featuring conclusions they happen to agree with, whether or not those arguments are totally sound. And when you take an uncharitable view to people who challenge those arguments, it can be hard to effectively judge their point against your own. So you end up with two fairly common fallacies among ardent atheists:

1.) Failure to distinguish between different religious claims. This one is the less common one. After all, these things should be pretty obvious: Not everyone who calls herself a Christian thinks the Bible is the literal word of God. Not everyone reaching for eternal reward thinks that faith in his deity is the only way to get there. Hell, some religious don’t don’t even think that God is omniscient, or interacts with the physical world in any observable way. I imagine if I were one of those people, I would be pretty weary of being conflated with Creationists.

2.) Overreliance on the argument from empiricism. Let’s talk about this guy:

I love this man. He’s a comic genius, and it’s great that he’s also public about his atheism in a thoughtful, articulate, non-dickish manner. But in his recent column on why he’s an atheist — the one all of my atheist tweeps keep linking around — he makes the appeal for atheism from science. It’s a popular argument, but it’s also a bad one.

The problem with the argument is that it takes multiple arguments and collapses them into one, in a manner not unlike the first fallacy. It confuses empirical claims with metaphysical claims. Science, of course, is only interested in the former.

The difference between an empirical claim and a metaphysical claim is the difference between saying, “Egypt suffered a plague of locusts,” and, “Egypt suffered a plague of locusts because a divine intelligence was displeased with the pharaoh for keeping the tribe of Israel enslaved.” The first one is definitively true or false, and you can look at evidence in the real world to make a judgment one way or the other. That’s where science comes in. But as far as divine intelligences go, science has absolutely nothing to say. You can’t measure or quantify a mind. You might be able to track physical phenomenon that are correlated with what one might want to call a mind, but science can’t help us make that determination.

(Aside: This cuts both ways, of course. You might witness something you want to call a miracle, because you see no logical explanation for it. But the fact that there is no explanation that science can currently afford us does not mean you can make any definitive metaphysical claim about the event. As the analytic philosopher of logic A.J. Ayer would point out, the solitary fact that the Red Sea miraculously parted does not mean that God did it — not unless your definition of God is solely, “that which parted the Red Sea.”)

My point isn’t that these questions have no definitive answer. My point is that this reliance on science to explain everything is cheap and intellectually lazy. Any argument over the existence of God has to take metaphysics into account as a discipline entirely separate from empirical observation.

That means taking the other philosophical problems of a godless universe seriously as well. For example: If there is no God, do we have any reason to believe that there are actions or consequences that are good and bad independent of our feelings about them? What is good? Do we have any reasons to be good? What’s the point of doing anything, really?

These questions don’t have scientific answers, either.* And all we accomplish by pretending that the answers are easy or obvious is to make ourselves willing accomplices in our own ignorance. Instead, I find it more helpful to see these questions as a gift to atheists: the universe is far more ambiguous without a God to tell us right from wrong, but it’s also full of so much more mystery and wonder. We squander that gift when we dismiss challenges to our premises out of hand. Better to find out what clues believers can bring to the hunt.

Like this:

I’m an atheist. A nonbeliever. A heretic. Call me by any of those names, or any others you’d like, but, for the love of Spider-man, never, ever, call me a “bright.” Or, worse, a “freethinker.”

Because the truth is, calling yourself a “freethinker” is self-contradictory. To say someone thinks freely is to suggest that she has no uncritical attachment to any ideology or belief system. It calls to mind someone who is always roaming, always seeking truth, and never satisfied with the easy facsimile of truth someone puts before her. That is what a freethinker is — at least unless you’ve ever met anyone who calls herself that. Because in the real world, “freethinker” is a smug term for someone who’s an atheist, and thinks that anyone who isn’t an atheist is, well, an un-free thinker. They’re all sheep, man.

No one who seriously thinks there is a binary distinction to be made between the atheists and the brainwashed hordes can be said to be thinking freely. In order to hold that view, you would have to find it inconceivable that any sane, intelligent individual could think critically about a particular faith, consider all the alternatives, read the literature, and still sign up.

I have a few friends like that. One of them is Jamelle Bouie. Jamelle is a very smart, literate, and introspective man who also happens to be a committed Christian. He’s not a “freethinker,” but he is a free thinker. Lately, he and I have been having a lot of conversations about faith, and I’ve been learning quite a bit: about Christianity, of course, but also atheism. My own and others’.

I hope all of you godless Americans out there have at least one friend like Jamelle. If not, you should go out and make one. Because if anyone can be said to have actually earned the title freethinker, it’s the guy who welcomes challenges to his own beliefs (or lack thereof) from people smart enough to make the case. When he hears those arguments, this guy — the freethinker — actually listens. He goes through every step to make sure he’s doing the argument justice, and where there’s ambiguity he gives it the most generous possible reading. If that undermines his position, so be it. The goal isn’t to score points.

All of this probably sounds pretty obvious and intuitive, but it’s worth reiterating in the age of brights, freethinkers, New Atheists and so on. I see more and more atheists behaving like there’s no difference between blind faith and self-critical faith. I see more and more atheists presuming that they basically have religion figured out, and believers have nothing to teach them.

That presumption is obviously, patently untrue. Religion doesn’t have to convert us in order to teach us something. For one thing, it can help atheists refine our understanding of what our atheism means, and lead us away from some of the logical fallacies popular atheism too often falls into. Plus, religion can teach us a great deal about what it means to be a human being.

I’ll get into a little more detail regarding those last two assertions somewhere in the next few days. Oh, and by the way: It’s totally a coincidence that I’m doing this the week before Christmas, I swear. Did not plan that.

Like this:

Nothing seems to me to be rarer today then genuine hypocrisy. I greatly suspect that this plant finds the mild atmosphere of our culture unendurable. Hypocrisy has its place in the ages of strong belief: in which even when one is compelled to exhibit a different belief one does not abandon the belief one already has.
–Friedrich Nietzsche

All of this talk about Mitt Romney as regional manager of an Olive Garden (okay, technically CEO, but I think my version is funnier) has got me thinking again about how underrated political hypocrisy is. I’ve written before that a political figure’s personal hypocrisy is basically irrelevant when it comes to assessing his or her fitness for office, but that’s not what we’re talking about here. Romney practices public, not personal, hypocrisy. And rather than being irrelevant, I think it’s actually a sort of virtue.

The difference between public and personal hypocrisy is this: the personal hypocrite preaches a certain set of values that he believes should guide people’s personal lives, but does not live by them. (Example: Ted Haggard preaches that homosexuality is a sin, but sleeps with male prostitutes.) The public hypocrite may or may not live by the values he preaches, but that’s beside the point. What makes for public hypocrisy is a tendency to change whatever those stated values are, not because of a genuine and publicly stated change of heart, but because of political expedience.

Romney is a one of the most shameless and nakedly opportunistic public hypocrites in recent political history. He refuses to let any of his prior accomplishments or stated positions stand in the way of his quest for the Republican nomination, to the point where if you presented him with a poll suggesting that the majority of Republican primary voters believed that electricity was witchcraft, then within the week there would be a Romney-authored op-ed in the Washington Post decrying the homosexual lightbulb agenda. This sort of desperate pandering is not what we would typically characterize as virtuous behavior. But there is something rather comforting about it. It’s almost quaint.

Maybe it would seem a little spookier if Romney didn’t have an actual governing record to accompany his sweaty dance of Tea Party seduction. But he does, and it’s not so bad. It suggests a competent administrator who realizes that rational self-interest demands his constituents be relatively healthy, happy, and secure. Richard Nixon was more or less the same way: the low-key, pragmatic father of the EPA and detente with China when in office, but a dirty trickster and stoker of racial resentment when election season loomed.

Granted, “Nixonian” generally isn’t used in the complimentary sense. But imagine a presidential candidate who espoused Romney’s currently stated views and actually acted on them once in office. Suddenly, no principles starts to look preferable to bad principles.

That, I think, is the major thing we can learn about public virtue from Romney’s candidacy. When it comes to politics, most of us taking Walter’s position in The Big Lebowski: “I mean, say what you like about the tenets of National Socialism, Dude, at least it’s an ethos.” But in a democratic system, where rational self-interest usually demands at least some deference to the interests of the voting public, savvy egoism is usually preferable to destructive principles. Of course, it should go without saying that committed (though not uncritical) attachment to well-reasoned principles trumps both nihilism and insanity. But when that’s too much to ask, I’ll go with public hypocrisy over “at least it’s an ethos.”

That said, there are cases where a remarkable lack of hypocrisy about one’s crazy, backwards beliefs can be a good thing. Example A: Our troglodytic friend, Rep. Louie Gohmert, arguing against the repeal of Don’t Ask, Don’t Tell:

No question, Gohmert is a homophobe. He is not, however, a hypocrite. Were he a hypocrite, he would have done one of two things:

Decided that it was in his own interests to vote to repeal Don’t Ask, Don’t Tell, regardless of his personal contempt for LGBT people. (Option A)

Continued to oppose DADT, but never reveal his real reasons for doing so. (Option B)

For sure, Option A, despite being hypocritical, would nonetheless be morally superior to the path he wound up taking. But Option B? In that case, he’s no longer adjusting his stated policy preferences out of political expediency. Instead, he’s just making dishonest arguments on behalf of those policy preferences. That’s a very different kind of public hypocrisy, one that we might well call McCainism. And I would argue that it’s the bad kind of public hypocrisy. Whereas Option A puts a political figure’s weak convictions — or total lack of conviction — in the service of a greater good, Option B is just another version of the same evil compounded by further dishonesty.

So let’s be glad that the Republican Party has both its Romneys and its Gohmerts. The Romneys harness the insanity that already exists in a way that, compared to how your Palins and Angles roll, looks relatively responsible. And when the rest of the party tries to veil the true intentions behind an undeniably unjust agenda by raising nonsensical procedural concerns, the Gohmerts are there to expose the party’s real reasoning to the light of day.

Share this:

Like this:

I caught an advance screening of the Joel and Ethan Coen’s latest on Tuesday. I wouldn’t say it’s one of the essential entries in the Coen Brothers library, but that’s really just a way of saying it was merely excellent instead of a classic. Of the two revisionist Westerns they’ve made so far, I suspect No Country For Old Men will be the one that lives on.

Partly that’s because True Grit is much more of a movie than No Country For Old Men. The latter film wholly committed to the Coen Brothers sensibility at its most dour, eschewing the comfortable rhythm and pleas for sympathy that characterize most Hollywood films in favor of unpredictability and clinical detachment from its characters. True Grit has a bit of that, particularly in its first act, but it remains the closest thing to an audience-friendly, mainstream feature that the brothers have produced in years.

That said, it bares little resemblance to the generic revenge thriller its marketing campaign promises. As I wrote over on the Ms. blog, the trailers and TV spots downplay the fact that this is very far from an ensemble film: while Jeff Bridges, Matt Damon and Josh Brolin all turn in fine performances (though Brolin’s is more of a cameo), fourteen year-old Mattie Ross is the undisputed center of the movie.

While the movie around her isn’t among the Coen Brothers’ best, Mattie is one of their more memorable characters. I don’t think they can claim much of the credit, though. Most of it goes to Charles Porter Portis (author of the novel from which both this movie and the 1969 feature of the same name were adapted) and, of course, the phenomenal Hailee Steinfeld, seen here in her feature debut. While Steinfeld is physically dwarfed by the supporting cast, she has a commanding presence to compete with all of them. What really sells her in the role are her terrifically expressive eyes: cold and appraising of everything around her at the start of them, but gradually widening and softening as she realizes the danger she’s placed herself in. Very few child actors could convincingly convey that sense of toughness, much less the subtle transition in those moments where she lets her guard down.

That’s especially crucial, because the Coens do something with Mattie that they do with very few of their characters: they get the audience emotionally invested. Detractors of the Coen Brothers often complain about their evident contempt for their characters, which is certainly fair in some cases (re: Burn After Reading, though there I would argue the contempt is well-earned and part of what makes the movie great), but more often just a misdiagnosis of the clinical detachment I brought up earlier. Either way, it’s exceedingly rare that they let an audience get close to a character like they do here. It makes the film feel uncharacteristically warm, especially in the relationship between Mattie and Bridges’ Marshal Cogburn.

That’s one of the pleasant surprises afforded by the movie. The other one is just how funny it is. This is no bleak slog like No Country, and while the humor is sometimes gallows-flavored (especially in one scene that takes place at some gallows), most of the laughs stem from Raising Arizona-style gentle tweaking. Damon’s Ranger La Boeuf takes the brunt of this, and Damon gamely commits to both the character’s gallantry and his vanity in equal measure.

But while the Coen Brothers are gentler here than in, say, Miller’s Crossing, this is still a fairly grim movie at heart. And while it will probably be more palatable for the general moviegoing audience than a lot of the Coen Brothers library, that’s not really saying much. Warmth isn’t the same thing as sentimentality, and this movie — especially the ending — is unsentimental to the point of brutality. Thank god for that, too. This movie may be the Coen Brothers’ love letter to American Westerns, but they never let genre convention get in the way of its raw emotional core.

Share this:

Like this:

I don’t think you need to accept all of the premises of John Quiggin’s argument to feel profoundly unsettled by the conclusion. So while I’m agnostic on President Obama’s reelection chances (two years is a long time and his poll numbers aren’t terrible right now), and profoundly skeptical that America will ever see a President Palin (because of, god, just take your pick), I do think this part is hard to argue with if you take out “Palin” and swap in whatever political flavor of the month strikes your fancy:

Starting with the worst case, how bad would a Palin Administration be ? In policy terms, it would obviously be terrible, but I’m more concerned about the prospect of Palin inheriting the monarchical powers amassed for the Presidency by the Bush and Obama Administrations. These include:

Powers conferred by legislation under the PATRIOT Act, Military Commissions Act and so on. These would surely be greatly strengthened by a Republican Congress under Palin

Powers claimed by Bush and Obama (for example, the power to direct the assassination of any person deemed to be a supporter of terrorism) with no specific legislative capacity

The power, with no legal basis, to pressure corporations into taking actions against real or putative enemies of the state (wiretapping, withdrawing services from Wikileaks)

The Bush-Obama precedent under which admittedly criminal actions taken by the President (such as ordering torture) do not give rise to any prosecution or right of redress

The main saving grace under Bush and Obama has been the fact that most of these powers have been used fairly sparingly, and never (AFAIK) against ‘mainstream’ political opponents of the Administration. I can’t see Palin accepting any such constraint. Given the starting position, four years of unfettered power for Palin would be enough to move the US a long way in the direction taken by Russia under Putin, with a compliant media, an oligarchical ruling class subject to rapid reprisals for any display of political independence, and dissidents are subject to all kinds of harassment up to and including assassination.

Look, we already have reason to suspect that these powers have already been wielded against mainstream political opponents. You could certainly make the case that the leaking of Valerie Plame’s identity fits into that pattern, though that wasn’t so much an abuse of formally granted power as the leveraging of privileged information for political gain. Similarly, in 2006 a credible mainstream journalist claimed that he was the target of NSA wiretapping. And while Julian Assange is not a mainstream journalist by any measure, prosecuting him under the Espionage Act as it currently exists would set a damn troubling precedent.

Realistically, I don’t think these powers are going away. Neither major political party even pays lip service to civil liberties anymore, and even if one of them did, the legislature’s ability to keep the executive in check is basically shot to shit. The Senate has hobbled itself to the point where the executive branch might assume even more of the legislature’s traditional roles through various agencies simply because that’s the only way anything will get done.

The creeping normalization of abuse of power has been going on for awhile now, but if I had to hazard a particularly dour guess, I’d say that it’s going to reach a watershed moment in the next decade or so. Look at the horrendous overreaction to Wikileaks. Think about what could happen if the economy stagnates for a few more years, and unemployment stays at insane levels. Combine that with America’s declining international influence and the global instability we could see once climate change starts to affect everyone’s access to basic resources. In a world like that, whoever the president is at the time could see knuckling down as his or her only option.

And who’s to stop it? We’ve been fine with the trend for awhile now, as long as the effects were more or less invisible to those who hold most of this country’s political and economic power. It turns out that one of the things that makes liberal democracy so fragile is that many, many voters will still cling to freakily illiberal views. When you get down to it, the democratic spirit is a psychological aberration. Nothing erodes it faster than the most basic human instincts.

Alright, enough with the doom, gloom, the rambling, and the grandiose pronouncements. I realize I’m young enough that I don’t have the historical perspective to make a true judgment call on this. America has seen greater abuses of power than the ones we’re looking at right now — way greater. Jim Crow laws are dead and gone. The Sedition Act is barely a memory. There are reasons to not be cynical.

Still, the sunniest thing I can say right now is that Quiggin’s prediction strikes me as a genuine possibility. We’re in a fragile position here. I think most of the time we take great efforts to ignore how fragile. And of course, genuine crises like the kind Quiggin describes are always inconceivable, until we decide, with the great wisdom of hindsight, that they were inevitable.

Share this:

Like this:

One of the strange things about blogging as a medium for writing is the pressure it puts on content production. Nobody judges an author by how many books she manages to put out in a five year span, and the newspaper columnist operates on a fixed schedule of one or two pieces a week. But when it comes to blogging, the more you update, the better. The goal is to make people check your front page at least once a day, and preferably several times throughout the course of that day.

Up until this week, I observed a regimen of 5-6 posts per week, minimum, except when extenuating circumstances made that too much of a hassle. In college, that wasn’t so hard. I was a liberal arts major (a vocation that largely consists of looking rumpled and overworked but receding guiltily when an actual overworked person from the business school or biology department walks by). Plus, my interests were pretty standard precocious political junkie nonsense. Looking back, it’s amazing how entertained I could keep myself just writing dozens of permutations of the same post about what a terror Sarah Palin is. Especially when I was producing most of NYU Local‘s election 2008 coverage, I could chase after every single campaign story the mainstream media presented as politically significant and never get bored. When boredom became so much as visible on the horizon, I could just do posts kvetching about why something wasn’t really news, and that would hold me over.

The trick of maintaining that level and style of content production lies in being fairly predictable — if not predictable to others, then predictable to yourself. I could dash off three NYU Local posts in about an hour because I already had the arguments for each one laid out in my head. They were there before the news I was ostensibly reacting to even occurred, and I was just waiting for events to go with them.

Writing like that doesn’t interest me anymore. It strikes me as lazy and uninteresting. Plus, as I’ve learned a little bit more about policy, history, and political science, I’ve come to comprehend what a vanishingly small percentage of the things I used to write about were significant events. It’s hard to get all that jazzed about dueling campaign narratives when the state of the economy and name of the incumbent party trump all of that. But you can’t just write, “The economy sucks and the president is a Republican, so Obama is going to win” each and every day, so I helped manufacture white noise.

There are other ways to write about these things. If you’re an expert in a particular policy area, you can cover that. (I am not, sadly, though I’m trying to educate myself on American foreign policy and civil liberties.) Failing that, you need the time and patience to grapple with a challenging subject and let an argument unfold. Now that I’m pretending to be a grown up (which involves not just holding down a grown up job but doing other strange, alien grown up things, like exercising and occasionally cooking for myself), I have less time for that than ever. I can’t do one post a day if I’m also going to do the homework needed to form an intelligent opinion on the subject of that post. (Incidentally, this is something I wish I had realized a couple of months earlier.)

More to the point, I can’t bang out a post a day without having that entire post laid out in my head from beginning to end. Writing like that saps all the joy out of the process. There’s no discovery involved, self- or otherwise. The process of actually getting your thoughts down becomes wholly performative. I’d like to get away from the whole genre of blogging as delivery of a prepared statement.

So in that spirit, this blog is going to be updated a lot less frequently. I’ll try to keep it above once a week and hope that I don’t hemorrhage too much readership. The quality of the writing will go up, though, I think. It’s going to become a little more essayistic, like this post, though I promise that this isn’t going to become a blog solely dedicated to navel-gazing on the process of blogging. The subjects will vary, but the overall mission is to make this blog more about writing and less about evacuating.

We’ll see how that goes. But even if I still suck at this, at least I’ll only suck at it two or three times a week, tops.