Friday, November 30, 2012

The morality of withdrawal from social and political activism had much concerned Merton, who had made the decision to move from activist to contemplative:

Withdrawal from other men can be a special form of love for them. It should never be a rejection of man or of his society. But it may well be a quiet and humble refusal to accept the myths and fictions with which the social life cannot help but be full—especially today. To despair of the illusions and façades which man builds around himself is certainly not to despair of man. On the contrary, it may be a sign of love and hope. For when we love someone we refuse to tolerate what destroys and maims his personality. If we love mankind, can we blind ourselves to mankind's predicament? You will say: we must do something about his predicament. But there are some whose vocation it is to realize that they, at least, cannot help in any overt social way. Their contribution is a mute witness, a secret and even invisible expression of love which takes the form of their own option for solitude in preference to the acceptance of social fictions. For is not our involvement in fiction, particularly in political and demagogic fiction, an implicit confession that we despair of man, and even of God?

Sounds like "scratch a pessimist, find a disappointed idealist," or "withdrawal in disgust is not the same as apathy." It's a pretty rationalization, but, nah. I'd go with Bartleby's "I would prefer not to."

I've made fun of Thoreau's popular image a coupletimes, but Peter France, in his book Hermits, excerpts a passage of his with which I can nod along in agreement:

I would not have anybody adopt my mode of living on any account; for, besides that before he has fairly learned it I may have found out another for myself, I desire that there may be as many different persons in the world as possible; but I would have each one be very careful to find out and pursue his own way and not his father's or his mother's or his neighbor's instead.

Of course, this could easily read as a generic exhortation to the sort of trivial individuality expressed through, say, different brands of consumer products. I mean, look, people, by definition, you can't all be regular hikers along Frost's road-less-travelled-by. But I prefer to interpret such statements as a fundamental challenge to Kant's categorical imperative, which strikes me as the Golden Rule on steroids; that is, as the deeply ingrained urge to proselytize for one's own preferences, to see differences of opinion as conflicts in need of resolution, to always seek a common denominator.

I'm not talking about cultivating contrarian rebelliousness or aggressively antisocial behavior, though. My ideal is rather a sort of self-contained, laodicean lack of concern for keeping up with and being accepted by the group. A hermit in my own head.

What should believers do if they discover that their belief is getting in the way of their proper connection to God? Would they be prepared to sacrifice their faith for their faith? For the true believer, God is always a mysterious supplement, present in life but never completely known, always in essence just beyond the ability of the mind to grasp. But for a true atheist, this is even more profoundly true: the atheist embraces the mysterious Otherness of God much more wholeheartedly than the believer does. To the point, indeed, of Othering God from existence itself. For a long, long time Christianity has been about an unironic, literal belief in the Trinity. It has lost touch with its everythingness and its difference and its novelty. Disbelief restores that.

Ahem. I do believe this epistemological phenomenon has already been born and named!

Thursday, November 29, 2012

One needn’t be a liberal to be a grammar fanatic. Nor need one be a grammar fanatic to be liberal. It is easy to imagine a world in which the two are independent, orthogonal, like rolling a die and then rolling it again. Indeed, this world is probably our own.

And that is what’s so bizarre. Language use varies along the same factors—class, race, ethnicity, region, age, gender—as other cultural practices and markers. Yet, many self-proclaimed liberals disparage others’ language with a fervor they’d never use to criticize the same people’s music, clothing, diet, spending habits, or parenting techniques.

This again? Okay. I can accept that such people exist somewhere, sure. However, anyone who has spent any time at all reading the progressive blogosphere will have seen a plethora of examples of "self-proclaimed liberals" criticizing every one of those things. Hell, I think the only stories I saw last week about the Hostess bankruptcy happened to be written by discerning authors informing us that they would never pollute the sacred temple of their bodies with such cheap artificial corporate garbage junk food blarg blarg blarg. Freddie deBoer has produced thousands of words examining the phenomenon of progressives who construct rigid caste systems of taste and cultural capital. Suffice it to say, if these fascist-in-grammar, relativist-in-everything-else liberals didn't exist, a lazy writer with several hundred words needed for a deadline would have to invent them.

As to her argument that insisting on rules and consistency in spelling and grammar is just insensitive linguistic imperialism, I can only quote George Carlin:

Yeah, I know you say, "Well many people are using it that way, so the meaning is changing." And I say, "Well, many people are really fuckin’ stupid, too; shall we just adopt all their standards?"

I'm not an educated fellow. I won't pretend to be familiar with the standards of composition that English undergrads study, let alone all sorts of arcane grammatical rules. But I honestly would consider it an insult, however unintentional, for someone to present a piece of writing to me, whether in the form of personal correspondence or a public post, that looked like a careless mess of misspellings and nonexistent punctuation. I mean, if I'm talking to a stranger, someone with whom I have no expectation of intuitive understanding, I go out of my way to speak clearly and explain any necessary background for their understanding. Why should it be any different in print? If you don't take your own thoughts seriously enough to make every effort to be clear about them, don't expect me to take them seriously enough to do the extra work of decoding it.

I do still cling to a naïve belief that professionals and other people who are responsible for communicating to the public should be nearly flawless in their spelling and grammar. But the shit that gets presented to me for revision in my copywriting gig is slowly eroding that idealism. Good lord, you would not believe how illiterate some of these people can sound.

Wednesday, November 28, 2012

Still, “Beware of Mr. Baker” invites you to listen again, and to attend to the rhythmic power and complexity that this drummer brought to the group’s thunderous (and often ponderous) variations on the rhythm and blues playbook.

Mr. Baker was a rocker, in a sense, by accident of birth and association. If you were young, musical and British in the 1960s, rock ’n’ roll was an irresistible career path, and Mr. Baker certainly, at least for a while, lived out the rock star legend to its fullest. But he was by taste and temperament more of a jazzman, captivated at an early age by African polyrhythms and the expansive approach of American drummers like Max Roach and Elvin Jones.

Rather than keeping the beat, Mr. Baker opened it up, adding layers and nuances without sacrificing his innate, unerring sense of time. He was wilder than steady rhythm players like Charlie Watts, and also far more disciplined and subtle than showboating wild men like Moon and Bonham.

My favorite record on which he played is, far and away, Sunrise on the Sufferbus by Masters of Reality. I first heard it on March 22nd, 1994; the date is fixed in my memory by virtue of the immediate impression it made on me as I stood there at the listening center where you could bring any CD in the store to hear before buying. It only took me the first three songs to consider this one a keeper. The jazzy drum fills, which did so much to set the unique mood of that record, reminded me of Bill Ward's work in Black Sabbath (perhaps unsurprising given the reverential nod toward Sabbath inherent in the band's name). You kids should check it out on the Spotify or the Eyetoons or whatever it is you do these days in lieu of listening to CDs.

Over my head, I see the bronze butterfly,
Asleep on the black trunk,
Blowing like a leaf in green shadow.
Down the ravine behind the empty house,
The cowbells follow one another
Into the distances of the afternoon.
To my right,
In a field of sunlight between two pines,
The droppings of last year’s horses
Blaze up into golden stones.
I lean back, as the evening darkens and comes on.
A chicken hawk floats over, looking for home.
I have wasted my life.

Arthur sent this to me this morning (I admit, I'm not familiar with the commercial):

Meanwhile, inspired in part by your comments about autumn and the neglected month of November, I came up with something of my own.

You are significantly younger than I am, so significantly that you may never have seen the TV commercial with the slogan "Pepperidge Farm remembers," said in a suitably old-timey New England voice with a faint gerontological whistle that makes the final consonant a sibilant.

Pepperidge Farm

How many have passed, these crisp and cold Novembers?
When you are young and shiny as any penny
How can you know? You’ve yet to learn how many.
But Pepperidge Farm remembers.

How many times have poets rhymed “Decembers”
With “embers”? Many, far too many times!
Go out into the fields and reap the rhymes
That Pepperidge Farm remembers,

You’ll find slant rhymes among them, umbers, ambers,
Off-colored, dun or dusky, fulvous sheaves,
And more among the leaves, among the graves.
Christ! Pepperidge Farm remembers.

Monday, November 26, 2012

I was gonna go off on one of my rants here, about the shriveled little bags of hate and resentment in the comments here showing what unique sophisticates they are by complaining that a website reviewed the finale of a very popular and beloved show, despite the fact that they probably sit at said website hitting refresh over and over so that they can momentarily distract themselves from their drab, empty lives of misdirected self-hatred and congealed defensive apathy as they wait for the inevitable moment when they realize that they've gone grey and gotten varicose veins and lost the last shreds of their youth without ever once bothering to feel an unguarded emotion, so consumed were they with the fear that some fuckface will make fun of them for actually trying and in so doing cost them a social currency that was absolutely worthless to them in the first place, and sob quietly in the night for having thrown away the only gift life gives us, the gift of being emotional creatures capable of experiencing feelings so intense that they can ruin or save your life, if you only leave your beating human heart open enough to risk the pain that is the price of everything and anything truly worth experiencing.

Some friends and I once had a fun contest to see who could write the best one-sentence horror story. My blue-ribbon submission, a story about a transvestite prostitute serial killer who favored a pizza cutter as a murder weapon, told from the perspective of the john/victim, seems weak in comparison to this.

Coen said it didn’t matter that those targeted in the articles were minors.

“So they need to learn, everybody needs to learn here that there’s no divide between real life and online. What you say online is just as important as what you say in real life. So I don’t think it matters what actually happens to the kids at the schools per se. We’re not acting as judges or juries. Our responsibility is just to the story.”

I'm so old, I remember when it was conservatives who were in favor of expanding the scope and severity of punishments for minors. That's cute, though, the way she invokes journalistic neutrality when convenient. Does working at Gawker Media make you a soulless pageview whore, or does the job itself just naturally select for such people? Kind of a "chicken and the egg" argument.

Saturday, November 24, 2012

I am growing increasingly convinced that people who believe we have an absolute moral duty to see to the well-being of all other human beings, to install water-purifying equipment in villages on the other side of the world, etc., and who, at the same time, happily contribute to the ongoing mass slaughter of animals, are really just picking and choosing their causes. There simply is no compelling reason why I, or anyone, should suppose that all and only human beings are the worthy targets of moral concern. This is not to say that you should care about animals. It is only to say that there is nothing natural or obvious or conclusive about your belief that you should care about all and only human beings. Your belief is a prejudice, characteristic of a time and place, and not the final say about where the reach of moral community ends.

I completely agree, and this is why I would never identify as a humanist.

I feel like our need to slot things into established categories may be one of the deepest problems we have to uproot as human beings pursuing the dharma. It’s a survival skill we need, this categorizing of things. It’s what keeps us alive. If I’m walking down the street in a foreign country at night, as I often am these days, I have to watch people closely. If a group of guys is giving off signals I read as potentially dangerous, I walk down another street.

Maybe they’re perfectly nice people. Maybe they’re just excited about a football match they just saw. But I have no way to know. So I categorize quickly and act accordingly. This is what we all do all the time.

But we also have a tendency to go too far with this. Or to believe that the categories in which we place things are true or absolute. That becomes a problem if our aim is to see all of life just as it is.

This is all true from a descriptive standpoint. Is that really what we want, though, that God's-eye view? What if achieving such a perspective destroyed our ability to live and flourish as human beings? Do we want to transcend humanity itself, or do we just want to become as wise as possible within our human limitations? Is "pursuing the dharma" a goal to be reached, or is it a cyclical, Sisyphean process to be undertaken for its own sake, with no thought of ultimate resolution?

Of course, I'm pretty sure I know what he means when he says that. But that's the thing with labels, signs, and heuristics: they often signify different things to different people, or they suggest different emphases that can widen into huge gulfs of understanding. And it usually requires someone else, with an outside perspective, to point out details that got overlooked or underexplained.

It's terribly inconvenient, this impulse for truth. The human brain is designed to delegate as much activity as possible to the jurisdiction of the unconscious and instinctive. Biologically speaking, truth-seeking for its own sake is a luxury, and not necessarily a psychologically healthy one. In other words, it's not just a bad habit when we quickly scan a few superficial characteristics before assigning a person to an existing type and interacting with them accordingly. It's actually a fundamental quality of humanity, "just as it is". Choosing to arrest this process in order to sift through the neverending stream of information more consciously and carefully may require exempting oneself from much social activity, which is probably why it's best left to Zen monks, hermits and other outliers.

Sunday, November 18, 2012

Whilst Buddhism is not violent in and of itself, as a lived tradition it can lend itself to dark and deadly uses. There are Buddhist dimensions to the Thai state’s violent struggle to control the country’s far South. To make sense of insurgent violence and the response to it, we have to understand the intricate interdependencies and interconnections among “race”, rule and religion in Thailand. To that end, Buddhist Fury examines “the role of Thai Buddhist monks in a religio-political conflict” (p. 5): the impact of violence on Buddhist monks and the ways in which, as actors in their own right, those monks have an effect on the ongoing violence. Its author asks whether the practices and habits of Buddhist monks in a violent environment exacerbate or ameliorate violence.

...Buddhism as a “lived tradition”, outside Western idealized “Platonic” representations, is diverse, fluid and contradictory. Buddhist truth and traditions are not universal and eternal but are rather enmeshed with particular interests and power relations. How, then, is the non-violent image of Buddhism maintained? Such an image is achieved, argues Jerryson, because its practitioners and its analysts create a fantasy version of Buddhism: “fictitious people and practice—virtual religious models, morally airbrushed to enhance the message” (p. 185). The problem with this mythic Buddhism is that its dark side is ignored: extreme phenomena such as monks with guns, “soldier-monks”, militarized temples, Buddhist militia. In Thailand, religious nationalism legitimates violence, offensive and defensive, against the enemies of “nation, religion and king”. Monks as spiritual exemplars are credited with the power to purify and order hearts and minds and social relations, as Christine Gray has argued.

Speaking of the ways in which supposedly-transcendent ideals end up subordinate to more prosaic concerns, I have to admit being disappointed to see identity politics being asserted within one of the few philosophical worldviews aimed at dissipating the illusion of identity:

"Buddhism goes against identity. Race is a very superficial way of looking at things," he said. "Hopefully at some point the (people of color) will be relaxed enough within their humanity to be able to come into a greater room full of people and feel that same degree of relaxation, but that's a stage of development and that can't be pushed or forced upon them. And at some point they do, like Tuere, she just naturally started to come [to the broader meditation groups]. But it may take long."

...The goal of "more diverse dharma," as Smith calls it, has proliferated across the nation in recent years. Race is just one factor, though the most easily seen in many cases. In places such as New York and the San Francisco Bay Area, though, diversity has become an ever wider effort. At the East Bay Meditation Center in Oakland, Calif., there are Buddhist groups for gay, lesbian, bisexual and transgender meditators, people with disabilities and those with allergies to perfumes. In New Mexico and Arizona, Buddhists and Native Americans have joined to launch meditation centers that combine teachings from both traditions and include traditional Native healing rituals. In western Massachusetts, meditation communities have formed "diversity councils" to recruit minority practitioners. In Atlanta, meditators thought separate meditation groups were too divisive, so they launched a broad campaign against all "the 'isms."

I've long been aware that the form of Buddhism that interested me as a teenager wasn't necessarily representative of the religion in its many forms around the world, which is why I've never bothered identifying as a Buddhist. When the term, the label, the signifier gets fixed in place like that, it just becomes a strong magnet for misunderstandings and useless distractions. Is my "Buddhism" too white, middle-class, logocentric, etc.? Fine, then, I'm not a Buddhist. You can keep that shriveled husk of a descriptor for yourself. (I'll just start calling myself a panta rheist for the time being.)

New research continues to emphasize the importance of mind wandering for learning. It turns out that not paying attention is one of the best ways of discovering new ideas. Reading books, whether silently or aloud, remains one of the most efficient means of enabling such errant thinking. As our bodies rest, our minds begin to work in a different way. New connections, new pathways, and sharp turns are being made as we meander our way through the book, but also away from it. There is no way to tell if anyone is actually paying attention anymore as I read, including myself. This seems to be one of the great benefits of reading aloud, that you can think of something else while you do it. We may be holding the book together, but our minds are no doubt far apart by now. The fairy tale is the first story of childhood because it tells of such leaving behind (parents and home), of entering the dreamscape of the woods—and the mind. It tells of the crooked path of change. How can one know where reading books ends and dreaming in books begins?

The point of the new reading technologies, it often seems, is to avoid deep immersion, precisely because it's an activity the crowd can't influence or control and thus a violation of the iron rule of digital existence: Never be alone. Deep, private reading and thought have begun to feel subversive. A decade ago, the digital space was heralded for the endless opportunities it offered for individual expression. The question now is how truly individual — as in bold, original, unique — you can be if you never step back from the crowd. When we think and write from within our busyness, surrounded by countless other voices, too often the result is reactive, derivative, short-shelf-life stuff.

People of the book, such as I, not only believe that the replacement of the page by the screen will alter human character, thin it out, empty it of depth, but secretly hope this happens. A deterioration in human character consequent upon the demise of the book will be, for the inveterate reader, an apologia pro vita sua. For we who have spent so much of our lives with, and even for books secretly derived a sense of moral superiority from having done so. This is obvious from the fact that no one says “Young people nowadays do not read” in a tone other than of lament or, more usually, moral condemnation. A person who does not read—and for us reading means books—is a mental barbarian, a man who, wittingly or unwittingly, confines himself to his own experience, necessarily an infinitesimal proportion of all possible experiences. He is not only a barbarian, but an egotist.

...Whether the book survives or not, I am firmly of the opinion that it ought to survive, and nothing will convince me otherwise. The heart has its beliefs that evidence knows not of. For me, to browse in a bookshop, especially a second-hand one, will forever be superior to browsing on the internet precisely because chance plays a much larger part in it. There are few greater delights than entirely by chance to come across something not only fascinating in itself, but that establishes a quite unexpected connection with something else. The imagination is stimulated in a way that the more logical connections of the Internet cannot match; the Internet will make people literal-minded.

Saturday, November 17, 2012

The skreak and skritter of evening gone
And grackles gone and sorrows of the sun,
The sorrows of sun, too, gone . . . the moon and moon,
The yellow moon of words about the nightingale
In measureless measures, not a bird for me
But the name of a bird and the name of a nameless air
I have never–shall never hear. And yet beneath

The stillness of everything gone, and being still,
Being and sitting still, something resides,
Some skreaking and skrittering residuum,
And grates these evasions of the nightingale
Though I have never–shall never hear that bird.
And the stillness is in the key, all of it is,
The stillness is all in the key of that desolate sound.

Friday, November 16, 2012

Public shaming is an integral part of our criminal justice system, although its prominence rises and falls periodically. Many cities have posted the names of drug offenders, deadbeat dads, or public urinators on billboards. Some have required people convicted of drunk driving to affix fluorescent license plates to their cars once they start driving again. Kansas City experimented with broadcasting on a government-owned television channel the names and addresses of men arrested for solicitation. The “perp walk” is a pre-conviction public shaming ritual. Individual judges have ordered offenders to wear signs and shirts, or go door-to-door apologizing to victims of their crimes. Legal challenges to such shaming sanctions typically fail.

With all the evidence that stigma is powerful and dangerous and the historical record showing how it has been put to such bad uses in the past, the law should be careful about invoking human-kind emotions. Yet there are legal scholars, lawyers and judges who think stigma is a fine tool for the legal system to use. They are all for "shame punishments" like chain gangs, prison cams, and license plates that tell the driver's crime to the world.

A better argument is that stigma — as a historical phenomenon and as a psychological and medical experience — is far too dangerous to invoke. Stigma is, as Martha Nussbaum of the University of Chicago elegantly points out, inherently counter to the spirit of law because it acts on irrational, unconscious parts of the mind. An understanding of how human-kind psychology works shows why shame punishments are a terrible idea. These are devices that the law should not use.

On a related note, the perils of instapunditry. Maybe it's a cognitive bias on my part, but I haven't noticed anywhere near the same number of bloggers who originally hurried to make some sort of creepy-Elmo joke correcting their casual condemnation of the poor guy, by name, as a pedophile. You know, asbefore, perhaps it wouldn't fucking kill us to wait just a goddamned minute before using the power of the Internet to facilitate mass shaming and mob justice.

Wednesday, November 14, 2012

It seems, then, that Posnock wanted another book, something more prescriptive and assertive. But a presentist study, of that type anyway, was never the goal of American Nietzsche. Ratner-Rosenhagen's historicism could never satisfy Posnock's not-so-rhetorical question: "Who got Nietzsche right?" The book asserts, in essence, that this question can't be answered. Nietzsche's thought was plastic; it could be transformed in the heat of one's passions and imagination. Nietzsche's writings are too vague to give solid ground, to provide transcendence. This will never satisfy philosophers, historians, and earnest readers who seek ultimate truths. Then again, mere historical thinking never really satisfies those who put history solely in the service of the present.

As always, I don't see the need to vex you good people with my maladroit musings when I can simply point to the already-existing thoughts of smarter people and better writers, both types of which happen to be contained, conveniently enough, in the singular form of my friend Arthur. This is from a recent email correspondence with him on a similar topic (the Bloom being referenced is Allan Bloom, author of The Closing of the American Mind):

As for relativism, Nietzsche, at least, does not take value to be completely arbitrary (undifferentiated and neutral with respect to its pragmatic effects), and unlike many who claim him (paradoxically!) as an authority (doesn't Zarathustra say "Only when you deny me will I return to you"?), he takes the breakdown of absolutism as a call to create values, to choose one value as better than another (healthier, more life-enhancing) without a metaphysical safety net. This is not the same thing as cultural relativism; it represents a change from a concept of value as something pre-established and discovered to something (quasi-artistically) created or invented. To defend cultural relativism is, again, a self-contradiction, because it elevates that value above its opposite. It creates a value without admitting or being aware that this is what it is doing. But Nietzsche is quite clear on this subject: "Man is the value-making animal." We are always making values and value judgments whether we know it or not. We can't help doing so, though we can deceive ourselves infinitely as to what we are doing when we do so.

This brings me to the problem of Nietzsche, the problem of using his writings to justify a moral, political, or philosophical doctrine of any kind. The problem is that there is no Nietzsche; there are only Nietzsches. What makes his thought (if that's the word for it) a double-edged sword is that it bares its own contradictions and doesn't try to synthesize or in any way gloss over them. Bloom decries the fact that we Americans are such happy-camper nihilists who, unlike Nietzsche (he claims), aren't terrified by the abyss staring back at us: we photoshop a have-a-nice-day face on the monster. But the Nietzsche who is oppressed and horrified by the abyss of nihilism is just one Nietzsche. The other Nietzsche (or one of the other Nietzsches) celebrates the noon-tide of affirmation instead of lamenting the midnight of nihilism. The closest he comes to squaring this circle or resolving this contradiction between the negative and positive aspects of the post-metaphysical condition is to picture the resolution poetically as Zarathustra biting off the head of the serpent that has grotesquely lodged itself in his mouth.

If you dismiss Nietzsche as a philosopher because he seems to revel in contradiction, if you dismiss him as a brilliant manic-depressive who passes off his mood-swings as prophetic insight or philosophical doctrine, you are probably missing the point. Nietzsche is fully aware that his contradictions are unresolved, he refuses on philosophical principle to resolve them philosophically. This makes him a kind of intellectual anarchist more closely related to the Emerson he greatly admired than to the Heidegger who proved himself a charlatan by writing a ponderous two-volume tome "explicating" the philosophical doctrines of Nietzsche. Nietzsche always places his thought at the boundary of a system, the point where in Goedel's logic the system breaks down into either incompleteness or self-contradiction. He is the truth-telling Cretan Liar. As Deleuze says, the systems of Marx and Freud lived and died by the double-edged sword: their ambition was to found institutions, and they thus became just two more lawgivers in a series of lawgivers. History has passed them by. Nietzsche's thought remains vital by the same token that it remains perhaps just barely coherent enough to be useful at all. It is almost pure anti-institutional, open-ended experimentation opposed, finally to all institutions, all laws, all codes.

Any attempt to reduce Nietzsche to a set of philosophical doctrines also runs up against the question of literary style, or rather, literary styles. No one in the history of philosophy since Plato wrote as poetically as Nietszsche, and this is fitting, since Plato (and the half-literary-character Socrates) is his arch-enemy: one measures one's strength by the strength of one's foes. "Plato is boring." Who but Nietzsche would have the nerve to say something so baldly and briefly? It is a kind of insider's joke to those alert to the paramount importance in Nietzsche of style. What it means, unpacked, is that Plato went on a bit, was too blandly urbane and concatenated. He should have written in aphorisms, like Heraclitus. The point of the aphorism is in how briefly and pointedly it says what it says. Implicitly it models the truth or things, the look of things, as aphorism-like: punctual, untimely, nervous, leptic, indifferent to discursive explication, reveling in paradox.

But the aphorism is just one of his stylistic tricks: there is also Biblical parody, dialogue, essay, confession, verse (The Gay Science), the foreword and afterword as modes of self-revision keeping the growing, transformative and provisional edge of thought exposed to the air and alive. In other words, the rapid and unpredictable changes in his styles is at least as important as any individual style he employes at any given moment. In this regard Nietzsche is in the Cynic tradition of serio-comic meta-philosophy, of writing that mocks the premises and personalities of philosophers. Lucian's mock-philosophical writings are vital precursors of Nietzsche's agile, mercurial stance. His seriocomic play of styles internalizes Aristophanic comedy (think of The Clouds and the parody of Socrates) as a kind of philosophical method. The method is in part to foreground how the medium straddles the message. The "content" cannot be separated from the form.

From this point of view Bloom's lamentation over the fact that Americans don't lament enough over the epistemological abyss is not so much false to Nietzsche as one-sided in emphasizing just one among his many aspects, stances, or, most subversively, perhaps, his philosophical moods.

Jebus, what blithering tripe, what pious inanities. This is only the latest atrocity. Fuck the Catholic church. Empty every pew, loot every coffer, disband every level of the hierarchy, take all their property and turn it over to secular authorities to be managed ethically and rationally.

Tuesday, November 13, 2012

So, what do we do? Do we "hold our nose and bear it?" Do we dismiss the elementary school child from the class, leaving her to learn her reading, writing and arithmetic at home? Do we move upwind, if such a thing as upwind exists in a classroom? Do we confront the student? The grandmother? I've a better solution. We grip it. We wallow in it.

While living in San Francisco in the 1990s, the printing company for which I worked produced a book titled smell this. Produced by Women of Color in Coalition at the Center for Racial Education in Berkeley, smell this attempted to build a sisterhood for disenfranchised women. The editor offered up musings on her own scent as an apologia for the natural scents of some women, especially women of color. Embracing natural scents can be empowering… even if those scents are overpowering.

That's awesome. The naturalistic fallacy meets identity politics. I wonder if anyone has claimed asparagus piss and rancid farts as equally natural and integral aspects of cultural self-determination. I'd love to see that topic explored in cultural-theory jargon.

I was at a library sale over the weekend, browsing the anthropology section, when my eyes suddenly crossed and began watering. I swear, it was like a bully had pinned me down and proceeded to pummel my poor nose with fists made of garlic bulbs. Squinting through my blurred, teary vision, I turned and saw a hippie academic straight out of Central Casting, complete with professor's ponytail (bald on top, long grey fringe tied back), serape and hiking boots. Well, I suppose soap, hot water and mouthwash would have been oppressive tools of colonialism or something. Oh, my kingdom for a fire hose!

This is so astonishing that it’s easily overlooked. In terms of appealing to the common interests of humanity ­­— even if purely as a cover for smaller or more sordid interests — only the great religions have attempted anything like it. No other secular ideology has tried to be a totalising force in the same way.

...The challenge for humanists and liberals in the face of a transhuman future is daunting: to replace the socialist project — or to revive it. Without something like it to underpin a sense of common human identity and common human interest, people will divide on the basis of other identities. Many on the left, of course, have found in identity politics a replacement for the universalism of their past. But identity can also be seized on by the far right. It can feed a resentful indifference to the plight of others that comes from having one’s own plight disregarded.

All right. So the aim of a peaceful, global community of equality, reasonable security, and material abundance was a fantasy. Make us drink that cup to the dregs, but don’t expect us to be humanists after we’ve wiped our lips. If labour in the white skin can never emancipate itself, why should it care if in the black it is branded?

Overlooked? Honestly, the fact that socialism/communism was a secular version of Christianity pretending to possess scientific rigor has been noted enough to almost qualify as a cliché by now. The similarities even extend to the aftermath, as evidenced by that last paragraph. Monotheists of both the religious and secular varieties, grappling with their loss of faith in universal meaning, attempt one last reassertion of it in an inverted form, that of nihilism. If my existence has no eternal meaning and significance, then nothing ever has meaning and significance in any context. All we are is dust in the wind! Either way, it provides a false sense of preordained certainty to alleviate the difficulty of thinking, measuring, experimenting, judging, attempting, and possibly failing at the Sisyphean task of creating contingent meaning in the course of everyday living.

If you don't believe in the universal socialist brotherhood of humankind, what's to stop you from becoming a white nationalist? Or, to phrase that sentiment in a more typical way, if you don't believe in God, what's to stop you from raping, robbing and murdering? When you realize the fallacious nature of such all-or-nothing thinking, you realize that all the life worth living is done in between such conceptual antipodes.

Monday, November 12, 2012

When the year fell damp and cold,
Long the nights and short the days,
And the forest's fallen gold
Trodden in the miry ways;
Cloud-drifts trailing on the ridges,
Moorland rivers swollen and brown,
Lone birds, from the dripping hedges,
Seeking shelter near the town:
Quite forgotten summer's rays,
Closed we round the glowing ember,
And deem'd the cosiest of our days
The bleak beginning of November.

List'ning to the beating storm,
And the wind up in the vent--
Without, so cold--within, so warm--
Hearts so full of deep content:
Reading legends in the ashes,
Telling tales that charm and move;
Looking underneath long lashes
To devour the eyes we love:--
Eyes are closed and hearts are still'd;
But 'tis given me to remember
The more than summer light that fill'd
The bleak beginning of November.

Saturday, November 10, 2012

Ken Wilber? In academic circles, Wilber remains obscure. A sixty-three-year-old autodidact, he is the author of an ambitious effort to reconcile empirical knowledge and mystical experience in an “Integral Theory” of existence. Yet his admirers include not only the alternative-healing guru Deepak Chopra—who has called Wilber “one of the most important pioneers in the field of consciousness”—but also the philosopher Charles Taylor, the theologians Harvey Cox and Michael Lerner, and Bill Clinton. Wilber’s generally lucid treatments of both Western science and Eastern spirituality have earned him favor with a coterie of highly literate seekers for whom the phrase “New Age” is nonetheless suspect. He’s an intellectual’s mystic, short on ecstatic visions and long on exegeses of Habermas (whom he regards, for his perception of “homologous structures” in human individual and social development, as something of a kindred spirit). At the Integral Institute, a Colorado-based think tank inspired by Wilber’s ideas, scholars like Jack Crittenden, a professor of political theory at Arizona State University, strive to apply his approach to “global-scale problems,” from climate change to religious conflict.

Yes, "an intellectual's mystic", exactly. He's conversant enough with "serious" science and philosophy to appeal to those who would be embarrassed to be seen reading Chopra or Eckhart Tolle, but being fluent in jargon — or, at least, fluent enough to pass a cursory reading — isn't the same thing as knowing what you're talking about (other worthwhile and amusing criticism can be found here and here). I actually read all of his books back in the late '90s, lest you think I'm just being opinionated for the hell of it. The most telling detail I still remember was the fact that he named the German Idealist philosopher Schelling as his apparent intellectual hero; make of that what you will.

Friday, November 09, 2012

The touchstone, instead, is a Buddhist idea that is among the most difficult for Westerners to accept: the concept of anatman, or ‘no-self’. Let’s be clear: Buddhists do not claim that people do not exist. When the Dalai Lama flies to a symposium in Geneva or London, he obtains a ticket with the name ‘Tenzin Gyatso’, and his body occupies a seat. However, for Buddhists there is no self in the deeper sense that no one exists as a singular, permanent structure distinct and isolated in any meaningful way from the rest of the world. This is entirely in line with an evolutionary and ecological approach to our origins and our embeddedness in natural processes.

Each of us arises in conjunction with others, dependent on and inseparable from those others. Trying to locate an inviolate particle of selfhood within anyone (or indeed, in any living thing) is not like finding a solid pit inside an apricot. It is more like peeling an onion: we are layers within layers, with nothing at the centre. Or, like an eddy in a river, each of us can be identified and pointed to, but nonetheless, there isn’t any persistent ‘us’: just a constantly moving pattern of flow, with everyone composed entirely of non-self stuff, all of it passing through. For Buddhists and ecologists alike, we are all created from spare parts scavenged from the same cosmic junk-heap, from which ‘our’ component atoms and molecules are on temporary loan, and to which they will eventually be recycled.

Tuesday's election -- an event that included reelecting a mixed-race President, legalizing marijuana in some states, legalizing same-sex marriage in some states, electing women to the Senate in record numbers, the election of the first openly gay Senator, and the defeat of many hard-line social conservatives -- serves as a reminder that the country continues to move in a more liberal direction.

Liberals, you should rein in the triumphalism. Obama won a narrow 51-49 percent victory and the composition of Congress changed only slightly. This was not a historic vindication of liberalism, and it doesn't mean that we can suddenly decide that demography will sweep us to victory for the next couple of decades. The plain truth is that although an increasing number of voters are turned off by what Republicans represent, that doesn't mean they've become lefty converts. A lot of them are still pretty nervous about a big part of our agenda, and we have a lot of work ahead to get them more solidly on our side.

Thursday, November 08, 2012

Zizek posits that “Western Buddhists” can absolve themselves of any responsibility for changing their environment – and Knabb charges that “engaged Buddhists” will avoid even the mildest confrontation. Whilst I still believe in what I’ve written here and here about the possibility for social change, I am now forced to consider that any course of action would necessitate some violence. Indeed, even if we were to have our Mindful Revolution, would the current powers-that-be allow this? And if not, would that then make violent confrontation inevitable?

One to ponder.

Certainly, political change can be effected peacefully, at however glacial a pace. But utopian ideals aiming at the complete and utter transformation of human behavior tend to be unstable and combustible when exposed to the reality of gaining, holding and exercising power in a realm of competing interests and limited resources. The idea that violence can, should, or will be completely abolished on all levels of society from the personal to the political is probably just an ideological fantasy. An individual, say, can easily enough abstain from violence with consistency, but attempting to make it a normative principle for society as a whole makes me recall John Gray's admonition that attempting to get everyone to believe the same thing is itself a reliable guarantor of conflict. Complete renunciation of violence will likely always be the prerogative mainly of monks, hermits and other social outliers.

Wednesday, November 07, 2012

Carr’s model for what we might call the ‘book-user’—the contemplative literate subject—is grounded particularly in visions from American Transcendentalism and Romantic poetry. It is Nathaniel Hawthorne sitting meditatively in Concord, Massachusetts, prior to having his concentration broken by the intruding sounds of modernity, or the Keats of Ode to Psyche. This figure supplies the norm against which to measure our technological decline. But it surely faces many other challenges at present than the formal character of technology: the generalization of insecurity and economic precarity; the erosion of the separation between work and life; the decline of the home’s integrity as a space external to the bustle of capitalist existence. In this world it is for most of us, sadly, a rare thing to be able to carve out the psychological space that this figure requires, to sit at length with the tranquillity required for ‘deep’ reading. The computer and the Web may well be significant factors in bringing this situation about—not only through our direct interactions with them, but also through their social, economic and cultural implications, many of which are ably traced by Carr across his three books. But there are clearly other factors too, beyond technology. The Web, we might say, is the pre-eminent technological construct of an increasingly sickly neoliberal capitalism. As such, it is a major factor in shaping the vectors of behaviour and experience that characterize this world. But it is also a product of these, and of the society in which they take place. It is hardly surprising that the technology of a hyper-flexibilized, insecure, turbulent world offers little security to the purposefully structured, meditative mind.

I'm sympathetic to the desire for such contemplative space and activity, of course. I just hate the way Carr has carelessly juxtaposed it in such trite, romantic contrast to technology.

Tuesday, November 06, 2012

Call it upper middle brow. The new form is infinitely subtler than Midcult. It is post- rather than pre-ironic, its sentimentality hidden by a veil of cool. It is edgy, clever, knowing, stylish, and formally inventive. It is Jonathan Lethem, Wes Anderson, Lost in Translation, Girls, Stewart/Colbert, The New Yorker, This American Life and the whole empire of quirk, and the films that should have won the Oscars (the films you’re not sure whether to call films or movies).

The upper middle brow possesses excellence, intelligence, and integrity. It is genuinely good work (as well as being most of what I read or look at myself). The problem is it always lets us off the hook. Like Midcult, it is ultimately designed to flatter its audience, approving our feelings and reinforcing our prejudices. It stays within the bounds of what we already believe, affirms the enlightened opinions we absorb every day in the quality media, the educated bromides we trade on Facebook. It doesn’t tell us anything we don’t already know, doesn’t seek to disturb—the definition of a true avant-garde—our fundamental view of ourselves, or society, or the world. (Think, by contrast, of some truly disruptive works: The Wire, Blood Meridian, almost anything by J. M. Coetzee.)

Eh, I don't know how useful the particulars of this classification are. It makes me think of what Ian Hacking called the "looping effect" — the very act of delineating the traits of this supposed character type initiates a self-fulfilling prophecy, in which people consciously identify with it and shape their beliefs and attitudes accordingly. At its worst, you have people who take astrology seriously, or the, uh, personality distinctions between PC/Mac or iPhone/Android users, or other such marketing fictions.

Nevertheless, I do agree that the complacency he describes exists; I just think it's a general privilege of consumerism, period. I've frequently criticized its manifestation in the form of what I call the spiritual-not-religious — people who settle down in metaphysical suburbia and customize their generic, unexamined beliefs with some exotic accessories from the world bazaar: a little Sufi mysticism here, a few references to Buddhism and Taoism there; nothing too challenging or genuinely transformative. Culturally and politically speaking, I sense that sort of complacency in varying degrees from sites like Salon, Slate, the Atlantic Wire, Gawker Media. I see it among the Twitterati who number less than one in five people, but continuously talk to and about each other as if they constitute the entire world worth knowing. And while The Wire is certainly a work of artistic excellence by any standard, I don't doubt for a second that it also represented a rich, previously untapped vein of status and hipster cred for many of its online fans, who could claim a familiarity with a dangerous, foreign way of life from the safety of their living rooms (and get sniffly when David Simon called them out for it).

I see it as a corollary to what others have noticed for some time: the sheer plethora of choice available on the Internet paradoxically helps ensure that our experience becomes more monolithic, as we curate a collection of websites and online relationships that reinforce and flatter our identities rather than challenge them.

Monday, November 05, 2012

This is another lengthy-but interesting essay that I will just point to in lieu of commenting on it. At times, it seems to veer dangerously close to drafting Chekov as a representative of modern political conservatism (which may or may not be fair, for all I know), but not so much as to detract from my enjoyment in reading it.

Sunday, November 04, 2012

It's far too long to justifiably excerpt any particular bit, but this essay by Charles C. Mann is engrossing. I highly recommend making time to read it. (Also, I was unaware that he had written a sequel to 1491. Why does no one keep me informed about these things?)

Saturday, November 03, 2012

Here are a few numbers that don’t add up. Just-released stats from the FBI show that about three-quarters of a million Americans were arrested on marijuana charges last year—most of them for simple possession, as StoptheDrugWar.org reports. Meanwhile, a brand-new Huffington Post poll finds that nearly 60 percent of Americans want the weed legalized. Okay, you might expect such news from the liberal cabal at HuffPo, but their survey comes on the heels of a Gallup poll that declared 50 percent—the highest total ever—supported legalization.

In her book The New Jim Crow, Alexander, a former ACLU staffer, argues that the 30-year-old War on Drugs has created a de facto “racial caste system” to replace the Jim Crow laws that fell in the 1960s. Today, thanks in large part to the drug war, more than 2 million people, disproportionate numbers of them black and Hispanic, are locked up in America’s prisons, giving us an incarceration rate of 750 per 100,000 people, outpacing even repressive regimes like Russia, China, and Iran. In major cities, Alexander says, four out of five black men have criminal records, which not only takes them out of the legitimate economy while they’re in jail, but keeps them out of work after they’re freed because of widespread, and quite legal, discrimination against ex-convicts.

...Readers familiar with the horrors of the Jim Crow era may find it hard to see today’s drug war, harsh as it may be, as destructive a system of social control as decades of sharecropping, systematic disenfranchisement, and lynch law – to say nothing of the centuries of slavery that came before it. In her effort to wake up her readers, Alexander may be guilty of exaggerating for effect. If so, the exaggeration is worth it. The War on Drugs, and the massive buildup of our prison populations, has barely come up in this year’s presidential campaign. Before we lecture the Chinese and Iranians on their treatment of their ethnic minorities and dissidents, we need to look more closely at the millions of our own people we are locking up for years, often for no more than possession or sale of a few grams of weed.

Your personality is revealed in the way you speak, according to new research. Introverts tend to use more concrete words and are more precise, in contrast to extraverts, whose words are more abstract and vague.

...The differences make sense in terms of what we know about social behaviour and the introvert-extravert personality dimension, with the introverted linguistic style being more cautious, and the extravert style being more casual and vague.

My first-grade teacher told my parents that she wished I'd participate in class more often, but that I refused to be drawn into guessing on unfamiliar questions. I either knew the answer or I didn't. If I didn't know it, I didn't want to waste time illustrating that fact. Time's a-wasting, you fuckers, tell me what I need to know and let's get on with the efficient accumulation of facts!

Of course, in my maturity, I've realized there's a place for vague, impressionistic thinking. We call that "poetry".

Hobbes’s combination of pessimism about human nature with a sublime confidence that the human condition can be greatly improved if only power will listen to reason helps place him in a distinct phase of modern thought – that of the early European Enlightenment. Contrary to a popular stereotype, Enlightenment thinkers are by no means always optimists about the future. Modern-day partisans of enlightenment may like to think of history as a saga of continuing progress culminating in their own unrivalled wisdom, but Hobbes was fully aware that the moral and political gains of one generation are very often lost by the next. What he never doubted was the existence of a rational method that could deliver human beings from the worst kinds of conflict. By contracting to create a sovereign with authority to do whatever is needed to ensure peace, humankind could escape life in the state of nature – “solitary, poor, nasty, brutish and short” – and enjoy the amenities of “commodious living”.

Because he had no interest in liberty or democracy as ends in themselves, Hobbes can be seen as the greatest exponent of enlightened despotism. Contrary to silly chatter about “liberal Enlightenment values”, the Enlightenment has always included a highly influential current of authoritarian thinking – a current that includes later thinkers such as Jeremy Bentham and Auguste Comte, along with political leaders such as Lenin and Ataturk. Hobbes belongs in this current but it is part of his greatness as a thinker that he can also be viewed as the founder of liberalism. His best 20th-century interpreters – the Marxist C B Macpherson, Leo Strauss (intellectual mentor of the American neocons) and the sceptical conservative Michael Oakeshott, acknowledged that Hobbes, more than any other thinker, was the progenitor of the most fundamental tenet of modern liberalism – the belief that there is no natural or divine right to rule. The idea of Hobbes as a liberal seems puzzling only as long as you cling to the historically parochial notion that liberal values are essentially to do with a human right to freedom. For Hobbes, government existed only to protect its subjects, but for that very reason rulers were not bound to respect any of the freedoms we now think of as integral to liberalism. A Hobbesian sovereign could legitimately curb freedoms of belief and expression as long as doing so was necessary to keep the peace.

Before I ever took a philosophy class, I knew that Calvin's stuffed tiger companion was one of the wisest philosophers I had yet encountered, so I was always interested to learn about his inspirational namesake. I still prefer the feline version, but this is an interesting take on the human one.

I write in my notebook with the intention of stimulating good conversation, hoping that it will also be of use to some fellow traveler. But perhaps my notes are mere drunken chatter, the incoherent babbling of a dreamer. If so, read them as such.

The One True Blog's prose is immaculate. Scribbs should be an English teacher...Do keep writing; you should get paid for it, but that's hard to find.

—Noel

You are such a fantastic writer! I'm with Noel; your mad writing skills could lead to income.

—Sandi

WOW - I'm all ready to yell "FUCK YOU MAN" and I didn't get through the first paragraph.

—Anonymous

You strike me as being too versatile to confine yourself to a single vein. You have such exceptional talent as a writer. Your style reminds me of Swift in its combination of ferocity and wit, and your metaphors manage to be vivid, accurate and original at the same time, a rare feat. Plus you're funny as hell. So, my point is that what you actually write about is, in a sense, secondary. It's the way you write that's impressive, and never more convincingly than when you don't even think you're writing — I mean when you're relaxed and expressing yourself spontaneously.

—Arthur

Posts like yours would be better if you read the posts you critique more carefully...I've yet to see anyone else misread or mischaracterize my post in the manner you have.

—Battochio

You truly have an incredible gift for clear thought expressed in the written word. You write the way people talk.