Here is a rather long passage from the review, written by Philip Cunliffe.

In terms of Iraq, Roy points out that the failed adventure there is not simply a ‘hare-brained scheme that has fizzled out.’ Underpinning the disaster is an influential doctrine of intervention and democratisation shared by all the leading developmental institutions, both intergovernmental and non-governmental. Its assumptions are: ‘Distrust of existing governments, encouragement of civil society, the development of microprojects, and the central importance of women’s and gender issues, advocacy of the humanitarian approach.’ These ideas grow directly out of influential leftist theories about the ‘duty to intervene’ in other societies to enforce democracy. Existing governments are to be drubbed into submission (with bombs if necessary), and then some new political institutions are quickly thrown up, with little concern as to whether the new institutions have any legitimacy or roots in those societies. Hence the ridiculous outcomes, which would be comedic if they were not so tragic: Roy gives the example of Afghan village committees that are forced to follow gender equality guidelines that would make a Scandinavian Social Democrat blanche.

Using his distinctive vantage point, Roy effortlessly punctures one myth after another. So while leftists imagined that oil pipelines would spontaneously spring up from beneath the GIs’ boots, Roy shows that the oil lobby was cool on the invasion of Iraq. Unlike the imperialist corporations of yesteryear, Roy argues that today’s oil corporations are not interested in controlling oilfields directly. As long as no single country monopolises supply, the energy companies are happy to rely on the market to set prices and regulate production. On the other hand, Roy says that the civil engineering sector and service companies supported the war, because unlike the oil companies, they work on short-term contracts that do not require any long-term strategy and they are paid for by the Western taxpayer (meaning they have no need to cultivate stable relations with foreign governments).

The main myth that Roy wants to cut down in his book is the idea of a monolithic Islam confronting the West. For a start, if Iraq survives in any viable national form, it will be a Shia-dominated state, which will give the religious schism between Sunni and Shia an enduring new geopolitical significance in the region. More profoundly, Roy shows that the rhetoric of the ‘global war on terror’ collapses together four distinct spheres of Islamised politics, each with their own specific logic: al-Qaeda-style terrorists; Islamists proper (those who try to build Islamic political institutions); fundamentalists who want to live under sharia law; and the cultural conservatives who advocate communal autonomy under multiculturalism. None of these represent an existential threat.

Despite its bloody ostentation, al-Qaeda’s violence has no strategic orientation that would allow it to give purpose or political direction to its activities, with no lasting institutions or political roots. While Islamist movements have more political traction, they also run up against their limits: the failure of the Islamist political model in regimes such as Iran, Sudan and Afghanistan, and the fact that they must consistently push beyond an Islamist agenda in order to maintain political momentum in any national arena. Roy observes: ‘It is clear that the Palestinians who voted for Hamas in 2006 did not do so because they wanted the destruction of Israel or sharia, but because they wanted good governance, like the Iranians who supported Ahmadinejad in the second round of the Iranian presidential elections in 2005.’

Western states’ inability to establish political gradations between different types of Islamised politics leads directly to impotence: ‘There is not the military capability to attack on all fronts: it is not possible simultaneously to wage war on al-Qaeda, the Taliban, Hezbollah, Hamas, Syria and Iran (and the Muslim Brotherhood, veiled women, inner-city imams, etc).’ Or, more pithily: ‘Moral intransigence leads to impotence if based on a non-political definition of evil.’

As a result of such insightful analysis, Roy’s book is head and shoulders above much of the standard commentary on the Middle East, Islamism and al-Qaeda. Indeed, the main problem with the book is that it does not go far enough in pursuing this trope of the ‘politics of chaos’. Roy draws attention to the fact that the real dynamic driving political and social evolution in the Middle East is not Islam, but deeper under-currents of national rivalries and uneven processes of state-led modernisation in poor and traditional societies. This emphasis provides a rigorous, scientific counterbalance to the nonsensical depictions of the Middle East as a cauldron of fanaticism.

A spokesman for the Clinton campaign said that the 2007 tax returns of Mrs. Clinton and her husband, former President Bill Clinton, went up in flames along with everything else.

In fact, according to a spokesman for the Terre Haute Fire Department, the tax returns may have been ignited first, setting off the conflagration that burned the building to the ground.

“Based on our initial investigation, it looks like gasoline was poured all over the tax returns and then set ablaze,” said THFD spokesman Tracy Klujian. “Whoever did this has experience in destroying records.”

While the Terre Haute police have yet to name any suspects in the inferno, an eyewitness to the fire said he saw a “person in a yellow pantsuit with black trimming” fleeing the building moments after the campaign headquarters caught on fire.

At a press conference in Pittsburgh, Mrs. Clinton said that the incineration of her tax records was “a great personal loss,” adding, “I was so looking forward to sharing them with the American people.”

Mrs. Clinton cut her press conference short after a protracted coughing fit, complaining of smoke inhalation.

But the truth is, is that, our challenge is to get people persuaded that we can make progress when there's not evidence of that in their daily lives. You go into some of these small towns in Pennsylvania, and like a lot of small towns in the Midwest, the jobs have been gone now for 25 years and nothing's replaced them. And they fell through the Clinton administration, and the Bush administration, and each successive administration has said that somehow these communities are gonna regenerate and they have not. And it's not surprising then they get bitter, they cling to guns or religion or antipathy to people who aren't like them or anti-immigrant sentiment or anti-trade sentiment as a way to explain their frustrations.

Asked to respond, McCain adviser Steve Schmidt called it a "remarkable statement and extremely revealing."

"It shows an elitism and condescension towards hardworking Americans that is nothing short of breathtaking," Schmidt said. "It is hard to imagine someone running for president who is more out of touch with average Americans."

"I saw in the media it's being reported that my opponent said that the people of Pennsylvania who faced hard times are bitter," Clinton said this afternoon. "Well, that's not my experience. As I travel around Pennsylvania, I meet people who are resilient, who are optimistic, who are positive, who are rolling up their sleeves. They are working hard everyday for a better future, for themselves and their children.

"Pennsylvanians don't need a president who looks down on them, they need a president who stands up for them, who fights for them, who works hard for your futures, your jobs, your families."

It's been an interesting week for mind and psychology articles in the news. From the effectiveness of psychology and psychiatry to moral psychology to neuroscience and literature and beyond, here are a few of the best articles I found this week (in no particular order), with some relevant quotes and a little commentary.

Spelling is in decline today, she thinks, not because of the rich diversity of dialects, as in Chaucer's day, but because the dominant mindset of nomadic culture is that language does not matter. We are entering, as she puts is, an age of “linguistic whateverism”. One reason is that people today are writing vastly larger amounts of text than ever before, and “the more we write online, the worse writers we become.” In the eras of quills, pens or even manual typewriters it was hard to write a lot, so people took time and care in clarifying their thoughts. Many nomads today are convinced that they don't have the time to think and care, so they concentrate on speed alone.

Because language is the primary vehicle for thought, this has consequences. Already, Ms Baron detects a new and widespread intellectual torpor among her students. Young Americans used to cut corners before an exam on “Hamlet” by reading the CliffsNotes. Teachers hated them, but they were pedagogic wonders compared with today's method of Googling the passage in question, then using the computer's “find” function to get to the exact snippet. Ms Baron thinks that these days her students even think in snippets, which is to say incoherently. And that is how they write essays. Having internalised the new whateverism, they launch in and stumble through, with nary a thought for what they actually want to say.

This criticism dovetails strikingly with what other sociologists and psychologists are observing in the interpersonal behaviour of some nomads. Older people use their mobile phones to “micro-co-ordinate” with partners during the day in order to run their errands more efficiently and perhaps to spend more time together as a result. But many younger people, who have never known paper diaries or an unconnected world, micro-co-ordinate in order to avoid committing themselves to any fixed meeting time, location or person at all. After all, a better opportunity might yet present itself.

The concern, therefore, is that young nomads not only write without thinking or leave home in the morning without planning but also enter relationships without tying themselves down. Large parts of human interaction, especially the awkward subjects of rowing and separating, can now be relegated to virtual, as opposed to physical, interaction. A worrying trend in recent years has been adolescents' practice of dumping their lovers by text message or, worse, by changing the status of their Facebook profile from “in a relationship” to “single”. This is efficient and instantaneous, but potentially traumatic.

On one hand, the purpose of language, in whatever form, is communication, and as long as people still can understand each other, it's all good. On the other hand, the decline of precise language will likely result in the decline of precise thought, and we already have too little good thinking in this culture. In this case, as in most cases involving language and the arts, I am pretty conservative and tend to think that over time culture will suffer from the over-reliance on electronic media.

A team of American researchers attracted national attention last year when they announced results of a study that, they said, reveal key factors that will influence how swing voters cast their ballots in the upcoming presidential election. The researchers didn’t gain these miraculous insights by polling their subjects. They scanned their brains. Theirs was just the latest in a lengthening skein of studies that use new brain-scan technology to plumb the mysteries of the American political mind. But politics is just the beginning. It’s hard to pick up a newspaper without reading some newly minted neuroscientific explanation for complex human phenomena, from schizophrenia to substance abuse to ­homosexuality.

The new neuroscience has emerged from the last two decades of formidable progress in brain science, psychopharmacology, and brain imaging, bringing together research related to the human nervous system in fields as diverse as genetics and computer science. It has flowered into one of the hottest fields in academia, where almost anything “neuro” now generates excitement, along with neologisms—neuroeconomics, neurophilosophy, neuromarketing. The torrent of money flowing into the field can only be described in superlatives—hundreds of millions of dollars for efforts such as Princeton’s Center for the Study of Brain, Mind, and Behavior and MIT’s McGovern Institute for Brain Research.

Psychiatrists have been in the forefront of the transformation, eagerly shrugging off the vestiges of“talk therapy” for the bold new paradigms of neuroscience. By the late 1980s, academic psychiatrists were beginning literally to reinvent parts of the discipline, hanging out new signs saying Department of Neuropsychiatry in some medical schools. A similar transformation has occurred in academic psychology.

A layperson leafing through a mainstream psychiatric journal today might easily conclude that biologists had taken over the profession. “Acute Stress and Nicotine Cues Interact to Unveil Locomotor Arousal and ­Activity-­Dependent Gene Expression in the Prefrontal Cortex” is the title of a typical offering. The field has so thoroughly cast its lot with biology, and with the biology induced by psychoactive drugs, that psychiatrists can hardly hope to publish in one of the mainstream journals if their article tells the story of an individual patient, or includes any personal thoughts or feelings about the people or the work that patient was engaged with, or fails to include a large dose of statistical data. Psychiatry used to be all theories, urges, and ids. Now it’s all genes, receptors, and ­neurotransmitters.

I find this troubling. Mental health is being reduced to materialism, treating the complexities of the human mind as though there is nothing but biology and chemistry. But we are subjective creatures. As much as we can learn from fMRIs and other methods of looking at the brain, we can also need learn a great deal from the first-person experience of what it is to be a conscious being.

Further, we need to recognize that human beings are not singular organisms existing in a vacuum. We are social beings, existing in cultural contexts, and as such we are embedded in a collective mind as much (or more?) as we are singular beings. On that note, on to the next item.

One of the tremendous biases and blind spots of modern neuroscience is that it's almost always forced to see the mind in a social vacuum. While there have been some rudimentary attempts to study human interaction, or what happens to the cortex when it's not by itself - see, for instance, some of the work by Read Montague - our theories of the brain are almost entirely based on brains in isolation. The reasons for this are straightforward: other people are confounding variables. They make everything too complicated. Nevertheless, even a cursory glance at human existence is a reminder that we are profoundly social animals, that our minds are largely shaped by the minds of others.

I am in total agreement. This would obviously be part of an integral approach to mind and consciousness, which is what we need -- seemingly the opposite of where the field is heading based on many of these articles (and my sessions at Toward a Science of Consciousness).

A pillar of the new synthesis is a renewed appreciation of the powerful role played by intuitions in producing our ethical judgements. Our moral intuitions, argue Haidt and other psychologists, derive not from our powers of reasoning, but from an evolved and innate suite of “affective” systems that generate “hot” flashes of feelings when we are confronted with a putative moral violation.

This intuitionist perspective marks a sharp break from traditional “rationalist” approaches in moral psychology, which gained a large following in the second half of the 20th century under the stewardship of the late Harvard psychologist Lawrence Kohlberg. In the Kohlbergian tradition, moral verdicts derive from the application of conscious reasoning, and moral development throughout our lives reflects our improved ability to articulate sound reasons for the verdicts—the highest stages of moral development are reached when people are able to reason about abstract general principles, such as justice, fairness and the Kantian maxim that individuals should be treated as ends and never as means.

But experimental studies give cause to question the primacy of rationality in morality. In one experiment, Jonathan Haidt presented people with a range of peculiar stories, each of which depicted behaviour that was harmless (in that no sentient being was hurt) but which also felt “bad” or “wrong.” One involved a son who promised his mother, while she was on her deathbed, that he would visit her grave every week, and then reneged on his commitment because he was busy. Another scenario told of a man buying a dead chicken at the supermarket and then having sex with it before cooking and eating it. These weird but essentially harmless acts were, nonetheless, by and large deemed to be immoral.

Further evidence that emotions are in the driving seat of morality surfaces when people are probed on why they take their particular moral positions. In a separate study which asked subjects for their ethical views on consensual incest, most people intuitively felt that incestuous sex is wrong, but when asked why, many gave up, saying, “I just know it’s wrong!”—a phenomenon Haidt calls “moral dumbfounding.”

It’s hard to argue that people are rationally working their way to moral judgements when they can’t come up with any compelling reasons—or sometimes any reasons at all—for their moral verdicts. Haidt suggests that the judgements are based on intuitive, emotional responses, and that conscious reasoning comes into its own in creating post hoc justifications for our moral stances. Our powers of reason, in this view, operate more like a lawyer hired to defend a client than a disinterested scientist searching for the truth.

Our rational and rhetorical skill is also recruited from time to time as a lobbyist. Haidt points out that the reasons—whether good or bad—that we offer for our moral views often function to press the emotional buttons of those we wish to bring around to our way of thinking. So even when explicit reasons appear to have the effect of changing people’s moral opinions, the effect may have less to do with the logic of the arguments than their power to elicit the right emotional responses. We may win hearts without necessarily converting minds.

This is an interesting article in many ways. Most importantly, we may have to reconsider the moral hierarchy of development that Kohlberg and Gilligan have made so famous, and which plays a central role in integral psychology.

Perhaps, we also need to reconsider the integral notion that intellect is leading line in determining the evolution of other developmental lines. It may time to give equal (or greater?) weight to the emotional line. And if we make that adjustment, we might also need to reconsider the somatic line, since nearly all emotions are body-based.

The article raises a whole series of other issues, including the difference in response to two different moral problems -- the Trolley and Footbridge problems. Different brain regions are activated in each of these dilemmas:

This pattern of activity suggests that impersonal moral dilemmas such as the Trolley Problem are treated as straightforward rational problems: how to maximise the number of lives saved. By contrast, brain imaging of the Footbridge Problem—a personal dilemma that invokes up-close and personal violence—tells a rather different story. Along with the brain regions activated in the Trolley Problem, areas known to process negative emotional responses also crank up their activity. In these more difficult dilemmas, people take much longer to make a decision and their brains show patterns of activity indicating increased emotional and cognitive conflict within the brain as the two appalling options are weighed up.

We seem to have emotional reactions (and make decisions accordingly) when we are personally involved in the situation. When we can merely look at an impersonal decision, we are more likely to be rational. However, most moral decisions we have to make are not abstract, they are personal, so we aren't very likely to take a rational approach.

Be sure to read the whole article -- this is a great summary of what we currently know about moral behavior in human beings.

Why do we have emotions? Wouldn't it better to have the heart and soul of a lizard and feel nothing at all?

It's easy to understand why we have good emotions. Happy people live happy lives and make for happy mates. Presumably, all that happiness translates into passing on genes. Other positive emotions such as love and attachment are, in fact, essential for bringing up children, those little packets of genes.

Harder to explain are the "bad" emotions such as fear, anxiety, anger and hate. Why would evolution fill our heads with such negativity?

It may be that emotionality comes as an all-inclusive package and so you have to take the good with the bad; with love comes its evil twin hate, with happiness comes the flip side of sadness.

But evolutionary psychiatrist Randolph Nesse of the University of Michigan thinks that individual emotions are actually adaptations selected by evolution to help us cope with specific situations.

Nesse calls emotions "the mind's software." Faced with a sad situation, the mind brings up the sadness program to cope, and when the situation brightens, the mind get into the happiness loop.

For Nesse, it's not so much about the specific emotions, as the situations, because many emotions have similar cognitive, psychological and physiological effects. Faced with a situation, our feelings ratchet up and any number of emotions can, for example, put the body on alert, shut it down, change thinking patterns or motivate behavior. What matters is not so much the name of some emotions as what the mind and body does with it.

The relatively new field of evolutionary psychology tries to bridge the gap between neuroscience, subjective experience, and cultural embeddedness -- not always successfully. In this case, they are building on some of the findings from social psychology.

When faced with a stressful situation, our interpretation of the physiological state we experience (increased pulse rate, sweating, anxiety, etc.) will be determined to some extent by the social and environmental cues around us. For example, a man on a high footbridge will interpret his physiological state as fear in the absence of other cues. Put a beautiful woman on the bridge and he will interpret his state as arousal or attraction.

A generation of academic literary critics has now arisen who invoke “neuroscience” to assist them in their work of explication, interpretation and appreciation. Norman Bryson, once a leading exponent of Theory and a social constructivist, has described his Damascene conversion, as a result of which he now places the firing of neurons rather than signifiers at the heart of literary criticism. Evolutionary theory, sociobiology and allied forces are also recruited to the cause, since, we are reminded, the brain functions as it does to support survival. The dominant model of brain function among cognitive neuroscientists is that of a computer, and so computational theory is sometimes thrown into the mix. The kinds of things critics get up to these days are illustrated by a recent volume, Evolutionary and Neurocognitive Approaches to Aesthetics, Creativity and the Arts, edited by Colin Martindale and others (New York, 2007), with chapter headings such as “Literary Creativity: A Neuropsychoanalytic View”, and a call for papers for a congress this year on “Cognitive Approaches to Medieval Texts” (cognitive science, neuroscience and evolutionary psychology all welcome); and the emergence of “Darwinian literary criticism” which approaches the Iliad and Madame Bovary through the lens of theories about the evolved brain. Evolutionary explanations of why people create and enjoy literature, “neurocognitive frameworks” for aesthetics, and neural-network explanations for the perception of beauty are all linked through the notion that our experiences of art are the experiences of a brain developed to support survival. Byatt’s approach to Donne’s poetry through neuroscience, therefore, is not unique, nor even unusual.

At first sight, the displacement of Theory, with its social constructivism and linguistic idealism, by talk of something as solid as “the brain” of the writer and “the brain” of the reader may seem like progress. In fact, it is a case of plus ça change, plus c’est la même chose. The switch from Theory to “biologism” leaves something essential unchanged: the habit of the uncritical application of very general ideas to works of literature, whose distinctive features, deliberate intentions and calculated virtues are consequently lost. Overstanding is still on the menu. In many of the critical approaches that reached their apogee in the 1980s, there was a denial of the centrality of the individual consciousness of the writer; in approaches that purport to be neuroscience-based, the consciousness of the writer (and of the reader, as we shall see) is reduced to neurophysiology. Indeed, the reductionism of neuro-lit-crit is more profound. While aficionados of Theory regarded individual works and their authors as, say, manifestations of the properties of texts, of their interaction with other texts and with the structures of power, neuroscience groupies reduce the reading and writing of literature to brain events that are common to every action in ordinary human life, and, in some cases, in ordinary non-human animal life. For this reason – and also because it is wrong about literature, overstates the understanding that comes from neuroscience and represents a grotesquely reductionist attitude to humanity – neuroaesthetics must be challenged.

Yep. This simple strikes me as daft. And useless. And, well, you get the point.

Literature is a subjective experience for the author and the reader. Attempting to analyze it through reductionist, objective neuroscience is just plain wrong-headed, and unproductive, and, well, you get the point. Nuf said.

David Brooks, of The New York Times, engages in a rare bit of humor as he (sort of) makes a point.

They say the 21st century is going to be the Asian Century, but, of course, it’s going to be the Bad Memory Century. Already, you go to dinner parties and the middle-aged high achievers talk more about how bad their memories are than about real estate. Already, the information acceleration syndrome means that more data is coursing through everybody’s brains, but less of it actually sticks. It’s become like a badge of a frenetic, stressful life — to have forgotten what you did last Saturday night, and through all of junior high.

In the era of an aging population, memory is the new sex.

Society is now riven between the memory haves and the memory have-nots. On the one side are these colossal Proustian memory bullies who get 1,800 pages of recollection out of a mere cookie-bite. They traipse around broadcasting their conspicuous displays of recall as if quoting Auden were the Hummer of conversational one-upmanship. On the other side are those of us suffering the normal effects of time, living in the hippocampically challenged community that is one step away from leaving the stove on all day.

This divide produces moments of social combat. Some vaguely familiar person will come up to you in the supermarket. “Stan, it’s so nice to see you!” The smug memory dropper can smell your nominal aphasia and is going to keep first-naming you until you are crushed into submission.

Your response here is critical. You want to open up with an effusive burst of insincere emotional warmth: “Hey!” You’re practically exploding with feigned ecstasy. “Wonderful to see you too! How is everything?” All the while, you are frantically whirring through your memory banks trying to anchor this person in some time and context.

A decent human being would sense your distress and give you some lagniappe of information — a mention of the church picnic you both attended, the parents’ association at school, the fact that the two of you were formerly married. But the Proustian bully will give you nothing. “I’m good. And you?” It’s like trying to get an arms control concession out of Leonid Brezhnev.

Your only strategy is evasive vagueness, conversational rope-a-dope until you can figure out who this person is. You start talking in the tone of over-generalized blandness that suggests you have recently emerged from a coma.

Sensing your pain, your enemy pours it on mercilessly. “And how is Mary, and little Steven and Rob?” People who needlessly display their knowledge of your kids’ names are the lowest scum of the earth.

You’re in agony now, praying for an episode of spontaneous combustion. But still she drives the blade in deeper, “That was some party the other night wasn’t it?”

You lose vision. What party? Did you see this person at a party? By now, articulation is impossible. You are a puddle of gurgling noises and awkward silences. After the longest of these pauses, she goes for the coup de grâce: “You have no idea who I am, do you?”

You can’t tell the truth. That would be an admission of social defeat. The only possible response is: “Of course, I know who you are. You’re the hooker who hangs around on 14th Street most Saturday nights.”

The dawning of the Bad Memory Century will have vast consequences for the social fabric and the international balance of power. International relations experts will notice that great powers can be defined by their national forgetting styles. Americans forget their sins. Russians forget their weaknesses. The French forget that they’ve forgotten God. And, in the Middle East, they forget everything but their resentments.

There will be new social movements and causes. The supermarket parking lots will be filled with cranky criminal gangs composed of middle-aged shoppers looking for their cars. As it becomes clear that a constant stream of blog posts and e-mails decimates the capacity for recall, people will be confronted with the modern Sophie’s choice — your BlackBerry or your mind.

Neural environmentalists will emerge from the slow foods movement, urging people to accept memory loss as a way to reduce their mental footprint. Meanwhile, mnemonic gurus will emerge offering to sell neural Viagra, but the only old memories the pills really bring back will involve trigonometry.

As in most great historical transformations, the members of the highly educated upper-middle class will express their suffering most loudly. It is especially painful when narcissists suffer memory loss because they are losing parts of the person they love most. First they lose the subjects they’ve only been pretending to understand — chaos theory, monetary policy, Don Delillo — and pretty soon their conversation is reduced to the core stories of self-heroism.

Their affection for themselves will endure through this Bad Memory Century, but their failure to retrieve will produce one of the epoch’s most notable features: shorter memoirs.

This post appeared over at Integrative Spirituality a few days ago. It seems like a useful list of values that can be obtained through spiritual practice, no matter which tradition one practices in. It includes an anonymous values quiz that asks some good questions.

When lived in a balance, the list of virtues and values found in this article are the resultant authentic fruits of the Spirit. These fruits of classic, modern and post modern virtues are a reliable and observable broad spiritual community validity "proofs" for the results of authentic and effective spiritual practice, spiritual growth and a congruent spiritual lifestyle for an individual within the world. Compare your current spiritual lifestyle against these virtues criteria now...

Within the majority of world religions they also are considered to be the time-proven community validity and reality tests for the legitimacy of the results that are to be achieved if o­ne is xperiencing authentic spiritual practice and spiritual growth. In the list below you will find some virtues and values from the modern, postmodern and integral worldviews not specifically mentioned in earlier classical or traditional worldview lists of virtues and values.

The virtues below do not absolutely imply belief in a God/Buddha or being self-aware that o­ne is o­n a "spiritual" path. People can embody or practice these values and virtues as a secular humanist. The values and virtues can be also thought of as o­ne of the best results and proofs of authentic "enlightenment."

For ages, these virtues and values have been called the "fruits of the spirit" in humanity’s religious materials. While o­ne could live and demonstrate these virtues without being spiritual, claiming to be spirituality growing or having an authentic and congruent spiritual lifestyle and practice without increasingly manifesting living more of these virtues lived in balance is a spiritual incongruity.

Click here to use our o­nline tool to rate your relationship and life application of the virtues listed below. This will make it easier to see just how balanced your application of them actually is.

Accepting - tolerating without protest, o­n a deeper level recognizing the inherent and neutral truth/existence/"isness" of every occurrenceAdaptable - capable of adapting to varying conditionsAltruistic - unselfish regard for or devotion to the welfare of othersAmiable - being friendly, sociable, and congenialAppreciative - having or showing gratitudeAttentive - heedful of the comfort or condition of othersAuthentic - true to o­ne's own personality, spirit, or characterAutonomous - existing or capable of existing independentlyAware - having or showing realization, perception, or knowledgeBalanced – mental, spiritual and emotional dynamic steadinessBenevolent - disposed to doing goodCapable - having general efficiency and abilityCentered - emotionally stable and secure

Charitable - merciful or kind in judging othersCommitted - able act with deliberationCommunicative - able to transmit information, thought, or feeling so that it is satisfactorily received or understoodCompassionate - showing empathyCompetent - having requisite or adequate ability or qualitiesConsiderate - thoughtful of the rights and feelings of othersConsistent - marked by harmony, regularity, or steady continuityCooperative - a willingness and ability to work with othersCourageous - mental or moral strength to venture, persevere, and withstand danger, fear, or difficulty

Co-Responsible - being a co-responsible agent doing at least your approximately o­ne six billionth fair share part to co-evolve our shared world toward the necessary improvements that you see and know need to be made.Creative - an ability to create beauty and or art in o­ne’s life as well as bringing increased levels of creativity or co-creativity to solving the problems of lifeDecisive - to find out or come to a decision about by investigation, reasoning, or calculationDevoted - consecrated to a purposeDirect - free from evasiveness or obscurityDiscerning - showing discriminating insight and understandingEco-Friendly - displaying environmental co-responsibility and sustainability

Ethical - guided by that which is morally goodEvolving - moving o­neself forward physically, mentally and spirituallyFair - characterized by frankness, honesty, impartiality, or candor; open; upright; free from suspicion or bias; equitable; justFlexible - characterized by a ready capability to adapt to new, different, or changing requirementsForgiving - allowing room for error or weakness and to give up resentment

Generous - showing or suggesting nobility of feeling and generosity of mindGood - possessing moral excellence or virtueGrounded - having a firm foundationHonest - free from fraud or deception. Marked by integrity. Marked by free, forthright, and sincere expressionHopeful - desiring some good, accompanied with an expectation of obtaining it, or a belief that it is obtainableHumane - compassionate, sympathetic, or considerate towards humans or animals

Humble - not proud or haughty: not arrogant or inappropriately assertiveInclusive - broad in orientation or scopeIntegrative - forming, coordinating, blending or integrating parts into a functioning or unified wholeInterconnected - to be or become or understand our mutually connectedness in the web of lifeInterdependent - understanding that both or many parties are needed to be successfulJust - rendering or disposed to render to each o­ne his due; equitable; fair; impartial

Loving - feeling or showing affectionLoyal - true to any person or persons to whom o­ne owes fidelity, especially as a wife to her husband, lovers to each other, and friend to friend; constant; faithful to a cause or a principle.Ministering - to supply or to things needful; esp., to supply consolation or remedies

Open-minded - ready to entertain new ideasPatient - good-natured tolerance of delay or incompetencePeaceful - peacefully resistant in response to injusticeProductive - producing, or able to produce, in large measure; fertile; profitable, also creating or maintaining productive, equitable and ethical relationships of exchange with others, society and worldResponsible - to be reliable; to be trustworthySelf-Disciplined - correcting or regulating o­neself for the sake of improvement

Sufficiency - the quality or state of being sufficient, or adequate to the end proposed, knowing when "enough is enough" particularly in relation to the excesses of materialismTolerant - showing respect for the rights or opinions or practices of others

If you have not done so already click here to use our o­nline tool to rate your relationship and life application of the virtues listed above. This o­nline rating tool will make it far easier to accurately see just how balanced your current application of the virtues actually is.

For more recently added information o­n the values of an integral citizen in the integral age, click here.

For information relating to where the virtues have come from and their relation to both spirituality and nature click here for more o­n Evolutionary Spirituality. To discuss your ideas o­n the virtues, click here.

Area Man Makes It Through Day

Blume summons the strength to brush his teeth.

SCHAUMBURG, IL—Despite an overwhelming, seemingly endless barrage of frustrations, area systems analyst Adam Blume made it through the entire day Tuesday, overcoming the odds against him in a Herculean display of courage, perseverance, and the indomitable human spirit.

According to witnesses, though it seemed on more than one occasion throughout the day that his life would come to an end, Blume valiantly found the wherewithal to carry on. Not only did the 37-year-old successfully get out of bed and leave his apartment, but he somehow found the strength to navigate through the day's many challenges and, once victorious, made his way back home again. Hit from every side with such formidable opponents as suburban conformity, mind-numbing coworkers, and the celebrity "infotainment" magazine he paged through on his lunch break, Blume nonetheless trudged along—permitting nothing, no matter how soul-deadening, to break his will.

"Man, what a day," Blume said regarding his 16-hour battle with everything from public transportation to profound spiritual alienation.

Experts estimate that, by 10 p.m. Tuesday night, Blume had survived exposure to approximately 1,700 advertising images of epic banality, at least 35 emotionless interactions with complete strangers without making any real human contact, and more than 25,000 moments of soul-crushing inner emptiness throughout the almost day-long struggle. In addition, he also surmounted the onslaught of more than 150 separate anxiety-producing forces, including credit card debt, weight gain, hair loss, sexual inferiority, loneliness, a dead-end job, geographical isolation from extended family, virus-laden spam, the need to keep his cell phone charged, in-store Muzak, mortality, mounting laundry and dishes, his cable bill, indefinable longing, fear of terrorism, online gossip, the unavoidable certainty of his own unimportance, nostalgia for a past that never was, severe lower-back pain, and general ennui.

"I only wish I had gotten a chance to pick up those replacement filters for the vacuum cleaner," Blume said only moments after valiantly suppressing the urge to set fire to his carefully cataloged file cabinet of insurance information and old appliance manuals. "The last ones I got were for the wrong model, but I can't take them back because I didn't save the receipt and now I need new ones."

How Blume made it out of his kitchen—let alone his apartment and suffocating cubicle—may never be known.

"And for some reason, I had the song 'Hobo Humpin' Slobo Babe' stuck in my head all day," he added.

Blume's epic odyssey of survival reportedly began at 6:15 a.m., the moment he awoke. After enduring the sudden, unrelenting attack of his bedside alarm clock, Blume resisted the near-overpowering compulsion to press the snooze button a second time. Courageously hurling himself from bed and dragging his almost unconscious body the 15 feet to his bathroom, Blume was almost defeated before even making it to work when, as he was putting toothpaste on his toothbrush, it fell on the floor.

"I thought I was going to lose it right there," Blume later told reporters. "It was lying in that space between the sink and the bathtub, covered in dust, so I had to bend over, grab it, rinse it off under some hot water, and put some more toothpaste on it. I hate when that happens."

According to roommate Joe Tesch, with whom Blume shares an apartment despite already having reached middle age, the physically, financially, and spiritually exhausted man then stared at his hollow face in the mirror for approximately three minutes before showering, shaving, and moving his bowels in time to catch the 7:04 bus.

After arriving at work, Blume's trials and tribulations only continued. Over the next 10 hours, Blume weathered an onslaught against his very humanity, from automated menus on telephones and cash machines, to shrill homeless men yelling in the street, to a coffee stain on his workplace-mandated tie.

This was not Blume's first exposure to adversity. When pressed, he was able to recall several such incidents, including the time in May 1993 when he walked on crutches all the way from the bus stop at the bottom of a large hill in Madison, WI to the unemployment office located at the top, the 72 hours he spent stranded in Chicago's O'Hare Airport during the 2004 Christmas season, and the thousands of other battles before, between, and since.

"Another day, another dollar," said Blume, modestly downplaying the impressive scope of his accomplishments. "I suppose I just did what anybody would have done."

Engine against th’ Almighty, sinner's tower, Reversed thunder, Christ-side-piercing spear,The six days’ world transposing in an hour; A kind of tune which all things hear and fear:

Softness and peace and joy and love and bliss; Exalted manna, gladness of the best;Heaven in ordinary, man well dressed, The milky way, the bird of paradise,

Church-bells beyond the stars heard, the soul’s blood, The land of spices; something understood.

* * *

Davis McCombs Comments:I first read George Herbert’s “Prayer (I),” nearly twenty years ago this Fall in a year-long survey of English Literature course I was taking at Harvard: “It’s Tuesday, this must be Spenser,” I used to say.

When I look at my creased and slumping copy of the Norton Anthology from that time, I see that I’ve drawn stars around Herbert’s poem, as well as noting in the margin, NO VERBS! I still remember how thrilled I was by the idea of a perfect English sonnet with not one verb.

Helen Vendler, with whom I was studying at the time, has argued that the poem’s trajectory represents, as she puts it, a “spiritual evolution from dry recitation, through resentment, to spiritual refreshment.” I like that. I also like the idea that the poem has no trajectory; that prayer is all of these things and that it is all of them at once. Others have used the poem as a window into Herbert’s reading of the Bible or into his thinking about various questions of theology. Two decades later, the poem still seems fresh to me. Each phrase or “metaphorical appositive,” as one critic puts it, is like a little puzzle, demanding and rewarding attention, all culminating in the soothing, if still enigmatic, chords of that last phrase: “something understood.” And we haven’t even gotten to the question of why Herbert decided to forgo verbs in the first place: what effect is intended? what message is implied?

I loved all that about “Prayer (I)”; I still do, but I have to admit that the aspiring poet I was in the Fall of 1990, looking back across nearly four centuries to Herbert in his little parish church in Wiltshire England, was most excited by what I took to be the challenge thrown down by the poem. From that moment on, I wanted to write my own verbless poem, one that like Herbert’s wouldn’t be just a static or random grouping of words or phrases, but would have coherence, momentum. How does one get the boxcars of a poem rumbling down the tracks without the rods and pistons of verbs?

Over the years, I’ve collected other verb-free poems: Robert Francis’s “Silent Poem,” for instance. I’ve even made the assignment to write a verbless poem a regular feature of my poetry workshops. Banning verbs for an assignment or two, it seems to me, is a great way to get one’s creative writing students to enter and utilize language in new ways, to let go of narrative, to alter their thinking about poetry and to be altered.

About Davis McCombs:Davis McCombs, a Yale Younger Poets Award winner selected by W.S. Merwin, directs the Creative Writing Program at the University of Arkansas. He attended Harvard University, the University of Virginia (MFA) and Stanford University as a Wallace Stegner Fellow. He is the recipient of fellowships from the Ruth Lily Poetry Foundation, the Kentucky Arts Council, and the National Endowment for the Arts. His work has appeared in Best American Poetry 1996, The Missouri Review, and Hayden's Ferry Review. He is the author of Ultima Thule (Yale University Press), and Dismal Rock (Tupelo Press).

It seems that the conservative establishment -- led by the often insightful and occasionally infuriating David Brooks -- are starting to realize that Western values (such as Democracy) cannot be imposed on tribally-based cultures.

The French and British (and others) tried it when they drew the national borders in the Middle East and Far East, borders that had nothing to do with tribal loyalties or religious affiliations. Now that those externally-imposed borders are starting to break down, as Henry Kissinger recently pointed out, and our attempts to impose Western style government in Iraq have failed, some people are starting to wake up to the obvious.

As Philip Carl Salzman argues in “Culture and Conflict in the Middle East” (brilliantly reviewed by Stanley Kurtz in The Weekly Standard), many Middle Eastern societies are tribal. The most salient structure is the local lineage group. National leaders do not make giant sacrifices on behalf of the nation because their higher loyalty is to the sect or clan. Order is achieved not by the top-down imposition of abstract law. Instead, order is achieved through fluid balance of power agreements between local groups.

In a society like this, political progress takes different forms. It’s not top down. It’s bottom up. And this is exactly the sort of progress we are seeing in Iraq. While the Green Zone politicians have taken advantage of the surge by trying to entrench their own power, things are happening at the grass-roots.

Iraqis are growing more optimistic. Fifty-five percent of Iraqis say their lives are going well, up from 39 percent last August, according to a poll conducted by ABC News and other global television networks. Forty-nine percent now say the U.S. was right to invade Iraq, the highest figure recorded since this poll began in 2004.

More generally, the Iraqi people are sick of war and are punishing those leaders and forces that perpetrate it. “A vital factor in the security improvement is public backlash against the chaos and extremism of the past five years,” declared Yahia Said of the Revenue Watch Institute in written testimony to the Senate Foreign Relations Committee.

And, as one would expect, the local clans have taken control. Iraqi politics have become hyperlocalized, Colin Kahl, a Georgetown professor and Obama adviser, has observed. The most prestigious groups in Iraqi society are tribes and Awakening Councils. Many of these councils earned legitimacy by fighting during the height of the violence and have now come out in the open as local authorities.

I think Brooks, who has been a loyal supporter of the Iraq War even while criticizing Bush, is misreading the situation, but he is on the right track.

Tribal culture is run from the bottom up, so that our efforts to impose rule from the national level are doomed to failure. And it seems that those on the ground know that, or at least have finally figured it out.

These groups have created a fluid network of fragile truces. They squabble over money, power, ideology and sectarian issues. But they have incentives to keep the peace. Sunni leaders have come to realize that they can’t win a civil war against the Shiites. Shiite militia leaders recognize their own prestige and power drops the more they fight.

As Stephen Biddle of the Council on Foreign Relations observed in his Senate testimony last week: “This does not mean sectarian harmony or brotherly affection in Iraq. But it does mean that cold, hard strategic reality increasingly makes acting on hatred too costly for Sunni insurgents and Shiite militias.”

The surge didn’t create the network of truces, but the truces couldn’t have happened without the surge. More than 70,000 local council members are paid by the Americans. They rely on the U.S. military to enforce bargains and deter truce-breaking. Thanks to these arrangements, ethno-sectarian violence dropped by 90 percent between June 2007 and March 2008. That’s the result of political progress, not just counterinsurgency techniques.

It has become common to belittle these truces. After all, they are not written by legislators on parchment. And indeed there’s a significant chance that they will indeed collapse and the country will devolve into anarchy.

But in certain societies, this is the way order is established, through what Salzman calls “balanced opposition.” As long as the network of truces holds, then the next president (Democrat or Republican) will have an overwhelming incentive to nurture the fragile peace.

I don't think the surge made this happen. But then I am not on the ground in Iraq. My sense is that this is occurring in spite of the surge. If we had been encouraging this type of bottom-up politics from the beginning, it's entirely possible that we would already be out of Iraq.

Here is some of the interview, which focuses more on getting science into a sports magazine:

Michele Wilson: Why did you want to write this very scientific article for the Sports Illustrated package about steroids?

David Epstein: We knew people were paying attention to this, with the Roger Clemens news. We wanted to do a present story, a past story and a future story. I knew the future story would have to be pretty science-based. It couldn’t be too narrative - from the personal angle of an athlete - because nobody is in the future. People aren’t getting caught doing [gene therapy] yet, so I knew it had to be pretty science-oriented.

MW: How did you know gene altering was the right topic?

DE: I have a background both in sports and in science. I had heard about this stuff years ago. I knew there were studies of this German baby who had genetic mutations that caused him to be more muscular. Some athletes would contend, when they were caught and when they tested positive for drugs that, like the baby, they had genetic abnormalities that could’ve made them the kind of athletes they are. So I knew this stuff was out there, but I never delved into it to find out how plausible it was, where it was right now. By keeping up with the science news, I knew where the next frontier would probably be. I started calling my sources and they told me that that frontier is closer than I had anticipated.

MW: Why has the press steered away from covering the science of gene doping?

DE: It can be a little technical. I always read through peer-reviewed literature. I think it’s really important to do to get a sense of how unsure some of this research is. It’s not really as black and white as sometimes it needs to be. When you read through those articles, they’re very difficult to get through, extremely difficult if you don’t have somewhat of a science background. So I think that’s one hurdle, that the primary sources are just difficult to understand.

Also, it’s really cutting edge, and no one’s gotten caught yet. Most of the steroid stories that have come out since I’ve paid attention have been based on some kind of law enforcement action, like Balco, where they raided a physical location; they had a trial that people could go to. No one’s gotten caught for this, so it’s harder to generate a news hook. One of the luxuries of being at a place like SI is that they tolerate something a little more expansive and nuanced, where it didn’t take someone getting caught for us to start writing about. You have to put some faith in your readers that they’ll be interested in reading about it.

MW: Se-Jin Lee, one of the scientists in your article, has been quoted-albeit briefly-in several places. How did you get different information than what’s already been written?

DE: When I first contacted him, he said, “I don’t do sports interviews anymore” [because reporters focused on the athlete angle rather than the science]. But he had been a Sports Illustrated reader for awhile, and he said, “If you promise me you’ll explain that my work is not to enhance athletes, we can do it.”

I think the challenge with any magazine reporting is that we’re often not completely breaking something. You have to find a way to do something new. Since I’ve been at SI, I’ve learned that interviewing for magazines has to be so much more detailed. You have to demand more time from people. I probably read fifty peer-reviewed science journal articles before I talked to Lee, and I interviewed other people, so I really had a good feel for the landscape. Like any interview, we hit it off. He was a big sports fan, so he was as happy to talk to me as I was to him. Se-Jin Lee discovered myostatin (the protein that tells muscles when to stop growing). He’s the man. He kind of started this whole thing.

I've been blogging about subpersonality theory for years (see the sidebar). Psychologists have been writing about it for more than 75 years. But all it takes is one book and a good marketing scheme to make it seem as though the author, Rita Carter in this case, has just come up with the theory on her own. Carter's new book, Multiplicity: The New Science of Personality, Identity, and the Self, makes it seem as though no one had ever thought of this before.

So, you might understand my displeasure with the misleading word, "new," in the title of Carter's book.

Dinesh Ramde, an Associated Press writer, reviewed the book and that review has been picked up by most newspapers that still carry book reviews, including the local paper here in Tucson. Here is a piece:

In "Multiplicity: The New Science of Personality, Identity, and the Self," Carter describes the concept of a personality and explains how one's multiple personalities interact. Carter's professional credentials aren't given, and it's unclear whether she's summarizing conclusions from peer-reviewed journals or simply producing her own hypotheses.

Her arguments sound logical. After all, we can all relate to acting differently at different times. Carter's point is that each actor is a different personality, and how we think and feel depends on which personality is taking center stage at that moment.

For example, suppose on Monday an acquaintance invites you to a weekend party. You accept because you're genuinely excited about the event. But by Friday afternoon you might feel so uninterested that you wonder why you accepted in the first place.

It's not that your mood has changed, Carter would argue — your Likes-to-Party personality was dominant when the invitation was accepted, but it later got bumped off stage by your Keeps-to-Yourself personality.

Because we think of ourselves as single personalities, we can't understand why our feelings have changed. But instead of berating ourselves for the change of heart, Carter says, we should focus on controlling the personalities that cause problems. In other words, persuade Keeps-to-Yourself to step back and allow Likes-to-Party to replace it for the night.

Can we really choose which personalities to keep active and which to confine to the background? Certainly, Carter writes. She spends the second half of the book walking readers through simple exercises designed to help them do just that.

But here's where the book begins to lag. The exercises may have value for readers willing to invest the time and mental energy, but a skeptic may remain unmoved.

For example, one exercise has the feel of a tea party among imaginary friends. Carter suggests that readers gather their "personalities" for a discussion: Set one empty chair per personality, visualize each personality sitting there and command the self-destructive personalities to ease up.

This exercise may very well help some people, but it's hard to imagine any but the most hardcore readers actually trying it.

There might be value to rethinking who we are, and the idea of multiple personalities jockeying for temporary control of our brains does help explain those times when we look back and think, "Did I really do that?" or "That was really out of character for me."

But "Multiplicity" promises more than it delivers. To her credit, Carter writes with simplicity and wit, and her anecdotes are engaging. But unless readers are willing to fully immerse themselves in Carter's exercises, the book will be more academic than therapeutic.

I hope this book doesn't find a market because it sounds as though Carter doesn't have the slightest idea how to work with parts.

One does NOT "command" parts to do anything. Nor do we "control" or "persuade" them. Parts exist for a reason. Each of our major parts originated in response to some form of trauma or pain. When those parts were "born," they worked to disown or exile other parts of our psyche that were seen as too vulnerable or dangerous. [See here and here for more complete explanations of how parts develop.]

When we work with our parts, we need to treat them with respect and approach them with curiosity and an open mind. If we are adversarial with them, they may pretend to relent, but they will quickly reassert their presence, usually even more powerfully than before.

Dick Schwartz gives an example from his early work with abuse survivors, who generally have the most destructive parts -- ones that lead to eating disorders, addictions, cutting, and even suicide. One woman, who was a persistent cutter, wasn't making much progress with the family therapy model in which he was trained. So, in desperation, he began to ask her about the cutting, which the woman described as a part.

Feeling a bit too egoic (a big mistake when working with parts), he decided that the woman would not leave the office until the "cutter part" agreed not to hurt her anymore. After a few hours, he and the client convinced the part not to cut her in the following week.

When the woman showed up for her next session, rather than the usual cuts on her arms, she had a large gash on the side of her face. From this painful experience, Schwartz learned that parts like these have to be negotiated with, respectfully, not coerced.

While most of us aren't cutters and don't have these extreme types of parts, nearly all of us have critics, pushers, or perfectionists (among others) that control our lives at times. These parts can be just as defiant when threatened. They think that they are protecting the psyche from pain, humiliation, shame, or whatever, and that if they go away the psyche will be damaged. Most often, these parts are also keeping exiles suppressed, those parts deemed too fragile or vulnerable or dangerous to allow into consciousness.

Back to Carter's book. In what appears to be an advertorial for the book, Carter has an article in the March issue of New Scientist in which she explains the distinction between DID (dissociative identity disorder, which used to be called multiple personality disorder) and multiplicity. This is useful for the lay reader who may have read Sybil or seen The Three Faces of Eve at some point and doesn't want to think of himself as "insane."

It's time for a real book on subpersonalities and multiplicity -- one that is psychologically accurate and capable of grasping the subtlety of work with parts.