The Senate of the United States shall be composed of two Senators from each State, chosen by the Legislature thereof, for six Years; and each Senator shall have one Vote.

Each state has two senators, regardless of the size of its population. Originally, senators were chosen by state legislatures. In 1913 the 17th amendment provided that senators would be directly elected by the people.

Immediately after they shall be assembled in Consequence of the first Election, they shall be divided as equally as may be into three Classes. The Seats of the Senators of the first Class shall be vacated at the Expiration of the second Year, of the second Class at the Expiration of the fourth Year, and of the third Class at the Expiration of the sixth Year, so that one third may be chosen every second Year; and if Vacancies happen by Resignation, or otherwise, during the Recess of the Legislature of any State, the Executive thereof may make temporary Appointments until the next Meeting of the Legislature, which shall then fill such Vacancies

My thought is that the 17th amendment should be repealed and the original Section 3 of the Constitution, should by default be reinstated. This would help curb the power of the Federal government over state governments, which was the intent of section 3 in the first place. Senators would be appointed by state governments ensuring their loyalty to the state they represent, not some special interest group or Party. The House of Representatives is the body of government that was intended to directly be voted into office by the people. It's because of the 17th amendment that the Federal government has been able to gain increasing power over States rights, and is a major reason for the political deadlock between Party's, constant campaigning, corruption and career politicans out of touch with their State's needs. I don't know what group of idiot's in 1913 thought they were smarter than the guy's that penned the final draft of the Constitution and the States that ratified it but they were idiot's and the cause of much of what ails us today. I could keep going but then my thought would undeniably be a rant then.

Mathematician and Economist; Principal, Natron GroupThe sophisticated "scientific concept" with the greatest potential to enhance human understanding may be argued to come not from the halls of academe, but rather from the unlikely research environment of professional wrestling.Evolutionary biologists Richard Alexander and Robert Trivers have recently emphasized that it is deception rather than information that often plays the decisive role in systems of selective pressures. Yet most of our thinking continues to treat deception as something of a perturbation on the exchange of pure information, leaving us unprepared to contemplate a world in which fakery may reliably crowd out the genuine. In particular, humanity's future selective pressures appear likely to remain tied to economic theory which currently uses as its central construct a market model based on assumptions of perfect information.If we are to take selection more seriously within humans, we may fairly ask what rigorous system would be capable of tying together an altered reality of layered falsehoods in which absolutely nothing can be assumed to be as it appears. Such a system, in continuous development for more than a century, is known to exist and now supports an intricate multi-billion dollar business empire of pure hokum. It is known to wrestling's insiders as "Kayfabe".Because professional wrestling is a simulated sport, all competitors who face each other in the ring are actually close collaborators who must form a closed system (called "a promotion") sealed against outsiders. With external competitors generally excluded, antagonists are chosen from within the promotion and their ritualized battles are largely negotiated, choreographed, and rehearsed at a significantly decreased risk of injury or death. With outcomes predetermined under Kayfabe, betrayal in wrestling comes not from engaging in unsportsmanlike conduct, but by the surprise appearance of actual sporting behavior. Such unwelcome sportsmanship which "breaks Kayfabe" is called "shooting" to distinguish it from the expected scripted deception called "working".Were Kayfabe to become part of our toolkit for the twenty-first century, we would undoubtedly have an easier time understanding a world in which investigative journalism seems to have vanished and bitter corporate rivals cooperate on everything from joint ventures to lobbying efforts. Perhaps confusing battles between "freshwater" Chicago macro economists and Ivy league "Saltwater" theorists could be best understood as happening within a single "orthodox promotion" given that both groups suffered no injury from failing (equally) to predict the recent financial crisis. The decades old battle in theoretical physics over bragging rights between the "string" and "loop" camps would seem to be an even more significant example within the hard sciences of a collaborative intra-promotion rivalry given the apparent failure of both groups to produce a quantum theory of gravity.What makes Kayfabe remarkable is that it gives us potentially the most complete example of the general process by which a wide class of important endeavors transition from failed reality to successful fakery. While most modern sports enthusiasts are aware of wrestling's status as a pseudo sport, what few alive today remember is that it evolved out of a failed real sport (known as "catch" wrestling) which held its last honest title match early in the 20th century. Typical matches could last hours with no satisfying action, or end suddenly with crippling injuries to a promising athlete in whom much had been invested. This highlighted the close relationship between two paradoxical risks which define the category of activity which wrestling shares with other human spheres:• A) Occasional but Extreme Peril for the participants.• B) General: Monotony for both audience and participants.Kayfabrication (the process of transition from reality towards Kayfabe) arises out of attempts to deliver a dependably engaging product for a mass audience while removing the unpredictable upheavals that imperil participants. As such Kayfabrication is a dependable feature of many of our most important systems which share the above two characteristics such as war, finance, love, politics and science.Importantly, Kayfabe also seems to have discovered the limits of how much disbelief the human mind is capable of successfully suspending before fantasy and reality become fully conflated. Wrestling's system of lies has recently become so intricate that wrestlers have occasionally found themselves engaging in real life adultery following exactly behind the introduction of a fictitious adulterous plot twist in a Kayfabe back-story. Eventually, even Kayfabe itself became a victim of its own success as it grew to a level of deceit that could not be maintained when the wrestling world collided with outside regulators exercising oversight over major sporting events.At the point Kayfabe was forced to own up to the fact that professional wrestling contained no sport whatsoever, it did more than avoid being regulated and taxed into oblivion. Wrestling discovered the unthinkable: its audience did not seem to require even a thin veneer of realism. Professional wrestling had come full circle to its honest origins by at last moving the responsibility for deception off of the shoulders of the performers and into the willing minds of the audience.Kayfabe, it appears, is a dish best served client-side

Conservatives like to talk about the causes of Western Civilization’s downfall: feminism, loose morality, drug abuse, Christianity’s decline, reality TV. Blaming civilization’s downfall on lardy hagfish such as Andrea Dworkin is like a doctor diagnosing senility by an old person’s wrinkles. The fact that anyone listened to such a numskull is a symptom, not the cause, of a culture in decline. The cause of civilizational decline is dirt-simple: lack of contact with objective reality. The great banker-journalist (and founder of the original National Review) Walter Bagehot said it well almost 150 years ago:

Conservatives like to talk about the causes of Western Civilization’s downfall: feminism, loose morality, drug abuse, Christianity’s decline, reality TV. Blaming civilization’s downfall on lardy hagfish such as Andrea Dworkin is like a doctor diagnosing senility by an old person’s wrinkles. The fact that anyone listened to such a numskull is a symptom, not the cause, of a culture in decline. The cause of civilizational decline is dirt-simple: lack of contact with objective reality. The great banker-journalist (and founder of the original National Review) Walter Bagehot said it well almost 150 years ago:

ON the morning of May 6, 1783, Guy Carleton, the British commander charged with winding down the occupation of America, boarded the Perseverance and sailed up the Hudson River to meet George Washington and discuss the British withdrawal. Washington was furious to learn that Carleton had sent ships to Canada filled with Americans, including freed slaves, who had sided with Britain during the revolution.

Britain knew these loyalists were seen as traitors and had no future in America. A Patriot using the pen name “Brutus” had warned in local papers: “Flee then while it is in your power” or face “the just vengeance of the collected citizens.” And so Britain honored its moral obligation to rescue them by sending hundreds of ships to the harbors of New York, Charleston and Savannah. As the historian Maya Jasanoff has recounted, approximately 30,000 were evacuated from New York to Canada within months.

Two hundred and twenty-eight years later, President Obama is wrapping up our own long and messy war, but we have no Guy Carleton in Iraq. Despite yesterday’s announcement that America’s military mission in Iraq is over, no one is acting to ensure that we protect and resettle those who stood with us.

Earlier this week, Mr. Obama spoke to troops at Fort Bragg, N.C., of the “extraordinary milestone of bringing the war in Iraq to an end.” Forgotten are his words from the campaign trail in 2007, that “interpreters, embassy workers and subcontractors are being targeted for assassination.” He added, “And yet our doors are shut. That is not how we treat our friends.”

Four years later, the Obama administration has admitted only a tiny fraction of our own loyalists, despite having eye scans, fingerprints, polygraphs and letters from soldiers and diplomats vouching for them. Instead we force them to navigate a byzantine process that now takes a year and a half or longer.

The chances for speedy resettlement of our Iraqi allies grew even worse in May after two Iraqi men were arrested in Kentucky and charged with conspiring to send weapons to jihadist groups in Iraq. These men had never worked for Americans, and they managed to enter the United States as a result of poor background checks. Nevertheless, their arrests removed any sense of urgency in the government agencies responsible for protecting our Iraqi allies.

The sorry truth is that we don’t need them anymore now that we’re leaving, and resettling refugees is not a winning campaign issue. For over a year, I have been calling on members of the Obama administration to make sure the final act of this war is not marred by betrayal. They have not listened, instead adopting a policy of wishful thinking, hoping that everything turns out for the best.

Meanwhile, the Iraqis who loyally served us are under threat. The extremist Shiite leader Moktada al-Sadr has declared the Iraqis who helped America “outcasts.” When Britain pulled out of Iraq a few years ago, there was a public execution of 17 such outcasts — their bodies dumped in the streets of Basra as a warning. Just a few weeks ago, an Iraqi interpreter for the United States Army got a knock on his door; an Iraqi policeman told him threateningly that he would soon be beheaded. Another employee, at the American base in Ramadi, is in hiding after receiving a death threat from Mr. Sadr’s militia.

It’s not the first time we’ve abandoned our allies. In 1975, President Gerald R. Ford and Henry A. Kissinger ignored the many Vietnamese who aided American troops until the final few weeks of the Vietnam War. By then, it was too late.

Although Mr. Kissinger had once claimed there was an “irreducible list” of 174,000 imperiled Vietnamese allies, the policy in the war’s frantic closing weeks was icily Darwinian: if you were strong enough to clear our embassy walls or squeeze through the gates and force your way onto a Huey, you could come along. The rest were left behind to face assassination or internment camps. The same sorry story occurred in Laos, where America abandoned tens of thousands of Hmong people who had aided them.

It wasn’t until months after the fall of Saigon, and much bloodshed, that America conducted a huge relief effort, airlifting more than 100,000 refugees to safety. Tens of thousands were processed at a military base on Guam, far away from the American mainland. President Bill Clinton used the same base to save the lives of nearly 7,000 Iraqi Kurds in 1996. But if you mention the Guam Option to anyone in Washington today, you either get a blank stare of historical amnesia or hear that “9/11 changed everything.”

And so our policy in the final weeks of this war is as simple as it is shameful: submit your paperwork and wait. If you can survive the next 18 months, maybe we’ll let you in. For the first time in five years, I’m telling Iraqis who write to me for help that they shouldn’t count on America anymore.

Moral timidity and a hapless bureaucracy have wedged our doors tightly shut and the Iraqis who remained loyal to us are weeks away from learning how little America’s word means.

Kirk W. Johnson, a former reconstruction coordinator in Iraq, founded the List Project to Resettle Iraqi Allies.

A version of this op-ed appeared in print on December 16, 2011, on page A43 of the New York edition with the headline: The Iraq We're Leaving Behind: Abandoning Our Friends.

"Moral timidity and a hapless bureaucracy have wedged our doors tightly shut and the Iraqis who remained loyal to us are weeks away from learning how little America’s word means."

Thank you for posting this article, P.C. It seems as though this has become a habitual pattern for our nation. And this strikes as incredibly important. As we engage China (and Iran and many others), like it or not we will need help and we will need to be trusted. Do we mean what we say with Taiwan? Japan? The Philippines? Others? Even if the answer is yes on dealing with these particular countries, we will be seen by them (or China) as having a credible committment? Or, is the fact that we do not honor our allies likely to lead to further doubt on the part of out natural and historic allies (and enemies)?

Woof bigdog, Thank you for saying so; the whole thing has me in a very frustrated and angry mood. We know what's coming; it's like watching a train wreck. I hope there's a hell because I don't think karma is going to be able to handle all the people responsible for the carnage to come. It won't be instantaneous, it will happen over time but it will rival Pol Pot's killing fields before it's over. P.C.

I post this here because I think it exceedingly important to put the blame for the crisis where it belongs-- and right at the center, along with Fed interest rate policy, government requirements that the unqualified be given home loans, etc are the FMs.===========================

By PETER J. WALLISON The Securities and Exchange Commission's lawsuits against six top executives of Fannie Mae and Freddie Mac, announced last week, are a seminal event.

For the first time in a government report, the complaint has made it clear that the two government-sponsored enterprises (GSEs) played a major role in creating the demand for low-quality mortgages before the 2008 financial crisis. More importantly, the SEC is saying that Fannie and Freddie—the largest buyers and securitizers of subprime and other low-quality mortgages—hid the size of their purchases from the market. Through these alleged acts of securities fraud, they did not just mislead investors; they deprived analysts, risk managers, rating agencies and even financial regulators of vital data about market risks that could have prevented the crisis.

The lawsuit necessarily focuses on 2006 and 2007, the years that are still within the statute of limitations. But according to the SEC complaint, the behavior went on for many years: "Since the 1990s, Freddie Mac internally categorized loans as subprime or subprime-like as part of its loan acquisition program," while its senior officials continued to state publicly that it had little or no exposure to subprime loans.

The GSEs began acquiring large numbers of subprime and other low-quality loans in the mid-1990s, as they tried to comply with the government's affordable-housing requirements—quotas for mortgage purchases imposed by the Department of Housing and Urban Development (HUD) under legislation enacted by Congress in 1992.

These quotas initially required that, of all the loans bought by Fannie and Freddie in any year, 30% had to have been made to borrowers earning at or below the median income in their communities. The quotas, however, would increase—they rose to 40% in 1996, 50% in 2000, and 55% in 2007. HUD also added and raised quotas for "special affordable" loans that were to be made to borrowers with low or very low incomes (in some cases a mere 60% of the area median income).

It is certainly possible to find prime mortgages among borrowers whose incomes are below the median, but this becomes more difficult as the quota percentages increase. Indeed, by 2000 Fannie and Freddie were offering to buy zero-down payment loans and buying large numbers of subprime mortgages in order to meet the HUD quotas.

Enlarge Image

CloseCorbis .According to the SEC, for example, Fannie failed to disclose a low-quality loan known as an Expanded Approval (EA) mortgage—even though these loans had the highest rate of "serious delinquency" (90 days past due, and almost certainly going to foreclosure) in Fannie's book. Those EA loans—as then-Chairman Daniel Mudd told the House Financial Services Committee in April 2007—"helped us meet our HUD affordable housing requirements."

Meeting these quotas made Fannie and Freddie important factors in the financial crisis. Relying on the research of my colleague Edward Pinto at the American Enterprise Institute, I stated in my dissent from the majority report of the Financial Crisis Inquiry Commission that there were approximately 27 million subprime and other risky mortgages outstanding on June 30, 2008, and a lion's share was on Fannie and Freddie's books. That has now been largely confirmed by the SEC's data.

The SEC also charges that Fannie and Freddie's disclosures grossly understated the number of subprime and other risky loans they were holding or securitizing. For example, Freddie's Information Statement and Annual Report to Stockholders, in March 2006, reported that for 2005 and 2004 the company's exposure to subprime loans was "not significant." According to the SEC complaint, subprime mortgages at this point constituted 10% of Freddie's exposures.

Similarly, Fannie held over $94 billion in EA loans in 2007, according to the SEC—"11 times greater than the 0.3% ($8.3 billion)" in subprime loans Fannie disclosed for that year. (According to an SEC press release, both GSEs have agreed with the commission's "Statement of Facts" about their disclosure failures, without admitting or denying liability. They also agreed to cooperate with the commission's litigation against the former executives.)

Fannie and Freddie were the dominant players in the U.S. mortgage markets, by far the largest buyers of mortgages and mortgage-backed securities of all kinds. Statements by these two firms that their exposure to subprime mortgages was "not significant" or ".03 percent" would be read by analysts and other mortgage market participants as strong indications that relatively few subprime and other low-quality mortgages were outstanding.

My own research, as a member of the Financial Crisis Inquiry Commission (and a dissenter from its majority report), did not turn up any analyst report or other public statement before the 2008 crisis that came close to estimating the actual number of subprime or other low-quality mortgages outstanding.

These failures to disclose subprime holdings meant that banks and other financial institutions, risk managers, analysts, rating agencies and even regulators may well have underestimated the risks of continuing to acquire, hold and distribute mortgages and mortgage-backed securities. Thus, when the bubble deflated in 2007, the financial system, and particularly the largest financial institutions, were primed for immense losses.

First time I've encountered this source; his CV is certainly interesting. I'm no slouch when it comes to making a desktop computer do my bidding, but my kids multitask digitally in manners that amaze me: they keep a lot of different balls in the air while still working on a primary task, often gaming. This gent thinks they might have what it takes to be a next generation war fighter:

The Future of Drone Warfare

Over half of Air Force UPT (undergraduate pilot training) grads are now assigned to pilot drones rather than a real aircraft.* The big question is why are drone pilots, guys that fly robots remotely from a computer terminal, going to a very expensive year of pilot training? I can understand why the Air Force has chosen to send drone jockeys to pilot training:

A shift to piloting drones rather than real aircraft is an assault on organizational culture of the Air Force. In the Air Force, pilots do the fighting and as a result take most of the leadership positions.

A transition to robotics upends that arrangement, and is why the USAF has strenuously resisted taking control of the drone mission until recently. In this light, sending these drone jockeys to a very expensive year of UPT is an attempt to ease the cultural transition. However, culture aside, is it the best training?

Drone Pilots Today

I suspect it isn't. Here's why. The assumption that combat with drones is going to be the same as combat without them is flawed. It's going to be VERY different. So far, it's hard to see that. Most engagements today involve:

a drone flying leisurely over a village in Pakistan controlled by a pilot at a terminal in Las Vegas/Nellis, waiting for five or more armed men to assemble in a single house (which is a terrorist "signature" that green lights authorization to eliminate the threat), and then pushing a button and holding a cursor over the house until it disappears.

That's not going to last long.

Drone Combat

How does the addition of drones change the nature of combat/conflict? Why? The tech is moving too fast. Here are some of the characteristics we'll see in the near future:

Swarms. The cost and size of drones will shrink. Nearly everyone will have access to drone tech (autopilots already cost less than $30). Further, the software to enable drones to employ swarm behavior will improve. So, don't think in terms of a single drone. Think in terms of a single person controlling hundreds and thousands.

Intelligence. Drones will be smarter than they are today. The average commercial chip passed the level of insect intelligence a little less than a decade ago (which "magically" resulted in an explosion of drone/bot tech). Chips will cross rat intelligence in 2018 or so. Think in terms of each drone being smart enough to follow tactical instructions. Dynamism. The combination of massive swarms with individual elements being highly intelligent puts combat on an entirely new level. It requires a warrior that can provide tactical guidance and situational awareness in real time at a level that is beyond current training paradigms.

Training Drone Bonjwas

Based on the above requirements, the best training for drones (in the air and on land) isn't real world training, it's tactical games (not first person shooters). Think in terms of the classic military scifi book, "Ender's Game" by Orson Scott Card. Of the games currently on the market, the best example of the type of expertise required is Blizzard's StarCraft, a scifi tactical management game that has amazing multiplayer tactical balance/dynamism. The game is extremely popular worldwide, but in South Korea, it has reached cult status. The best players, called Bonjwas, are treated like rock stars, and for good reason:

Training of hand/eye/mind. Speeds of up to 400 keyboard mouse (macro/micro) tactical commands per minute have been attained. Think about that for a second. That's nearly 7 commands a second.

Fight multi-player combat simulations for 10-12 hours a day. They live the game for a decade and then burn out. Mind vs. mind competition continously.

To become a bonjwa, you have to defeat millions of opponents to reach the tournament rank, and then dominate the tournament rank for many years. The ranking system/ladder that farms new talent is global (Korea, China, SEA, North America, and Europe), huge (millions of players), and continuous (24x7x365). Currently, the best Starcraft bonjwa in the world is Flash. Here's his ELO rating.

By THOMAS MEANEY As a teenager, Friedrich Nietzsche was fascinated by America. "The American way of laughing does me good," he wrote after reading "The Adventures of Tom Sawyer," "especially this sort of sturdy seaman like Mark Twain." In the essays of Ralph Waldo Emerson he discovered a "brother-soul" who kindled his lifelong passion for truth-seeking. Despite making his name as the greatest anti-democratic thinker of his age, Nietzsche believed that America was a land of free spirits, unburdened by the weight of the European past.

American readers, for their part, have repaid Nietzsche's attentions. More than any other European thinker, he is alive in our cultural bloodstream. But in a country that, from the start, elevated the values of efficiency and equality over the virtues of aristocratic excellence, Nietzsche's message was bound to mutate. We have blunted his challenge to "create yourself" into a commercial catchphrase; we prefer to "like" our fellow citizens rather than to love or hate them; we don't hesitate to declare any child who dabbles in crayons an "artist." As a culture, we have given Nietzsche a happy ending.

What does our use and abuse of Nietzsche's thinking say about us? This is the interesting question that Jennifer Ratner-Rosenhagen sets out to answer in "American Nietzsche," her elegant and revealing account of America's reckoning with the German thinker. She samples the gamut of responses to Nietzsche in an effort to explain how nearly every segment of American culture "discovered in Nietzsche a thinker to think with."

For American thinkers wrestling with the anxieties unleashed by living in a pluralist democracy, Nietzsche not only diagnosed the mentality more acutely than anyone else but for his careful readers—those with "a third ear"—also promised forms of higher fulfillment.

.For Nietzsche, as for Emerson, the source of this fulfillment was to be found in a radically new conception of the individual. The self was not a stable entity for Nietzsche, nor was there any "true self" to be discovered. Rather the self is something that we are constantly becoming. "We shed our old bark, we shed our skins every spring," Nietzsche writes, "we keep becoming younger, fuller of future, taller, stronger." We construct ourselves by assembling our experiences, desires and actions in the way a novelist gives coherence to the incidental plot points of a novel. "Make your own Bible!" declares Emerson. For both Nietzsche and Emerson the point was to generate meaning through a continuous act of self-creation.

Nietzsche's first American popularizer was the journalist H.L. Mencken, who was drawn to Nietzsche's European exoticism. Nevertheless, Mencken understood clearly enough that the self-created individuals that Nietzsche described could never arise easily in a democracy, where the self-creation of one citizen inevitably treads on the self-creation of another. In his 1908 book, "The Philosophy of Friedrich Nietzsche," Mencken excoriated the way that American mass society trampled on the possibility of unadjusted heroes. "It is only the under-dog . . . that believes in equality," he seethed, "it is only the mob that seeks to reduce all humanity to one dead level, for it is only the mob that would gain by such leveling."

Mencken reviled American culture for not producing more genuine artists to match their European counterparts. "The culture of the Renaissance raised itself on the shoulders of a group of a hundred men," Nietzsche wrote, and it was such a cultural avant-garde that Mencken aimed to cultivate.

Enlarge Image

CloseDoug Pike

A 1997 cartoon by Doug Pike..Mencken's columns put Nietzsche's name on the American cultural map, and the philosopher's ideas provoked murmurs of enthusiasm among a coterie of readers. But Nietzsche's reputation never got off the ground with the general public in the early decades of the 20th century. The first reason was the sensational trial, in 1924, of Leopold and Loeb, who kidnapped and murdered a 14-year-old boy, apparently under the influence of Nietzsche (or so claimed Clarence Darrow, Loeb's defense attorney). The second, more significant, reason was the rise of fascism in Europe.

It was one thing for American intellectuals and academics to invoke Nietzsche in their criticism of liberal democracy when its values seemed to be secure, but it was a considerably less welcome exercise in the 1930s, when those values were on the defensive. In the lead up to the war with Germany, Nietzsche's philosophy became hopelessly conflated with Nazism, though this association was the result of superficial reading. (Anti-Semitism, for instance, was one of Nietzsche's favorite examples of German stupidity.)

It was left to the German émigré and Princeton professor Walter Kaufmann to rehabilitate Nietzsche's reputation after World War II. In the best chapter of her book, Ms. Ratner-Rosenhagen explains how the Nietzsche we encounter in print today is largely Kaufmann's Nietzsche—mediated by his translations, collations and introductions. Kaufmann became not only Nietzsche's tireless promoter but also, to a degree, the sanitizer of his thought.

By arguing for Nietzsche's place in the Western canon alongside Kant and Hegel, Kaufmann made his subject respectable enough for the college classroom. He was also responsible for recasting Nietzsche as the forerunner of the various strains of existentialism that came into vogue in the 1960s. Nietzsche was suddenly a cultural touchstone with disciples ranging from Hugh Hefner to the Black Panther Huey Newton (the latter apparently misunderstood what Nietzsche meant by "slave morality" and thought it might be a good thing).

If there is a problem with "American Nietzsche," it is that Ms. Ratner-Rosenhagen is not quite up-front about the story she is telling. She claims at the outset that her study "is not even a book about Nietzsche"—and that, in the spirit of her subject, she will be merely presenting us with a series of interpretations in order to understand Nietzsche's "role in the ever-dynamic remaking of modern thought." But the last chapter of her book shows her to be partial to a very particular way of reading her subject. The chapter is devoted to three American Nietzscheans—Harold Bloom, Stanley Cavell and Richard Rorty—who all rediscovered American transcendentalism through Nietzsche and whose inclusion at the end of the book makes Nietzsche's thought seem like a long detour on the way back home to Emerson.

But Messrs. Cavell and Rorty have domesticated Nietzsche in peculiar ways, often sidestepping the main difficulties he presents. For Rorty, for instance, the challenge Nietzsche posed for a democratic culture could be solved by simply signing on to everything he says about the self but quarantining the rest of his unpalatable anti-democratic pronouncements. Nietzsche's two great contributions to American culture, according to Rorty, were that he provided us with an example of how we can all make an art of our private lives and that he showed us that the truth, far from having any absolute value, is simply whatever we find useful. When it comes to our democratic foundations, Rorty advises that we cheerfully embrace our lucky political inheritance, which we only risk squandering by interrogating too closely.

It would be nice if it were all that easy. But one of Nietzsche's major claims was, after all, that some of us will always rebel against the leveling effect of liberal democracy, while others—most of us—will join the herd. Likewise, Nietzsche thought that the truth was rarely ever useful. He thought errors, disasters and profound misunderstandings were much more precious.

Still, there is something to be said for the happy ending America has given Nietzsche. A country that can translate the striving of the Nietzschean superman into a guide for democracy's self-creating everyman may have discovered a rare kind of philosophical agility. The shift may not be quite fair to Nietzsche, but then he was always thrilled by America's powerful misreading of the European past.

—Mr. Meaney is a doctoral student in history at Columbia and a co-editor of the Utopian.

A War Worth FightingRevisionists say that World War II was unnecessary. They're wrong.by Christopher Hitchens | June 14, 2008 11:36 AM EDTIs there any one shared principle or assumption on which our political consensus rests, any value judgment on which we are all essentially agreed? Apart from abstractions such as a general belief in democracy, one would probably get the widest measure of agreement for the proposition that the second world war was a "good war" and one well worth fighting. And if we possess one indelible image of political immorality and cowardice, it is surely the dismal tap-tap-tap of Neville Chamberlain's umbrella as he turned from signing the Czechs away to Adolf Hitler at Munich. He hoped by this humiliation to avert war, but he was fated to bring his countrymen war on top of humiliation. To the conventional wisdom add the titanic figure of Winston Churchill as the emblem of oratorical defiance and the Horatius who, until American power could be mobilized and deployed, alone barred the bridge to the forces of unalloyed evil. When those forces lay finally defeated, their ghastly handiwork was uncovered to a world that mistakenly thought it had already "supped full of horrors." The stark evidence of the Final Solution has ever since been enough to dispel most doubts about, say, the wisdom or morality of carpet-bombing German cities.Historical scholarship has nevertheless offered various sorts of revisionist interpretation of all this. Niall Ferguson, for one, has proposed looking at the two world wars as a single conflict, punctuated only by a long and ominous armistice. British conservative historians like Alan Clark and John Charmley have criticized Churchill for building his career on war, for ignoring openings to peace and for eventually allowing the British Empire to be squandered and broken up. But Pat Buchanan, twice a candidate for the Republican nomination and in 2000 the standard-bearer for the Reform Party who ignited a memorable "chad" row in Florida, has now condensed all the antiwar arguments into one. His case, made in his recently released "Churchill, Hitler and the Unnecessary War," is as follows:Buchanan does not need to close his book with an invocation of a dying West, as if to summarize this long recital of Spenglerian doomsaying. He's already opened with the statement, "All about us we can see clearly now that the West is passing away." The tropes are familiar—a loss of will and confidence, a collapse of the desire to reproduce with sufficient vigor, a preference for hedonism over the stern tasks of rulership and dominion and pre-eminence. It all sounds oddly … Churchillian. The old lion himself never tired of striking notes like these, and was quite unembarrassed by invocations of race and nation and blood. Yet he is the object of Buchanan's especial dislike and contempt, because he had a fondness for "wars of choice."This term has enjoyed a recent vogue because of the opposition to the war in Iraq, an opposition in which Buchanan has played a vigorous role. Descending as he does from the tradition of Charles Lindbergh's America First movement, which looked for (and claimed to have found) a certain cosmopolitan lobby behind FDR's willingness to involve the United States in global war, Buchanan is the most trenchant critic of what he considers our fondest national illusion, and his book has the feel and stamp of a work that he has been readying all his life.But he faces an insuperable difficulty, or rather difficulties. If you want to demonstrate that Germany was more the victim than the aggressor in 1914, then you must confine your account (as Buchanan does) to the very minor legal question of Belgian neutrality and of whether Britain absolutely had to go to war on the Belgian side. (For what it may be worth, I think that Britain wasn't obliged to do so and should not have done.) But the rest of the kaiser's policy, most of it completely omitted by Buchanan, shows that Germany was looking for a chance for war all over the globe, and was increasingly the prisoner of a militaristic ruling caste at home. The kaiser picked a fight with Britain by backing the white Dutch Afrikaner rebels in South Africa and by butchering the Ovambo people of what is now Namibia. He looked for trouble with the French by abruptly sending warships to Agadir in French Morocco, which nearly started the first world war in 1905, and with Russia by backing Austria-Hungary's insane ultimatum to the Serbs after the June 1914 assassinations in Sarajevo. Moreover, and never mentioned by Buchanan at all, the kaiser visited Damascus and paid for the rebuilding of the tomb of Saladin, announced himself a sympathizer of Islam and a friend of jihad, commissioned a Berlin-to-Baghdad railroad for the projection of German arms into the Middle East and Asia and generally ranged himself on the side of an aggressive Ottoman imperialism, which later declared a "holy war" against Britain. To suggest that he felt unjustly hemmed in by the Royal Navy's domination of the North Sea while he was conducting such statecraft is absurd.And maybe a little worse than absurd, as when Buchanan writes: "From 1871 to 1914, the Germans under Bismarck and the Kaiser did not fight a single war. While Britain, Russia, Italy, Turkey, Japan, Spain, and the United States were all involved in wars, Germany and Austria had clean records." I am bound to say that I find this creepy. The start of the "clean record" has to be in 1871, because that's the year that Prussia humbled France in the hideous Franco-Prussian War that actually annexed two French provinces to Germany. In the intervening time until 1914, Germany was seizing colonies in Africa and the Pacific, cementing secret alliances with Austria and trying to build up a naval fleet that could take on the British one. No wonder the kaiser wanted a breathing space.Now, this is not to say that Buchanan doesn't make some sound points about the secret diplomacy of Old Europe that was so much to offend Woodrow Wilson. And he is excellent on the calamitous Treaty of Versailles that succeeded only—as was noted by John Maynard Keynes at the time—in creating the conditions for another world war, or for part two of the first one. He wears his isolationism proudly: "The Senate never did a better day's work than when it rejected the Treaty of Versailles and refused to enter a League of Nations where American soldiers would be required to give their lives enforcing the terms of so dishonorable and disastrous a peace."Actually, no soldier of any nation ever lost so much as a fingernail in the service of the League, which was in any case doomed by American abstention, and it's exactly that consideration which invalidates the second half of Buchanan's argument, which is that a conflict with Hitler's Germany both could and should have been averted. (There is a third Buchanan sub-argument, mostly made by implication, which is that the democratic West should have allied itself with Hitler, at least passively, until he had destroyed the Soviet Union.) Again, in order to believe his thesis one has to be prepared to argue that Hitler was a rational actor with intelligible and negotiable demands, whose declared, demented ambitions in "Mein Kampf" were presumably to be disregarded as mere propaganda. In case after case Buchanan shows the abysmal bungling of British and French diplomacy—making promises to Czechoslovakia that could never have been kept and then, adding injury to insult, breaking those promises at the first opportunity. Or offering a guarantee to Poland (a country that had gleefully taken part in the dismemberment of Czechoslovakia) that Hitler well knew was not backed by any credible military force.Buchanan is at his best here, often causing one to whistle at the sheer cynicism and stupidity of the British Tories. In the Anglo-German Naval Agreement of June 1935, for example, they astounded the French and Italians and Russians by unilaterally agreeing to permit Hitler to build a fleet one third the size of the Royal Navy and a submarine fleet of the same size as the British! Not only was this handing the Third Reich the weapon it would soon press to Britain's throat, it was convincing all Britain's potential allies that they would be much better off making their own bilateral deals with Berlin. Which is essentially what happened.But Buchanan keeps forgetting that this criminal foolishness is exactly the sort of policy that he elsewhere recommends. In his view, after all, Germany had been terribly wronged by Versailles and it would have been correct to redraw the frontiers in Germany's favor and soothe its hurt feelings (which is what the word "appeasement" originally meant). Meanwhile we should have encouraged Hitler's hostility to Bolshevism and discreetly rearmed in case he should also need to be contained. This might perhaps have worked if Germany had been governed by a right-wing nationalist party that had won a democratic vote. However, in point of fact Germany was governed by an ultra-rightist, homicidal, paranoid maniac who had begun by demolishing democracy in Germany itself, who believed that his fellow countrymen were a superior race and who attributed all the evils in the world to a Jewish conspiracy. It is possible to read whole chapters of Buchanan's book without having to bear these salient points in mind. (I should say that I intend this observation as a criticism.) As with his discussion of pre-1914 Germany, he commits important sins of omission that can only be the outcome of an ideological bias. Barely mentioned except in passing is the Spanish Civil War, for example, where for three whole years between 1936 and 1939 Germany and Italy lent troops and weapons in a Fascist invasion of a sovereign European nation that had never threatened or "encircled" them in any way. Buchanan's own political past includes overt sympathy with General Franco, which makes this skating-over even less forgivable than it might otherwise be.On the one occasion where Spain does get a serious mention, it illustrates the opposite point to the one Buchanan thinks he's making. The British ambassador in Berlin, Sir Neville Henderson, is explaining why Hitler didn't believe that Britain and France would fight over Prague: "[Hitler] argued as follows: Would the German nation willingly go to war for General Franco in Spain, if France intervened on the side of the Republican government? The answer that he gave himself is that it would not, and he was consequently convinced that no democratic French government would be strong enough to lead the French nation to war for the Czechs."In this instance, it must be admitted, Hitler was being a rational actor. And his admission—which Buchanan in his haste to indict Anglo-French policy completely fails to notice—is that if he himself had been resisted earlier and more determinedly, he would have been compelled to give ground. Thus the whole and complete lesson is not that the second world war was an avoidable "war of choice." It is that the Nazis could and should have been confronted before they had fully rearmed and had begun to steal the factories and oilfields and coal mines and workers of neighboring countries. As Gen. Douglas MacArthur once put it, all military defeats can be summarized in the two words: "Too late." The same goes for political disasters.As the book develops, Buchanan begins to unmask his true colors more and more. It is one thing to make the case that Germany was ill-used, and German minorities harshly maltreated, as a consequence of the 1914 war of which Germany's grim emperor was one of the prime instigators. It's quite another thing to say that the Nazi decision to embark on a Holocaust of European Jewry was "not a cause of the war but an awful consequence of the war." Not only is Buchanan claiming that Hitler's fanatical racism did not hugely increase the likelihood of war, but he is also making the insinuation that those who wanted to resist him are the ones who are equally if not indeed mainly responsible for the murder of the Jews! This absolutely will not do. He adduces several quotations from Hitler and Goebbels, starting only in 1939 and ending in 1942, screaming that any outbreak of war to counter Nazi ambitions would lead to a terrible vengeance on the Jews. He forgets—at least I hope it's only forgetfulness—that such murderous incitement began long, long before Hitler had even been a lunatic-fringe candidate in the 1920s. This "timeline" is as spurious, and as sinister, as the earlier dates, so carefully selected by Buchanan, that tried to make Prussian imperialism look like a victim rather than a bully.One closing example will demonstrate the corruption and prejudice of Buchanan's historical "method." He repeatedly argues that Churchill did not appreciate Hitler's deep-seated and respectful Anglophilia, and he continually blames the war on several missed opportunities to take the Führer's genially outstretched hand. Indeed, he approvingly quotes several academic sources who agree with him that Hitler invaded the Soviet Union only in order to change Britain's mind. Suppose that Buchanan is in fact correct about this. Could we have a better definition of derangement and megalomania than the case of a dictator who overrules his own generals and invades Russia in wintertime, mainly to impress the British House of Commons? (Incidentally, or rather not incidentally, it was precisely that hysterical aggression that curtain-raised the organized deportation and slaughter of the Jews. But it's fatuous to suppose that, without that occasion, the Nazis would not have found another one.)It is of course true that millions of other people lost their lives in this conflict, often in unprecedentedly horrible ways, and that new tyrannies were imposed on the countries—Poland, Czechoslovakia and China most notably—that had been the pretexts for a war against fascism. But is this not to think in the short term? Unless or until Nazism had been vanquished, millions of people were most certainly going to be either massacred or enslaved in any case. Whereas today, all the way from Portugal to the Urals, the principle of human rights and popular sovereignty is at least the norm, and the ideas of racism and totalitarianism have been fairly conclusively and historically discredited. Would a frightened compromise with racist totalitarianism have produced a better result?Winston Churchill may well have been on the wrong side about India, about the gold standard, about the rights of labor and many other things, and he may have had a lust for war, but we may also be grateful that there was one politician in the 1930s who found it intolerable even to breathe the same air, or share the same continent or planet, as the Nazis. (Buchanan of course makes plain that he rather sympathizes with Churchill about the colonies, and quarrels only with his "finest hour." This is grotesque.) As he closes his argument, Buchanan again refuses to disguise his allegiance. "Though derided as isolationists," he writes, "the America First patriots kept the United States out of the war until six months after Hitler had invaded Russia." If you know anything at all about what happened to the population of those territories in those six months, it is rather hard to be proud that America was neutral. But this is a price that Buchanan is quite willing to pay.I myself have written several criticisms of the cult of Churchill, and of the uncritical way that it has been used to stifle or cudgel those with misgivings. ("Adlai," said John F. Kennedy of his outstanding U.N. ambassador during the Bay of Pigs crisis, "wanted a Munich.") Yet the more the record is scrutinized and re-examined, the more creditable it seems that at least two Western statesmen, for widely different reasons, regarded coexistence with Nazism as undesirable as well as impossible. History may judge whether the undesirability or the impossibility was the more salient objection, but any attempt to separate the two considerations is likely to result in a book that stinks, as this one unmistakably does.

Not often do I find TF to be worthy of posting here (understatement ) but today is an exception.=====================

So Much Fun. So Irrelevant.By THOMAS L. FRIEDMANPublished: January 3, 2012

Two things have struck me about the Republican presidential candidate debates leading up to the Iowa caucuses. One is how entertaining they were. The other is how disconnected they were from the biggest trends shaping the job market of the 21st century. What if the 2012 campaign were actually about the world in which we’re living and how we adapt to it? What would the candidates be talking about?

Surely at or near the top of that list would be the tightening merger between globalization and the latest information technology revolution. The I.T. revolution is giving individuals more and more cheap tools of innovation, collaboration and creativity — thanks to hand-held computers, social networks and “the cloud,” which stores powerful applications that anyone can download. And the globalization side of this revolution is integrating more and more of these empowered people into ecosystems, where they can innovate and manufacture more products and services that make people’s lives more healthy, educated, entertained, productive and comfortable.

The best of these ecosystems will be cities and towns that combine a university, an educated populace, a dynamic business community and the fastest broadband connections on earth. These will be the job factories of the future. The countries that thrive will be those that build more of these towns that make possible “high-performance knowledge exchange and generation,” explains Blair Levin, who runs the Aspen Institute’s Gig.U project, a consortium of 37 university communities working to promote private investment in next-generation ecosystems.

Historians have noted that economic clusters always required access to abundant strategic inputs for success, says Levin. In the 1800s, it was access to abundant flowing water and raw materials. In the 1900s, it was access to abundant electricity and transportation. In the 2000s, he said, “it will be access to abundant bandwidth and abundant human intellectual capital,” — places like Silicon Valley, Austin, Boulder, Cambridge and Ann Arbor.

But we need many more of these. As the world gets wired together through the Web and social networks, and as more and more sensors run machines that are talking to other machines across the Internet, we are witnessing the emergence of “Big Data.” These are the mountains of data coming out of all these digital interactions, which can then be collected, sifted, mined and analyzed — like raw materials of old — to provide the raw material for new inventions in health care, education, manufacturing and retailing.

“We’re all aware of the approximately two billion people now on the Internet — in every part of the planet, thanks to the explosion of mobile technology,” I.B.M.’s chairman, Samuel Palmisano, said in a speech last September. “But there are also upward of a trillion interconnected and intelligent objects and organisms — what some call the Internet of Things. All of this is generating vast stores of information. It is estimated that there will be 44 times as much data and content coming over the next decade ...reaching 35 zettabytes in 2020. A zettabyte is a 1 followed by 21 zeros. And thanks to advanced computation and analytics, we can now make sense of that data in something like real time.”

The more information and trends you are able to mine and analyze, and the more talented human capital, bandwidth and computing power you apply to that data, the more innovation you’ll get.

When eight doctors from around the world can look at the same M.R.I. in real time, said Levin, it enables the acceleration of small breakthroughs, which is where big breakthroughs eventually come from. Big bandwidth, he added, would enable these same doctors doing high-risk surgery to practice the life-saving procedures in advance over network-enabled simulators, leading to better results, new kinds of surgical innovations and new forms of medical education. Big bandwidth, combined with 3-D printers, would also allow for the rapid prototyping of all kinds of manufactured products that can then be made anywhere.

Right now, though, notes Levin, America is focused too much on getting “average” bandwidth to the last 5 percent of the country in rural areas, rather than getting “ultra-high-speed” bandwidth to the top 5 percent, in university towns, who will invent the future. By the end of 2012, he adds, South Korea intends to connect every home in the country to the Internet at one gigabit per second. “That would be a tenfold increase from the already blazing national standard, and more than 200 times as fast as the average household setup in the United States,” The Times reported last February.

Therefore, the critical questions for America today have to be how we deploy more ultra-high-speed networks and applications in university towns to invent more high-value-added services and manufactured goods and how we educate more workers to do these jobs — the only way we can maintain a middle class.

I just don’t remember any candidate being asked in those really entertaining G.O.P. debates: “How do you think smart cities can become the job engines of the future, and what is your plan to ensure that America has a strategic bandwidth advantage?”

The members of America's new upper class tend not to watch the same movies and television shows that the rest of America watches, don't go to kinds of restaurants the rest of America frequents, tend to buy different kinds of automobiles, and have passions for being green, maintaining the proper degree of body fat, and supporting gay marriage that most Americans don't share. Their child-raising practices are distinctive, and they typically take care to enroll their children in schools dominated by the offspring of the upper middle class—or, better yet, of the new upper class. They take their vacations in different kinds of places than other Americans go and are often indifferent to the professional sports that are so popular among other Americans. Few have served in the military, and few of their children either.

Worst of all, a growing proportion of the people who run the institutions of our country have never known any other culture. They are the children of upper-middle-class parents, have always lived in upper-middle-class neighborhoods and gone to upper-middle-class schools. Many have never worked at a job that caused a body part to hurt at the end of the day, never had a conversation with an evangelical Christian, never seen a factory floor, never had a friend who didn't have a college degree, never hunted or fished. They are likely to know that Garrison Keillor's monologue on Prairie Home Companion is the source of the phrase "all of the children are above average," but they have never walked on a prairie and never known someone well whose IQ actually was below average.

"Therefore, the critical questions for America today have to be how we deploy more ultra-high-speed networks and applications in university towns to invent more high-value-added services and manufactured goods and how we educate more workers to do these jobs — the only way we can maintain a middle class. I just don’t remember any candidate being asked in those really entertaining G.O.P. debates: “How do you think smart cities can become the job engines of the future, and what is your plan to ensure that America has a strategic bandwidth advantage?”

Is Friedman stating it should be a matter of government policy and administration to oversee the bandwidth revolution (if you will). If you ask me America is in the forefront of this.

Should we have government policy overseeing this? If so one can then imagine a new Gov. agency. Or stated another way smaller government, at least in this area is not better.

OTOH one could argue that the gov. should simply get out of the way. But I am not clear it ever was.

I don't see anything on the Romney webist about technology in general other than maybe extending visas to foreigners with advanced degrees.

OTOH it may not be a bad idea to wait with some plans and ideas lest Brockster use them for his trial balloons. Clinton was a master (with a MSM allowing him to get away with it) of co-opting the ideas of the opposing party and act like it was all him.

My Time at Walmart: Why We Need Serious Welfare ReformDecember 13, 2011 By crousselleDuring the 2010 and 2011 summers, I was a cashier at Wal-Mart #1788 in Scarborough, Maine. I spent hours upon hours toiling away at a register, scanning, bagging, and dealing with questionable clientele. These were all expected parts of the job, and I was okay with it. What I didn’t expect to be part of my job at Wal-Mart was to witness massive amounts of welfare fraud and abuse.I understand that sometimes, people are destitute. They need help, and they accept help from the state in order to feed their families. This is fine. It happens. I’m not against temporary aid helping those who truly need it. What I saw at Wal-Mart, however, was not temporary aid. I witnessed generations of families all relying on the state to buy food and other items. I literally witnessed small children asking their mothers if they could borrow their EBT cards. I once had a man show me his welfare card for an ID to buy alcohol. The man was from Massachusetts. Governor Michael Dukakis’ signature was on his welfare card. Dukakis’ last gubernatorial term ended in January of 1991. I was born in June of 1991. The man had been on welfare my entire life. That’s not how welfare was intended, but sadly, it is what it has become.Other things witnessed while working as a cashier included:a) People ignoring me on their iPhones while the state paid for their food. (For those of you keeping score at home, an iPhone is at least $200, and requires a data package of at least $25 a month. If a person can spend $25+ a month so they can watch YouTube 24/7, I don’t see why they can’t spend that money on food.)b) People using TANF (Temporary Assistance for Needy Families) money to buy such necessities such as earrings, kitkat bars, beer, WWE figurines, and, my personal favorite, a slip n’ slide. TANF money does not have restrictions like food stamps on what can be bought with it.c) Extravagant purchases made with food stamps; including, but not limited to: steaks, lobsters, and giant birthday cakes.d) A man who ran a hotdog stand on the pier in Portland, Maine used to come through my line. He would always discuss his hotdog stand and encourage me to “come visit him for lunch some day.” What would he buy? Hotdogs, buns, mustard, ketchup, etc. How would he pay for it? Food stamps. Either that man really likes hotdogs, or the state is paying for his business. Not okay.The thing that disturbed me more than simple cases of fraud/abuse was the entitled nature of many of my customers. One time, a package of bell peppers did not ring up as food in the computer. After the woman swiped her EBT card, it showed a balance that equaled the cost of the peppers. The woman asked what the charge was, and a quick glance at the register screen showed that the peppers did not ring up as food. (Food items had the letter ‘F’ next to their description.) The woman immediately began yelling at me, saying that, “It’s food! You eat it!”This wasn’t the only time things like this happened: if a person’s EBT balance was less than they thought it would be, or if their cards were declined, it was somehow my fault. I understand the situation is stressful, but a person should be knowledgeable about how much money is in their account prior to going grocery shopping. EBT totals are printed on receipts, and every cell phone has a calculator function. There’s no excuse, and there’s no reason to yell at the cashier for it.The worst thing I ever saw at Wal-Mart Scarborough was two women and their children. These women each had multiple carts full of items, and each began loading them at the same time (this should have been a tip-off to their intelligence levels). The first woman, henceforth known as Welfare Queen #1, paid for about $400 worth of food with food stamps. The majority of her food was void of any nutritional value. She then pulled out an entire month’s worth of WIC (Women, Infants, and Children program) checks. I do not mind people paying with WIC, but the woman had virtually none of the correct items. WIC gives each participating mother a book containing actual images of items for which a person can and cannot redeem the voucher. This woman literally failed at image comprehension.After redeeming 10+ WIC checks, Welfare Queen #1 had me adjust the prices of several items she was buying (Wal-Mart’s policy is to adjust the price of the item without question if it’s within a dollar or two). She then pulled out a vacuum cleaner, and informed me that the cost of the vacuum was $3.48 because, “that’s what the label says.” The vacuum cleaner was next to a stack of crates that were $3.48. Somehow, every other customer was able to discern that the vacuum cleaner was not $3.48, but Welfare Queen #1 and her friend Welfare Queen #2 were fooled. Welfare Queen #2 informed me that she used to work for Wal-Mart, and that the “laws of Wal-Mart legally said” that I would have to sell her the vacuum for $3.48. After contacting my manager, who went off to find the proper vacuum price, Welfare Queen #1 remarked that it must be tough to stand on a mat all day and be a cashier. I looked at her, smiled, shrugged, and said, “Well, it’s a job.” She was speechless. After they finally admitted defeat, (not before Welfare Queen #2 realizing she didn’t have enough money to buy all of the food she had picked out, resulting in the waste of about $200 worth of products) the two women left about an hour and a half after they arrived at my register. The next man in line said that the two women reminded him of buying steel drums and cement. I said I was reminded why I vote Republican.Maine has a problem with welfare spending. Maine has some of the highest rates in the nation for food stamp enrollment, Medicaid, and TANF. Nearly 30% of the state is on some form of welfare. Maine is the only state in the nation to rank in the top two for all three categories. This is peculiar, as Maine’s poverty rate isn’t even close to being the highest in the nation. The system in Maine is far easier to get into than in other states, and it encourages dependency. When a person makes over the limit for benefits, they lose all benefits completely. There is no time limit and no motivation to actually get back to work. Furthermore, spending on welfare has increased dramatically, but there has been no reduction of the poverty rate. Something is going terribly wrong, and the things I saw at work were indicators of a much larger problem. Something must change before the state runs out of money funding welfare programs.

CCP: "Should we have government policy overseeing this? If so one can then imagine a new Gov. agency. Or stated another way smaller government, at least in this area is not better."

Yes, Friedman always seems fascinated by the way the trains run on time under totalitarian regimes. He doesn't usually come out and say our government is who should do this but he frames his context by saying the people running for head of the federal government should be talking about it.

For me, no thank you to the central planners. Sure a super high speed network linking Stanford Univ and all the engineers and labs in Silicon Valley would be nice if they don't already have one. Then we will need to provide equal services to inner cities, suburbia and rural America, right? We don't want anyone to be disadvantaged.

Information technology is important and dynamic, always changing and advancing. You wouldn't put your government in charge of auto manufacturing, energy or health care would you? Whoops! It's hard to find examples of things they wouldn't turn over to government control.

The only role I see for public sector is for local government to help with the right of ways for fiber optic lines, not the 60% tax they had home telephone service as they helped to kill it off. Or the mortgage oversight committees where they asked themselves, now that we have all this power, what shall we do next with it?

If given the choice, choose the door with the smaller, equal-protection government behind it. That will not leave us insurmountably lagging the Chinese and the Koreans IMHO.

Yes, there is almost no real poverty left to fight in America. Like climate change, there is a theory that living in hunger causes obesity.

Even if EBT can't go directly to liquor, junk foods or prepared foods, it frees up other monies to do that. Dollars on food stamp cards sell on the street for 50 cents on the dollar. Not a good tradeoff for the taxpayer, but with deficits where they are, the taxpayer isn't really paying either.

As scholar James Q. Wilson has stated, “The poorest Americans today live a better life than all but the richest persons a hundred years ago.”[3] In 2005, the typical household defined as poor by the government had a car and air conditioning. For entertainment, the household had two color televisions, cable or satellite TV, a DVD player, and a VCR... and a coffee maker.

"The major role of the eunuch in ancient societies was a political one. Eunuchs were the perfect guardians of harems and provided safe companions and secretaries for royal ladies. They could also be entrusted with the very highest offices of state with no fear that they would want to muscle in and start their own dynasties. Less susceptible than other men to corruption and persuasion by sexual means, they were the ideal politicians and civil servants. Their reputations could not be sullied by the accusations of rape, paternity suits and other scandals that so often blight the careers of public figures."

"The first civilization deliberately to select eunuchs as officers of state was the Assyrian Empire, which dominated the Near East during the early first millenium BC. The practice was continued by its successors, including the Persian Empire, founded by Cyrus the Great (559-529 BC) who, according to the Greek writer Xenophon, 'selected eunuchs for every post of personal service to him, from the doorkeepers up'. Eunuchs were becoming powerful in China during the same period. They were especially influential under the Han Dynasty (202 BC - 220 AD), when some held tremendous power simply because of their looks, and it was normal for emperors to have as many male favorites as the recommended magical number of wives. But most were of the professional variety, trained for a career in government."

"The Roman civil service also employed eunuchs, despite the bans on castration imposed by various emperors. And, although the custom was condemned by the Church, the zenith of 'eunuch power' in the Roman world actually came after it was Christianized, under the Eastern Roman (Byzantyne) Empire, which ruled from Constantinople (Istanbul) between AD 395 and 1453. Thousands of young men entered public service by being castrated, providing the empire with some of its most distinguished state secretaries, generals and even Church leaders."

By MARK P. MILLS AND JULIO M. OTTINO In January 1912, the United States emerged from a two-year recession. Nineteen more followed—along with a century of phenomenal economic growth. Americans in real terms are 700% wealthier today.

In hindsight it seems obvious that emerging technologies circa 1912—electrification, telephony, the dawn of the automobile age, the invention of stainless steel and the radio amplifier—would foster such growth. Yet even knowledgeable contemporary observers failed to grasp their transformational power.

In January 2012, we sit again on the cusp of three grand technological transformations with the potential to rival that of the past century. All find their epicenters in America: big data, smart manufacturing and the wireless revolution.

Information technology has entered a big-data era. Processing power and data storage are virtually free. A hand-held device, the iPhone, has computing power that shames the 1970s-era IBM mainframe. The Internet is evolving into the "cloud"—a network of thousands of data centers any one of which makes a 1990 supercomputer look antediluvian. From social media to medical revolutions anchored in metadata analyses, wherein astronomical feats of data crunching enable heretofore unimaginable services and businesses, we are on the cusp of unimaginable new markets.

Enlarge Image

CloseCorbis .The second transformation? Smart manufacturing. This is the first structural shift since Henry Ford launched the economic power of "mass production." While we see evidence already in automation and information systems applied to supply-chain management, we are just entering an era where the very fabrication of physical things is revolutionized by emerging materials science. Engineers will soon design and build from the molecular level, optimizing features and even creating new materials, radically improving quality and reducing waste.

Devices and products are already appearing based on computationally engineered materials that literally did not exist a few years ago: novel metal alloys, graphene instead of silicon transistors (graphene and carbon enable a radically new class of electronic and structural materials), and meta-materials that possess properties not possible in nature; e.g., rendering an object invisible—speculation about which received understandable recent publicity.

This era of new materials will be economically explosive when combined with 3-D printing, also known as direct-digital manufacturing—literally "printing" parts and devices using computational power, lasers and basic powdered metals and plastics. Already emerging are printed parts for high-value applications like patient-specific implants for hip joints or teeth, or lighter and stronger aircraft parts. Then one day, the Holy Grail: "desktop" printing of entire final products from wheels to even washing machines.

The era of near-perfect computational design and production will unleash as big a change in how we make things as the agricultural revolution did in how we grew things. And it will be defined by high talent not cheap labor.

Finally, there is the unfolding communications revolution where soon most humans on the planet will be connected wirelessly. Never before have a billion people—soon billions more—been able to communicate, socialize and trade in real time.

The implications of the radical collapse in the cost of wireless connectivity are as big as those following the dawn of telegraphy/telephony. Coupled with the cloud, the wireless world provides cheap connectivity, information and processing power to nearly everyone, everywhere. This introduces both rapid change—e.g., the Arab Spring—and great opportunity. Again, both the launch and epicenter of this technology reside in America.

Few deny that technology fuels economic growth as well as both social and lifestyle progress, the latter largely seen in health and environmental metrics. But consider three features that most define America, and that are essential for unleashing the promises of technological change: our youthful demographics, dynamic culture and diverse educational system.

First, demographics. By 2020, America will be younger than both China and the euro zone, if the latter still exists. Youth brings more than a base of workers and taxpayers; it brings the ineluctable energy that propels everything. Amplified and leavened by the experience of their elders, youth and economic scale (the U.S. is still the world's largest economy) are not to be underestimated, especially in the context of the other two great forces: our culture and educational system.

The American culture is particularly suited to times of tumult and challenge. Culture cannot be changed or copied overnight; it is a feature of a people that has, to use a physics term, high inertia. Ours is distinguished by incontrovertibly powerful features, namely open-mindedness, risk-taking, hard work, playfulness, and, critical for nascent new ideas, a healthy dose of anti-establishment thinking. Where else could an Apple or a Steve Jobs have emerged?

Then there's our educational system, often criticized as inadequate to global challenges. But American higher education eludes simple statistical measures since its most salient features are flexibility and diversity of educational philosophies, curricula and the professoriate. There is a dizzying range of approaches in American universities and colleges. Good. One size definitely does not fit all for students or the future.

We should also remember that more than half of the world's top 100 universities remain in America, a fact underscored by soaring foreign enrollments. Yes, other nations have fine universities, and many more will emerge over time. But again the epicenter remains here.

What should our politicians do to help usher in this new era of entrepreneurial growth? Liquid financial markets, sensible tax and immigration policy, and balanced regulations will allow the next boom to flourish. But the essential fuel is innovation. The promise resides in the tectonic technological shifts under way.

America's success isn't preordained. But the technological innovations circa 2012 are profound. They will engender sweeping changes to our society and our economy. All the forces are in place. It's just a matter of when.

Mr. Mills, a physicist and founder of the Digital Power Group, writes the Forbes Energy Intelligence column. Mr. Ottino is dean of the McCormick School of Engineering and Applied Sciences at Northwestern University.

We see the nepotism corruption and control that government control over business can have here in the US. There are some benefits as noted in multiple articles in the Economist's last issue. Yet, eventually it is thought this style of governing will lose out.Brockster is leading us towards the government control of everything which is bad in the long run. We will have much more stifling of innovation, and far more corruption in the long run. Here is one article:

***The visible handThe crisis of Western liberal capitalism has coincided with the rise of a powerful new form of state capitalism in emerging markets, says Adrian WooldridgeJan 21st 2012 | from the print edition

BEATRICE WEBB grew up as a fervent believer in free markets and limited government. Her father was a self-made railway tycoon and her mother an ardent free-trader. One of her family’s closest friends was Herbert Spencer, the leading philosopher of Victorian liberalism. Spencer took a shine to young Beatrice and treated her to lectures on the magic of the market, the survival of the fittest and the evils of the state. But as Beatrice grew up she began to have doubts. Why should the state not intervene in the market to order children out of chimneys and into schools, or to provide sustenance for the hungry and unemployed or to rescue failing industries? In due course Beatrice became one of the leading architects of the welfare state—and a leading apologist for Soviet communism.

The argument about the relative merits of the state and the market that preoccupied young Beatrice has been raging ever since. Between 1900 and 1970 the pro-statists had the wind in their sails. Governments started off by weaving social safety nets and ended up by nationalising huge chunks of the economy. Yet between 1970 and 2000 the free-marketeers made a comeback. Ronald Reagan and Margaret Thatcher started a fashion across the West for privatising state-run industries and pruning the welfare state. The Soviet Union and its outriggers collapsed in ruins.

In this special report»The visible handSomething old, something newNew masters of the universeTheme and variationsMixed bagThe world in their handsAnd the winner is…Sources & acknowledgementsReprints

Related topicsEconomicsGlobalisationUnited StatesBusinessEmerging marketsThe era of free-market triumphalism has come to a juddering halt, and the crisis that destroyed Lehman Brothers in 2008 is now engulfing much of the rich world. The weakest countries, such as Greece, have already been plunged into chaos. Even the mighty United States has seen the income of the average worker contract every year for the past three years. The Fraser Institute, a Canadian think-tank, which has been measuring the progress of economic freedom for the past four decades, saw its worldwide “freedom index” rise relentlessly from 5.5 (out of 10) in 1980 to 6.7 in 2007. But then it started to move backwards.

The crisis of liberal capitalism has been rendered more serious by the rise of a potent alternative: state capitalism, which tries to meld the powers of the state with the powers of capitalism. It depends on government to pick winners and promote economic growth. But it also uses capitalist tools such as listing state-owned companies on the stockmarket and embracing globalisation. Elements of state capitalism have been seen in the past, for example in the rise of Japan in the 1950s and even of Germany in the 1870s, but never before has it operated on such a scale and with such sophisticated tools.

State capitalism can claim the world’s most successful big economy for its camp. Over the past 30 years China’s GDP has grown at an average rate of 9.5% a year and its international trade by 18% in volume terms. Over the past ten years its GDP has more than trebled to $11 trillion. China has taken over from Japan as the world’s second-biggest economy, and from America as the world’s biggest market for many consumer goods. The Chinese state is the biggest shareholder in the country’s 150 biggest companies and guides and goads thousands more. It shapes the overall market by managing its currency, directing money to favoured industries and working closely with Chinese companies abroad.

State capitalism can also claim some of the world’s most powerful companies. The 13 biggest oil firms, which between them have a grip on more than three-quarters of the world’s oil reserves, are all state-backed. So is the world’s biggest natural-gas company, Russia’s Gazprom. But successful state firms can be found in almost any industry. China Mobile is a mobile-phone goliath with 600m customers. Saudi Basic Industries Corporation is one of the world’s most profitable chemical companies. Russia’s Sberbank is Europe’s third-largest bank by market capitalisation. Dubai Ports is the world’s third-largest ports operator. The airline Emirates is growing at 20% a year.

State capitalism is on the march, overflowing with cash and emboldened by the crisis in the West. State companies make up 80% of the value of the stockmarket in China, 62% in Russia and 38% in Brazil (see chart). They accounted for one-third of the emerging world’s foreign direct investment between 2003 and 2010 and an even higher proportion of its most spectacular acquisitions, as well as a growing proportion of the very largest firms: three Chinese state-owned companies rank among the world’s ten biggest companies by revenue, against only two European ones (see chart). Add the exploits of sovereign-wealth funds to the ledger, and it begins to look as if liberal capitalism is in wholesale retreat: New York’s Chrysler Building (or 90% of it anyway) has fallen to Abu Dhabi and Manchester City football club to Qatar. The Chinese have a phrase for it: “The state advances while the private sector retreats.” This is now happening on a global scale.

.This special report will focus on the new state capitalism of the emerging world rather than the old state capitalism in Europe, because it reflects the future rather than the past. The report will look mainly at China, Russia and Brazil. The recent protests in Russia against the rigging of parliamentary elections by Vladimir Putin, the prime minister, have raised questions about the country’s political stability and, by implication, the future of state capitalism there, but for the moment nothing much seems to have changed. India will not be considered in detail because, although it has some of the world’s biggest state-owned companies, they are more likely to be leftovers of the Licence Raj rather than thrusting new national champions.

Today’s state capitalism also represents a significant advance on its predecessors in several respects. First, it is developing on a much wider scale: China alone accounts for a fifth of the world’s population. Second, it is coming together much more quickly: China and Russia have developed their formula for state capitalism only in the past decade. And third, it has far more sophisticated tools at its disposal. The modern state is more powerful than anything that has gone before: for example, the Chinese Communist Party holds files on vast numbers of its citizens. It is also far better at using capitalist tools to achieve its desired ends. Instead of handing industries to bureaucrats or cronies, it turns them into companies run by professional managers.

The return of history

This special report will cast a sceptical eye on state capitalism. It will raise doubts about the system’s ability to capitalise on its successes when it wants to innovate rather than just catch up, and to correct itself if it takes a wrong turn. Managing the system’s contradictions when the economy is growing rapidly is one thing; doing so when it hits a rough patch quite another. And state capitalism is plagued by cronyism and corruption.

But the report will also argue that state capitalism is the most formidable foe that liberal capitalism has faced so far. State capitalists are wrong to claim that they combine the best of both worlds, but they have learned how to avoid some of the pitfalls of earlier state-sponsored growth. And they are flourishing in the dynamic markets of the emerging world, which have been growing at an average of 5.5% a year against the rich world’s 1.6% over the past few years and are likely to account for half the world’s GDP by 2020.

State capitalism increasingly looks like the coming trend. The Brazilian government has forced the departure of the boss of Vale, a mining giant, for being too independent-minded. The French government has set up a sovereign-wealth fund. The South African government is talking openly about nationalising companies and creating national champions. And young economists in the World Bank and other multilateral institutions have begun to discuss embracing a new industrial policy.

That raises some tricky questions about the global economic system. How can you ensure a fair trading system if some companies enjoy the support, overt or covert, of a national government? How can you prevent governments from using companies as instruments of military power? And how can you prevent legitimate worries about fairness from shading into xenophobia and protectionism? Some of the biggest trade rows in recent years—for example, over the China National Offshore Oil Corporation’s attempt to buy America’s Unocal in 2005, and over Dubai Ports’ purchase of several American ports—have involved state-owned enterprises. There are likely to be many more in the future.

The rise of state capitalism is also undoing many of the assumptions about the effects of globalisation. Kenichi Ohmae said the nation state was finished. Thomas Friedman argued that governments had to don the golden straitjacket of market discipline. Naomi Klein pointed out that the world’s biggest companies were bigger than many countries. And Francis Fukuyama asserted that history had ended with the triumph of democratic capitalism. Now across much of the world the state is trumping the market and autocracy is triumphing over democracy.

Ian Bremmer, the president of Eurasia Group, a political-risk consultancy, claims that this is “the end of the free market” in his excellent book of that title. He exaggerates. But he is right that a striking number of governments, particularly in the emerging world, are learning how to use the market to promote political ends. The invisible hand of the market is giving way to the visible, and often authoritarian, hand of state capitalism.

Special report at a glance The crisis of Western liberal capitalism has coincided with the rise of a powerful new form of state capitalism in emerging markets says Adrian Wooldridge Neil Webb A brief history of state capitalism and its variations Neil Webb Related stories The visible hand

Something old, something new How state enterprise is spreading to achieve global reachNeil Webb Related stories State capitalism's global reach: New masters of the universe

It's not all the same, there are themes and variations within state capitalismNeil Webb Related stories A choice of models: Theme and variations

Pros and cons: SOEs are good at infrastructure projects, not so good at innovationNeil Webb Related stories Pros and cons: Mixed bag

The Great DivorceBy DAVID BROOKSPublished: January 30, 2012 Recommend Twitter Linkedin

comments (97)Sign In to E-Mail

Print

Reprints

ShareCloseDiggRedditTumblrPermalink I’ll be shocked if there’s another book this year as important as Charles Murray’s “Coming Apart.” I’ll be shocked if there’s another book that so compellingly describes the most important trends in American society.

Josh Haner/The New York TimesDavid Brooks

Go to Columnist Page »David Brooks’s BlogThe intellectual, cultural and scientific findings that land on the columnist’s desk nearly every day.

Go to the Blog »The ConversationDavid Brooks and Gail Collins talk between columns.

All Conversations »Readers’ CommentsReaders shared their thoughts on this article.Read All Comments (97) »Murray’s basic argument is not new, that America is dividing into a two-caste society. What’s impressive is the incredible data he produces to illustrate that trend and deepen our understanding of it.

His story starts in 1963. There was a gap between rich and poor then, but it wasn’t that big. A house in an upper-crust suburb cost only twice as much as the average new American home. The tippy-top luxury car, the Cadillac Eldorado Biarritz, cost about $47,000 in 2010 dollars. That’s pricy, but nowhere near the price of the top luxury cars today.

More important, the income gaps did not lead to big behavior gaps. Roughly 98 percent of men between the ages of 30 and 49 were in the labor force, upper class and lower class alike. Only about 3 percent of white kids were born outside of marriage. The rates were similar, upper class and lower class.

Since then, America has polarized. The word “class” doesn’t even capture the divide Murray describes. You might say the country has bifurcated into different social tribes, with a tenuous common culture linking them.

The upper tribe is now segregated from the lower tribe. In 1963, rich people who lived on the Upper East Side of Manhattan lived close to members of the middle class. Most adult Manhattanites who lived south of 96th Street back then hadn’t even completed high school. Today, almost all of Manhattan south of 96th Street is an upper-tribe enclave.

Today, Murray demonstrates, there is an archipelago of affluent enclaves clustered around the coastal cities, Chicago, Dallas and so on. If you’re born into one of them, you will probably go to college with people from one of the enclaves; you’ll marry someone from one of the enclaves; you’ll go off and live in one of the enclaves.

Worse, there are vast behavioral gaps between the educated upper tribe (20 percent of the country) and the lower tribe (30 percent of the country). This is where Murray is at his best, and he’s mostly using data on white Americans, so the effects of race and other complicating factors don’t come into play.

Roughly 7 percent of the white kids in the upper tribe are born out of wedlock, compared with roughly 45 percent of the kids in the lower tribe. In the upper tribe, nearly every man aged 30 to 49 is in the labor force. In the lower tribe, men in their prime working ages have been steadily dropping out of the labor force, in good times and bad.

People in the lower tribe are much less likely to get married, less likely to go to church, less likely to be active in their communities, more likely to watch TV excessively, more likely to be obese.

Murray’s story contradicts the ideologies of both parties. Republicans claim that America is threatened by a decadent cultural elite that corrupts regular Americans, who love God, country and traditional values. That story is false. The cultural elites live more conservative, traditionalist lives than the cultural masses.

Democrats claim America is threatened by the financial elite, who hog society’s resources. But that’s a distraction. The real social gap is between the top 20 percent and the lower 30 percent. The liberal members of the upper tribe latch onto this top 1 percent narrative because it excuses them from the central role they themselves are playing in driving inequality and unfairness.

It’s wrong to describe an America in which the salt of the earth common people are preyed upon by this or that nefarious elite. It’s wrong to tell the familiar underdog morality tale in which the problems of the masses are caused by the elites.

The truth is, members of the upper tribe have made themselves phenomenally productive. They may mimic bohemian manners, but they have returned to 1950s traditionalist values and practices. They have low divorce rates, arduous work ethics and strict codes to regulate their kids.

Members of the lower tribe work hard and dream big, but are more removed from traditional bourgeois norms. They live in disorganized, postmodern neighborhoods in which it is much harder to be self-disciplined and productive.

I doubt Murray would agree, but we need a National Service Program. We need a program that would force members of the upper tribe and the lower tribe to live together, if only for a few years. We need a program in which people from both tribes work together to spread out the values, practices and institutions that lead to achievement.

If we could jam the tribes together, we’d have a better elite and a better mass.

Barack Hussein Obama centered his recent State of Disunion campaign speech on the worn socialist refrain of "fairness.""We can go in two directions," Obama said. "One is towards less opportunity and less fairness. Or we can fight for ... building an economy that works for everyone, not just a wealthy few."His subsequent 2012 stump speeches include a variation of these words at his most recent whistle stop in Michigan: "I want this to be a big, bold, generous country where everybody gets a fair shot, everybody is doing their fair share, everybody is playing by the same set of rules."Let's briefly review our nation's history in regard to Liberty, taxation and "fairness."The first American Revolution was galvanized by a Tea Party protest against a small three pence tax surcharge on imported tea.Our Founders were uniformly concerned about government power to lay and collect taxes and, accordingly, enumerated specific limitations on taxing and spending.James Madison addressed the issue of unlimited spending, and his words are applicable today: "It has been [said], that the power 'to lay and collect taxes, duties, imposts and excises, to pay the debts and provide for the common defence and general welfare of the United States,' amounts to an unlimited commission to exercise every power which may be alleged to be necessary for the common defence or general welfare." Rejecting that "misconstruction" of our Constitution, Madison went on to write, "If Congress can do whatever in their discretion can be done by money, and will promote the General Welfare, the Government is no longer a limited one, possessing enumerated powers, but an indefinite one."To ensure that federal taxation would be limited to these constraints, Article I, Section 8, Clause 1 of our Constitution (the "Taxing and Spending Clause"), as duly ratified in 1789, defined the "Taxes, Duties, Imposts and Excises," but Section 8 required that such, "Duties, Imposts and Excises shall be uniform throughout the United States." This, in effect, limited the power of Congress to impose direct taxes on individuals, as further outlined in Section 9: "No Capitation, or other direct, Tax shall be laid, unless in Proportion to the Census or enumeration herein before directed to be taken."That Constitutional limitation survived until 1861, when the first income tax was imposed to defray costs of the War Between the States. That three-percent tax on incomes over $800 was sold as an emergency war measure. In 1894, congressional Democrats tested the Constitution, passing a peacetime tax of two percent on income above $4,000. A year later, that tariff was overturned by the Supreme Court as not complying with the limitations set forth in Article 1. However, the greatest historical injury to economic Liberty was dealt in the presidential campaign of 1912, when the father of Democratic Socialism, Woodrow Wilson, was elected on his mastery of class warfare rhetoric, as outlined in Karl Marx's Communist Manifesto in the mid-19th century. He used Marx's populist redistribution theme, "From each according to his abilities, to each according to his needs," to gain passage of the Sixteenth Amendment, which stated, "The Congress shall have power to lay and collect taxes on incomes, from whatever source derived, without apportionment among the several States, and without regard to any census or enumeration."The top tax rate levied under the new Amendment was just seven percent on incomes above $500,000 (about $12 million in 2012 dollars).But the ability to impose direct taxes gave rise to a century of class warfare political rhetoric that would be anathema to our Founders and the Liberty they fought so hard to secure for their posterity.Two decades later, Franklin Roosevelt gained acceptance of his New Deal programs via his refined classist rhetoric -- and American socialist propaganda has been the bookmarked page in the political playbook of all Democrat presidents since.Though the contrast between, and debate about, Leftist Tyranny versus Essential Liberty was boldly reinvigorated by Ronald Reagan during his two terms of office, never before the election of Barack Hussein Obama in 2008 have so many Americans fully recognized the cumulative manifestation of collectivist socialism.A second Tea Party protest has been brewing since Obama took office, demanding tax reform -- but no such reformation will succeed unless accompanied by tax conformation, ensuring that taxes collected are only for expenditures authorized by our Constitution.Post Your Opinion: Tell us how you will act to oppose Obama's re-election.Laying the groundwork for his 2012 re-election bid, Obama's SOTU was devoid of any free-market economic remedies, and every "solution" was predicated upon government engineering via intervention, regulation or redistribution -- consistent with his perfected version of Wilson's and Roosevelt's Democratic Socialist platform.Additionally, Obama has dumbed-down his classist "fairness" rhetoric to comport with the latest populist appeals of the "occupy movement."Anticipating that his opponent in the general election will be Mitt Romney, an easy-to-target "rich Republican," Obama has rallied his own stable of uber-wealthy Leftists in support of his "Wall Street v Main Street" disinformation campaign.In his SOTU, Obama declared, "You can call this class warfare all you want. But asking a billionaire to pay at least as much as his secretary in taxes? Most Americans would call that common sense."To further advance his classist "divide and conquer" strategy, he trotted out Debbie Bosanek, the secretary of billionaire Obamaphile Warren Buffett, as a prop for invoking the Buffet Rule -- "If you make more than $1 million a year, you should not pay less than 30 percent in taxes.""Right now, Warren Buffett pays a lower tax rate than his secretary. Do we want to keep these tax cuts for the wealthiest Americans?" asked Obama. Predictably, our Class-Warrior-in-Chief has refused to tell the American people how much Ms. Bosanek is paid in order to be taxed at a higher rate than her boss. Forbes Magazine, however, uses current IRS tax tables to estimate that she makes "well above $200,000 annually." Clearly, Ms. Bosanek isn't just any old secretary."Facts," as John Adams noted, "are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passion, they cannot alter the state of facts and evidence."The fact is that Warren Buffet, like Mitt Romney and other "wealthy Americans," pays much more than the much-maligned "15 percent" on capital gains. Before being taxed on his profits, the corporations producing them are already taxed at 35 percent -- the highest corporate tax rate in the world. So, in effect, these vilified wealthy Americans are already paying more than 50 percent in taxes, far above the 30 percent rate of Obama's beloved "Buffett Rule," and far, far above his "15 percent" claim.

Here in our humble editorial shop, we call Obama's deceptive prevarication, The Big Lie.Now, Obama might make a pitch for fairness if the top 25 percent of income earners were paying a lower percentage of the nation's tax bill than the percentage of national income they earn. But the top 25 percent are currently paying 87 percent of that bill while earning 65 percent of that income.However, populist rhetoric trumps facts, where there is not enough "common sense" to prevail. As George Bernard Shaw said, "A government which robs Peter to pay Paul can always depend on the support of Paul."Moreover, the real tragedy is that Obama's senseless sycophants don't comprehend the great error of his "fairness" rhetoric. The capital Obama proposed to remove from the economy in the form of even more disproportionate taxes (for expenditures not authorized by our Constitution) will decrease the available pool of capital for economic expansion, job creation and higher standards of living for ALL working Americans.Post Your Opinion: How can the next Republican presidential candidate best defend free enterprise?Obama's "fairness" farce to raise taxes and shrink capital is the last component of his macroeconomic agenda to break the back of free enterprise, in order to achieve his objective of "fundamentally transforming the United States of America" from a nation guided by Rule of Law as supported by economic Liberty, to one subdued by the rule of men under the oppression of Democratic Socialism.In 1819, Chief Justice John Marshall famously observed, "An unlimited power to tax involves, necessarily, a power to destroy; because there is a limit beyond which no institution and no property can bear taxation."Should Obama gain a second term, he is virtually assured "an unlimited power to tax" beyond the limits free enterprise can bear. The outcome of the next election will be either a sunrise or sunset on Liberty.While Leftists may have the constitutional authority to levy direct taxes on income, they do not have the authority to levy such taxes for expenditures not expressly authorized by our Constitution -- though they have done so with impunity for generations.In 1794, as recorded in the Annals of Congress, James Madison declared, "I cannot undertake to lay my finger on that article of the Constitution which granted a right to Congress of expending, on objects of benevolence, the money of their constituents..."To this day, no constitutional articulation of such spending power exists, and challenging that authority exposes the Achilles' heel of the generations of socialist programs espoused by Obama and his Leftist cadres.So, can an "establishment Republican" defeat Obama with a "tax reform" platform?No -- unless he centers the debate on the fact that our Constitution provides no authority for the expenditures Obama proposes, charges him with violating his "sacred oath" to "support and defend," and vigorously makes the case that Obama has offended American Liberty in the process.It is just such a breach of trust that gave rise to the first American Revolution. Focusing on Obama's Breach of Oath will ensure that Liberty can be sustained with ballots rather than bullets.The next Republican presidential candidate must not only defeat Obama's rhetoric with Rule of Law defining the role of government, but if he succeeds, he must devote his administration, first and foremost, to real tax reform and implementation of either a flat tax or national sales tax.As far as the "Buffett Rule" is concerned, Buffett and his secretary, and all American taxpayers, are being overtaxed for illegal expenditures.Semper Vigilo, Fortis, Paratus et Fidelis!Libertas aut Mortis!

Baroness Sayeeda Warsi is the co-chairman of the British Conservative Party and the first female Muslim to serve as a minister in a UK cabinet. This week she gave a controversial speech about the role of faith in public life at a conference organised by the Vatican.

Today I want to make one simple argument. That in order to ensure faith has a proper space in the public sphere, in order to encourage social harmony, people need to feel stronger in their religious identities, more confident in their beliefs. In practice this means individuals not diluting their faith and nations not denying their religious heritage.

If you take this thought to its conclusion then the idea you’re left with is this: Europe needs to become more confident in its Christianity. Let us be honest. Too often there is a suspicion of faith in our continent: where signs of religion cannot be displayed or worn in government buildings; where states won’t fund faith schools; and where faith is sidelined, marginalised and downgraded.

It all hinges on a basic misconception: that somehow to create equality and space for minority faiths and cultures we need to erase our majority religious heritage. But it is my belief that the societies we are, the cultures we’ve created, the values we hold and the things we fight for stem from something we’ve argued over, dissented from, discussed and built up: centuries of Christianity.

The Christian roots of Europe

It’s what the Holy Father called the “unrenounceable Christian roots of [our] culture and civilisation” which shine through our politics, our public life, our culture, our economics, our language and our architecture. You cannot and should not erase these Christian foundations from the evolution of our nations any more than you can or should erase the spires from our landscapes.

Let me get one thing very clear: I am not saying that everything done in the name of faith has been a blessing for our continent. Too much blood has been shed in the name of religion. But trying to erase this history or blind ourselves to the role of religion on our continent is wrong. We need to realise what drives us, what binds us and what inspires us is a history we are in danger of denying.

I know, in a globalised world, it is easy to think that to relate to others you must water down your identity. But my point today is that being sure of who you are is the only way in which you will be more accommodating of others.

And there is a second strand to this argument. That true confidence has the power to guarantee openness. Because only when you’re content in your own identity only when you realise that the ‘Other’ does not jeopardise who you are can you truly accept and not merely tolerate the presence of difference. Just as the bully bullies because he or she is insecure, so too the state suppresses, marginalises, dictates and dismisses when it feels its identity is at stake.

In the United Kingdom, we have guarded against such fear by recognising the importance of the Established Church and our Christian heritage – our majority faith. And that is what has created religious freedom and a home for people like me, of minority faiths. Majority faiths and minority faiths – as a Muslim who was born and raised in – and now serves – a Christian country, I have experience of both.

What truly enabled me to learn about my faith and to practice it was that my country – the bed over which the river of my faith flowed – had a strong Christian identity. This defined, shaped and gave me confidence in my own faith which, combined with the confidence of my country’s principles and values have since been evident in the decisions I’ve taken as an adult.

Good works come from conviction

A strong sense of Christianity didn’t threaten our Muslim identity – it actually reinforced it. It enabled me to make the case for further interfaith debate, discussion and work. It motivated me to stand up and speak out against anti-Muslim hatred, the persecution of Christians and anti-Semitism. And it inspired me to challenge the growing marginalisation of faith in my country and in Europe.

As I look around the world today, my resolve is strengthened. Where we see faith inspiring, driving and motivating good works is where certainty of conviction is at its strongest. As the Bible teaches us: “For even as the body without the spirit is dead: so also faith without works is dead.”

The Qur’an teaches us something similar – that: “those who believe and do good works are the best of created beings”. We see the proof every day – globally, locally and individually. From the Catholic Church being instrumental in toppling communism to its key role in securing peace in Northern Ireland. From the Catholic Schools in the UK, many of which are outperforming other institutions to the domestic response to the earthquake in Haiti, the floods in Pakistan and the drought in East Africa. And where day by day, faith sustains people through their darkest, most desperate periods. There is no denying the link between these positive actions and faith.

Don’t dumb down religion

As a UK cabinet minister of the Muslim faith, representing a country with an Anglican Established Church, visiting our friends in the spiritual home of Catholicism you will find no greater champion of understanding between faiths than me.

But I believe that where interfaith dialogue does not work is where faiths are dumbed down in order to find common ground. Just as the European language of Esperanto, which attempted to build a new tongue, neutralises our component languages, a common language between faiths risks watering down the diversity and intensity of our respective religions.

The point is that in so many ways, being sure of your faith adds a layer of strength to society. Confidence in our own beliefs enables us to defend attacks on others. Faith asks you to stand up for your neighbour. As the fourth Muslim caliph Ali ibn Abu Talib said: “Every man is your brother.either your brother in faith or your brother in humanity.”

This is the spirit which inspired Muslims to protect Jews during the Holocaust, which motivated Christians to support Muslims fleeing persecution in Darfur and which led Chief Rabbi Sacks to call for action against persecution in Bosnia. It’s something I’ve been arguing for a long time. That persecution somewhere is persecution everywhere. That if you oppress my neighbour you are oppressing me. That an attack on a gudwara is an attack on a mosque, a church, a temple, a synagogue.

Marginalisation of faith

But the confident affirmation of religion which I have spoken of is under threat. It is what the Holy Father called ‘the increasing marginalisation of religion’ during his speech in Westminster Hall.

I see it in United Kingdom and I see it in Europe: spirituality suppressed; divinity downgraded. Where, in the words of the Archbishop of Canterbury, faith is looked down on as the hobby of ‘oddities, foreigners and minorities’. Where religion is dismissed as an eccentricity because it’s infused with tradition. Where we undermine people who attribute good works to their belief and require them to deny it as their motivation. And where faith is overlooked in the public sphere with not even a word about Christianity in the preface of the “European Constitution”.

When I pledged that the new government in the United Kingdom would ‘do God’, in some quarters there was uproar. More telling were the countless comments I received of quiet support a relief that finally someone had said what they had been thinking. This fact alone shows the extent to which religion has been sidelined by some.

Because in parts of Europe there have been misguided beliefs that in order to accommodate people from other backgrounds, we must somehow become less religious or less Christian, that somehow society must level itself out so that faith becomes something that is marginalised and limited to the private confines of one’s home or even one’s mind.

But those calls are not coming from other faith communities. They are coming from two types of people. First, the well-intentioned liberal elite who, conversely, are trying to create equality by marginalising faith in society, who think that the route to religious pluralism is by creating a path of faith-neutrality, who downgrade religion to a mere subcategory in public life.

But look at their supposed level playing field. Its terrain is all but impassable to anyone of belief. One of the arguments of the liberal elite is that faith and reason are incompatible. But they don’t realise, as the Holy Father has argued for many years, that faith and reason go hand in hand. As he said to us in Westminster Hall, “the world of secular rationality and the world of religious belief need one another and should not be afraid to enter into a profound and ongoing dialogue, for the good of our civilisation.” In other words, just as reason should not be excluded from debates about faith so too spirituality should not be excluded when we look at worldly matters.

Second, there are the anti-religionists, the faith deniers, the people who dine out on free-flowing media and sustain a vocabulary of secularist intolerance, attempting to remove all trace of religion from culture, history and public discourse, while ignoring the fact that people of faith give more to charity and that the number of people going to a place of worship is globally on the up.

The deep intolerance of militant secularisation

For me, one of the most worrying aspects about this militant secularisation is that at its core and in its instincts it is deeply intolerant. It demonstrates similar traits to totalitarian regimes – denying people the right to a religious identity and failing to understand the relationship between religious loyalty and loyalty to the state.

That’s why in the 20th Century, one of the first acts of totalitarian regimes was the targeting of organised religion. Why? Because, to them, a religious identity struck at the heart of their totalitarian ideology. In a free market of ideas, they knew their ideology was weak. And with the strength of religions, established over many years, followed by many billions, their totalitarian regimes would be jeopardised.

Our response to militant secularisation today has to be simple: holding firm in our faiths; holding back intolerance, reaffirming the religious foundations on which our societies are built. And reasserting the fact that, for centuries, Christianity in Europe has been inspiring, motivating, strengthening and improving our societies. In public life – driving people to do great things, like setting up schools, creating public services, leading the way in charitable acts. In politics – inspiring parties on both the left and the right. In economics – providing many of the foundations for our market economy and capitalism. In culture – influencing our monuments, our music, our paintings, and our engravings.

Faith must inform public debate

Politicians need to give faith a seat at the table in public life. Not the privileged position of a theocracy, but that of an equal informer of our public debate. So we are not afraid to acknowledge when the debate derives from a religious basis. And not afraid to take onboard – and take on – the solutions offered up by religion. Politicians must also not be afraid to speak out when we think people who speak in the name of faith have got it wrong.

I am not saying that faith leaders should have a monopoly on morality. Because, of course, as our Prime Minister David Cameron said, there are Christians who don’t live by a moral code and there are atheists and agnostics who do. But for people who do have a faith, their faith can be a helpful prod in the right direction.

Therefore, I’m arguing that religion needs a role when we look at the problems today. So that even the most committed atheist can find that those who are committed to religion have something to offer and that faith can be good for society, good for communities and good for those who choose to follow a faith. When religion has a role in public life, it enables us to look at our economy and refer to the Christian principles on which our markets were founded. It means we can take solace from teachings such a Rerum Novarum and Caritas in Veritate, which offer up answers for creating moral markets.

It means we can look at our social problems and be inspired by Catholic Social Teaching [by] looking at our welfare system and thinking, how does this impact on human dignity; [by] looking at social breakdown and thinking, are we reinforcing responsibility between citizens; [by] looking at governance and thinking, are we relying on large organisations to do what smaller units could achieve -- all the while thinking and remembering that many of our values -- loving our neighbours, acting as the Good Samaritan, supporting and championing the family unit, doing to others as you would be done by -- are Biblical, spiritual and religious in their origin.

People need to realise that, in our continent and beyond, Christianity’s teachings and values are as permanent as Westminster Abbey as indelible as Da Vinci’s Last Supper and as solid as Christ the Redeemer and that Christianity is as vital to our future as it is to our past. Our two states have lots to learn and much to teach and I have hope, and yes faith, that others will continue with us on this path.

This speech has been edited for length. For the full speech, visit Sayeeda Warsi’s website.

If You're Ever Murdered, Here's an IdeaTuesday, February 21, 2012ShareThisI'd like to offer a simple proposal that, if enacted, could generate a great deal of a most precious resource: moral clarity.

It concerns the death penalty.

Opponents of capital punishment for murderers argue that the state has no right to take a murderer's life. Apparently, one fact that abolitionists forget or overlook is that the state is acting on behalf of the murdered person and the murdered person's family, not only on behalf of society.

In order to make this as clear as possible, here is my proposal: Americans should be able to declare what they want the state to do on their behalf if they are murdered. Those who wish the state to keep their murderer alive for all of his natural years should wear, let us say, a green bracelet and/or place a green dot on their driver's license or license plate. And those who want their convicted murderer put to death can wear a red bracelet and/or have a red dot on their license.

Just as I have a pink "donor" circle on my driver's license signifying that in case I die, I wish to provide my organs to help keep some person alive, I wish to make it known that if I am murdered, I do not want my murderer kept alive a day longer than legally necessary.

There are a number of reasons for recommending such a policy.

First, as noted, it is clarifying for the individual. It is easier to take a position in the abstract than when it hits home. It is one thing to oppose the death penalty when others are killed, but if you have to decide what happens if it is you who is murdered, the mind focuses with greater clarity.

Before deciding which color to choose, let a woman imagine herself raped and then stabbed to death. And let her further imagine that if this happened to her, she now has some say in determining what happens to the person who did this to her. She is no longer a silent corpse. Her voice will be heard, perhaps even be determinative of her killer's fate.

Likewise, the woman who truly opposes death for any murderer, no matter how heinous and sadistic his actions, will also now have the ability to speak from the grave. No matter how much her family may seek the death penalty, family members will have no say. Any woman -- or man -- who passionately opposes the death penalty under every conceivable circumstance can now help to ensure that at least in his or her case, a murderer's life that might have been taken might now be preserved. There is no more direct way to give death-penalty abolitionists the right to have a say over the fate of a murderer.

Second, such a choice gives great power to the individual. Abolitionists who live in pro-death-penalty Texas, for example, can now have a say on a matter of enormous moral magnitude. And pro-death penalty citizens living in states that have either legally or de facto abolished the death penalty regain a sense of power over their life (or death, to be precise).

The whole American experiment has been predicated on giving individuals as much control over their own lives as possible. But this has been undermined in the last 50 years, as the state has gotten ever more powerful. Giving murder victims say over their murderer's fate would be a small but symbolically significant step in Americans reasserting the importance of the individual. It's hard to imagine a more appropriate arena than in determining what happens to the person who murdered you.

As dark as thoughts of one's own murder may be, we all think about it. And I don't think I speak only for myself in saying that I would rest just a tiny bit better knowing that if I were murdered, my murderer might not be allowed to watch TV; read books; exercise; develop relationships with people inside and outside of prison; surf the Internet; sing; listen to music; have his health care needs addressed; and be visited by loved ones while I lay in my grave.

And for those opposed to the death penalty, they, too, will be able to rest a bit easier. They will be assured that even men who came to their home, raped all the females in their family, and then set the house on fire with the family inside -- as happened in Connecticut a few years ago -- would never be killed by the state.

Third, it would be interesting to see if these color-coded bracelets and licenses had any effect on who gets murdered. Clearly, when the murder is a crime of passion, it is hard to imagine that a would-be murderer would stop himself from killing someone upon noticing a red bracelet or a red dot on a license plate. But crimes of passion are generally not, in any event, punished by death. On the other hand, in murders that could be capital crimes, it is possible (not necessarily likely, but possible) that a murderer (or even more likely, his accomplice, if there were one) might just re-think murdering the victim.

Fourth, choosing which color bracelet or dot on one's license not only forces people to confront their own consciences, but it will undoubtedly engender deep discussions with others. To cite but one example, it can surely help singles who are dating. If you're against the death penalty, and your date drives up with a red bracelet and/or dot on his license plate, you'll have either a far deeper discussion than you would have otherwise have had at dinner, or you'll spare yourself the time and expense of a date that will probably go nowhere.

These are some of the arguments for the plan. I can't think of one good argument against it -- unless you're an abolitionist who is fearful of seeing red.

One key thing lacking in much of modern society is the sense of community provided by churches and other religious institutions. In a discussion about his new book, "Religion for Atheists," Alain de Botton talks with WSJ's Gary Rosen about how our sense of kinship and belonging might be reattained in a more-secular age..One of the losses that modern society feels most keenly is the loss of a sense of community. We tend to imagine that there once existed a degree of neighborliness that has been replaced by ruthless anonymity, by the pursuit of contact with one another primarily for individualistic ends: for financial gain, social advancement or romantic love.

In attempting to understand what has eroded our sense of community, historians have assigned an important role to the privatization of religious belief that occurred in Europe and the U.S. in the 19th century. They have suggested that we began to disregard our neighbors at around the same time that we ceased to honor our gods as a community.

Journal Community ..This raises two questions: How did religion once enhance the spirit of community? More practically, can secular society ever recover that spirit without returning to the theological principles that were entwined with it? I, for one, believe that it is possible to reclaim our sense of community—and that we can do so, moreover, without having to build upon a religious foundation.

Insofar as modern society ever promises us access to a community, it is one centered on the worship of professional success. We sense that we are brushing up against its gates when the first question we are asked at a party is "What do you do?," our answer to which will determine whether we are warmly welcomed or conclusively abandoned.

In these competitive, pseudo-communal gatherings, only a few sides of us count as currency with which to buy the goodwill of strangers. What matters above all is what is on our business cards. Those who have opted to spend their lives looking after children, writing poetry or nurturing orchards will be left in no doubt that they have run contrary to the dominant mores of the powerful, who will marginalize them accordingly.

Religion in Secular Life: A ProposalView Slideshow

Thomas Greenall & Jordan Hodgso

A university alive to its true responsibilities would teach students about things like the tensions of married life, with books like 'Anna Karenina' and 'Madame Bovary' on the syllabus...Given this level of discrimination, it is no surprise that many of us choose to throw ourselves with a vengeance into our careers. Focusing on work to the exclusion of almost everything else is a plausible strategy in a world that accepts workplace achievements as the main tokens for securing not just the financial means to survive physically but also the attention that we require to thrive psychologically.

Religions seem to know a great deal about our loneliness. Even if we believe very little of what they tell us about the afterlife or the supernatural origins of their doctrines, we can nevertheless admire their understanding of what separates us from strangers and their attempts to melt away one or two of the prejudices that normally prevent us from building connections with others.

Consider Catholicism, which starts to create a sense of community with a setting. It marks off a piece of the earth, puts walls up around it and declares that within their confines there will reign values utterly unlike the ones that hold sway in the world beyond. A church gives us rare permission to lean over and say hello to a stranger without any danger of being thought predatory or insane.

The composition of the congregation also feels significant. Those in attendance tend not to be uniformly of the same age, race, profession or educational or income level; they are a random sampling of souls united only by their shared commitment to certain values. We are urged to overcome our provincialism and our tendency to be judgmental—and to make a sign of peace to whomever chance has placed on either side of us. The Church asks us to leave behind all references to earthly status. Here no one asks what anyone else "does." It no longer matters who is the bond dealer and who the cleaner.

Zackary Canepari/Panos;

We all stand to learn something from the ways in which religion promotes morality, inspires travel, trains minds and encourages gratitude at the beauty of life..The Church does more, however, than merely declare that worldly success doesn't matter. In a variety of ways, it enables us to imagine that we could be happy without it. Appreciating the reasons why we try to acquire status in the first place, it establishes conditions under which we can willingly surrender our attachment to it.

It is the genius of the Mass to confront these fears. The building in which it is performed is almost always sumptuous. Though it is technically devoted to celebrating the equality of man, it often surpasses palaces in its beauty. The company is also enticing. As the congregants start to sing "Gloria in Excelsis," we are likely to feel that the crowd is nothing like the one that we encounter at the shopping mall or the bus stop. We gaze up at the vaulted, star-studded ceiling and rehearse in unison the words "Lord, come, live in your people and strengthen them by your grace." We leave thinking that humanity may not be such a wretched thing after all.

As a result, we may start to feel that we could work a little less feverishly, because we see that the respect and security we hope to gain through our careers is already available to us in a warm and impressive community that imposes no worldly requirements on us for its welcome.

If the Mass has done its job and we are awake to its lessons, it should succeed by its close in shifting us at least fractionally off our accustomed egocentric axes. It should also have given us a few ideas for mending some of the more dispiriting aspects of our fractured modern world.

One of these ideas relates to the benefits of taking people into a distinct space where they can be isolated from the usual ideology of the mercantile world. The venue itself ought to be attractive enough to evoke enthusiasm for the notion of a group. It should inspire visitors to suspend their customary frightened egoism in favor of a joyful immersion in a collective spirit—an unlikely scenario in the majority of modern so-called "community centers," insultingly designed structures whose appearance paradoxically serves to confirm the inadvisability of joining anything communal.

The Mass also contains a lesson about the importance of rules for directing people in their interactions with one another. The liturgical complexity of a Missal—the way in which this book of instructions for celebrating a Mass compels the congregants to look up, stand, kneel, sing, pray, drink and eat at given points—speaks to an essential aspect of human nature. To foster a sense of communal intimacy and to ensure that profound and dignified personal bonds can be forged, a tightly choreographed agenda of activities may be more effective than simply leaving a group to mingle aimlessly on its own.

A final lesson from the Mass is closely connected with its history. Before it was a service, before the congregants sat in seats facing an altar behind which a priest held up a wafer and a cup of wine, the Mass was a meal. What we now know as the Eucharist began as an occasion when early Christians put aside their work and domestic obligations and gathered around a table (usually laden with wine, lamb and loaves of unleavened bread) in order to commemorate the Last Supper. They talked, prayed and renewed their commitments to Christ and to one another. Like Jews at the Sabbath meal, Christians understood that it is when we satiate our bodily hunger that we are often readiest to direct our minds to the needs of others.

Enlarge Image

CloseKyoko Hamada/Gallery Stock

The contemporary world is not lacking in places where we can dine well in company, but what's significant is that there are almost no venues that can help us to transform strangers into friends..In honor of the most important Christian virtue, these gatherings became known as agape (love, in Greek) feasts and were regularly held by Christian communities in the period between Jesus's death and the Council of Laodicea in A.D. 364. Complaints about the excessive exuberance of some of these meals eventually led the early Church to the regrettable decision to ban agape feasts and to suggest that the faithful eat at home with their families instead—and only afterward gather for the spiritual banquet that we now know as the Eucharist.

But the Mass is hardly alone as an instructive example, and community is certainly not our only unmet need in the modern world. My premise is that even those who aren't religious can find religion sporadically useful, interesting and consoling and should consider how we might import certain religious ideas and practices into the secular realm.

Everyone stands to learn something from the ways in which religion delivers sermons, promotes morality, engenders a spirit of community, inspires travel, trains minds and encourages gratitude at the beauty of life. In a world beset by fundamentalists of both the believing and the secular variety, it must be possible to balance a rejection of religious faith with a selective reverence for religious rituals and concepts.

Religion serves two central needs that secular society has not been able to meet with any particular skill: first, the need to live together in harmonious communities, despite our deeply-rooted selfish and violent impulses; second, the need to cope with the pain that arises from professional failure, troubled relationships, the death of loved ones and our own decay and demise.

Religions are a repository of occasionally ingenious concepts for trying to assuage some of the most persistent and unattended ills of secular life. They merit our attention for their sheer conceptual ambition and for changing the world in a way that few secular institutions ever have. They have managed to combine theories about ethics and metaphysics with practical involvement in education, fashion, politics, travel, hostelry, initiation ceremonies, publishing, art and architecture—a range of interests whose scope puts to shame the achievements of even the greatest secular movements and innovators.

Enlarge Image

CloseMasterfile

Cathedral La Seu in evening light, Palma de Majorca, Spain.It feels especially relevant to talk of meals, because our modern lack of a proper sense of community is importantly reflected in the way we eat. The contemporary world is not, of course, lacking in places where we can dine well in company—cities typically pride themselves on the sheer number and quality of their restaurants—but what's significant is that there are almost no venues that can help us to transform strangers into friends.

The large number of people who patronize restaurants suggests that they are refuges from anonymity and coldness, but in fact they have no systematic mechanism for introducing patrons to one another, to dispel their mutual suspicions, to break up the clans into which they segregate themselves or to get them to open up their hearts and share their vulnerabilities with others. At a modern restaurant, the focus is on the food and the décor, never on opportunities for extending and deepening affections.

Patrons tend to leave restaurants much as they entered them, the experience having merely reaffirmed existing tribal divisions. Like so many institutions in the modern city (libraries, nightclubs, coffee shops), restaurants know full well how to bring people into the same space, but they lack any means of encouraging them to make meaningful contact with one another once they are there.

With the benefits of the Mass and the drawbacks of contemporary dining in mind, we can imagine an ideal restaurant of the future, an Agape Restaurant. Such a restaurant would have an open door, a modest entrance fee and an attractively designed interior. In its seating arrangement, the groups and ethnicities into which we commonly segregate ourselves would be broken up; family members and couples would be spaced apart. Everyone would be safe to approach and address, without fear of rebuff or reproach. By simple virtue of being in the space, guests would be signaling—as in a church—their allegiance to a spirit of community and friendship.

Though there wouldn't be religious imagery on the walls, some kind of art that displayed examples of human vulnerability, whether in relation to physical suffering, poverty, anxiety or romantic discord, would bring more of who we actually are into the public realm, lending to our connections with others a new and candid tenor.

Religions are aware that the moments around the ingestion of food are propitious to moral education. It is as if the imminent prospect of something to eat seduces our normally resistant selves into showing some of the same generosity to others as the table has shown to us. Religions also know enough about our sensory, nonintellectual dimensions to be aware that we cannot be kept on a virtuous track simply through the medium of words. They know that their captive audience is likely to accept a trade-off between ideas and nourishment—and so they turn meals into disguised ethical lessons.

Enlarge Image

CloseThomas Greenall & Jordan Hodgson

Temple to perspective: This structure would represent the age of the Earth, with each centimeter of height equating to 1 million years. A tiny band of gold a mere millimeter thick at the bottom of the roughly 150- foot structure would stand for mankind's time on Earth..Before our first sip of wine, religious communities offer us a thought that can be swallowed with the liquid like a tablet. They make us listen to a homily in the gratified interval between two courses. And they use specific types of food and drink to represent abstract concepts, telling Christians, for example, that bread stands for the sacred body of Christ, informing Jews that the Passover dish of crushed apples and nuts is the mortar that was used by their enslaved ancestors to build the warehouses of Egypt and teaching Zen Buddhists that their cups of slowly brewing tea are tokens of the transitory nature of happiness in a floating world.

Taking their seats at an Agape Restaurant, guests would find in front of them guidebooks reminiscent of the Haggadah (the text followed at a Passover Seder) or the Missal, laying out the rules for how to behave at the meal. No one would be left alone to find their way to an interesting conversation with another, any more than it would be expected of participants at a Passover meal or in the Eucharist that they might manage independently to alight on the salient aspects of the history of the tribes of Israel or achieve a sense of communion with God.

The Book of Agape would direct diners to speak to one another for prescribed lengths of time on predefined topics. Like the famous questions that the youngest child at the table is assigned by the Haggadah to ask during the Passover ceremony ("Why is this night different from all other nights?" "Why do we eat unleavened bread and bitter herbs?" and so on), these talking points would be carefully crafted for a specific purpose, to coax guests away from customary expressions of pride ("What do you do?" "Where do your children go to school?") and toward a more sincere revelation of themselves ("What do you regret?" "Whom can you not forgive?" "What do you fear?").

The liturgy would inspire charity in the deepest sense, a capacity to respond with complexity and compassion to the existence of our fellow creatures. One would be privy to accounts of fear, guilt, rage, melancholy, unrequited love and infidelity that would generate an impression of our collective insanity and endearing fragility.

Thanks to the Agape Restaurant, our fear of strangers would recede. The poor would eat with the rich, the black with the white, the orthodox with the secular, workers with managers, scientists with artists. The claustrophobic pressure to derive all of our satisfactions from our existing relationships would ease, as would our desire to climb ever higher in social status.

The notion that we could mend some of the tatters in the modern social fabric through an initiative as modest as a communal meal may seem offensive to those who trust in the power of legislative and political solutions to cure society's ills. But these restaurants would not be an alternative to traditional political methods. They would be a prior step, taken to humanize one another in our imaginations.

Christianity, Judaism and Buddhism have made significant contributions to political life, but their relevance to the problems of community are arguably never greater than when they depart from the modern political script and remind us that there is also value to be had in standing in a big hall singing a hymn or in ceremoniously washing a stranger's feet or in sitting at a table with neighbors and partaking of lamb stew and conversation. These rituals, as much as the deliberations inside parliaments and law courts, are what help to hold our fractious and fragile societies together.

—From "Religion for Atheists: A Non-Believers Guide to the Uses of Religion" by Alain de Botton, to be published March 6 by Pantheon. Copyright by Alain de Botton.

A version of this article appeared Feb. 18, 2012, on page C1 in some U.S. editions of The Wall Street Journal, with the headline: Religion for Everyone.

Can you believe the nerve of employers? Many of them still seem to think that they should be allowed to determine the benefits they offer. I guess they haven't read your 2,000-page health law. It's the government's job now.

That's a good thing, too. Employers for too long have been able to restrict our access to essential health services like contraception by making us pay some of the bill. Really, it's amazing that we aren't all dead. Now, thanks to you, we'll enjoy free and universal access to preventative care just like workers do in Cuba. Even so, there are still many essential benefits that the government must mandate to make the U.S. the freest country in the world.

• Fitness club memberships. Most doctors agree that exercising is one of the best ways to prevent disease. However, gym memberships can run between $240 and $1,800 per year. Such high prices force us to choose between exercising and buying groceries. While we could walk or jog outside, many of us prefer not to. Therefore, employers should be required to pay for workers' gym memberships. Doing so might even reduce employers' health costs, which is why many companies already subsidize memberships. Those that don't are limiting our freedom to exercise.

Enlarge Image

CloseGetty Images .• Massages. Stress raises the risk of heart disease, obesity, depression and a host of other maladies. About one half of Americans say they're stressed, and studies show that health costs for stressed-out workers are nearly 50% higher than those for their chilled-out counterparts. According to the Mayo Clinic, a great way to reduce stress is to get a massage. However, since few of us can afford massages, it is imperative that employers be required to cover weekly massage treatments or hire in-office masseuses. Think of the millions of new jobs this mandate will create in the therapeutic field, too.

• Yoga classes. Like exercise and massage, yoga reduces stress and can relieve back pain, osteoarthritis and even menopausal symptoms. Yoga is also one of the best exercises for pregnant women since stress raises the risk of birth defects, which in turn increase health costs. While we could practice yoga with the aid of a DVD or Web video, classes offer social benefits that enhance our psychological well-being.

• Coffee. Studies show that coffee can ward off depression, Alzheimer's disease, type 2 diabetes and sleepiness—which makes it one of the most powerful preventive treatments. Workers who drink java are also more productive and pleasant. While many offices have coffee makers, some employers—most notably those affiliated with the Church of Jesus Christ of Latter-day Saints—continue to deny workers this essential benefit. All employers should have to provide workers with freshly brewed coffee. Oh, and workers must also be able to choose the kind of coffee regardless of the price.

Republicans might argue that requiring Mormon charities to serve coffee is a violation of "religious liberty" since the Mormon church's doctrine proscribes coffee, but this argument is a red herring. Leading medical experts recommend drinking coffee. Moreover, 99% of adults have drunk coffee at one point in their lives (including most Mormons).

• Salad bar. Studies also show that eating a lot of salad helps people maintain a healthy weight, which is key to preventing diabetes, heart disease and hypertension. Admittedly, mandating that employers include a free salad bar in their cafeterias would primarily benefit healthy eaters (women like myself) and raise prices for workers who subsist on junk (most men). However, such a mandate is necessary to expand our access to healthy food. Nanny-state conservatives who oppose this mandate merely want to ban salad and control what we eat.

Republicans may complain that these suggested mandates represent an unconstitutional expansion of federal government power. However, I'm sure Attorney General Eric Holder, Health and Human Services Secretary Kathleen Sebelius and your political adviser David Axelrod could produce a legal memorandum explaining why they are necessary and proper to promote our general welfare (and of course, your re-election).

Besides, if you can justify a mandate on individuals to buy health insurance, this should be a piece of cake.

How can we not at least grant congress the power to mandate employers at a minimum make peas and carrots available to workers during the work day? What kind of a country have we become? Maybe the article is written in jest but just where does nanny state, cradle-to-grave government begin and end, no one knows.

Going the other direction with it, Dem female lawmakers are avenging proposed limits on the slaughtering of our young with viagra legislation.

Bill introduced to regulate men's reproductive healthPart of a trend, she likens the bill to men legislating ‘a woman’s womb.’

COLUMBUS – Before getting a prescription for Viagra or other erectile dysfunction drugs, men would have to see a sex therapist, receive a cardiac stress test and get a notarized affidavit signed by a sexual partner affirming impotency, if state Sen. Nina Turner has her way.

The Cleveland Democrat introduced Senate Bill 307 this week.

A critic of efforts to restrict abortion and contraception for women, Turner says she is concerned about men’s reproductive health. Turner’s bill joins a trend of female lawmakers submitting bills regulating men’s health. Turner said if state policymakers want to legislate women’s health choices through measures such as House Bill 125, known as the “Heartbeat bill,” they should also be able to legislate men’s reproductive health...

I would note in the false declaration of gender wars that I think it is women who are the most energetic of the pro-life movement (anti-women's rights?) and my guess that it is older women who might like to enjoy their aging husbands' erection one more time.

THIS week, Robert De Niro made a joke about first ladies, and Newt Gingrich said it was “inexcusable and the president should apologize for him.” Of course, if something is “inexcusable,” an apology doesn’t make any difference, but then again, neither does Newt Gingrich.

Mr. De Niro was speaking at a fund-raiser with the first lady, Michelle Obama. Here’s the joke: “Callista Gingrich. Karen Santorum. Ann Romney. Now do you really think our country is ready for a white first lady?”

The first lady’s press secretary declared the joke “inappropriate,” and Mr. De Niro said his remarks were “not meant to offend.” So, as these things go, even if the terrible damage can never be undone, at least the healing can begin. And we can move on to the next time we choose sides and pretend to be outraged about nothing.

When did we get it in our heads that we have the right to never hear anything we don’t like? In the last year, we’ve been shocked and appalled by the unbelievable insensitivity of Nike shoes, the Fighting Sioux, Hank Williams Jr., Cee Lo Green, Ashton Kutcher, Tracy Morgan, Don Imus, Kirk Cameron, Gilbert Gottfried, the Super Bowl halftime show and the ESPN guys who used the wrong cliché for Jeremy Lin after everyone else used all the others. Who can keep up?

This week, President Obama’s chief political strategist, David Axelrod, described Mitt Romney’s constant advertising barrage in Illinois as a “Mittzkrieg,” and instantly the Republican Jewish Coalition was outraged and called out Mr. Axelrod’s “Holocaust and Nazi imagery” as “disturbing.” Because the message of “Mittzkrieg” was clear: Kill all the Jews. Then the coalition demanded not only that Mr. Axelrod apologize immediately but also that Representative Debbie Wasserman Schultz “publicly rebuke” him. For a pun! For punning against humanity!

The right side of America is mad at President Obama because he hugged the late Derrick Bell, a law professor who believed we live in a racist country, 22 years ago; the left side of America is mad at Rush Limbaugh for seemingly proving him right.

If it weren’t for throwing conniption fits, we wouldn’t get any exercise at all.

I have a better idea. Let’s have an amnesty — from the left and the right — on every made-up, fake, totally insincere, playacted hurt, insult, slight and affront. Let’s make this Sunday the National Day of No Outrage. One day a year when you will not find some tiny thing someone did or said and pretend you can barely continue functioning until they apologize.

If that doesn’t work, what about this: If you see or hear something you don’t like in the media, just go on with your life. Turn the page or flip the dial or pick up your roll of quarters and leave the booth.

The answer to whenever another human being annoys you is not “make them go away forever.” We need to learn to coexist, and it’s actually pretty easy to do. For example, I find Rush Limbaugh obnoxious, but I’ve been able to coexist comfortably with him for 20 years by using this simple method: I never listen to his program. The only time I hear him is when I’m at a stoplight next to a pickup truck.

When the lady at Costco gives you a free sample of its new ham pudding and you don’t like it, you spit it into a napkin and keep shopping. You don’t declare a holy war on ham.

I don’t want to live in a country where no one ever says anything that offends anyone. That’s why we have Canada. That’s not us. If we sand down our rough edges and drain all the color, emotion and spontaneity out of our discourse, we’ll end up with political candidates who never say anything but the safest, blandest, emptiest, most unctuous focus-grouped platitudes and cant. In other words, we’ll get Mitt Romney.

We live in a world where groups like Media Matters and Think Progress monitor and record every word uttered by any conservative of prominence in the media, hoping to find something “insensitive” with which to mount a campaign to silence the speaker through intimidation of advertisers.

So what goes around come around, and Bill Maher is feeling the heat. And Maher doesn’t like it:

Let’s have an amnesty — from the left and the right — on every made-up, fake, totally insincere, playacted hurt, insult, slight and affront. Let’s make this Sunday the National Day of No Outrage. One day a year when you will not find some tiny thing someone did or said and pretend you can barely continue functioning until they apologize.

If that doesn’t work, what about this: If you see or hear something you don’t like in the media, just go on with your life. Turn the page or flip the dial or pick up your roll of quarters and leave the booth.

The answer to whenever another human being annoys you is not “make them go away forever.” We need to learn to coexist, and it’s actually pretty easy to do. For example, I find Rush Limbaugh obnoxious, but I’ve been able to coexist comfortably with him for 20 years by using this simple method: I never listen to his program. The only time I hear him is when I’m at a stoplight next to a pickup truck.

Maher is right. But an amnesty is not possible so long as groups like Media Matters and Think Progress exist, because what Maher decries is their very reason for existence.