dotCommonweal

dotCommonweal Blog

Fellow dotcommonwealer Anthony Annett takes David Brooks to task for singling out the moral failures of the poor, while overlooking the “beam in the eye” of the rich. Annett rightly notes:

During the postwar era in the United States, there was a fair amount of solidarity between capital and labor. Unions were strong and respected, and the fruits of higher productivity were broadly shared. Top income tax rates were high, and it was considered unseemly for top executive compensation to soar to stratospheric levels. … But the social norms underpinned this model shifted dramatically during the libertarian revival of the late 1970s and early 1980s, heralded by the rise of Reagan. Now, it became acceptable to put self-interest above social solidarity. Top tax rates were cut, unions were attacked, and the financial sector was unleashed. It became acceptable to push wages to rock bottom simply to maximize shareholder returns and top executive compensation. It became acceptable to scrape the bottom of the barrel in terms of ethical standards to make a quick buck. It became acceptable to spend billions in lobbying for your own short term interest, while demonizing the poor, and fighting for your extra tax cut to come from their extra benefit. And it became acceptable to insist on the God-given right to perpetual pollution, planet be damned.

Annett is right that both structurally and culturally, we've shifted from a stance of solidarity to a stance of selfishness. Given that Brooks’s column offers some horrifying anecdotes of the destructive culture of poverty, it is only fair that Annett summon up the horrifying images of the filthy rich. I don’t deny the truth in either of these descriptions, but what bothers me about these kinds of dueling descriptions of our economic situation is the extent to which they have a tendency to fall into and trade on stereotypes. Again, there's truth here, but it is so easy for these generalizations to go too far, become too sweeping, and then impair constructive progress. In the first chapter of my book on luxury, I note that the tendency to lock discussions of economic ethics into structural debates controlled by “the market-state binary” means that

the debates also tend to leave things out and arrive at an impasse. They often neglect significant differences in behavior within the categories “rich” and “poor.” To put it bluntly, they trade on stereotypes of both groups, whether positive or negative, and resort to an anecdotal story or two to reinforce their preferred stereotype. The rich are either rapaciously greedy or noble “job creators”; the poor are either struggling victims in need of compassion or lazy, dependent freeloaders in need of personal discipline and a sense of responsibility. But surely neither group is in fact homogenous! “The rich” and “the poor” are misleading abstractions. Such stories often “explain” complex economic problems by scapegoating this or that subgroup – “Wall Street” or “welfare queens,” “government regulators” or “insurance company executives.” Sadly, this passes for reasoned, public debate.

So, Brooks and Annett both have valid points. There really are characteristic, if stereotyped, vices that afflict both rich and poor in our society. Both in fact tend toward the “libertarian default,” though in different ways. But a prudent discussion would get past the stereotypes and find ways to recover moral language that should be shared by all. I think luxury is a key part of that, a language of reasonably, self-controlled spending that recognizes the responsibility of using excess wealth for the common good. Wealth is there to be shared. There are rich and poor who in fact practice such sharing; there are also rich and poor who are consumed by consumption. The primary moral vocabulary is not “rich” and “poor”; it should be solidarity and frugality.

But a moral vocabulary “shared by all” is important, too. All this stereotyping and scapegoating does serve an important political function, which is a further consequence of the market-state binary: by focusing on groups of great wealth or severe poverty, the discussion tends to exempt “the middle class". If we can blame the Wall Streeters or the dysfunctional poor neighborhoods, then maybe our own lifestyles can get off the hook. But consider a different possibility: maybe the need for norms of solidarity, generosity, and frugality might be most powerful if practiced and expressed by the middle class, and particularly what I call the “39%” – that is, the upper two income quintiles below the 1%. The 39% control a lot of wealth, a lot of votes, and a lot of organizations. Solidarity and frugality could go a long way if that’s what the 39% sought. And of course, some do. Perhaps they are the really important cultural catalysts.

In a recent column, David Brooks wades into the debate on the huge gaps in income and opportunity that have arisen in the United States. He focuses on the plight of the poor, and his argument is essentially that the problem is not so much money and policies as norms and virtues.

In other words, he blames the poor for their own plight, and Elizabeth Stoker Bruenig immediately pounces. She argues, quite persuasively, that the moral values of the poor do not differ from the moral values of the rich, and that what keeps the poor down is daily grind of poverty and its soul-destroying burden. On this point, Paul Krugman is in complete agreement—he had noted for a while that social dysfunction can be traced to collapse in decent jobs rather than a collapse in virtue.

But I think that Brooks nonetheless makes a good observation. The cause of much of our social and economic malaise is indeed a breakdown in social norms, the habituation of some wholly unvirtuous behavior. He’s right that we need to look at this through the lens of virtue ethics, especially when he asks core questions like: are you living for short-term pleasure or long-term good?

The only problem is, Brooks singles out the poor, when the real culprits are the rich. The real breakdown in social norms over the past few decades has come from the top.

But he might well have titled it, An outline of the Pope’s forthcoming encyclical.

Vatican expert and papal biographer Austen Ivereigh called the lecture “a curtain-raiser” from “the man whose council wrote the first draft.”

The lecture’s overall themes and key phrases resound with the language Pope Francis has used since day one of his pontificate. But more importantly, it signals both how scripture will be interpreted anew against the backdrop of ecological degradation and how Francis’s teaching on “integral ecology” builds on the magisterium of the previous two popes.

The phrase “integral ecology” seems primed to become the encyclical’s central idea. Turkson describes it as “the key to addressing the inter-related issues of human ecology, development and the natural environment.”

The article makes some disputable particular points, but overall, the author rightly shows that a strong distinction between fact and opinion is not coherent. A supposed “fact/value” distinction was in ascendancy in some philosophical circles a century ago, but has been cogently criticized for at least fifty years. Innumerable examples can be adduced to suggest fluidity in both directions: one can have a difference of opinion about who is the best baseball player, but such opinion is itself constrained by facts—there may not be one right answer, but there are very many clearly mistaken answers. From the other direction, how one construes what “the facts” are (or at least what their significance is) is affected by what we value, the moral commitments we have. Again, from this side, we can’t completely “make up” facts, but even contemporary neuroscience affirms that what we actually “see” is affected by our commitments.

Oftentimes, it is good practice to ignore comment boxes (except at dotCommonweal, of course), but my concern was amplified by the comments that followed. The Times curates the comments, and so its “picks” rise to the top—and, astonishingly, most of the picks represent quite sane and rational responders who strongly reject the author’s claim, and who want very strongly to adhere to this distinction. As one commenter put it:

After reading many of the comments, it seems as though the great majority of the adults reading this blog don't believe in moral facts. And yet, many of them express this by vehemently claiming that McBrayer is WRONG to impose his view on others (implying that it is a moral fact that one shouldn't do this). Believing in moral facts allow us to call certain practices wrong. I believe slavery was and is wrong and that those who ever thought it was permissible had false beliefs—not just that we happened to change our feelings about it. One can believe in moral facts without being the moralistic monster many are claiming Professor McBrayer is (a judgment that seems to be based on the fact that someone apparently noticed he teaches philosophy of religion—an ad hominem if I've ever seen one). The extreme reactions here astound me.

Count me astonished, too. It is well-known that, with a depressing frequency, those on the far Right abandon reasoned discourse about what we should do; it is a supposed virtue that the political Left is more careful about these matters—say, on the economy or the environment. Yet here we have apparently well-educated Times readers displaying very fundamental irrationality. The great achievements of the Progressive Left in the last century—the New Deal, unionism, and civil rights—all sprang from quite firm moral convictions. And indeed, I still think many of these commentators in practice retain these convictions. What is alarming is that they reject a public discourse that could appeal to these moral commitments…at least as anything other than majoritarian preference.

What is going on? Another comment on the McBrayer piece might illustrate the problem:

Can the bedroom of an eleven-year-old girl be objectively a “mess”? To a pair of exhausted, exasperated working parents the answer is obvious. But when the girl in question notes that “mess” is a value claim and thus is not a matter of fact but an opinion, the point must be grudgingly conceded -- though allowance may still be withheld.

Pride in the growing ability of your child to articulate the difference between fact and opinion is tempered by the realization that it’s being turned against you, and that it will soon be deployed in disagreements inevitably more fraught than whether the dirty socks and Taylor Swift t-shirt need to be picked up right now. That my daughter has learned this skill in school on one level validates our decision to enroll her where we did, though on another it suggests continued vigilance is warranted: The Common Core curriculum, under fire from numerous quarters for a number of reasons, is now also getting the attention of moral philosophers who say it “embeds a misleading distinction between fact and opinion.” From Justin P. McBrayer at The Stone blog of The New York Times:

[O]ur public schools teach students that all claims are either facts or opinions and that all value and moral claims fall into the latter camp. The punchline: there are no moral facts. And if there are no moral facts, then there are no moral truths.

The inconsistency in this curriculum is obvious. For example, at the outset of the school year, my [second-grade] son brought home a list of student rights and responsibilities. Had he already read the lesson on fact vs. opinion, he might have noted that the supposed rights of other students were based on no more than opinions. According to the school’s curriculum, it certainly wasn’t true that his classmates deserved to be treated a particular way — that would make it a fact. Similarly, it wasn’t really true that he had any responsibilities — that would be to make a value claim a truth.

McBrayer says he’d realized many of his college students already don’t believe in moral facts, and that conversations with other philosophy professors suggest “the overwhelming majority of college freshmen … view moral claims as mere opinions that are not true or are true only relative to a culture.” The implications are obvious and relevant to the recent discussion here concerning curricula at Catholic universities. Concerns about moral relativism in academia are established, though, and it’s too soon to know how anything specifically inculcated by Common Core will have an effect. College students were cheating, for example, long before Common Core; so were corporate executives; so were spouses. But it bears watching, of course, given that millions of students in more than forty states are being educated according to the standards -- which themselves might have arisen out of the academic environment McBrayer describes.

Plus, given the pace of technological development, it might one day be not just human beings that need moral compassing.

Since “hate crime” is a legal term, and prosecuting under hate crime legislation requires a particular burden of proof, quoting the family as saying, “this was a hate crime” (which they have repeated) rather than naming it as such is understandable within journalistic constraints. But whether the crime qualifies as a hate crime in a court of law, and whether we can talk about prejudice as a factor outside the courtroom are different things. Anger over an everyday event and having religious or racial prejudices are clearly not mutually exclusive attitudes, and prejudice is not a clear strain of thought easily plucked out from other kinds of thoughts. This is true whether we are describing ourselves, or another person. That feelings, fears, and motivations are often subconscious or partially conscious is partly why social prejudice is so pernicious. It is still necessary and useful to name prejudice when it’s there, but we cannot so easily claim for ourselves, or for others, when it’s not. Of course, not being able to confirm absence doesn’t confirm presence; criticism of hate crime legislation is often about that very difficulty.

“In the event of a nuclear attack, which of these items would be the most helpful? Rank them in order of importance.”

This was one of the first worksheets I remember from elementary school. There were about twenty illustrated items. My classmates and I were perplexed. Sure, we had probably watched a filmstrip that mentioned the Geiger Counter, but none of us could remember what it did. And why would we want a broom? Would we be that concerned with the tidiness of our fallout shelter?

IT WAS ABOUT 1983. That same year, the Russians shot down a Korean civilian airliner over the Sea of Japan; the U.S. Catholic Bishops issued a lengthy warning about the buildup of nuclear weapons; and on September 26, a Soviet Lieutenant Colonel secretly saved the world from accidental Armaggedon. But more about Stanislav Petrov later.

Growing up in the early 1980’s, not far from North American Aerospace Defense (NORAD) and the Air Force Academy, the Cold War was a hot topic – even for kids. Popular videos on the burgeoning MTV network, such as Genesis’ “Land of Confusion,” satirized and lamented the possibility of nuclear annihilation. Dads took their sons to see “Top Gun” in theaters, and we cheered when Russian MIGs were splashed in the ocean. “Red Dawn” was always checked out of the video store. One of my favorite books, still there in my parents’ house, was titled “Great Warplanes of the 1980’s.”

KIDS TODAY don't have the same fears. They don’t know that the broom is to sweep nuclear fallout off your friends.

The globally-aware college students that I teach don’t think about nuclear annihilation. Environmental degradation? Yes. Terrorism? Yes. Economic inequality? Yes. Racial injustice? Absolutely. But if they think about nuclear weapons at all, it’s in the context of who might acquire them – namely, North Korea or Iran. The notion that the arsenals of the already nuclear-armed states should be at the center of moral concern seems outdated, like referring to music videos being shown on MTV.

The fact is, the nuclear capabilities that already exist have grown in power beyond human comprehension, and there have been enough “close calls” regarding their deployment to warrant the gravest of fears. In recent years, many influential voices have made the case that – regardless of whether nuclear weapons ever made us more safe – they certainly no longer do so.

In the fall of 2013, the Catholic University of America announced a $1 million pledge from the Koch Foundation, one of the many not-for-profit outfits with strong ties to the billionaire libertarians David and Charles Koch. The money, according to the university, would go to the business school, allowing it to hire professors and offer a course on "principled entrepreneurship." You may remember the Kochs from their charitable efforts to undermine public-employee unions, to support a campaign against renewable-energy standards, to suppress the vote, or to discredit the minumum wage (which the U.S. bishops want to raise).

A group of about fifty Catholic theologians certainly remembered. They sent a disapproving letter to Catholic University, voicing their concern that by accepting the grant, the university was sending "a confusing message to Catholic students and other faithful Catholics that the Koch brothers’ anti-government, Tea Party ideology has the blessing of a university sanctioned by Catholic bishops." But university president John Garvey and business-school dean Andrew Abela remained unmoved. They replied by pointing out that several of the professors cash paychecks from universities that accept Koch money, and accused them of trying to "score political points."

If any of those theologians were clinging to the hope that, given enough time, Garvey and Abela might come around to the idea that there's something odd about a Catholic business school accepting money from people who are so deeply committed shrinking the social safety net, cutting taxes, weakening environental regulations, ending the minimum wage, and busting unions, they can let go now. Because Catholic University's business school recently accepted another $1.75 million pledge from the Charles Koch Foundation (in addittion to $1.25 million from other donors).

The Senate Intelligence Committee's "Torture Report," the 500-page report which summarizes a 6,700 page classified report, was released today.

Even for those of us who follow the torture beat closely, this report contains significant new information and corroboration of previous suppositions. Among the most alarming findings is that a minimum of 20% of tortured detainees were wrongly detained, some in blatant cases of mistaken identity.

My own research on torture in U.S. detention facilities has emphasized the religious aspects of abuse ("The Secret Weapon" and "Disgrace"). And though today's report does not contain as much along these lines as did the Senate Armed Services Committee’s report in 2009, it does analyze assertions made by CIA Director Hayden in 2007 about the role of religion in "enhanced interrogation."

Hayden argued that the CIA’s experience with detainees and “their particular psychological profile” necessitated interrogation so burdensome that the detainees would consider themselves released from their religious obligations:

Perceiving themselves true believers in a religious war, detainees believe they are morally bound to resist until Allah has sent them a burden too great for them to withstand. At that point — and that point varies by detainee — their cooperation in their own heart and soul becomes blameless and they enter into this cooperative relationship with our debriefers.

… it varies how long it takes, but I gave you a week or two as the normal window in which we actually helped this religious zealot to get over his own personality and put himself in a spirit of cooperation. (485-86)

Over the past few election cycles, Colorado has become an important "battleground state" and a bellwether for larger electoral trends. Featuring contested races for both a Senate seat and the Governor's mansion, it is arguably the most important site of the upcoming midterm elections. The gubernatorial contest has Bob Beauprez, an established figure in the Colorado Republican party, attempting to unseat (the previously very popular) Gov. Hickenlooper.

Social issues have entered the two campaigns in some expected ways -- abortion, health care coverage, gun safety laws, and marijuana legalization. But during these gubernatorial debates, the issue of the death penalty has also briefly held the spotlight.

Back in May, Beauprez made a campaign promise that surprised many, since he presents himself as a faithful Roman Catholic. "When I'm governor," he said during a GOP debate, "Nathan Dunlap will be executed." Or, in a headline offered by Mother Jones, "Elect Me, and I'll Kill that Guy."

Paul Krugman marshals German poet and playwright Friedrich Schiller in writing about a new book on what economists and policymakers have and haven't learned about the crash of 2008: "The gods themselves contend in vain against stupidity." He's not specifically targeting the author of the book, Martin Wolf of the Financial Times, but the would-be reformers who in having come to accept the so-called Standard Model of the crisis (complacency bred by faith in deposit insurance, trust in financial "innovation" and the wisdom of the industry, and the belief that the crisis could be "contained" to the mortgage market) continue to propose drastic spending cuts and deficit reduction as pillars of a stable, even expanding economy. Or, as Krugman himself puts it: to reject "orthodox economics ... in favor of doctrines like 'expansionary austerity'--the unsubstantiated claim that slashing government spending actually creates jobs."

Anyone who reads Krugman won't be surprised by his use of the word "wrongheaded" in describing such prescriptions. But it's the "intellectual shifts" he wants to call attention to: "[the unlearning of] the hard-won lessons of the Great Depression, the return to pre-Keynesian fallacies and prejudices." The application of such fallacies prolonged the crisis; the ongoing expression of such prejudices would make worse "the mess we're in."

Krugman's review appears in the same issue of the New York Review of Booksas a piece by Priyamavda Natarajan on three new works about science. Or, how science is understood, misunderstood, scapegoated, and rejected by those who aren't scientists -- including politicians and policymakers but also ordinary citizens feeling overwhelmed by it now, whether or not they had a firm handle on it in the first place.

Natarajan's main point is that people don't understand the provisionality of science, the idea that incremental advances arrived at through trial and error lead to greater, though perhaps not complete, understanding: provisionality is "the state of knowledge at a given time." What happens is that people both expect too much of science (how could this earthquake not have been predictable?) but also distrust it. Refusal to acknowledge climate change is one obvious manifestation of the latter, illustrated tellingly in Natarajan's account of a North Carolina law forbidding "the use of any new data and allowing only historical data in making estimates of sea-level rise in awarding permits" for development in coastal regions.

During oral arguments in Hobby Lobby v. Sibelius and subsequent written opinions, the Supreme Court debated the case's unintended consequences.

Would laws requiring vaccinations or prohibiting child labor, for example, now be affected by the new interpretation of RFRA? Or would the "parade of horribles" never come to pass?

A new case from Utah provides a surprising early glimpse: a member of the Fundamentalist Church of Jesus Christ of Latter-Day Saints (FLDS) has successfully refused a federal subpoena based on his religious belief in secrecy.

This past Thursday the Scripps Institute confirmed that monthly levels of atmospheric carbon dioxide have surpassed 400 parts per million, not only for the first time in human history, but for possibly in more than a million years. Annual measurements taken since 1958 show that levels of carbon dioxide in the atmosphere have risen by about 40 percent since humans began burning fossil fuels more than two centuries ago, and that there've been more greenhouse emissions in just the last forty years than the previous two hundred. Interviewed in Slate, Ralph Keeling of Scripps explains the significance of 400 ppm: "People like round numbers. When you hit a milestone you realize how far you've come. It's a little bit surreal. You think, 'Whoa, OK, not quite used to this one yet.' It's like having a round number birthday. It takes a while to identify with the new era you're in."

Yes, round numbers have clarifying effects, and I think Keeling's birthday analogy sits interestingly alongside something Cardinal Óscar Andrés Rodríguez Maradiaga said in his remarks opening the Sustainable Humanity, Sustainable Planet, Our Responsibility conference now underway at the Vatican's Pontifical Academy of the Sciences: "[M]an finds himself to be a technical giant and an ethical child." For all the scientific and mathematical complexities we can comprehend, obvious milestones still seem necessary for us to think about things, well, more thoughtfully.

Mozilla Firefox is my browser and it works. Every now and again I get an update; sometimes a note asking for a contribution to what is largely/wholly a public-spirited effort to keep the internet open source (or something like that). For some reason, I thought it was an Italian effort (as in Mozarella), but it turns out it organizes itself right here in the U.S. drawing on sources from global techies.

Brendan Eich, its recently appointed CEO is now its recently resigned former CEO. The issue: he donated a thousand dollars to California's proposition 8 campaign in 2008. It was an effort to turn back a California court decision allowing same-sex marriages in the state. Apparently when this contribution was discovered, there was a social media uproar (didn't see it on Mozilla though). There were calls for his resignation, and according to this story in the NYTimes, he did resign.

UPDATE: Saturday's NYTimes story: The issue of Mr. Eich's social skills comes up. What would social skills consist of in a libertarian context? The story suggests to me that no Mozillian has much in the way of social skills! Or at least, it can't be much of a job requirement.

What are crackers good for? As platforms for peanut butter, herring, and cheese. Shortly after the New Years, opening a new box, I found they did not stay intact long enough to break in half along their perforation. Forget peanut butter!!! Subsequent boxes: more crumble.

Took the matter in hand and wrote to the manufacturer, Mondelez. They have replied: "The differences you noted may be due to a change in the production facility and the process we use to make the cracker. We have also made some minor changes to the formula. Some of the changes we made are: Changed the oil; Removed the Whey Powder; Added Ammonium bicarbonate and sodium metabisulphite (used to make dough rise)....We apologize for this experience. We will make sure our Quality team is aware of your comments. Thank you for your loyalty and we hope that your next experience is a good one."

Since the controversy about (and subsequent veto of) Arizona's SB 1062, a pointed debate in newspapers and blogs has ensued about civil rights vs. religious liberty. Ross Douthat's New York Timescolumn expressed frustration that religious dissenters are not being permitted to "negotiate terms of surrender" in a culture "war."

What makes this response particularly instructive is that such bills have been seen, in the past, as a way for religious conservatives to negotiate surrender — to accept same-sex marriage’s inevitability while carving out protections for dissent. But now, apparently, the official line is that you bigots don’t get to negotiate anymore.

But is this best construed as a war, or does a less threatening metaphor suffice? Perhaps we're not fighting an apocalyptic war of religion vs. secularism, but instead tinkering with our delicate balance of Constitutional rights.

There is a fascinating story in today’s Washington Post about an FDA panel debating the question of whether to allow genetic modification of human embryos to “insert” genes of a third person, when there are genetic defects in the original DNA. Quite apart from how this displays our ongoing hostility toward disability, it has produced the most fascinating set of headline developments I have ever seen. On the paper copy of the Post in my library, the headline reads: “FDA debates idea of three-parent babies.” On the Web, the headline is “FDA panel debates technique that that would create embryos with three genetic parents.” Gone are the three-parent babies. Instead, bring in the “techniques” and the “embryos”! But the headline on the Web front page is even further from the print: “FDA debates procedure that mixes DNA from three people to form embryo.” Hmmm. Is “procedure” a bit more medical and less manufacturing in its resonance than is “technique”? Does “mixing DNA to form embryo” work better than “parenting”?

Herbert McCabe wrote in his magnificent Law, Love, and Language that what ethics is really all about is not simply law or love; rather, it is about developing a language whereby we could see more and more deeply and richly into the genuine significance of human living. The headline variance here provides quite a test case!

If you’ve spent any time in the last ten days or so watching the Olympics you may have caught the ad from Cadillac and thought to yourself: wait -- what? To synopsize: pugnacious, squared-jawed guy speaks directly to camera about why the American way of doing things is so great, as he takes the viewer on a swaggering tour of his holdings: from the vista of his infinity pool, across the natural-lit expanses of his glass-sided home, and ultimately to his serene, manicured driveway, where a shiny new Cadillac ELR awaits the promised imprint of his imperial haunches. The ad is titled “Work Hard,” and on advertising site iSpot it’s summarized like this: “Why do you work hard, foregoing [sic] vacation, family, and personal time? For stuff? No, it’s for a sense of accomplishment.”

Why do we work so hard? For what? For this? For stuff? Other countries, they work, they stroll home, they stop by the cafe, they take August off. Off. Why aren't you like that? Why aren't we like that? Because we're crazy, driven, hard-working believers, that's why. Those other countries think we're nuts. Whatever. Were the Wright Brothers insane? Bill Gates? Les Paul? Ali? Were we nuts when we pointed to the moon? That’s right. We went up there. You know what we got? Bored. So we left. Got a car up there, left the keys in it. You know why? Because we're the only ones going back up there, that's why.

But I digress. It's pretty simple. You work hard, you create your own luck, and you gotta believe anything is possible. As for all the stuff, that's the upside of only taking two weeks off in August. N’est-ce pas?

So: Inspiring, or repulsive? That’s the either/or quality of the debate that’s taken shape in the days since the ad first aired, but after repeated viewings I find it to be neither. Or, at any rate, not simply repulsive; plenty of commercials just by dint of their being commercials are repulsive. But the (quite literal) wink that comes with this ad pushes it into a different category. Come on, it wants to assure us, we know we’re being over the top here; we’re really just joking. But like anything that comes with a wink, there’s the other, underlying assurance to those in the know that it’s not a joke. Don’t be fooled by the appropriation of talismans of cool like Les Paul and Muhammad Ali—these are just two more acquisitions for this guy, accumulated cultural “capital” no more familiar to him than the art he’s purchased for his walls (as others have pointed out, doesn’t he realize that Ali forswore his given American name, converted to Islam, refused military conscription, and criticized U.S. policy on race and economics?). Don’t be fooled that he actually unplugs his little reward to himself—how much of an offset to a carbon footprint like his will an electric car provide? And then there’s the snotty French sign-off, which against the backdrop of international athletic competition underscores the current “maker” contempt toward any system not explicitly tuned to maximize personal wealth, American-style.

But it’s just a joke. And it’s not about wealth or stuff, even though the Cadillac ELR is, according to the advertising, “priced from $75,000,” home-charging station not included.

Of all the things President Obama said in the long New Yorker profile-interview last week, I found it interesting how many people seized on his remark that he wouldn’t let his son play pro football. Syria? The ACA? Obstructionist Congressional Republicans? There were about seventeen-thousand other words to choose from, but with the two-week gap between conference championships and Super Bowl Sunday, maybe people were itching for something, anything, football-related to talk about (surely it wasn’t just another reason to criticize Obama for positing “imaginary” offspring and apologizing for America?).

The president said something similar this time last year, only then it was that he’d have to think “long and hard about it.” Of course, the twelve months between have served up still more stories of players now living with (and dying from) the effects of catastrophic brain injury tied to playing football, and still more data confirming the connection. So maybe it’s understandable that his position has solidified. And yet then came what he called his “caveat emptor,” that current NFL players “know what they’re buying into. It’s no longer a secret. It’s sort of the feeling I have about smokers, you know?”

Just how responsible are fans and viewers of football for the well-being of the people playing it?

It's hard to believe that question is still being debated, isn't it? For over 100 years, the definitive answer is No. Pope after pope after pope, right up to Benedict XVI, has explained this in the most magisterial ways.

But perhaps it has taken Pope Francis's singular history, style, and gift for communication to break through the noise of American-style capitalism. Or perhaps the underbelly of globalization has finally come to light, through a combination of the explosion of financial capital, the worldwide recession, and the opportunities afforded by the Information Age for learning about the distant effects of almost-unregulated markets.

Whatever the reason, Pope Francis is getting through. He is obviously not a Marxist or socialist. But he is leveling strong critiques of the current state of global capitalism -- as it is actually being employed. And to my mind, one of the best interpreters of his message (especially for those reading from the right-wing) has been Michael Gerson.