Harry Frankfurt, professor emeritus at Princeton, is best known among philosophers for what are called “Frankfurt cases”: apparent counterexamples to the widespread assumption that an agent is morally responsible for what she does only if she can do otherwise. Frankfurt cases might be relevant to understanding Trump insofar as they imply that he (like anyone else) should be held morally responsible for his words and actions even if he is incapable of speaking or acting otherwise (as he well might be). But more relevant here is a distinction Frankfurt developed in 1985, and later published in his accessible little book entitled “On Bullshit” (2005). It was this book (or at least its title) that inspired Jon Stewart to interview Frankfurt on The Daily Show, thereby elevating (or lowering?) him to the role of “public intellectual” for a brief period.

“On Bullshit” is not among Frankfurt’s best work, but it’s about as readable as analytic philosophy can be. It’s also just plain fun to read a book by a highly respected professor that is copiously sprinkled with ‘bullshit’. The book’s framework can help to explain the sort causal disregard for truth and justification (or evidence) that Trump has demonstrated at least since his “Birther” years (2011-2016). The fundamental idea is this: we need to distinguish bullshitters from both people with merely false beliefs and liars. Anyone can be mistaken, and reasonable people will usually modify their beliefs once they are aware of their falsity. Take, for example, Zeke Miller, the reporter who mistakenly reported last week that a bust of MLK Jr. had been removed from the oval office; learning of his mistake, he immediately corrected his report. Liars, by contrast, are usually aware of the their statements’ falsity, and – for that very reason – intentionally try to hide it. So they recognize and “respect” (in the sense of fear) at least the particular truth they are trying to hide. But bullshitters have no respect for, or even fear of, the truth. The concept is not on their radar screen. Bullshitters might use the terms ‘true’ or ‘false’ and ‘right or ‘wrong’, but they don’t do so seriously. Think of Trump’s multiple exclamations of ‘wrong!’ during the debates with Clinton. He made no attempt to follow up with any evidence that might justify his charge, and thereby demonstrate at least a modicum of respect for truth. Rather, he seemed content to merely express his disapproval… and perhaps to do so in a way he thought his followers might find entertaining or pleasing in some other way.

The failure to take truth, evidence, or epistemic rationality seriously suggests a close kinship between natural bullshitters like Trump and far more intellectual “post-modernists”. Unlike contemporary modernists, many of whom regard objective truth as a mere ideal, but one towards which we can make progress by adopting at least apparently reliable belief-forming processes, post-modernists (of a certain stripe) reject the notion of objective truth outright, preferring to promote the value of sincerity instead. Sincerity, on such a view, is the best one can hope for, epistemically speaking.

Now, a significant number of Trump’s followers cite his penchant for “speaking his mind” as their main reason for voting for him, and downplay the importance of what he says being true or justified, or even of his beliefs being consistent with their own. (This was evident in Tom Ashbrook’s interview of Trump supporters on On Point this morning.) Clearly, such followers value sincerity over evidence, which they seem quite comfortable without, and I see no reason to doubt their judgment that Trump is sincere (at least some of the time). Surely it would be difficult if not humanly impossible to express such wildly unjustified beliefs as that climate change is a Chinese hoax, or that most undocumented Mexican immigrants are rapists and drug dealers, or – for many years – that Obama was born in Kenya, without holding those beliefs sincerely. On the other hand, Trump seems not to value sincerity for its own sake. Rather, what ultimately matters to him seems to be neither truth, nor evidence, nor sincerity, but rather the practical consequences of expressing a belief. That is, Trump seems to use his expressions of belief like a carpenter uses his tools: to build his base of support, or to manipulate a situation (for example, to strike a deal or make an ally). In this he may appear to more closely resemble a pragmatist than a post-modernist. However, pragmatism, a philosophical outlook pioneered by William James, John Dewey, and Charles Pierce, takes the concept of truth seriously enough to bother redefining it in terms of the practical consequences of holding a belief. Bullshitters don’t care enough about truth to bother redefining it, and since Trump keeps reconfirming the observation that he is a bullshitter, I think we can safely avoid placing him in the lofty company of James, Dewey, or Pierce.

I want to stress that identifying Trump as a bullshitter in no way directly impugns his policy positions. Even the most dedicated bullshitter may (at least inadvertently) speak the truth, or have a good policy idea. Neither should we assume that most of Trump’s supporters value his sincerity merely for its own sake; many may also take it as a sign that he will keep his policy promises. Finally, although Trump often invites insults by insulting others, arguing that we should reject his policies merely because he is a bullshitter would be to commit an ad hominem fallacy, and so to imitate him. However, when Trump argues that we should accept his pronouncements and policies just because he is trustworthy or reliable, then it is not fallacious at all to point out that he is neither. And the more bullshit he spouts, the less we should regard him as either.

Trumpology, the still-nascent study of Trump, is an interdisciplinary pursuit. I have nothing to add to what many pundits and social scientists have observed about Trump’s politics and policies (such as they are, or as they might come to be). But I do think that philosophers have contributions to make to the field (beyond pointing out that Trump’s tweets are a virtual cornucopia of ad hominem fallacies). For instance, there have been several articles on how Harry Frankfurt’s analysis of bullshit seems to apply to Trump’s rhetoric, including this piece by Jeet Heer at the New Republic. (By the way, for an entertaining if over the top primer on Frankfurt’s view of bullshit, see this video). Others have noted how Trump may be the first post-modernist president, which is richly ironic, given how Republicans have vilified post-modernist academics for so many decades. But as far as I know, no one has yet noted how Martha Nussbaum’s “moral psychology” – a type of philosophy that focuses on the psychological determinants of moral and immoral behavior – might help to explain the authoritarian (or, if you prefer, the bullying) aspects of Trump’s personality. I have no reason to think that she had Trump in mind when she developed the ideas I discuss below (why would she have, unless her TV happened to be tuned to The Apprentice while she was writing), but that just makes their “trumpological” relevance all the more remarkable. I also don’t know whether she would agree with how I am applying a very small part of her framework to our new President. Although she made a few remarks about Trump in the context of an interview on anger at The Atlantic, she did not mention how her broader view of the authoritarian personality might apply to him.

The central ideas of Nussbaum’s moral psychology are defended in depth and detail in her Upheavals Of Thought (2001). However, if you lack the time or patience to read that 800+ page tome, many of the same general ideas can be found in her much shorter and more accessable Not For Profit: Why Democracy Needs the Humanities (2010). I highly recommend either book, but for brevity’s sake I’ll focus here only on the ideas as they are expressed in Not For Profit.

As with Freud’s psychoanalytic theory, Nussbaum’s brand of moral psychology has been criticized for over-emphasizing early childhood development. However, Nussbaum regards her developmental views as constituting a mere “narrative”, rather than a scientific theory. As such, she feels little need to present experimental evidence (although she does present some). Rather, she argues that her speculations are well-supported by ordinary experience (including memories of one’s own childhood, observations of other children, clinical observations offered by psychotherapists of their patients, and the deep-dives into experience that the best novelists are capable of. In any case, the question here is not whether her narrative is generalizable, but rather whether it is applicable to the quite particular case of Donald Trump. After all, although Trump’s political ideology and policy prescriptions are far from clear, his personality has been in the spotlight for at least a year and a half, and I believe that Nussbaum’s view of how such a personality might be produced can shed some light on the authoritarian tendencies Trump has plainly displayed. That doesn’t mean it’s the only way of accounting for Trump’s personality… But it’s a possible starting point for biographers and other Trumpologists to consider.

As Nussbaum’s narrative is condensed in Chapter 3 of Not For Profit, it begins with two main observations. First, “Human infants are born, helpless, into a world that they did not make and do not control”, and second, infants (and even newborns) are born with, or else rapidly develop, a significant amount of emotional sophistication. Since for Nussbaum – as for many other philosophers and psychologists – emotions are cognitions or judgments of a situation’s value relative to the subject’s goals or preferences, this implies that young children, infants, and even some non-human animals have a more sophisticated mental life than one might suppose. In the case of most humans, first comes anxiety, which is caused by the infant’s expectation that their immediate needs and desires will be satisfied, along with the fact that sometimes – perhaps often – this expectation is not met. More controversially, perhaps, Nussbaum asserts that this anxiety is accompanied by a primitive type of shame that is caused by the (non-verbal) realization that “one is not in fact omnipotent”. Narcissism, a trait characterized by an obsessive self-focus and an ongoing desire for completeness, power, and control, emerges as a reaction to the primitive anxiety and shame. Complicating matters further, the shame that began as a result of dependency on others is soon joined by disgust at one’s own bodily waste products – an emotion often triggered by toilet training. That disgust, combined with a growing narcissism, leads to projection. We observe this when children stigmatize other children as “having cooties”, but that relatively innocent sort of game can turn much more serious when social influences focus the projection onto subordinate groups. At that point whole populations can be bifurcated into the “pure” and the “impure”. Later, particularly among adolescent males, peer pressure based on socially accepted conceptions of “the real man” (that include unrealistic norms of perfection, invulnerability, and control) tends to further exacerbate the narcissism that began in infancy. Finally, given a social milieu in which most people tend to defer to authority, acquiesce to the dehumanization of vulnerable groups (e.g., the widespread objectification of women), and fail to raise critical or dissenting voices, and the authoritarian personality grows and thrives.

Obviously, this narrative is controversial. I should also stress that it is only a very partial representation of Nussbaum’s view, as she gives an equally detailed analysis (in Upheavals of Thought) of competing developmental processes – involving innate and learned capacities for empathy and compassion – that allow people to avoid developing intense narcissism or an authoritarian personality. However, if this view of authoritarian personality development is right, its value lies in how it can reveal junctures at which parents and societies can intervene in the process to disrupt it, and (on the positive side of the ledger) to encourage the development of an empathetic citizenry capable of participating in a genuine democracy. To that end, Nussbaum goes on to point out ways of countering projective disgust, the pure/impure bifurcation, narcissism, and unrealistic norms of perfection and invulnerability.

So to what extent can Nussbaum’s narrative help to explain Trump? First, Trump’s having been for most of his life the head of a highly successful (whether multi-million or multi-billion dollar) company seems likely to have involved the social factors that could nurture his authoritarian personality, once it had already started to develop. That is, in such a situation, Trump would likely have been surrounded by people who would defer to his authority, and who would feel uncomfortable voicing much dissent. Also, at some point prior to this, Trump, like most other Americans, would have been exposed to the prejudices, racisms, and misogyny that have always permeated American culture. Indeed, these were more intense during Trump’s formative years than they are now. And here is where the rubber really meets the road. For if Nussbaum is right, it was those influences that shaped and amplified Trump’s projective disgust, which was so amply manifested during the campaign. Consistent with the observation that he is misogynistic, it began by his singling out women as targets. For instance, as Alexander Hurst noted in The New Republic:

Last week, Donald Trump was once again disgusted. Commenting on Hillary Clinton’s awkward bathroom break during the last Democratic debate, he said, “I know where she went, it’s disgusting, I don’t want to talk about it. No, it’s too disgusting. Don’t say it, it’s disgusting, let’s not talk.”

It’s not the first time that Trump has been perturbed by a bodily function. As Frank Bruni noted in his New York Times column, Trump has been publicly disgusted by Marco Rubio’s sweat and by the idea of pumping breast milk. Then there was his notorious comment about Fox News host Megyn Kelly, in which he conveyed an almost visceral revulsion: “You could see there was blood coming out of her eyes, blood coming out of her wherever.”

The Trump campaign has stunned bemused pundits by growing in strength with every controversy and outrageous policy proposal, like banning foreign Muslims from entering the United States. It has finally forced them to admit that his success comes not despite these things, but because of them.

What if disgust is a distinct part of that?

What if, indeed?

Similarly, Trump repeatedly manifested the adolescent norms of “real manhood” in his dealings with the other candidates. Perhaps these were cemented into his personality during his years at the military academy he attended, which he fondly remembers as being quite formative. In any case, we all recall how Jeb Bush (and later Clinton) were accused of having “low-energy”, and how Marco Rubio was nicknamed “little-Marco”. Ted Cruz escaped this direct form of belittling, perhaps because Trump sensed that Cruz has narcissistic tendencies (“leadership qualities”?) similar to his own. However, Cruz did have to deal with insults against his wife, which is not surprising on a Nussbaumian analysis. For according to adolescent male ideals, the more attractive a man’s wife or girlfriend is, the more “manly” the man is. If Trump intuited that Cruz shared his own adolescent conception of a “real man”, he also might have supposed that he could successfully attack Cruz by insulting his wife. He did, and it worked: Cruz got suitably rattled. “Real men” don’t get rattled. Finally, when Rubio tried to strike back at the same adolescent level by insinuating that Trump’s little fingers indicated that he has a little penis, Trump proudly remarked that that was certainly not the case. (That’s how “real men” respond to an insult, not by getting rattled but by doubling down). And that response worked, because Rubio was trying to out-bully Trump, a hopeless task given that it takes a lifetime to develop an authoritarian personality of Trumpian proportions, and it was obvious that Rubio was not that type of guy (and Bush was even less so). Commentators, of course, decried the level on which this discourse played out, but that level could have been predicted (and explained) by Nussbaum’s narrative.

Finally, Nussbaum’s narrative might be useful in explaining Trump’s “bromance” with Vladimir Putin. For on that narrative, authoritarian bravado masks a deep insecurity born of unresolved anxiety, shame, and disgust. Trump’s oversized self-image needs constant feeding to counterbalance those negative emotions or self-judgments (this, of course, helps to explain his thin skin). Who better to supply this feeding than a fellow authoritarian like Putin, himself a role model of “male perfection” (as Trump may view him). There might of course be many other factors at work here, including the fact that Trump would feel a natural affinity for another authoritarian who projects disgust at a shared “out-group” (Muslims). The danger, however, is that two authoritarian narcissists make for a very unstable couple. We can only hope that they can resolve whatever problems arise between them by directly comparing their genitals, rather than by launching missiles at each other (and at us).

At this uncertain moment of history, at the very beginning of the Trump Era, my hope is that the authoritarian tendencies Trump has displayed, the obvious (Frankfurtian) bullshit he has spouted, and the negative emotions he has projected, have been mostly for show. Perhaps he is actually more intelligent and compassionate than he seems. Barring that, I hope that he has grown-ups around him – perhaps some of his children! – who can help to mitigate his worst instinctive reactions. The best that can be said of Trump is that his thought processes seem not to be orderly enough to sustain a coherent ideology (for evidence, just try to make sense of his meanderings during this New York Times interview). However, we can only hope that a disorderly mind is not a precursor to a truly disordered one. If it is, philosophy might be our only consolation.

Who would have thunk it? Aristotle analyzed trolling millennia ago! The long-lost manuscript was recently discovered and translated by Rachel Barney, and published in the American Philosophical Association’s Journal! Here’s the first paragraph:

That trolling is a shameful thing, and that no one of sense would accept to be called ‘troll’, all are agreed; but what trolling is, and how many its species are, and whether there is an excellence of the troll, is unclear. And indeed trolling is said in many ways; for some call ‘troll’ anyone who is abusive on the internet, but this is only the disagreeable person, or in newspaper comments the angry old man. And the one who disagrees loudly on the blog on each occasion is a lover of controversy, or an attention-seeker. And none of these is the troll, or perhaps some are of a mixed type; for there is no art in what they do. (Whether it is possible to troll one’s own blog is unclear; for the one who poses divisive questions seems only to seek controversy, and to do so openly; and this is not trolling but rather a kind of clickbait.)

I read Harry Frankfurt’s “On Bullshit” shortly after it was published some eleven years ago. I’ve thought now and then about how I might incorporate it into one of my philosophy courses, but I’ve never found an acceptably seamless way of fitting it in (given everything else I wanted to discuss). Although it does not demonstrate Frankfurt’s meticulous analytic method, this video might be a second-best choice for getting discussion going next time a student wonders aloud whether philosophy itself is bullshit. My answer is: no, it’s not; some philosophers do bullshit, but one of the main purposes of contemporary philosophy is to not let them get away with it for very long.

At the end of a recent PBS NewsHour interview, Jeffrey Brown asked actress Juliette Binoche how she felt about the sorts of roles she can expect to be offered, now that she is older than 50. She gave one of the most positive responses about aging that I’ve ever heard. Here it is, rendered as verse-

I mean, I’ve aged, you know,
I have experience.
And so it’s not as if
I’m not facing it…
but it’s not a fear.
‘Cause time is a tool to grow!
If you don’t have that tool,
how can you grow?
How can you transform?
So, you have to believe that
time is your best friend!
Imagine if you had to die
when you’re young,
you’d feel like, wow!
You know, what I’ve learnedwith time
is amazing!

There is a very good cover story in Harpers Magazine this month (September issue) by William Deresiewicz entitled “How College Sold Its Soul… and surrendered to the market.” This story is especially relevant here in Wisconsin, where Governor Walker and the Republican-controlled legislature recently slashed the UW system budget by $250,000,000 while freezing tuition, and “the search for truth” came close to being excised from the UW’s mission statement. Although many students are under the misapprehension that eschewing liberal arts programs in favor of business and professional ones is likely to improve their financial position over the long run, pointing that out isn’t Deresiewicz’s main concern; rather, he’s arguing that college should not be viewed in economic terms at all. Here’s a brief excerpt from the article:

It is not the humanities per se that are under attack. It is learning: learning for its own sake, curiosity for its own sake, ideas for their own sake. It is the liberal arts, but understood in their true meaning, as all of those fields in which knowledge is pursued as an end in itself, the sciences and social sciences included. History, sociology, and political-science majors endure the same kind of ritual hazing (“Oh, so you decided to go for the big bucks”) as do people who major in French or philosophy. Governor Rick Scott of Florida has singled out anthropology majors as something that his state does not need more of. Everybody talks about the STEM fields – science, technology, engineering, and math – but no one’s really interested in science, and no one’s really interested in math: interested in funding them, interested in having their kids or their constituents pursue careers in them. That leaves technology and engineering, which means (since the second is a subset of the first) it leaves technology.

Deresiewicz locates the origin of the problem in the ascendence of “neo-liberalism”, by which he means “an ideology that reduces all values to money values.” Corporate and other business interests would prefer that colleges act as vocational schools, rather than that they train students to reason critically and creatively. He points out that it is not in the interests of economic elites to have students conceiving of alternatives to the status quo, or at least to have them gaining the skills that would allow them to do so. Whether you agree with his diagnosis or not, his critique of current attitudes towards higher education (even on college campuses themselves) is well worth reading.

If you have trouble finding the article, Kathleen Dunn of WPR interviewed Deresiewicz on Monday 8/31, and they covered many issues not discussed in the article, including Wisconsin-related ones. You can listen to or download the segment here. You can also find the podcast on iTunes.

It’s been fascinating to read the news stories on Sandra, the orangutan who an Argentine court decided has a right to freedom as a “non-human person”. Reporting it, UPI made one of the most revealing blunders, declaring-

On Sunday the court agreed with AFADA attorneys’ argument that Sandra was denied her freedom as a “non-human person” — a distinction that places Sandra as a human in a philosophical sense, rather than physical.

Well, no: the distinction doesn’t “place Sandra as a human” in any sense, and especially not “in a philosophical sense”. Rather, the court is implying that non-human animals have rights, not as honorary members of our species, but in virtue of their own cognitive abilities. Some animal rights activists might even take offense at this sort of “discrimination” by cognitive class (at what degree of cognitive impairment does a human cease to have rights?), but at least it avoids the – probably unconscious – speciesism that seems to lie behind the UPI comment.

That’s not to say it is philosophically easy to decide who has rights, and on what basis, partly because there are so many views of what a “right” is. What seems clear is that granting all and only humans rights (on the basis of their species alone) is objectionably arbitrary. An alternative approach is to argue that any sentient creature deserves moral consideration on the basis of its ability to feel pleasure or pain, but such a view has its own complications. While all of the philosophical kinks are being worked out (a process that is notoriously slow), it seems safest to “err” on the side of maximal compassion, which we can hope to be also the side of maximal impartial rationality.

Over the years I’ve posted several audio excerpts from Alan Watts’ talks, but I hadn’t seen any animated illustrations of his suggestions or parables until today. Here are a couple of short ones that draw from his lectures on Zen and Daoism (thanks to Tom via Berry for finding them)-

The second one, in particular, raises some interesting questions: is Watts – or the parable – suggesting that one should never judge an event to be good or bad, just because one can never know all of the event’s long-term consequences, and one can never be certain even of its immediate, short-term consequences? In the case of each of the events in the parable, instead of the farmer’s saying “Maybe”, could he not have said something just a little stronger: “Probably”? True, he would have been wrong about the improbable consequences of the events in the story, but if he made a habit of saying “probably”, wouldn’t he be right at least most of time? And wouldn’t that be enough to allow for the usefulness of at least some value judgments (the ones past experience teaches us we can be most confident about)?

I guess my point is this: yes, nature is very complex, and our minds are very limited, as are the data we use when we judge some event to be good or bad. But our minds are also part of the complexity of nature, and the somewhat predictable patterns of nature can and should inform our minds. I think the best lesson to be learned from the parable is not that one should never make value judgments, but that one should be very humble when making them, and act only on those judgments one has good reason to believe are true.

I’ve railed against Facebook many times on this blog, and in 2010’s “Facebook: Beyond The Last Straw“), I promised I would stop. I managed to keep that promise for nearly four years, but I’ve been roused to rail once again by the confluence of four different interests I happen to have: emotion research (one of my philosophical activities), ethics (a subject I teach), federal regulations covering university research (which I help to administer by being on my university’s Institutional Research Board) and the internet (which, of course, I constantly use).

In case you haven’t yet heard, what Facebook did was to manipulate the “news feeds” users received from their friends, eliminating items with words associated with either positive emotions or negative emotions, and then observing the degree to which the manipulated users subsequently employed such positive or negative vocabulary in their own posts. Facebook’s main goal was to disconfirm a hypothesis suggested by previous researchers that users would be placed in a negative mood by their friends’ positive news items, or in a positive mood by their friends’ negative news items. As I understand it, the results did disconfirm that hypothesis, and confirmed the opposite one (namely, that users would be placed in congruent rather than incongruent mood states by reading their friends’ positive or negative news items), but just barely.

Although I find this methodology questionable on a number of grounds, apparently peer-reviewers did not. The research was published in a reputable journal. More interesting to me are the ethical implications of Facebook’s having used their users as guinea pigs this way.

The best article I’ve found on the net about the ethical issues raised by this experiment was written as an opinion piece on Wired by Michelle N. Meyer, Director of Bioethics Policy in the Union Graduate College-Icahn School of Medicine at Mount Sinai Bioethics Program. Meyer is writing specifically about the question of whether the research, which involved faculty from several universities whose human-subject research is federally regulated, could have (and should have) been approved under the relevant regulations. Ultimately, she argues that it both could have and should have, assuming that the manipulation posed minimal risk (relative to other manipulations users regularly undergo on Facebook and other sites). Her only caveat is that more specific consent should have been obtained from the subjects (without giving away the manipulation involved), and some debriefing should have occurred afterward. If you’re interested in her reasoning, which at first glance I find basically sound, I encourage you to read the whole article. Meyer’s bottom line is this-

We can certainly have a conversation about the appropriateness of Facebook-like manipulations, data mining, and other 21st-century practices. But so long as we allow private entities to engage freely in these practices, we ought not unduly restrain academics trying to determine their effects. Recall those fear appeals I mentioned above. As one social psychology doctoral candidate noted on Twitter, IRBs make it impossible to study the effects of appeals that carry the same intensity of fear as real-world appeals to which people are exposed routinely, and on a mass scale, with unknown consequences. That doesn’t make a lot of sense. What corporations can do at will to serve their bottom line, and non-profits can do to serve their cause, we shouldn’t make (even) harder—or impossible—for those seeking to produce generalizable knowledge to do.

My only gripe with this is that it doesn’t push strongly enough for the sort of “conversation” mentioned in the first line. The ways in which social media sites – and other internet sites – can legally manipulate their users without their specific consent is, as far as I can tell, entirely unregulated. Yes, the net should be open and free, but manipulation of the sort Facebook engaged in undermines rather than enhances user freedom. We shouldn’t expect to be able to avoid every attempt to influence our emotions, but there is an important difference between (for instance) being exposed to an ad as a price of admission, and having the information one’s friends intended you to see being edited, unbeknownst to you or your friends, for some third party’s ulterior purpose.

As both a fan of the BK Veggie (at least when starving and passing through a small town with only fast food restaurants and no Subway) and a philosophy professor, I found this news item almost as interesting as it is just plain weird: Burger King, in its infinite corporate wisdom, has decided to change its catch-phrase from “Have It Your Way” to “Be Your Way”. BurgerBusiness.com apparently got the scoop–

Fernando Machado, SVP_Global Brand Management, told BurgerBusiness.com that the new tagline is the result of a company reexamination of its brand and its relationship with its customers. “Burger King is a look-you-in-the-eyes brand, a relaxed and a friendly brand. It is approachable and welcoming,” he said. “So we wanted the positioning to reflect that closeness. We elevated ‘Have It Your Way’ to ‘Be Your Way’ because it is a richer expression of the relationship between our brand and our customers. We’ll still make it your way, but the relationship is deeper than that.”

Sure, Be Your Way: be obese, be diabetic, be wasteful, be oblivious (except, of course, when you order the Veggie). We’ll take your money, however you are. Of course, “Have It Your Way” has its own share of unfortunate associations: have a heart attack, have a stroke, have gastric distress… But what seems to be moving the advertisers here is rather this: since being indicates a “deeper relationship” than having, and therefore since what you are is likely to be more important to you than merely what you have, emphasizing being over having should lead you to desire a Whopper more than you would were you still stumbling into one their establishments under the less efficacious spell of their traditional catch-phrase. However, the relationship between desiring, being, and having can be tricky, as Jean-Paul Sartre made abundantly clear in his epic Existentialist tome, Being and Nothingness. Here’s a quick summary of his view on this, courtesy of the Internet Encyclopedia of Philosophy–

For Sartre, the lover seeks to possess the loved one [or the loved burger – ed.] and thus integrate her into his being: this is the satisfaction of desire. He simultaneously wishes the loved one nevertheless remain beyond his being as the other he desires, i.e. he wishes to remain in the state of desiring. These are incompatible aspects of desire: the being of desire is therefore incompatible with its satisfaction.

So… do the advertisers really want to short-circuit the desiring process, and prematurely emphasize being over having? But wait… the plot thickens-

In the lengthier discussion on the topic “Being and Having,” Sartre differentiates between three relations to an object that can be projected in desiring. These are being, doing and having. Sartre argues that relations of desire aimed at doing are reducible to one of the other two types. His examination of these two types can be summarised as follows. Desiring expressed in terms of being is aimed at the self. And desiring expressed in terms of having is aimed at possession. But an object is possessed insofar as it is related to me by an internal ontological bond… Through that bond, the object is represented as my creation. The possessed object is represented both as part of me and as my creation. With respect to this object, I am therefore viewed both as an in-itself [an inert, untroubled thing – ed.] and as endowed with freedom. The object is thus a symbol of the subject’s being, which presents it in a way that conforms with the aims of the fundamental project [that is, the impossible project of being God, who alone can be conscious of something without being alienated from it – ed.]. Sartre can therefore subsume the case of desiring to have under that of desiring to be, and we are thus left with a single type of desire, that for being.

So, ultimately, if desiring to have is reducible to desiring to be, the advertisers might be wasting their time – much ado about nothing. Or is that much ado about nothingness?

As a rule, musicals tend to strike me as amusing at best (Passing Strange aside), and only time will tell whether Sting and his cohort of Broadway pros can pull off the rare feat of successfully marrying rock, pop, or folk songs to an emotionally resonant and theatrically stageable story. But the more I listen to the numbers Sting has written for his The Last Ship project, the more they grow on me. You can listen to many of those songs, performed live by Sting and several cast members, on this American Masters episode. Meanwhile, here’s one of the more thought-provoking and suggestive songs from the album (not included in the Great Performances episode), one that demonstrates how a talented – and well-read – songwriter (or two) can relate an interpretation of quantum physics to a theme with a lot of poetic and dramatic potential: how choices create universes, and how those universes might be related to parallel universes not only physically, but – more humanly – by relief, or regret, or resignation, or…

“It’s Not The Same Moon”
by Sting and Rob Mathes

Did you ever hear the theory of the universe?
Where every time you make a choice,
A brand new planet gets created?
Did you ever hear that theory?
Does it carry any sense?
That a choice can split the world in two,
Or is it all just too immense for you?

That they all exist in parallel,
Each one separate from the other,
And every subsequent decision,
Makes a new world then another,
And they all stretch out towards infinity,
Getting further and further away.

Now, were a man to reconsider his position,
And try to spin the world back to its original state?
It’s not a scientific proposition,
And relatively speaking…you’re late.

It’s not the same moon in the sky,
And these are different stars,
And these are different constellations,
From the ones that you’ve described.
Different rules of navigation,
Strange coordinates and lines,
A completely different zodiac,
Of unfamiliar signs.

It’s not the same moon in the sky,
And those planets are misleading,
I wouldn’t even try to take a bearing or a reading,
Just accept that things are different,
You’ve no choice but to comply,
When smarter men have failed to see,
The logic as to why.

I usually teach two books in my “Contemporary Philosophy” class: A. J. Ayer’s Language, Truth and Logic, and Saul Kripke’s Naming and Necessity. Ayer’s book nicely illustrates the limits of verificationist semantics, the problems with phenomenalism, and the futility of trying to eliminate metaphysics from philosophy. Kripke’s book shows how metaphysics survived – and ultimately exploited – the “linguistic turn” taken by 20th century analytic philosophy. One thing that both books have in common, however, is at least a passing concern with unicorns.

Ayer uses the sentence “Unicorns are fictitious” to illustrate how surface grammar can systematically mislead philosophers into spouting metaphysical nonsense (e.g., that since ‘unicorns’ seems to be the subject of this sentence, they must “have a mode of real being which is different from the mode of existing things”). Kripke, on the other hand, uses his scientific essentialism to argue that unicorns not only do not actually exist; they could not even possibly exist.

Well, we were talking about Ayer’s discussion of unicorns in class today, and Shannon, one my sharpest students, later tweeted me that “‘Back to the unicorns’ is something one only hears in Harry Potter or philosophy classes”, to which I responded with “Indeed…”, followed by the title of this post.

This got me thinking, though: just how extensively are unicorns used in the philosophical literature? (There’s a book to be written here, if it hasn’t already been published). To get a rough idea, I did a quick search of the Stanford Encyclopedia of Philosophy (one of my favorite resources), and found that the mythological creatures trot onto that particular stage in no less than twenty-nine – count ’em, 29 – different topics! Here’s a link to the list, for all of you unicorn junkies out there.

I found “Her”, Spike Jonze’s new movie, somewhat difficult to sit through. It feels too long (so little happens), and it treads a very thin line between a psychologically rich character study and a Saturday Night Live parody of a cliché romance. Also, the overall look of the film is bland, as if it were covered with a gray-filter. It’s too dimly lit in many scenes. In fact, one key scene happens entirely in the dark, a stylistic choice I couldn’t help but see as a sign of Jonze’s embarrassment with the scene’s content. No doubt the somberness of much of the indoor photography is meant to underscore protagonist Theodore’s extremely introverted personality. But it’s overkill: Joaquin Phoenix’s spot-on Theodore needs no extra help.

Yet, despite these problems, “Her” is, to my mind, perhaps the most thought-provoking Hollywood film released this year, with the possible exception of 12 Years A Slave (which I blogged about here). I say this even though in most ways Her is the opposite of my favorite Jonze film, 2004’s Adaptation. That movie had an almost frenetic energy; it was saturated by the sub-tropical colors of South Florida, it had a very complex structure (thanks to screenwriter Charlie Kaufman), and it centered on two (or three?) eccentric protagonists, played with all the requisite bravado by Nicholas Cage and Chris Cooper. Her, on the other hand, just plods along, without much to look at (a Scarlett-Johansson-shaped video image of Samantha might have helped a lot in that respect), with the simplest possible structure, and only two substantial characters, one of which is invisible, the other of which is rarely expressive.

But what I like about “Her” is its heartfelt exploration of intimacy, an exploration that goes deeper than what is generally found in your standard relationship flick (which, I admit, is not saying much). The film raises the question of whether it would be possible to be really intimate with the user interface of an operating system (to get some sense of Johansson’s silicon-based Samantha, just imagine Apple’s SIRI on both intellectual- and emotional-IQ steroids). But the film is more centrally concerned about the loss of human intimacy in our ever more technologically-mediated world, and that is an even worthier subject. Samantha and Theodore’s dialog reminds us, somewhat poignantly, of what a genuinely intimate relationship at least sounds like – something that’s sorely lacking not only from most other films, but also from many lives. The only thing missing from Theodore and Samantha’s relationship (besides a body, of course) seems at first to be any element of danger. For surely nothing could be less dangerous than a relationship with an entity pre-programmed to satisfy one’s every need. Theodore apparently need not fear that Samantha will ever leave him like his ex-wife did, but there’s the rub: how could such an apparently “failsafe” relationship ever really be fulfilling?

It’s the particular way in which the film first raises and then answers (or subverts) that question that makes it worth watching, and helps to excuse its weaknesses. Here’s the trailer-

Near the end of the film there’s an important reference to Alan Watts, the mid-20th century intellectual, ex-theologician, and pre-New Age disseminator of Asian religious traditions and metaphysical views. For those unfamiliar with Watts’ work, the brief description of him given in the film might suffice for the script’s purposes (though I doubt it). But for those at least passingly familiar with his life and work, the reference will have all sorts of rich resonances, and suggest several different levels on which to interpret the ending. The most obvious level has to do with Watts’ charismatic charm, which seems to have been accompanied by a (no doubt philosophically motivated) lack of shame. The second, slightly less obvious level rides on Watts’ trenchant criticisms of Western Culture, which he viewed as both a cause and effect of its average member’s confusion and neurosis. His prescription was, quite simply, to become enlightened in the down-to-earth, Zen sense he himself clearly sought. Finally, a third level of interpretation rides on the similarity between Samantha and Theodore that Samantha at one point says comforts her. To jump aboard this train of thought, you need to focus on Watts’ thesis that reality is, ultimately, One (a “monistic” worldview that a Buddhist need not accept). To avoid falling into didactic mode, I’ll just add that these three levels of interpretation are, I think, complementary. They leave Theodore with much more to mull over beyond the picture’s ending than just the promises and pitfalls of romantic attachments. The only problem is that the reference to Watts and the relevance of his personality and worldview is such “inside baseball”, the resonances that finally sold me on the film will probably not occur to most of the film’s audience. I’m not sure that they even occurred to the filmmakers.

If you’ve never heard an Alan Watts talk, here’s a 10-minute audio excerpt I once used in an adult enrichment class that focused on his fusion of Eastern and Western perspectives. At one point he mentions “the ceramic myth” and “the fully automatic myth” – ideas he explains earlier in the talk. By the first he just means the monotheistic story that God created the universe (much as people create ceramics). By the second he’s referring to the Newtonian view of the universe as a dumb, fully automatic machine, devoid of consciousness. In this excerpt, the two main themes he riffed on throughout his career – the mental illness of Western culture, and the metaphysical monism (supported by ecology and post-Newtonian physics) that could be part of the cure – are on full display.

I’ve blogged before about Stew (AKA Mark Stewart), the Tony award-winning playwright for his rock musical Passing Strange and accomplished singer-songwriter (check out his latest album, Making It), but his lecture/performance at UW Oshkosh the night before last gave me another opportunity to share him with you.

The answer he gave to the question – is art necessary? – was, as you might have expected, yes… but the reasons he gave were not the usual ones. For instance, it wasn’t that cultures require art to flourish, or that art is needed to civilize the heathen soul. Rather, Stew riffed on three main themes, and I’ll just state the gist of them here, along with some of my own elaboration I don’t think he’d object to.

First, art is what people do, as people. You simply can’t be a person unless you create art, even if the only art you create is yourself. When you step into your grandma’s house, you notice – if you have any eye for it at all – that she has carefully placed keepsakes and photos on the coffee table, the shelves, etc.. Her whole life is (or at least those aspects of it she cares to remember are) on display, if not for others, at least for herself. Then there’s the annual holiday card, letter, or now email that many of us send to our friends and family, updating them on our “true stories”. This is a creative act. It is art. Similarly, we’re all playwrights. Every day we choose our own costumes and dabble with our sets; we also write most of own lines. I would add that, unlike the days when radio ruled, we’re now our own music supervisors as well, as we carry our music libraries on our phones. But – and here I’m developing Stew’s theme in a way with which he might not entirely approve – for better or worse we’re not entirely in control of the final product. We’re not the sole producers of our art, after all. Our parents, and everyone who came before us, and for that matter the entire universe, also have that honor (or should I say dubious distinction?). Nor, even if we are self-directors, do we contractually have control over the final cut. We all wander onto each other’s stages, often in the middle of productions we have nothing – or nearly nothing – to do with. Narratively this should result in relative chaos, and sometimes it does, but usually we manage to muddle through. It is, as Stew said, what we do.

Secondly, art is necessary in the sense that, paradoxical as this might sound, it keeps life real. It always, though often unintentionally, offers a critique of the status quo: the one-dimensional, black and white, reductive Grand Narratives proffered by politicians, religious leaders, and mass media marketeers. Art does this merely by reminding us of the particular, the personal, and the idiosyncratic. Impoverished art – and here’s my somewhat more Aeolian take on Stew’s relatively Ionian melody – is little more than some permutation of the status quo that the artist has perhaps unconsciously internalized and regurgitated. Impoverished art merely reflects the status quo by being overly simplistic, stereotypical, shallow, sentimental and/or sensationalistic… Sartre would call such art “inauthentic”. When impoverished art is intentionally produced and therefore bad in addition to impoverished, there might be a temptation write it off as prostitution – it is often done just for money, and it does similarly satisfy a consumer’s need (so perhaps even bad art is “necessary”, in a sense). But artists that intentionally produce impoverished art invest less of themselves in their work than even the most jaded prostitutes, who at least have to use their own bodies. Such artists merely pretend, without taking any chances, without revealing anything about their actual selves. More “authentic” artists also pretend, but never merely. Their pretending is not deceptive; it’s not pretense.

Not that I have anything against the occasional “guilty pleasure”… For instance, I confess to regularly watching the latest version of “Hawaii 5-0”, mainly for the scenery and, since I grew up in the Islands, its nostalgic value. Sometimes, serendipitously and for purely personal, idiosyncratic reasons, even impoverished art resonates.

Finally, art provides us with at least one half of a real friendship in a world where real friends are always rare, but grow even rarer as we age. Poets, novelists, singer-songwriters, filmmakers, and others put the best of themselves into their works; they represent themselves – or at least how they see the world – as honestly as they can. What more could you ask of true friends, except perhaps that they also show some interest in you? And these friends, unlike the flesh-and-blood kind, are never far away. There they are, under a layer of dust on your bookshelf, in your rarely opened music and movie files, undemanding, patiently waiting to be discovered or re-discovered when you most need them. Of course, just like the flesh-and-blood variety, such friends might fail to live up to expectations, or lose their attractiveness over time. But to co-opt and re-purpose Matthew 7:16- By their fruits you shall know them… not to mention yourself.

Speaking of fruits (or, less metaphorically, works), it seems fitting to end this post with the opening lines of T.S. Eliot’s “The Burnt Norton”, the first of his “Four Quartets” – which Stew mentioned as being a very old friend of his, but one that he’s just now really getting to know:

Time present and time past
Are both perhaps present in time future,
And time future contained in time past.
If all time is eternally present
All time is unredeemable.
What might have been is an abstraction
Remaining a perpetual possibility
Only in a world of speculation.
What might have been and what has been
Point to one end, which is always present.
Footfalls echo in the memory
Down the passage which we did not take
Towards the door we never opened
Into the rose-garden. My words echo
Thus, in your mind.