EVENTS

Films can have an enormous emotional impact on a viewer, swaying them emotionally in ways that their intellect would oppose. I was reminded of this recently when I watched two films from the silent era, Buster Keaton’s The General (1927) and D. W. Griffiths’ Birth of a Nation (1915). The latter was one of the earliest American feature films (the first being made in 1912) with the very first being made in Australia in 1906.

It was purely a coincidence that I happened to watch two films from the silent era so close to each other because the reasons were quite different. I had always wanted to see a Buster Keaton film because I had read that he was a pioneering genius of the silent film comedy genre. I watched Griffiths’ film as part of the College Scholars Program that I help teach.

Coincidentally, both films involved the Civil War and were told from a viewpoint that was sympathetic to the Confederacy. The first thing that struck me about both was how modern they were in the way they told their stories. They did have obvious signs of being old, such as the lack of sound and color and special effects, and poor quality film stock. But apart from these purely physical factors, the narrative structure was surprisingly familiar with flashbacks being the only modern feature of films that was missing.

Because of the lack of spoken dialogue, the actors had to exaggerate their gestures a little in order for the viewer to get a sense of their emotions and what they were saying, but apart from that, these were both films that kept the viewer engrossed in their respective stories. Despite the fact that the films had no spoken words (or because of it?), they were both fast-moving and kept the viewer engaged.

But there the similarities ended.

The General is a comedy in which the two warring sides were just a backdrop for a simple story of a train engineer (Buster Keaton) whose girl friend and train (named The General) were captured by the other side. The entire film dealt with the engineer’s foray into enemy territory to get them both back home.

This is not a political film. The entire film could have been done with the two sides interchanged and all that would have been necessary would have been to switch the army uniforms. The fact that it was the Civil War was also immaterial. Any two warring factions would have served equally well. In fact there was not a single black person in the whole film (at least as I recall). The fact that the engineer and girl were from the South was seemingly due to the idea for the film coming from an actual incident in the war. This film is worth watching, if only to see how well Keaton did all the stunts himself.

Birth of a Nation, on the other hand, is a very political film, determined to drive home a very specific message. I had heard of the film before and the comments were of two kinds: (1) that it was a landmark in the development of modern film; and (2) that it was terribly racist. After seeing the film, I have to agree with both judgments.

The film (which runs a little over three hours, surprisingly long for that period) consists really of two parts. The first part starts just prior to the Civil War and deals with events leading up to its end and Lincoln’s assassination. The second part deals with the period of Reconstruction in the south immediately afterwards.

The first part starts with an idyllic portrayal of life before the Civil War, with the stories of two large happy white families – one from the north, the other from the south – who are friends and visit one another, and the budding romances of one son and daughter from one family with son and daughter from the other. The war then pits the boys against each other in battle and produces deaths in each family.

This first part of the film is not too offensive and if the film had ended at this point there would not have been much controversy. The chief criticism that would have been leveled at it would have been the portrayal of all blacks as ‘happy slaves,’ either cheerfully loyal to their masters as house servants or happily working in the cotton fields and waving to the masters as they walk by. Lincoln is portrayed as a good man who did not want to seek vengeance in the South after the North’s victory.

But the second part is set entirely in the south and deals with the Reconstruction following Lincoln’s death. This is where the film’s highly disturbing treatment of race becomes manifest. This period is portrayed as a time when blacks took complete control of life in the South, shutting out white voters in elections and thus getting majorities in the legislatures. The southern whites are portrayed as a horribly oppressed people, being pushed aside by blacks in the streets and suffering various other indignities. The blacks are entirely caricatured, with white actors in blackface portraying them as lazy and drunken and evil, shuffle-dancing in the streets, lecherously leering at the demure white women, and always rubbing it in to the whites that they were now the bosses. Only the faithful house slaves stayed loyal to the whites, to the extent of rescuing them from black mobs at great peril to themselves.

The first part of the film, by showing scenes of these two loving, courteous families, with children playing and puppies and kittens frolicking, suffering the tragedies of their family members being killed in the war, etc., had already created sympathy for them in the minds of the viewer. The only black people who emerged as recognizable characters appeared in the second part, and were two-dimensional portrayals of evil so that the viewer had no sympathy for them at all.

But the real shocker is that the film portrays the creation of the Ku Klux Klan during the Reconstruction as a response by these decent, law-abiding whites to the lawlessness created by black rule. It is started by one of the family members we have already identified with, who is appalled by the breakdown of order and merely seeking to right wrongs. The KKK’s reign of terror is also not portrayed. Only one black person is shown being ‘tried’ and found guilty by the KKK and has his body later dumped at the home of the evil black leader. Instead the people of the KKK were protrayed like comic-book heroes, ‘respectable’ citizens who adopt secret identities to fight crime and injustice. Only in this case the costumes that hide their identities are the notorious white sheets.

There is no surer way of gaining an audience’s sympathy than setting up a scene in which a plucky little band of good people (including the elderly, women, children, and pets) heroically fight overwhelming odds against an evil and faceless enemy. This is a time-tested method of swaying the viewer’s sympathies and is a staple of cowboy films. Griffiths heavily exploits this towards the end of Birth of a Nation. So powerfully had the deck been emotionally stacked in favor of the white families that in the climactic scene, when the tiny group of white people is trapped in a small house and surrounded by a large number of advancing hostile black Union soldiers, I found myself rooting for their rescue, even though the rescue was going to be by the KKK.

The spell cast by Griffiths was broken whenever the scene cuts to show the KKK riding in to save this group because the sight of people covered in white sheets now has an overwhelmingly negative emotional impact. But one can imagine how in 1915, just fifty years after the Civil War ended, this film could be seen a huge propaganda coup for the KKK, showing them in an entirely positive light. Although the KKK had been dormant for some time, 1915 saw the second resurgence of this group and the timing of that had to have something to do with the release of this film.

The fact that Griffith was able to portray a group like the KKK in such a sympathetic light is a warning about the dangerous power that films can have in shaping attitudes and sympathies. It illustrates the importance of having people realize that films and other forms of video can never be taken as the only source of knowledge. We cannot avoid the hard work of reading about and around important events, both historical and contemporary, if we are to piece together a reasonably accurate understanding of events.

POST SCRIPT: Mr. Deity returns

Mr. Deity is taken on a tour of hell by Lucifer.

For all the Mr. Deity clips, see here.

Share this:

The presidential election campaign for 2008 has already started with a whole host of declared and undeclared candidates running. George Bush’s performance seem to have persuaded people that anyone can do a better job than him.

All the candidates face stiff hurdles in getting their respective nominations. But the reality is that almost all of them have no chance. It is not because they are not good candidates or are incapable of being president or have unsavory histories but because they have two inter-related issues that work against them right form the start.

One of those issues is the ability to raise money. It requires a lot of money to run a presidential campaign. This is something that everyone is aware of. The less obvious but related issue is that the media has already made a judgment about who is ‘worthy’ and capable of being president and some of the candidates have already been written off. The coverage of their campaigns will reflect this bias against them and this will adversely affect those candidates’ ability to raise money and gain name recognition.

It is clear that the media has already chosen the following as the ‘viable’ candidates based on nothing more than their own preferences. For Democrats they are Clinton, Obama, and Edwards. For the Republicans, they are Giuliani, McCain, and Romney.

The media will be either dismissive of the others, or treat them as distractions, or use them as fodder to provide ‘color’ to the campaigns. For example, Michael McIntyre says Kucinich’s in his ‘Tipoff’ column in the Plain Dealer on January 20, 2007 described Kucinich’s campaign as ‘futile.’ On what basis? He does not say. The fact is that Kucinich and Paul are the only Congresspeople running for president who had the foresight to vote against the Authorization for Use of Military Force Against Iraq Resolution of 2002, the disastrous law that George Bush used to wage his illegal and immoral invasion of Iraq. But that seems to count for nothing in the minds of the media who continue to give prominence to the politicians and pundits who have been consistently wrong on everything concerning this war. (Obama was also against the war but not in Congress at that time.)

This is not a new phenomenon. The pack of media journalists that follow campaigns as a group has long tended to decide early on which candidates ‘deserve’ serious consideration, or even are worthy of being president and slant their coverage accordingly. Jonathan Schwarz describes an experience he had many years ago that illustrated to him that “the government and corporate media self-consciously see themselves as a governing elite that runs things hand in hand.”

Washington Post columnist Richard Cohen came to talk at Yale in 1988, just after I arrived. Following schmancy Yale tradition, he had tea with a small group of students and then ate dinner with an even smaller group. I weaseled my way into attending.

Gary Hart had recently flamed out in the ’88 presidential race because of Donna Rice. And at dinner Cohen told all us fresh-faced, ambitious, grotty youths this:

The Washington press corps had specifically tried to push Hart out of the race. It wasn’t because Hart had had extramarital affairs—everyone knew this was the norm rather than the exception among politicians. So Hart wasn’t at all unusual in this respect. Instead, Cohen said, it was because the press corps felt that Hart was “weird” and “flaky” and shouldn’t be president. And when the Donna Rice stuff happened, they saw their opening and went after him.

(I wish I remembered more about what Cohen said about the specific gripe of the press corps with Hart, but I don’t think he revealed many details.)

At the time, I remember thinking this:

1. How interesting that the DC press corps knows grimy details about lots of politicians but only chooses to tell the great unwashed when they decide it’s appropriate.

2. How interesting that the DC press corps feels it’s their place to make decisions for the rest of America; ie, rather than laying out the evidence that Hart was weird, flaky, etc., and letting Americans decide whether they cared, they decided run-of-the-mill citizens couldn’t be trusted to make the correct evaluation.

3. How interesting that Cohen felt it was appropriate to tell all this to a small group of fresh-faced, ambitious, grotty Yale youths, but not to the outside world. And how interesting that we were being socialized into thinking this was normal.. . .If you’re not part of their little charmed circle, believe me, all your worst suspicions about them are true. They do think you’re stupid. They do lie to you. They do hate and fear you. Most importantly, they think you can’t be trusted with the things they know—because if you did know them, you’d go nuts and break America.

CBS News’s Dick Meyer confirms the fact that the media often decides to not tell the public the truth about political leaders:

This is a story I should have written 12 years ago when the “Contract with America” Republicans captured the House in 1994. I apologize.

Really, it’s just a simple thesis: The men who ran the Republican Party in the House of Representatives for the past 12 years were a group of weirdos. Together, they comprised one of the oddest legislative power cliques in our history. And for 12 years, the media didn’t call a duck a duck, because that’s not something we’re supposed to do.

The situation now is not unlike that which existed earlier when Thomas Jefferson said:

Men by their constitutions are naturally divided into two parties: 1. Those who fear and distrust the people, and wish to draw all powers from them into the hands of the higher classes. 2. Those who identify themselves with the people, have confidence in them, cherish and consider them as the most honest and safe, although not the most wise depository of the public interests.

It seems clear to me that the members of the mainstream media and the political classes today tends to fall into the first group. But for a healthy democracy, it is important that we advocate belonging to the second group. This is why I think that citizenship means that we do not accept what is given to us by the media but be active seekers of knowledge.

Share this:

In a celebrated remark in the case Jacobellis v. Ohio (1964) involving “hard core pornography”, US Supreme Court Justice Potter Stewart said that “I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description; and perhaps I could never succeed in intelligibly doing so. But I know it when I see it.”

I am becoming convinced that this is a general feature of life. Questions have simple answers only when we don’t examine them too closely. Suppose, for example, I asked the question “What is the length of my desk?” you would expect that there is a definite length to it and that there should be a straightforward way to get the answer.

At the simplest level, you could take a ruler and measure it and call this the length. But is that the most accurate measurement? A ruler is, after all, a pretty coarse measuring instrument. You could get fancier and use more sophisticated devices such as laser beams and high precision timers to get increasing levels of precision. But at some point you reach a limit to precision because at a fundamental level, because of Heisenberg’s Uncertainty Principle, the length of the desk is not a well-defined quantity. This is because although the desk looks like an object with sharp boundaries, when you get to the sub-atomic level, we know that the atoms on the surface are quantum mechanical systems and so the edges of the desk are not sharply defined but instead are fuzzy and a blur. How do you measure the length of a blur?

At the large scales with which we normally work, we can ignore this and think of the desk as having a definite length but that is because we are not looking too closely.

For another example, although we all have a general intuitive idea about what is science and non-science, I have previously discussed how, when you look closely at the question, it is hard it is to strictly demarcate science from non-science. This is because the problem of finding necessary and sufficient conditions that demarcate one class of objects from another class of objects is very hard, and perhaps impossible.

While in everyday life we tend to be coarse-grained in our outlook, universities tend to be places where things are examined in fine-grained detail This is partly the reason why universities have received the label of “ivory towers.” To those outside the university it can seem like academics are engaged in research at a level of detail that seems pointless and the ‘ivory tower’ label is sometimes intended as an insult. But the reality is that universities are one of the few places where people try to examine things closely, to see how far we can go in defining things before we reach the limits at which things break down. While to outsiders this may seem like nitpicking, it is important to do this because the consequences of such fine-grained analyses can have practical consequences.

For example, most people have a clear idea of (say) what is alive and what is dead, of what is human and what is not. But those classifications are not as clear-cut as they can seem. What is considered dead for example, has changed over time, from ‘heart dead’ to ‘brain dead’ to ‘persistent vegetative state.’ Knowing the precise limits of knowledge in this area has important practical consequences. (See part 1, part 2, and part 3 of that series.)

It seems to be the case that as much as we might like to have certainty, we can never have it. At some point, we reach a level of detail where we have to make a decision, a judgment, as to what something is and what we need to do. This is why we often delegate to people (judges, doctors, academics, and other experts in each field) who have studied these things the right to make such judgments on our behalf. It is not that they are infallible and cannot be wrong, but because at least they work with an awareness of the limits of knowledge and of the ambiguities that exist at the fine-grained level.

Share this:

In a recent online discussion about whether intelligent design creationism should be taught as part of science, one of the participants took exception to a statement by someone else that the theory of evolution is so well established that it was of no use to allow for the inclusion of intelligent design creationism. The challenger asked, quite reasonably: “On what things is there no room for debate? Of what things are we so certain that we’re willing to close the door to possibilities? If academics allow themselves to appear dogmatic about their theories, we legitimize dogmatism. We should be careful that scientists themselves do not become the new proselytizers to claim they hold absolute truth.”

This puzzlement is not uncommon and not unjustified. Seen from the outside, scientists must seem as if we either cannot make up our minds as to what we know for certain and what we are unsure of, or we are accused of cynically shifting our position for polemical advantage, sometimes arguing that evolution is a fact beyond dispute (in order to exclude intelligent design creationism as a viable competitor) while also asserting that intelligent design creationism is not scientific because it is not falsifiable. On the surface, those two positions seem inconsistent, applying different criteria to the two theories.
It is true that scientists assert that “evolution is a fact,” just as they assert that “gravity is a fact.” They also acknowledge the “theory” of evolution and the “theory” of gravity. And they also assert that ALL knowledge is provisional and subject to change.

How can all these things be simultaneously true? How can something be at the same time a fact and a theory, certain and yet subject to change? These are deep questions and ones that can lead to heated discussions since they affect deeply held core beliefs about science and religion.

These also happen to be questions that form the core of the seminar course I teach to sophomores. We discuss all kinds of things in my course including science and religion, intelligent design etc. and it is remarkable that in the four years that I have taught it, there have been absolutely no blowups or confrontations or unpleasantness, although colleagues have told me that these very same questions have caused problems in their classes. The relative harmony of my class exists despite the fact that I know that many of my students are quite religious, from a variety of traditions, and they know that I am an atheist. These personal beliefs are not things that we keep secret because they shed important perspectives on the discussions.

Perhaps the reason for the lack of friction is that my course starts with looking closely at what science’s knowledge structure is. We read Pierre Duhem, Karl Popper, Thomas Kuhn, Imre Lakatos, Larry Laudan and other historians and philosophers of science and see how it is that science, unlike other areas of knowledge, progresses rapidly because of the commitment of its practitioners to a paradigm in which the framework in which problems are posed and solved are well defined. The paradigm consists of a scientific consensus about which theory (or a set of closely related theories) should be used for analyzing a problem, rules for determining what kinds of research problems are appropriate, the kinds of evidence, arguments, and reasoning that are valid, and the conditions that solutions to these research problems must satisfy if they are deemed to be satisfactory. That complex paradigmatic framework is sometimes loosely and collectively referred to as a “theory” and students quickly realize that the popular meaning of the word “theory” as some sort of simple hypothesis or guess does not apply in the scientific realm.

As long as that paradigmatic framework (or “theory”) is fruitful and brings forth new problems and successes, it remains inviolate from challenges, and practitioners strenuously resist attempts at overthrowing it. The “theory” is thus treated and defended as if it were a “fact” and it is this that is perceived by some outside of science as dogmatism and an unwillingness to change.

But as Kuhn so persuasively argues, it is this very commitment to a paradigm that is the reason for science’s amazing success, because the scientist working on a problem defined within a paradigm can be assured a priori that it is legitimate and important, and that only skill and ingenuity stands between her and the solution. Solving such problems within a paradigm is a sign of superior skill and brings rewards to the scientist who achieves it. Such conditions ensure that scientists will persevere in the face of challenges and adversity, and it is this kind of dogged determination that has resulted in the scientific breakthroughs from which we now benefit.

Kuhn likens this commitment of scientists to a paradigm to that of an industrialist to the manufacturing process that exists to make a particular product. As long as the product is made well, the manufacturer is not going to retool the factory because of the enormous effort and costs involved. Similarly, learning how to successfully exploit a scientific paradigm involves a long period of scientific apprenticeship in a discipline and scientists are unlikely to replace a working paradigm with another one without a very good reason. Learning to work well within a new paradigm is as costly as retooling a factory, and one does not do so cavalierly but only if one is forced into it. The dogmatism of science is thus pragmatic and not ideological.

But we do know that scientific revolutions, both major and minor, occur periodically. Very few of our current paradigms have a long history. So how and why do scientific paradigms change? They occur when the dominant paradigm shows signs of losing its fruitfulness, when it fails to generate interesting new problems or runs out of gas in providing solutions. It is almost never the case that one (or even a few) unsolved problems result in its overthrow because all scientific paradigms at all times have had many unsolved problems. A few counterexamples by themselves are never sufficient to overthrow a paradigm, though they can be a contributing factor. This is the fundamental error that advocates of intelligent design creationism (IDC) make when they argue that just because evolution by natural selection has not as yet explained some phenomena, Darwin’s theory must be rejected.

To be taken seriously, a new paradigm must also promise to be more fruitful than its predecessor, open up new areas of research, and promise new and interesting problems for scientists to work on. It does that by postulating naturalistic mechanisms that make predictions that can be tested. If it can do so and the predictions turn out to be successful, the commitment to the existing paradigm can be undermined, and the process begins by which the paradigm may be eventually overthrown. IDC has never come even close to meeting this requirement.

Some people have challenged the idea that scientific theories have to have as necessary conditions that they be naturalistic and predictive, arguing that insisting they be so is to impose dogmatic methodological rules. But the requirement that scientific theories be naturalistic and predictive are not ad-hoc rules imposed from outside. They follow as a consequence of needing the paradigm to be able to generate new research programs. How could it be otherwise?

This is why IDC, by pointing to a few supposedly unsolved problems in evolutionary theory, has not been able to convince the biology community of the need to change the way they look at things. Intelligent design creationism does not provide mechanisms and it does not make predictions and has not been able to produce new research.

When we discuss things in the light of the history of science, the students in my class understand why science does things the way it does, why it determinedly holds on to some theories while being willing to abandon others, and that this process has nothing to do with dogma in the traditional religious sense. Religious dogma consists of a commitment to an unchanging core set of beliefs. Scientific “dogma” (i.e. strong commitment to a paradigm and resistance to change) is always provisional and can under the right conditions be replaced by an equally strong commitment to a new “dogma.”

Almost all my students are religious in various ways, and while some find the idea of IDC appealing, they seem to have little difficulty understanding that its inability to enter the world of science is not a question of it being right or wrong, but is because of the nature of science and the nature of IDC. IDC simply does not fit into the kind of framework required to be a fruitful scientific theory.

Share this:

The willingness of our so-called intellectuals to use fiction as a basis for justifying barbaric policy decisions is truly astounding.

I have written before about how people who should know better (and probably do) continue to evoke the TV program 24 to justify the use of torture because the main character routinely uses it to extract information from captives. It should come as no surprise that the creator of that program Joel Surnow describes himself as a “Bush fan” and plans to continue to use torture even though people who do interrogations professionally say that such practices are actually harmful.

In a recent New Yorkerarticle (Whatever it takes by Jane Mayer, February 19, 2007), some senior army interrogators and trainers of soldiers tried to get the program to not push this idea so much because it was giving army recruits the wrong idea of what kinds of interrogation techniques work, let alone are legal.

This past November, U.S. Army Brigadier General Patrick Finnegan, the dean of the United States Military Academy at West Point, flew to Southern California to meet with the creative team behind “24.” Finnegan, who was accompanied by three of the most experienced military and F.B.I. interrogators in the country, arrived on the set as the crew was filming. At first, Finnegan—wearing an immaculate Army uniform, his chest covered in ribbons and medals—aroused confusion: he was taken for an actor and was asked by someone what time his “call” was.

In fact, Finnegan and the others had come to voice their concern that the show’s central political premise—that the letter of American law must be sacrificed for the country’s security—was having a toxic effect. In their view, the show promoted unethical and illegal behavior and had adversely affected the training and performance of real American soldiers. “I’d like them to stop,” Finnegan said of the show’s producers. “They should do a show where torture backfires.”

Finnegan told the producers that “24,” by suggesting that the U.S. government perpetrates myriad forms of torture, hurts the country’s image internationally. Finnegan, who is a lawyer, has for a number of years taught a course on the laws of war to West Point seniors—cadets who would soon be commanders in the battlefields of Iraq and Afghanistan. He always tries, he said, to get his students to sort out not just what is legal but what is right. However, it had become increasingly hard to convince some cadets that America had to respect the rule of law and human rights, even when terrorists did not. One reason for the growing resistance, he suggested, was misperceptions spread by “24,” which was exceptionally popular with his students. As he told me, “The kids see it, and say, ‘If torture is wrong, what about “24”?’ ” He continued, “The disturbing thing is that although torture may cause Jack Bauer some angst, it is always the patriotic thing to do.”

Gary Solis, a retired law professor who designed and taught the Law of War for Commanders curriculum at West Point, told me that he had similar arguments with his students. He said that, under both U.S. and international law, “Jack Bauer is a criminal. In real life, he would be prosecuted.”. . .The third expert at the meeting was Tony Lagouranis, a former Army interrogator in the war in Iraq. He told the show’s staff that DVDs of shows such as “24” circulate widely among soldiers stationed in Iraq. Lagouranis said to me, “People watch the shows, and then walk into the interrogation booths and do the same things they’ve just seen.”

But Surnow does not care for the testimony of experts in interrogation because, like his hero George W. Bush, what matters is what he feels in his gut: “We’ve had all of these torture experts come by recently, and they say, ‘You don’t realize how many people are affected by this. Be careful.’ They say torture doesn’t work. But I don’t believe that.”

So we have TV program creators helping to create a mindset in the country where illegal and immoral acts are considered just fine. When combined with media commentators and academics who also advocate barbaric acts, it is depressing but perhaps not surprising that there is little outcry when we hear of the torture of people held in the war on terror.

As Austin Cline points out in his essay Medicalizing torture and torturing medicine, the widening rot that is produced by encouraging and condoning torture extends to the medical profession. Torture cannot take place without the complicity of doctors, nurses, and other medical personnel who have to treat the tortured and hide the evidence that it has occurred. Although the recent revelations about conditions at Walter Reed hospital had nothing to do with torture, he points out that it could not have escaped notice for so long without the complicity of medical personnel as well and he argues that it is due to a public mindset that is becoming increasingly comfortable with people being dehumanized.

Once we shrug our shoulders at people being tortured and rationalize it by saying that they would be treated worse by other countries, it is not that far a step to view mistreated hospital patients as whiners who should be grateful for what they get rather than complain about what they don’t get.

Share this:

In yesterday’s post, I classified the appreciation of films according to four levels. At the lowest level is just the story or narrative. The next level above that is some message that the director is trying to convey and which is usually fairly obvious. The third level is that of technique, such as the quality of dialogue and acting and directing and cinematography and sound and lighting. And then there is the fourth and highest level, which I call deep meaning or significance, where there is a hidden message which, unlike the message at the second level, is not at all obvious but which has to be unearthed (or even invented) by scholars in the field or people who have a keen sensitivity to such things. I classified people whose appreciation does not get beyond the first two levels as low-brow.

The same classification scheme can be applied to books, especially fiction. In recent years I have started reading mostly non-fiction, but when it comes to fiction, I am definitely low-brow. To give an example of what I mean, take the novels of Charles Dickens. I like them because the stories he weaves are fascinating. One can enjoy them just for that reason alone. The second level meanings of his books are also not hard to discern. Many of his books were attempting to highlight the appalling conditions of poor children at that time or the struggles of the petite bourgeoisie of England. That much I can understand and appreciate.

What about his technique, the third level that I spoke of? The fact that I (and so many others over so many years) enjoy his books means that his technique must be good but I could not tell you exactly what his technique is. It is not that I am totally oblivious to technique. His habit of tying up every single loose end at the conclusion of his books, even if he has to insert extraordinary coincidences involving even minor characters, is a flaw that even I can discern, but this flaw of structure is not something fatal enough to destroy my enjoyment of his the work.

There is probably the fourth level to Dickens that scholars have noticed but which I will never discover by myself. Here we get into the writer’s psyche such as whether certain characters reflect Dickens’s own issues with his family’s poverty and his father’s time in a debtor’s prison and his relationship to his mother and so on. This is where really serious scholars of Dickens come into their own, mining what is known of his life to discover the hidden subtext of his novels.

My inability to scale these heights on my own is the reason why there are some writers who are stated to be geniuses whom I simply cannot appreciate. Take William Faulkner. I have read his novels The Sound and the Fury and As I Lay Dying and his short stories A Rose for Miss Emily and Barn Burning but I just don’t get his appeal.

In fact, I find his writing sometimes downright annoying. At the risk of incurring the wrath of the many zealous Faulkner fans out there, I think that Faulkner does not play fair with his readers, deliberately misleading them seemingly for no discernible reason. In The Sound and the Fury, for example, he abruptly keeps switching narrators on you without warning, each with their own stream of consciousness, but you soon get the hang of that and can deal with it. But what really annoyed me was that he has two characters have the same name but be of different genders and of different generations but this fact is not revealed until the very end. Since this character is central to the story and is referred to constantly by the different narrators, I was confused pretty much all the way through as to what was going on, since I had naively assumed that the references were to the same person, and the allusions to that person did not fit any coherent pattern. As a result, I found it hard to make sense of the story and that ruined it for me. I could not see any deep reason for this plot device other than to completely confuse the reader. I felt tricked at the end and I had no desire to re-read the book with this fresh understanding in mind.

This is not to say that writers should never misdirect their readers but there should be good reasons for doing so. I grew up devouring mystery fiction and those novels also hide some facts from their readers and drop red herrings in order to provide the dramatic denouement at the end. But that genre has fairly strict rules about what is ‘fair’ when doing this and what Faulkner did in The Sound and the Fury would be considered out of bounds.

More sophisticated readers insist to me that Faulkner is a genius for the way he recreates the world of rural Mississippi, the people and places and language of that time. That may well be true but that is not enough for me to like an author. When my low-level needs of story and basic message are not met, I simply cannot appreciate the higher levels of technique and deep meaning. Furthermore, there is rarely a sympathetic character in his stories. They all tend to be pathological and weird, which makes it even harder to relate to them.

I had similar problems with Melville’s Moby Dick. For example, right at the beginning there are mysterious shadowy figures that board the ship and enter Captain Ahab’s cabin but they never appear afterwards although it does not appear that they left the ship prior to its departure. What happened to them? What was their purpose? And what do all the details about whaling (that make the book seem like a textbook on the whaling industry) add to the story? Again, the main characters were kind of weird and unsympathetic and I finished the book feeling very dissatisfied.

James Joyce’s Ulysses seems to me to be a pure exercise in technique and deep meaning that is probably a delight for scholars to pick through and interpret and search for hidden meanings, but that kind of thing leaves me cold. I simply could not get through it, and also failed miserably with The Portrait of the Artist as a Young Man.

Gabriel Garcia Marquez in his book Love in the Time of Cholera pulls a stunt similar to Melville. His opening chapter introduces some intriguing and mysterious characters who then disappear, never to appear again or be connected with the narrative in even the most oblique way. I kept expecting them to become relevant to the story, to tie some strands together, but they never did and I was left at the end feeling high and dry. Why were they introduced? What purpose were they meant to serve? Again, people tell me that Marquez is great at evoking a particular time and place, and I can see that. But what about the basic storytelling nature of fiction? When that does not make sense, I end up feeling dissatisfied.

I also have difficulty with the technique of ‘magic realism’ as practiced by Marquez in his A Hundred Years of Solitude and Salman Rushdie in The Satanic Verses. In this genre you have supernatural events, like ghosts appearing and talking to people, or people turning into animals and back again, and other weird and miraculous things, and the characters in the books treat these events as fairly routine and humdrum. I find that difficult to accept. I realize that these things are meant to be metaphors and deeply symbolic in some way, but I just don’t get it. These kinds of literary devices simply don’t appeal to me.

This is different from (say) Shakespeare’s plays, which I do enjoy. He too often invokes ghosts and spirits in some of his plays but these things are easily seen as driving the story forward so it is easy to assimilate their presence. Even though I don’t believe in the existence of the supernatural, the people of his time actually believed in those things and the reactions of the characters in his plays to the appearance of these ghosts and fairies seem consistent with their beliefs. But in a novel like The Satanic Verses that takes place in modern times, to have a character turn into a man-goat hybrid and back to fully man again with the other characters responding with only mild incredulity and not contacting the medical authorities, seems a little bizarre.

I would hasten to add that I am not questioning the judgment of experts that Faulkner and Melville and Joyce and Marquez and Rushdie are all excellent writers. One of the things about working at a university is that you realize that the people who study subjects in depth usually have good reasons for their judgments and that they are not mere opinions to be swept aside just because you happen to not agree with them. One does not go against an academic consensus without marshalling good reasons for doing so and my critiques of these writers are at a fairly low level and come nowhere close to being a serious argument against them. What I am saying is that for me personally, a creative work has to be accessible at the two lowest levels for me to enjoy it.

I think that there are two kinds of books and films. One the one hand there are those that can be enjoyed and appreciated by low-brow people like me on our own, and others that are best appreciated when accompanied by discussions led by people who have studied those books and authors and films and directors and know how to deal with them on a high level.

Share this:

Although I watch a lot of films, I realized a long time ago that my appreciation of films (or plays or books or concerts) was decidedly at a ‘low brow’ level. To explain what I mean, it seems to me that there are four levels in which one can appreciate a film (or play). At the lowest level is just the story or narrative. The next level above that is some message that the writer or director is trying to convey and which is usually fairly obvious. People whose appreciation does not get beyond these two levels are those I call low-brow. And I am one of them.

But I am aware there are higher levels of appreciation and criticism that can be scaled. The third level is that of technique, such as the quality of writing and things like acting and directing and cinematography and sound and lighting. And then there is the fourth and highest level, which I call deep meaning or significance, where there is a hidden message which, unlike the message at the second level, is not at all obvious but which has to be unearthed (or even invented) by scholars in the field or people who have a keen sensitivity to such things.

I almost never get beyond the first two levels. In fact, if the first level does not appeal to me, then no level of technique or profundity will rescue the experience. This does not mean that the items in the third level do not matter. They obviously are central to the enjoyment of the experience. It is just that I rarely notice the third level items unless they are so bad that it ruins the storytelling aspect. If the dialogue or acting (for example) is really rotten, then I will notice it but if I don’t notice these things at all, then it means that they were good.

But I don’t even consider these things unless the first two levels are satisfactory. If the first two levels are bad, nothing at the higher levels can salvage the experience for me. I never leave a film saying things like “The story was awful but the camerawork was excellent.”

As an example, I really enjoy Alfred Hitchcock’s films and have seen nearly all of them, many multiple times. But I just enjoy the way he tells the stories. Since I enjoy reading about films after I have watched them, I often find people pointing out subtle effects of technique such as how he uses lighting or sets up a camera angle or how he creates a mood, and so on. While I enjoy having these things pointed out to me, I would never notice them on my own.

The same thing holds with the music soundtrack. When friends tell me that they enjoyed the soundtrack of a film that is not a musical, my usual response is “what soundtrack?” The only films in which I notice the soundtrack are those in which there are obvious songs, such as in (say) The Graduate or Midnight Cowboy, the latter having a wonderful theme song Everybody’s Talkin’ by Harry Nillson and a beautifully haunting harmonica score that so pervades the film that even I noticed it.

The same happens with the fourth level of analysis, which is even more inaccessible to me. Just recently I read that in several of Hitchcock’s films, he was exploring homosexual themes. I had no idea and would never have figured that out on my own. While I have no talent for exploring these deeper levels of meaning, I appreciate the fact that there are people who can do so and are willing to share that knowledge. Reading them and talking about films with such knowledgeable and keenly observant people is a real pleasure.

I once had pretensions to ‘higher criticism’ (which deals with the third and fourth levels) myself but that ended one day when it became dramatically obvious that I had no clue how to do it. It was in 1975 when I watched the film If. . . (1968) by director Lindsay Anderson. I like Anderson’s films a lot. He creates strange and quirky films that deal with class politics in Britain, such as This Sporting Life (1963) and O Lucky Man (1973). The last one has an absolutely brilliant soundtrack and I noticed it because it consists of songs sung by British rocker Alan Price and he and his group periodically appear in the film to sing them, so you can’t miss the music. It is one of the rare CDs I bought of a film soundtrack, it was so good.

Anyway, my friends and I watched If. . . and we noticed that while most of the film was in color, some of the scenes were in black and white. We spent a long time afterwards trying to determine the significance of this, with me coming up with more and more elaborate explanations for the director’s intent, trying to make my theories fit the narrative. By an odd coincidence, soon after that I read an article that explained everything. It said that while making the film, Anderson had run low on money and had had to complete shooting with cheaper black and white film. Since films are shot out of sequence, the final product had this mix of color and black and white footage. That was it, the whole explanation, making laughable my elaborate theories about directorial intent. It was then that I gave up on the higher criticism, realizing that I would simply be making a fool of myself.

There are some films that are self-consciously technique-oriented, and I can appreciate them as such. For example Memento and Mulholland Drive are films that are clearly designed by the director to have the viewer try and figure out what is going on. They are like puzzles and I can enjoy them because they are essentially mystery stories (one of my favorite genres) in which the goal is to determine the director’s intent and methods used. Both films were a lot of fun to watch and grapple with.

But except in those special cases, I leave ‘higher criticism’ to those better equipped to do so. That is the nice thing about creative works of art. One can appreciate and enjoy them at so many different levels and each viewer or reader can select the level that best suits them.

Next: A low-brow view of books.

Share this:

Former governor of Massachusetts Mitt Romney has declared himself a candidate for the Republican nomination for president in 2008. I argued earlier that Romney’s religion (he is a Mormon) should be immaterial to whether he is qualified to be President.

But at a recent campaign event, he was challenged by someone who called him a “pretender” because as a Mormon he did not believe in Jesus Christ. Instead of answering that a person’s faith was a private affair that did not belong in the public sphere and closing the discussion on that topic, Romney responded that “We need to have a person of faith lead the country.”

Obviously I disagree with that but it also strikes me that Romney has opened himself up a can of worms because once you say that faith is necessary for being president, you have to deal with the issue of what kinds of faiths are allowed. This means that questions about the suitability of a person’s faith can become part of the political discussion. What about Islam or Hinduism or Buddhism? Are believers in those religions considered ‘persons of faith’? What about a person who has faith in tree spirits or voodoo or Satan or the Flying Spaghetti Monster. Are those faiths good enough? There is no question at all that the leaders of al Qaeda are ‘persons of faith’ by any reasonable definition of the term. So are their faiths acceptable?

A good question to ask Romney, which has been made legitimate by his response, is what criteria he uses to determine what constitutes an appropriate faith. Of course, no candidate or the media is going to discuss these kinds of questions because it would be too awkward. They know that there is no answer that can be given that does not (at best) contradict the US constitution that religious beliefs cannot be a test for public office or (at worst) comes across as rank bigotry. Both media and candidates tend to use ‘person of faith’ as code for ‘people just like us.’

It’s become vogue for politicians to make their religious beliefs, their “faith,” central parts of their campaigns. If they do so, it’s quite fair for people take a look at just what those beliefs are.

Romney says only a “person of faith” can be president. Plenty of people are going to say they don’t want a Mormon to be president. Is this bigotry, an objection to belief (or lack), or both?

Want to make personal religious beliefs a central issue in politics? Fine, bring it on. You guys can fight it out.

“We need to have a person of faith lead the country.”

“We need to have a Christian to lead the country.”

“We need to have a member of the Reformed Baptist Church of God, Reformation of 1915 to lead this country.”

Where’s the line?

The phrase ‘person of faith’ has come to mean someone who believes in some supernatural entity, but more importantly, believes in things similar to what you believe. In particular, it is used to signify a particular stance on certain moral issues.

For example, in response to an earlier post on this topic, a reader emailed me the following:

To me, without a presumption of a Divine Creator, objective morality is impossible. How will we judge anyone, if their retort is effectively, “my behavior might not seem right to you, but it’s right for me”? Unless we can state some objective ground for morality, all our law goes out the window, and anarchy must result. That conclusion seems inevitable.

I am always puzzled by the assertion that belief in a god leads to an ‘objective morality.’ How can that be squared with the blatant contradictions that are so easily observable? After all, we have all kinds of different religions that believe in a ‘Divine Creator’ and yet they all have different moralities. We even have within religions (be they Christianity, Judaism or Islam) different moralities even though they claim to believe in the same version of god. And even within each tradition morality has changed with time, so that what Christians and Jews and Muslims consider moral now is quite incompatible with what was considered moral in the past. To belabor the obvious, owning slaves and the ritual sacrificing of animals was considered quite moral at one time.

Rather than belief in a Divine Creator being the basis of morality, it seems pretty clear that people use their ideas about morality to decide what version of god they would like to believe in. In other words, ideas about morality are prior to belief in a god.

This idea that belief in a god is the basis of morality is so obviously contradicted by the facts that I can only conclude that this is an example of the power of religion to blind people to logic and reason.

Share this:

Readers will recall that Dayton, TN was where the celebrated Scopes trial on the teaching of evolution was held back in 1925. Well, that state is still fighting against the teaching of evolution.

The latest effort is chronicled in the newspaper the Nashville Postwhich reports on a resolution proposed by State Sen. Raymond Finney (R-Maryville). The senator, a retired physician, clearly thinks he has come up with a clever way of putting the state’s Department of Education on the spot, presumably because they teach evolution without mentioning god. So Finney is asking the Senate to endorse certain questions that he would like to pose to the Department of Education. The department has to provide a response by January 15, 2008.

A Tennessee State Senate member has filed a resolution asking the Tennessee Department of Education to address a few basic questions about life, the universe and all that:

(1) Is the Universe and all that is within it, including human beings, created through purposeful, intelligent design by a Supreme Being, that is a Creator?

Understand that this question does not ask that the Creator be given a name. To name the Creator is a matter of faith. The question simply asks whether the Universe has been created or has merely happened by random, unplanned, and purposeless occurrences.

Further understand that this question asks that the latest advances in multiple scientific disciplines –such as physics, astronomy, molecular biology, DNA studies, physiology, paleontology, mathematics, and statistics – be considered, rather than relying solely on descriptive and hypothetical suppositions.

If the answer to Question 1 is “Yes,” please answer Question 2:

(2) Since the Universe, including human beings, is created by a Supreme Being (a Creator), why is creationism not taught in Tennessee public schools?

If the answer to Question 1 is “This question cannot be proved or disproved,” please answer Question 3:

(3) Since it cannot be determined whether the Universe, including human beings, is created by a Supreme Being (a Creator), why is creationism not taught as an alternative concept, explanation, or theory, along with the theory of evolution in Tennessee public schools?

If the answer to Question 1 is “No” please accept the General Assembly’s admiration for being able to decide conclusively a question that has long perplexed and occupied the attention of scientists, philosophers, theologians, educators, and others.

I am always happy to help out people. So in the spirit of pure charity, I offer free-of charge to the Tennessee Department of Education, the answers to the senator’s questions.

1. This is a question that cannot be answered scientifically. (This answer corresponds to his option of “This question cannot be proved or disproved” but I changed it slightly because his wording is awkward since you cannot prove or disprove a question.) So following the senator’s flow chart, we move on to question 3.

2. Not applicable

3. Because creationism is not science, it should not be taught in science classes.

No need to thank me, Senator Finney and the Tennessee Department of Education. I am happy to oblige.

Share this:

It had to happen some time. I have written before about how most people’s knowledge of the Bible is a CliffsNotes version, just the sketchiest of outlines of what is says. This is convenient because it enables each group or individual within Christianity and Judaism to pretty much adopt any lifestyle and morals and values and claim that it is how god would want them to live.

But in actual practice there are some restrictions. In contemporary America, there has grown up the consensus that to be a religious means at the very least avoiding drunkenness and profanity and promiscuous sex. Dressing nicely, going to church on Sundays, being polite and nice to others, and shaking hands with strangers in the pews are highly recommended. This has to be limiting to people who like to think of themselves as ‘real’ men and want to drink and swear and run around but still want to be considered Christian. Such people are worried that Christianity is becoming a religion for wusses.[Read more…]