I recently finished reading the book Subliminal: How Your Unconscious Mind Rules Your Behavior by Leonard Mlodinow. It is an excellent summation of the best scientific research on the subject of the unconscious. Mlodinow analyzes and organizes the material beautifully, and he also shares personal anecdotes for clarification and levity. For those unfamiliar with his past work, Mlodinow is a theoretical physicist who has published numerous books on a variety of subjects, worked alongside Stephen Hawking, and written for popular television series, including Star Trek: The Next Generation.

Subliminal is a wonderful book in its own right, but I like it for another reason. Whether or not Mlodinow is even aware of the fact, the book provides scientific and empirical support for psychological ideas advanced by Friedrich Nietzsche in the 19th century. For example, the book presents evidence to support Nietzsche’s claims on everything from the illusion of free will to the will to power (though it does not name this idea explicitly). I am especially fond of Mlodinow’s chapter on “Feelings,” which explores the role that our unconscious feelings play in our choices and actions. This role is indeed a greater one than that played by our conscious, rational faculties. Our reasoning and thoughts are always just the post-hoc justifications for our behaviors, never the true motivations. As Nietzsche said, “Thoughts are the shadows of our feelings—always darker, emptier, simpler.”

Take the following passage from the “Feelings” chapter:

We ask ourselves or our friends questions like “Why do you drive that car?” or “Why do you like that guy” or “Why did you laugh at that joke?” Research suggests that we think we know the answers to such questions, but really we often don’t. When asked to explain ourselves, we engage in a search for truth that may feel like a kind of introspection. But though we think we know what we are feeling, we often know neither the content nor the unconscious origins of that content. And so we come up with plausible explanations that are untrue or only partly accurate, and we believe them. Scientists who study such errors have noticed that they are not haphazard. They are regular and systematic. And they have their basis in a repository of social, emotional, and cultural information we all share.

This brings up issues similar to those I have discussed previously in “Hume, Kael, and the Role of Subjectivity in Criticism.” Just as with the questions Mlodinow asks above, the question of why a person likes a particular film or artwork is also always answered with a convenient narrative rather than an honest account or an objective reason. This is why criticism is subjective and why objectivity is an illusion. This is also the meaning behind Stanley Kubrick’s statement: “The test of a work of art is, in the end, our affection for it, not our ability to explain why it is good.” To quote Nietzsche again: “It is hard enough to remember my opinions, without also remembering my reasons for them!”

This is not to discount criticism, of course. I offer a solution to this supposed discrepancy in my aforementioned “Subjectivity” essay. But if you are at all curious how your unconscious affects your aesthetic judgments, or if you would like a greater understanding of just how deeply your unconscious governs your behavior and shapes your identity, I wholeheartedly recommend Mlodinow’s Subliminal.

The first thing we notice is the noise: loud machinery, clanking metal, grinding chains. Then we catch abstract glimpses of the moving parts—and, for brief seconds, the sight of the dark ocean crashing below. But we can’t seem to catch our bearings. The camera is purposefully disorienting us, unsettling us. And it only gets worse from this point forward.

The soundtrack will soon give way to the wet scaly slaps of dying fish, the rattle of cracked shells, the gurgles of submersion, and the prehistoric calls of ravenous gulls. The visuals will move somewhat rhythmically between machines and flesh, metal and viscera. (One may easily be reminded of mid-90s Nine Inch Nails music videos.) This is Leviathan, a captivating documentary by Lucien Castaing-Taylor and Verena Paravel of the Sensory Ethnography Lab at Harvard University.

In regard to theme, narrative, or even setting, we have no firm footing. We are on a fishing vessel, but we might as well be on another planet. The voices of the crew sound alien. Their faces are the only evidence that they are human. And they are our only respite from the dripping blood, the dancing fish heads, the bulging eyeballs. Indeed, the animals look horrifically distorted and bloated, like demons out of Hieronymus Bosch. The aforementioned birds, in flight against the black sky, recall both the Ride of the Valkyries from Wagner and the flight of dancing spirits in the Night on Bald Mountain sequence of Disney’s Fantasia. This should give you an idea of the film’s overall tone, as neither reference supplies much comfort.

Leviathan opens with an epigraph from the Book of Job, and it ends with a credit reel that lists the scientific names of the depicted species. The significance of these details, if any, is left for the viewer to decide. Some have read Leviathan as a parable about the viciousness of humanity against the environment, which it rapes and wastes with abandon, its hulking fishing vessels being construed as the true “Leviathan” of the title. There is perhaps good evidence to support this reading. However, I think that the film is better experienced with no such narrative in mind. It should be felt viscerally, like a psychological horror movie that creeps under your skin like botfly larvae. As already mentioned, it uses frequent disorienting cinematographical effects typical of films in that genre, and the audio track embodies the very essence of foreboding disquiet. On top of this, a few scenes of systematic butchering are certainly unnerving for anyone who has seen slasher films like The Texas Chainsaw Massacre.

A close relative to Leviathan is Werner Herzog’s Lessons of Darkness, a film that presents Kuwaiti oil fires as alien phenomena. Both films offer us an alternative view of the world we think we know so well, and both make no attempt to shield us from the horror that runs so close to the surface of all that we do, breaching it here and there like starfish limbs through a fish net. But Leviathan does it better. It’s truly an astonishing and unforgettable work. Let it wash over you; let it nauseate you and stir up your unconscious fears. Maybe you’ll enjoy it as much as I did.

Once a week, Criticwire asks a group of film critics a question and compiles their responses. This week’s Criticwire Survey seems to have caused a bit of a stir. Here is the question posed by Matt Singer:

What movie widely regarded as a cinematic masterpiece do you dislike (or maybe even hate)?

This question and its responses were promoted under the incendiary headline: “Overrated Masterpieces.” Needless to say, this provoked some outrage, both in the comments and across the web. Only one critic, Glenn Kenny, appears to have left the proceedings unscathed. The reason for this is that he refused to name a film:

I find this question especially dispiriting, as it’s really just a form of bait, and a cue for individuals to come up with objects to snicker at, feel superior to, and all that. I’m sure many critics will have a blast with it.

Kenny follows this with a passage from Richard Hell’s autobiography where Hell writes of an encounter with Susan Sontag in which she laments the fact that she has opinions because, as Hell puts it, “opinions will solidify into prejudices that substitute for perception.”

First of all, I would argue that Kenny himself is using this opportunity to “snicker at” and “feel superior to” his fellow critics. Second, I would argue that the point of this particular survey is to counter popular opinions that may have solidified into prejudices, not the other way around. Finally, I think that it is Scott who is being “glib” in his dismissal of the exercise as “pseudo-contrarianism.”

Each individual critic (Kenny included) will have points of divergence from the critical community with which he or she belongs. This is only natural; individuals have individual tastes (e.g., likes and dislikes) based on individual life experiences. But here is an unsettling fact: many people will accept that certain films are sacred—sometimes irrationally and without having actually seen them—for the single reason that the films have been blessed with critical approval and labeled masterpieces. The critics who answered the Criticwire Survey are simply challenging this automatic acceptance, some even going so far as to offer rational and articulate defenses of their opinions (the opposite of pseudo-contrarianism, I would say).

The full English breakfast is the most overrated of British dishes – even the name is shuddersome. How did we become shackled to this fried fiasco?

Just as with the Criticwire Survey (and perhaps again due to the word “overrated”), Ramsden experienced a lot of backlash. He felt compelled to write a response (published only a day after the Criticwire Survey): “Which well-loved foods do you hate?” In this piece, we learn that Ramsden received accusations similar to those received by the film critics. For example, he, too, was accused of trolling (maybe by the A. O. Scott of the British food blogging world). However, Ramsden understands where the attacks are coming from:

I understand it because I’ve felt it too. It is perhaps not a rational reaction to a subjective aversion […], but we feel strongly about food and are thus oddly offended by someone vehemently opposing that which we cherish.

Yes, and people apparently feel strongly about film as well and will oppose subjective aversions to well-loved films with equal vehemence and irrationality. Ramsden, after providing a long list of similar aversions from some notable chefs and food critics, ends his piece by stating:

The common denominator with all of these dislikes is the mutual conviction that the other person is a loon, even a heretic. There are certain aversions – anchovies, haggis, balut, kidneys – that are entirely understandable (you don’t often hear cries of “you don’t like kimchi?!” except perhaps in certain foodish circles), but when it comes to dissing curry, fish and chips, pasta, or indeed a fry-up, it turns out people are, at best, going to think you very odd indeed. Still, can’t blame a man for trying.

Glenn Kenny chose not to name a film on which his opinion differs from that of the masses. Does that mean he holds no such opinion? That no such film exists? Hardly. As I said, he used this opportunity to elevate himself above his fellow critics under the pretense that criticism has loftier goals than this sort of muckraking. I think that he just didn’t want to get his hands dirty. I prefer the “loons” and the “heretics” who are unafraid of their own subjectivity. On a related note, I believe that Pauline Kael would have loved this week’s Criticwire Survey. Especially the word “overrated.”

Once again, I feel compelled to address some claims made by the art critic Jonathan Jones at The Guardian. This time, Jones has written a piece attacking Banksy. This in itself is not the problem. The problem is that the attack makes very little sense under close examination.

Here is the crux of Jones’s argument:

Some art can exist just as well in silence and obscurity as on the pages of newspapers. The Mona Lisa is always being talked about, but even if no one ever again concocted a headline about this roughly 510-year-old painting it would still be as great. The same is true of real modern art. A Jasper Johns painting of a network of diagonal marks surrounded by cutlery stuck to the frame, called Dancers On a Plane – currently in an exhibition at the Barbican – was just as real, vital and profound when it was hidden away in the Tate stores as it is under the gallery lights. Johns does not need fame to be an artist; he does not even need an audience. He just is an artist, and would be if no one knew about him. Banksy is not an artist in that authentic way.

I strongly disagree that art can exist in a vacuum; I think it needs an audience to be art. Thus, I cannot fathom the absurdity in the statement that Jasper Johns “does not even need an audience” to be an artist. How does that work exactly? It doesn’t. Jones is simply presupposing a metaphysical reality in which art possesses inherent value independent of humans. This presupposition, being fictional, remains unsupported. How can a work remain profound if no one is around to bestow the value of profundity upon it? And does it not take a human mind to transform Jasper Johns’s “network of diagonal marks surrounded by cutlery stuck to the frame” into a cohesive whole? Truly, then, one cannot dismiss Banksy on the grounds that his work demands an audience. All art does.

Another problem that I have with Jones’s argument is that he takes the properties that make Banksy aesthetically interesting to most people and transforms them into Banksy’s aesthetic shortcomings:

Banksy, as an artist, stops existing when there is no news about him. Right now he is a story once again, because a “mural” by him (street art and graffiti no longer suffice to describe his pricey works) has been removed from a wall and put up for auction. Next week the story will be forgotten, and so will Banksy – until the next time he becomes a headline.

Part of Banksy’s “art” is in the impermanence of his pieces and in the confrontational nature of his “murals” that are designed to disrupt people from their daily routines to make them stop and notice something, to see things differently. Perhaps comparisons to static pieces like the Mona Lisa are not the best means to understand performance-based work of this nature (though I admit that because the art market has laid claim to Banksy, such comparisons are not necessarily off base, either).

But “street art” is hardly the first recognized art form to be temporary and confrontational in the manner adopted by Banksy. And why does Jones consider fame and branding as faults or weaknesses of the artist? These attributes were obviously as essential in solidifying the legacies of the artists whom Jones admires as they were in elevating Banksy above his peers.

Jones claims that he wants “art that is physically and intellectually and emotionally real.” Unfortunately for him, as his blog on Banksy makes clear, he seems to have no idea what that even means.

An interesting question has been making the rounds in certain critical circles since the release of Kathryn Bigelow’s Zero Dark Thirty this past December. And I’m not talking about the question of whether or not the film endorses torture (it doesn’t). I’m talking about the broader question that has been phrased this way by Danny Bowes at Movie Mezzanine:

[…] is a critic under any obligation to render a moral judgment on a film?

After pointing out that the debate extends beyond Zero Dark Thirty to films like Django Unchained and Beasts of the Southern Wild, Bowes states:

With each of these films, critics praising the aesthetics of each have been accused of ignoring, rationalizing, or even siding with offensive content therein. In response, critics have been forced into a “no I do not” defensive posture, and a great deal of huffiness about art for art’s sake and the primacy of the work over the given critic’s personal beliefs and austere objectivity and so forth has ensued.

In the past, I would have agreed with the l’art pour l’art critics who claim that they can separate their personal beliefs from their aesthetic evaluations of a given film and adopt an “objective” or an “impersonal” position from which to judge the work in question. But not anymore. Indeed, it is my understanding that an aesthetic judgment is inseparable from a moral judgment, and vice versa. I think that Bowes agrees:

Every act of criticism is a moral judgment, and not in a glib, media-trolling, mid-’60s Jean-Luc Godard way, either. However objective any critic tries to be in evaluating any work, the evaluation is being conducted by a matrix of observation, cognition, and the innately unique assembly of life experience and education that makes up all the things the critic knows and how s/he knows them.

Yes. Each person who makes an aesthetic judgment on a work of art cannot escape his or her “unique assembly of life experience and education,” and this assembly includes a person’s adopted morality. Thus, I cannot consciously separate my moral leanings from my critical evaluations of artworks any more than I can separate my aesthetic taste from my moral judgments, no matter how hard I might try to hide the influence of one over the other. As the character Bill Haydon says in regard to his treason in Tinker Tailor Soldier Spy, “It was an aesthetic choice as much as a moral one.”

Bowes writes at the end of his piece:

The decision a critic makes to approach a movie on its own terms with as much objectivity as s/he can muster is a moral decision. Not everyone succeeds in completely divesting their preexisting baggage.

Not exactly. I would say that no one succeeds in this and that the morality present in a work of criticism is never a “decision” but inevitable. In addition, we can never really know the multitude of factors that have brought us to our critical assessments (factors as disparate as temperature, mood, and peer pressure), so how can we choose to ignore some while allowing for others? We can’t.

In Daybreak, Friedrich Nietzsche writes:

You dislike him and present many grounds for this dislike—but I believe only in your dislike, not in your grounds! You flatter yourself in your own eyes when you suggest to yourself and to me that what has happened through instinct is the result of a process of reasoning. (D358)

Though criticism remains our best attempt to account for our likes and dislikes, we must recognize the limitations of the undertaking (e.g., the fact that it might just be a post-hoc rationalization of a knee-jerk judgment). And we must stop pretending that we can consciously control what influences our opinions and what doesn’t, whether it be our moral conditioning, environmental factors, or something else entirely. The best we can do is be honest regarding the extent of our knowledge in this area. In most cases it will be minimal.

I had wanted to write about video games as art for some time now, but I was worried that the question was no longer relevant–that most people (including me) had finally accepted the fact that video games can be art. This past November, Disney released Wreck-It Ralph, a film which brings to life video game characters and worlds in the manner of Pixar’s Toy Story. In his review of the film in The New York Times, A. O. Scott writes:

The secret to its success is a genuine enthusiasm for the creative potential of games, a willingness to take them seriously without descending into nerdy pomposity.

Clearly, I thought, this means that we’ve reached a turning point–that critics like A. O. Scott are now on board and willing to accept the aesthetic potential of games.

But I was wrong. On November 30, Jonathan Jones, the art critic at The Guardian, published a blog entitled “Sorry MoMA, video games are not art.” His blog is a response to the fact that the Museum of Modern Art in New York plans to curate a selection of video games as part of its Architecture and Design collection. Despite the fact that this is not the first time that an art museum will be playing host to video games (the Smithsonian American Art Museum held such an exhibit earlier this year), Jones has decided to put his foot down and play the predictable role of arbiter of what is and isn’t art (the role once famously played by Roger Ebert in this particular debate). He writes:

Walk around the Museum of Modern Art, look at those masterpieces it holds by Picasso and Jackson Pollock, and what you are seeing is a series of personal visions. A work of art is one person’s reaction to life. Any definition of art that robs it of this inner response by a human creator is a worthless definition. Art may be made with a paintbrush or selected as a ready-made, but it has to be an act of personal imagination.

Whether through ignorance or idiocy, Jones has made an argument that is simply not applicable to video games. If he were to watch the great documentary from this year on the subject of independent game design, Indie Game: The Movie, he would realize that he has no right to claim that video games are not the work of personal imaginations. In that film, we see just how personal games can be to their creators. We watch Phil Fish, for example, as he obsesses endlessly over every detail of his game FEZ, postponing its scheduled release for years and revealing how much of himself is in the game–how it has become his identity. We also watch Edmund McMillen and Tommy Refenes as they complete Super Meat Boy, an ode to their childhood video gaming experiences. From the Wikipedia synopsis of the film:

McMillen talks about his lifelong goal of communicating to others through his work. He goes on to talk about his 2008 game Aether that chronicles his childhood feelings of loneliness, nervousness, and fear of abandonment.

Surely this suggests the extent to which games can be the works of personal imagination. Another film playing the festival circuit this past year, From Nothing, Something, a documentary about the creative process, also features a video game designer among its artist subjects: Jason Rohrer, who “programs, designs, and scores” his games “entirely by himself.” It does not get more personal than that.

And this is not even limited to independent game design (a field which Jones might not even know exists). Surely the games of Nintendo’s Shigeru Miyamoto are recognizable as products of that creator’s personal vision. Through pioneering works such as Donkey Kong, Super Mario Bros., and The Legend of Zelda, Miyamoto became one of the first auteurs of game design.

Regardless, Jones ends his argument against video games as art by making a point about chess:

Chess is a great game, but even the finest chess player in the world isn’t an artist. She is a chess player. Artistry may have gone into the design of the chess pieces. But the game of chess itself is not art nor does it generate art — it is just a game.

Jones’s use of chess to illustrate his case against the aesthetic value of games is interesting because he writes about the game in a previous blog titled “Checkmates: how artists fell in love with chess.” In this piece, he doesn’t necessarily call chess art (he seems content to assign it the role of muse), but he comes awfully close:

It is a game that creates an imaginative world, with powerful “characters”: this must be why artists were inspired to create designer chess sets long before modern times.

On top of this, Jones seems willing to concede the fact that chess pieces can be art. Would he also concede the fact that pixelated characters, orchestral scores, and other “pieces” of a video game can be art? (To be sure, there are clearly “traditional” artists who work on individual aspects of games: graphic designers, writers, and musicians.) My question would then become: Why cannot the many artistic pieces cohere into a single work of art that also happens to be a game? Architects create buildings that serve as works of art as well as living spaces. Imagine an art critic who would perhaps recognize the artistry in a stained glass window yet say condescendingly of the cathedral in which it is found: “It’s just a building.” The idea is absurd.

I am all in favor of meaningful distinctions between objects. We can have art and games as separate categories. But we must acknowledge that there can indeed be overlap. I already demonstrated on this blog how food can serve both instrumental and aesthetic ends. The same is true for games.

To see something as art requires something the eye cannot descry — an atmosphere of artistic theory, a knowledge of the history of art: an artworld.

The fact of the matter is that video games have now been allowed into two respected art museums (the Smithsonian American Art Museum and the Museum of Modern Art), the National Endowment for the Arts has started to allow funding for game designers, and the conversation about the artistic merits of games is alive and well–within the general populace, yes, but also within the hallowed halls of academia. This is enough, in my opinion, to qualify video games as art. Clearly, in practice, that is simply what they are. Psychologically, people are experiencing them in the same way that they experience objects more commonly classified as art (e.g., novels and movies). The fact that critics such as Jonathan Jones and Roger Ebert will not allow for the status of art to be extended to games–and that they would rely on smug and silly arguments to prove their points–says more about them than it does about the reality of the situation. They are great critics, but here, where perhaps they feel their grasp loosening around that which they believed themselves to be experts, they are simply wrong. We see some metaphysical justifications for their beliefs, but primarily we see the constricting influence of habit and conditioning–their inability to see other than what they have been trained (or educated) to see. But no matter. Others seem to have a much easier time seeing the artistic potential of games.

In an interview with USA Today about composing the theme song for the game Call of Duty: Black Ops II, Trent Reznor says:

I’ve watched with a kind of wary eye how gaming has progressed. I was there at the beginning with Pong in the arcade, and a lot of my great childhood memories were around a Tempest machine. I really looked at gaming as a real art form that is able to take a machine and turn it into something that is a challenging, human interaction puzzle game strategy.

Video games are culture; they are a new way of doing art. You know, I fought against them at first. I used to say that, you know, being able to make up a story as you went, I fought against that. I did a couple of whole speeches about how you want the plot in Shakespeare. But I’ve now understood.

Ridley Scott’s Prometheus is chilling science fiction, a Lovecraftian space odyssey that poses some big questions about the origin of life and its ultimate purpose. David Denby has called it “a metaphysical ‘Boo!’ movie.” Andrew O’Hehir compared it to Terrence Malick’s The Tree of Life:

Both are mightily impressive spectacles that will maybe, kinda, blow your mind, en route to a hip-deep swamp of pseudo-Christian religiosity.

I want to counter those claims by demonstrating that, though characters in the film may have faith in something beyond the material world, the film itself (mostly through the android David) depicts a world incompatible with that faith.

The film opens with a humanoid on what is presumably primordial earth. A spaceship is seen in the distance, apparently abandoning him. He drinks something from a cup and begins to disintegrate. His genetic material, we’re led to believe, helped spawn life on earth. Thus, we’re immediately given the film’s premise: an alien race “engineered” humans through this initial act of terraforming. This premise, quite naturally, invites skepticism. Even if an alien race did spark life on earth, there is no way that they could have predicted the paths that this life would take. There is no way that they would have been able to engineer the many happy accidents that allowed a branch from this seed to evolve into humans. Later, we will meet a biologist among the crew of the spaceship Prometheus. He knows how life evolved on earth and voices his skepticism at the idea that we were somehow designed. How does the script handle this contradiction? It renders the biologist irrelevant, as nothing more than a cowardly stock character. But skepticism hardly matters; we have already seen the creation of life on earth, so we must accept this premise, believable or not, as a fact in the world of the film.

This brings us to our protagonist, archaeologist Elizabeth Shaw. She (along with boyfriend Charlie Holloway) is the one who uncovered the cave paintings supporting the theory of extraterrestrial parentage. The mission of the Prometheus, we learn, is to find our alien ancestors and ask them why they created us. The assumption, of course, is that there is a meaning to human life, a reason for us being here. And this meaning, according to Shaw, is out there among the stars for us to discover. She wears her faith in this idea like a virtue; she also wears a cross.

But Shaw isn’t the only one who has a religious worldview at stake. Even Peter Weyland (the sinister corporate interest who is funding the mission) expresses faith in metaphysical gobbledygook when he says that David, his android creation, differs from humans in that he does not possess a “soul.”

We are told that David is different from humans because he has no soul — but is the trick really that David knows humans don’t either? Where humans pretend that they are different, that we have creators with answers to our questions, gods who will elevate us above the rest of the universe, David accepts the empty desert and the trick is simply: not minding that it hurts.

I agree with this analysis, and I think it is a key to understanding David’s function in the film and his obsession with Lawrence of Arabia. His fondness for the David Lean film is particularly fascinating. He even attempts to mimic Peter O’Toole through his appearance and mannerisms. In this ability to learn through experience and observation and to mimic the behavior of model figures, David is perhaps more human than the other characters can comfortably realize, despite his lack of a “soul.” As the author of the character analysis suggests, maybe David differs most from humans in that he can accept the meaninglessness of existence. For example, David knows all too well why he was created:

DAVID: Why do you think your people made me?

HOLLOWAY: We made you because we could.

DAVID: Can you imagine how disappointing it would be for you to hear the same thing from your creator?

In exchanges such as this, David perfectly undermines the metaphysical delusions of his companions.

So what of Shaw’s faith? What does it mean in this context? As I already discussed, we are shown the creation of life right at the start, so we at least know that Shaw’s theory of extraterrestrial parentage is correct (absurd as it is). We then see Shaw and Holloway uncover physical evidence to support their claim (cave paintings around the world that depict giant figures pointing to a specific star system). People are reasonably skeptical, but rather than argue with the strength of their evidence, Shaw relies on a typical religious defense: “It’s what I choose to believe.” She clearly possesses a metaphysical bent; she demands a meaning for her life outside of her own making, and as I said earlier, she wears her faith in this objective value like a virtue. But the manner in which life was created, designed, or engineered is depicted as a material process–not a spiritual one.

Thus, Shaw can accept her theory of extraterrestrial parentage without the need of a metaphysical foundation for this belief. She has data that supports it (including strong DNA evidence), even if it goes against the established body of scientific data. So her conviction and her cross are peculiar affects, much like Captain Janek’s Christmas tree (a cultural symbol that survives through habit and custom). What’s even more interesting is that Shaw does not discard her faith at the film’s end, even after she exclaims quite exuberantly: “We were so wrong.” She requests her cross back from David, who had removed it earlier. He asks: “Even after all this, you still believe, don’t you?” It’s a valid point. How can we take Shaw seriously as a scientist if she is so willing to turn a blind eye to all that she has just witnessed? We are left silently snickering at this all-too-human foible, just as David mocks it in his own special way.

So Prometheus does not support a metaphysical outlook, even if its characters adopt one. As Jim Emerson points out: “Not unlike Star Trek V: The Final Frontier, Prometheus uses god as a MacGuffin.” Furthermore, David the android serves as the perfect foil to the humans and their odd beliefs. Toward the end of the film, on the brink of death, Weyland declares: “There is nothing.” “I know,” David responds with appropriate coldness. “Have a pleasant journey, Mr. Weyland.”