December 8, 2014

I’m seeing a lot of conversations stir up around victim advocacy and feminism in policy and reporting regarding the recent Rolling Stone story about campus sexual assault. A lot of it is the appropriate review of journalistic standards after a story that, at this point, has definitely not gone well for anyone. Some of it is veering into a critique on whether sensitivity to victims should play a role in reporting at all, or responding to a situation in which such sensitivity was misapplied to launch a campaign against victims.

For anyone who’s still confused or anxious about the need for feminist perspectives, allow me to put it this way.

One of society’s challenges addressing sexual assault is that dealing with a real story will almost automatically entail some cognitive dissonance. When you are speaking to and about victims who talk about sexual assault, your presumption (based on overwhelming statistical likelihood) should be that they are being truthful and potentially traumatized — yet when you are speaking to and about offenders, all who are implicated in a crime are presumed innocent until found guilty. Both of these are in some ways “liberal” principles and both are necessary to prevent irreparable harm to people. Yet they are cognitively dissonant. It can be confusing to understand how you can empathize totally with victims and at the same time remain fair on matters of due process.

This is why society needs a mindful and sophisticated system for dealing with sexual assault. That includes victims advocates (who are in a separate role from judge and jury, incorporating some elements of the role of a mental health professional and some elements of the role of a personal attorney) and why the public and media should strive for a similarly mindful and sophisticated attitude towards it.

What else should be part of that sophisticated system? Part of it addresses the bullshit, shaming and contradictory expectations we pile on women regarding their sexuality.

Rape is wrong because it violates a person’s bodily integrity, privacy, and trust in others — it does so in principle, but also with likelihood to do psychological damage. (You’ll notice that this is completely neutral towards a victim’s gender.) In the eyes of traditional society, however, rape is wrong because it tarnishes a woman’s sexual purity and innocence (which is a gendered interpretation). Therefore, society tends to asses harm done to women by sexual assault by evaluating a victim’s state of purity and innocence prior to being sexually assaulted, and continues to question their purity and innocence afterward by evaluating whether they display too much or too little trauma to be trustworthy. Society also tends to be distrustful and patronizing toward women’s memories and interpretations — this can include police, investigators and prosecutors, who are usually men. That’s utterly immoral and damaging to put a victim through.

Identifying these gender-based double standards, and shifting towards gender equity, is feminism. I don’t think society’s approach to sexual assault can be considered sophisticated, or even competent, without universal understanding of feminist perspectives.

Campaigns to change attitudes towards sexual assault are often challenged by skeptics speaking as if there are “two sides” to the issue. Granted, on any topic, every individual has her or his own experience and so there is an unlimited number of “sides.” But I think it’s very ironic when one cause, seeking to improve general understanding of what sexual assault is and the harm it does to victims so that the incidents don’t happen in the first place, are asked to consider the other “side” — the experience of the accused. Mind you, the main goal of anti-rape activists, who are painfully aware that many offenders are said to “seem like nice guys” who for whatever reason fail to register the humanity of a person they victimize, is to change the culture to prevent there from being victims or offenders in the first place.

We are universally educated about due process — yes even feminists are — so we don’t need to lecture women, feminists or victims advocates about due process. They all know what it is. But feminist perspectives, though diverse, still face misunderstanding and resistance. So, yes, you need to learn about feminism to speak competently about sexual assault, because in the absence of that knowledge there is bias. There’s room for different viewpoints when you get there, but you have to have that basic groundwork of knowledge to play a positive role.

July 3, 2014

There’s a tendency to think of persuasion in terms of winning an argument or debate. You’re trying to be rational: you’re thinking that if you defend your position with logic and evidence, and prove that the counter-arguments are false, the other person is now obligated to switch to your view.

If only life were so simple. In your mind you have proven you are right, but the other person is just thinking she or he needs to do some homework to come up with a better rebuttal tomorrow. Once you set someone up as your intellectual adversary they are not going to cross the line to join you.

That might be why political debates don’t have a significant influence on elections, no matter how well one side does or how eloquently the positions are argued. People root for the candidate they already support, and the arguments can stir up enthusiasm but they don’t “convert.” There’s a very different way that candidates vie for votes.

The reality is that people in the day-to-day world believe things because they want to. There are just far too many competing ideas and decisions to make to investigate each one objectively. Everyone is able to weigh different factors — pros and cons — to arrive at an opinion or decision, but the weight people grant to each factor comes down to how much they like and identify with it.

Persuading effectively comes down to this simple approach:

Identify an idea that you and your audience can already agree on. (Common values or principles, something you might both say is a problem, a universal need, etc.)

And that’s basically it. People are very inclined to side with you simply because you are someone with similar concerns and reached the conclusion you did.

The main goals are to be relatable — the person you are trying to persuade is your peer, not inferior to you — and to emphasize your area of agreement so much that the other factors fade from mind.

Here’s the important part: DO NOT argue points where you disagree. Acknowledge them as valid points and then steer the conversation away from them. No matter how ridiculous you think they are and no matter how strong your evidence against them is, you’re not going to convince people their own interests or ideas are wrong. They will always be a factor, but you can say that a different set of factors (the ones that support your cause) are more compelling for you.

Think back to the last time you observed a political campaign. One thing you won’t see a candidate do is try to get voters to change their minds about their basic ideals and principles. They won’t try to convert liberals into conservatives or convert conservatives into liberals, but the candidates will jump and tumble over each other trying to validate the experiences and values of crucial swing voters.

They’ll say, “We know that families in Ohio are struggling.” (something that they’ve poll-tested to be sure the audience they’re targeting agrees). “Jobs have been shipped overseas, and too many people are worried whether the manufacturing industry — once the backbone of the American economy — is ever going to come back…” (validating the audience’s existing worries and experiences) “…which is why I have a plan to create more than a million new manufacturing jobs over the next five years.”

This candidate might disagree with the target audience when it comes to immigration policy, foreign policy or social issues, but isn’t going to try to sway their minds on those things. She’ll just keep hammering on areas where she and the audience agree, and try to make the election all about those things.

Now I call this a “trick” because that’s what it is; while it comes instinctively to many people, it’s a technique that can help you towards high-minded goals as well as goals that are very selfish and manipulative. If you’re trying to get people to turn against their own interests, eventually they’re going to figure that out and you might never gain their trust again. And if someone senses over time that you’re only feigning commonality to persuade them without being open to their ideas as well, they’re going to get really annoyed and stop listening to you. You’re better off if you’re looking to learn from people as much as to persuade them.

April 10, 2010

Journalism is an important institution in the free world. Few dispute this. Journalism is an important institution in America, too; one so valuable that the framers of our Constitution chose to codify the freedom of the press in the First Amendment.

We live in a complicated world. Few dispute this. Nobody has time to absorb information found in every corner; ordinary Americans do not have time to resolve, for themselves, the inner happenings of the titans on Wall Street or the troubled alleys in Afghanistan. Ordinary citizens do not have time to investigate potential corruption in government or the potential outcomes of current economic trends.

We the people rely on other people to gather that information for us: we rely on journalists.

People will criticize the press for doing just that. This is not a new or controversial statement. The act of journalism has been imbued with constant accusations of “bias,” since its beginning, and on many occasions in history governments as well as private institutions have tried to shut it down. The way this story is written clearly benefits the Left, or benefits the Right, a detractor will say.

March 28, 2010

If a straight man suddenly strips off his shirt in public, it means he wants to fight somebody. Either that or he just found out there’s a bee in it. But the bottom line is that wearing shirts is considered the norm for most straight guys.

Not so for gay men, who take great pleasure in letting everyone know when they’ve been to the gym, and reminding them of it, again, and again, and again. A gay man requires little provocation to bare skin and any of the following factors suggest shirtlessness: It’s a nightclub. He’s jogging. It’s a sunny summer day. Somebody shouted “strip!” A webcam is on. He’s just been brutally defeated at beer pong. It’s his Connexion profile. He’s in California. He wants to get back at his boyfriend, or ex-boyfriend. He is in a gay-themed advertisement. His drunk friend made an unserious comment about him taking off clothes so now he has to make her regret it. He’s at Pride. Someone is taking a picture. He is dancing. It is part of his Halloween costume. He wants you to see his tattoo.

You get the idea.

And when it comes to willingness to take off pants in public, gay guys are light years ahead of straight guys. (more…)

March 23, 2010

We had a popular new Democratic president with outstanding rhetorical skills, elected with the biggest percentage of voters in 20 years – largely on plans to reform healthcare – allied by the biggest Democratic majority in congress since 1976 – and in spite of that it took a year-long, caustic and fierce battle to the brink of political suicide to enact a bill that is so moderate and incremental that a liberal Republican could have thought of it. Indeed it has key elements John McCain supported in 2008 and looks somewhat like what Mitt Romney enacted in Massachusetts.

I’ll say it again: “Obamacare” is moderate and incremental. It doesn’t go as far to cover everyone as we will need to go in the future, and some will say it doesn’t even go far enough for now. Yet we’ve come out with a country more divided, with a more fearful status-quo, than we have seen since the Civil Rights era.

Lets create a scale of government involvement in a healthcare system for perspective. A totally government-run and non-optional healthcare system where all doctors and healthcare workers are government employees – say Cuba’s system – is ranked as 100 in government involvement. A totally unregulated “Ayn Rand’s Dream” free-market system where you only get what you can personally pay for even if you’re dying, and providers can set whatever price they want, will be a 0 in government involvement.

That would mean “Obamacare” moved us from about a 25 to a 35. Most of the developed world is between 50 and 90.

The National Health Service in the U.K., in which the government employs all doctors but a small minority of citizens still choose private plans and there are small fees for most services, would be a 95. Canada’s government-insured system where the government pays for care but you get it from private doctor’s offices and hospitals, would be about a 60 with some government and some market. A private insurance system that contains one “public option” letting people buy insurance from the government if they want – a true balance letting individuals opt for a government or private system – would be about a 45. Switzerland’s system with compulsory health insurance from nonprofit private companies (banned by law from earning a profit on their services) would be about a 40. America’s pre-2010 system, which guarantees care in worst-case scenarios where you are broke but dragged to the hospital bleeding, and provides mostly-free care to seniors, some poor people and veterans – but is mostly market-run and leaves many uninsured – would be about a 25. The new system, when fully initiated after 2014, will ensure that anyone can some level of routine care if they want it and enforces penalties to encourage everyone to do so, but from private companies that earn lots of profit for providing care. It’s a 35.

The changes will make a big difference for many uninsured and under-insured Americans, but the post-“Obamacare” American healthcare system is still one of the most right-leaning and market-oriented systems in the developed world. And the right-leaning half of the country is treating it like the plague.

March 10, 2010

You can’t convince me how you believe in the potential of kids from low-income, failing schools, and then in the same breath argue that people who grew up in those schools are bad candidates to be teachers.

But that’s what a recent article in Newsweek seems to do, in a discussion of the need to fire teachers whose students underperform, and the need to recruit new teachers who came from more prestegious colleges.

I don’t dispute the article’s sentiment towards bad teachers, but this quote from a sidebar in the Newsweek article caught my eye: in “2000, 37% of teachers [came] from colleges with SAT scores in the lowest 5%,” explaining that this happens because teaching is an “undesirable” fall-back job.

The SAT, like the ACT and every standardized test, does not measure intelligence: it measures the value of your pre-college education. So if the public education system is flawed – and the Newsweek article argues yes, it is – it seems ridiculous to be judging students or their colleges on what their SAT scores were or what their school’s average SAT scores were. Consider also that a college with SAT scores in the lowest 5% are not representing the lowest 5% of students, but rather, the lowest 5% SAT-scoring colleges, which still select from higher-performing high school graduates and represent closer to the 50th percentile of all students.

Essentially, the statement in Newsweek is like saying low-income people who graduated from urban schools with average test scores and worked their way through school at the city college are a black stain on the educational system as teachers, compared to students who went to major universities and lived in the dorms. This is a prime example of the politics of privilege.

Yes, there are myrad problems with schools in America, which is why, as Newsweek itself cites, kids who grow up in low-income households underperform middle-class students, and black and hispanic kids underperform white kids in public schools year after year. It has been a permanent problem plaguing the country and proving that some injustice is taking place. And while teacher incompetence might be a factor in school districts everywhere, it does not explain the fullness of this disparity.

Poverty is one of the most obvious unaddressed factors here, but here’s something else that stands a chance of explaining much of this issue. The vast majority of teachers are white and come from middle-class backgrounds. In schools where the majority of students are Hispanic or black, the white teachers are a “ruling class” of sorts in an intrinsically sensitive situation as the ones making crucial decisions for and wielding authority over people who are different from them. We know that to grow up white in America is to be instilled with subtle and overt cues that your own culture, values and experiences are superior; considering the power a teacher has over her or his students, it would be so easy for conscious and unconscious biases to affect the students. Teacher training programs often pay some lip-service to diversity, but cannot be truly effective unless they are led and organized by people from diverse backgrounds who aren’t afraid to “go there.”

Perhaps the fact that white, middle-class people who have not been thoroughly trained in anti-racism dominate the American teaching class is responsible for some of the following facts: (more…)

February 18, 2010

I was inspired by this video where J.K. Rowling, the author of Harry Potter – who was living off of government welfare when she wrote her first book – explains the benefit of personal failure and disadvantage.

I’ve often had mixed feelings about academia, which seems to, on one hand, be a noble realm of liberalism and ideas, designed to encourage our brightest minds to collaborate for a common good. On the other hand, academia is full of institutions of privilege that are designed take smaller distinctions in competence and intelligence on admissions requirements, and make them big distinctions through opportunity and prestige by the time of graduation.

It seems that the ultimate goal of presidents and faculty members who aim to make their academic institutions “prestegious,” is to aid and increase social inequality and stratification. They want to give their graduates the best footing compared to everyone else, and the differences between the Ivy Leaguer and the non Ivy Leaguer are much greater after graduation than before.

My personal experience colors my mood towards academia. I don’t think I lacked the intelligence to go to a prestegious out-of-state school; teachers and counselors pointed me in that direction from an early age and insisted I was sufficiently talented. I think that I didn’t have that opportunity because it was too expensive and because personal burdens I have faced made a single-minded quest for success impossible. My dream was to go to NYU, and while my high school grades and test scores were sufficient, I realized it was out of my price range by a factor of five. Similarly, I think that the privileges I have had by being white and middle-class and male are unearned, and I don’t know if I could get even to where I have gotten if my background was different.

These are all questions we have to face living in a society that deems itself a “meritocracy,” that still holds a comparitive model of success even as it attempts to develop a fair and coherent system for evaluating success.

Those feelings really come to a head when it comes to my thoughts an institution such as Harvard, which, as a liberal person, I want to defend from the irrational hatred of the Right, yet I have seen much conceit and privilege come from there, as well as ideas that are much less than progressive. How does one, then, gracefully address the unearned privileges that go unrecognized in Harvard’s student body, and yet honor the hard work and merit that its graduates have invariably accomplished?

I think that J.K. Rowling, who saw some period of despair after she graduated from college, does a brilliant job of putting the culture of success in privileged academia in proper context without making assumptions or rabble-rousing.

Harry Potter is often slighted by elites in literature and art as being unintellectual and phillistine; I had plenty of English professors who reflected that sentiment. But she is also the person who made writing – a traditionally poorly-paying profession – into the most lucrative, as she is the richest person in the United Kingdom today. It is hard to say her interpretation of literature is less valuable or prestegious than theirs in light of that, and it’s hard to knock on her in light of a speech that is similarly nuanced and insightful.

This video is two years old, but I stumbled upon it today and had to pass it on.

January 11, 2010

I used to reflexively put all my friends’ application items on “hide” when they showed up on my Facebook news feed. That included annoying, meaningless notifications like “Christina could use some help fertilizing her crops!” or “Dan found a baby calf on his FarmVille, will you give it a home?”

When I saw a friend obsessively tending to her FarmVille Farm, I asked what it was about, and she was happy to explain. Farmville looks kind of like SimCity – a game I loved as a kid – except that instead of zoning for homes you plant crops, and come back to harvest them when they are grown, which brings in money you can put into buying equipment or more crops. In the communication era there is an added social element that traditional computer games from my youth didn’t have, and that is in peer networks; in Farmville you can send gifts to other users or add them as “neighbors” in the game so they come fertilize your crops, which scores you both points. You can decorate your farm to make it look nice for when visitors see it on their own computers, and there are even certain items on Farmville that you can only get with the help of others, which help you advance or bring in cash.

The game looked amusing enough, so I said why not and signed myself up – it’s free, and I figured you can put in as much or as little time as you want to. But there’s a problem with that kind of test-the-waters approach – FarmVille is addictive. It’s designed to give you early rewards, along with increasing responsibility. They start you off with a surge of excitement as you harvest fast-growing crops (strawberries mature in four hours) and advance through the early levels quickly. But as soon you plant something, you give yourself the requirement of coming back soon or face the risk of having unharvested crops linger too long and “wither” to brown twigs, turning moneymakers to money sinks.

A quick google search will indicate how many people are hooked on this game – there are dozens of blogs devoted to Farmville tips and strategy, and communities both for and against the rapidly-growing phoenomenon. At the time of me writing this on January 11, 2010, there are about 75 million people using Farmville across the world (and growing exponentially as each user brings in two or three friends). To put that in perspective, if each of those users spent an hour a day on the game (which would not be far-fetched and many people put much more time into it), the application would be accumulating as many hours of attention as the entire economy of a small country or a medium-sized American state. Picture every working-aged person in Ohio waking up at dawn to harvest the digital pumpkins and plant daffodils, which will be ready in 2 days.

I remember when I was sixteen and people treated the Internet as a geeky thing that socially-awkward people were drawn to. “You actually have a website up there?” someone would ask with a wrinkled nose, talking about my Friendster account or, later on, my profile on an early version of Myspace. “How creepy. You talk to people? Are they, like, stalkers or something?” Now, virtually everyone under the age of 40 along with a hefty dose of those over 40 have Facebook accounts. Maybe Farmville is the next Facebook, if the 70 million users (and growing!) are any indication: this stuff can catch on fast.

Now I’m checking my Facebook friends’ news feeds for brown or golden chicken eggs somebody found in a chicken coop that, if clicked, will give my farm new chickens or occasionally other treats like a fig tree or water trough.

A friend told me I was being ridiculous. “That isn’t even real,” she told me, and asked me how a pursuit that can only serve to be cyclical (harvest crops to earn money to plant crops) is worthwhile.

Well, I said, isn’t that cycle a lot like the way the real world works? FarmVille is like capitalism, intrinsically connected to the idea of perpetual growth. Think about it: all you really need to survive as a human being is 2,000 calories a day, a roof over your head and maybe, you could argue, medicine. Things like television, computers, brand name clothing and updated styles in furniture are all vanity and excess. Things we grow so used to that we consider them “necessities” and couldn’t imagine living without them. Plop a guy from 40,000 BCE into our society and he’ll be thinking how about I set up a leather tent in your back yard, work an hour a day to pay for canned beans and rice and have 23 hours of free time to do whatever the hell I want? That’s the life! Most people aren’t the caveman, though; most of them work hard for extra pay and buy nice things all for the sake of keeping up with the Joneses, which is all I’m doing on my farm – harvesting crops, buying expansions and keeping up. I don’t want my friends reaching level 15 before I do!

I have only been on FarmVille for three days. I am generally motivated by novelty, but novelty by definition doesn’t last. I’m sure that as time goes by the excitement will wane and harvesting virtual crops will settle into the same smooth daily predictability of checking my inbox or brushing my teeth. I’ll let the artichokes sit ripe for a few hours till I have time to get them, maybe losing one or two to withering every now and then but generally keeping things regular and comfortable.

In the meantime, my first crop of yellow bell peppers is ninety-five percent grown and I am absolutely thrilled. A few of my friends fertilized them for me, so we’ll have bulging, sparkling yellow produce in no time. I could leave them alone to look pretty on my farm, but instead I’ll harvest them the second they finish, to plant new seeds – time is money! We value hard work here on FarmVille!

In the beginning, and I suppose even now, I’m a little embarrassed to be caught up in this. I’ll click “ignore” when the game prompts me to post eggs on my wall for friends to collect from me – instead I email them directly to someone who sent eggs to me first. I don’t want people to see lots of FarmVille notifications on my public wall and realize that I’m obsessed with a computer game.

But outsiders oughtn’t be so quick to judge, really. If you haven’t gotten on FarmVille yet, open an acre and see if you can play for a half an hour without getting as hooked as you were as a kid when you brought home your first baby pet.

January 1, 2010

I was hoping that 2010 would be the first year we start pronouncing years the way they were pronounced every year of my life until 2000. That would be, 2010 is “twenty ten” rather than “two thousand and ten,” the same way that 1999 was “nineteen ninety nine” instead of “one thousand nine hundred and ninety nine.”

But if you follow YouTube and television – which are collectively the the pulse of American culture – it seems that less than a full day into the year we have already gotten on in the habit of pronouncing 2010 the same way we pronounced 2009. Maybe this won’t be year it changes back to the more efficient pronunciation, and it’s possible that a change won’t become popular for a few more years until 2013; we’ve already gotten accustomed to saying “two thousand twelve” when we talk about the film 2012, and those habits stick. According to some, the last decade was full of “the two thousands” in part because of the pronunciation of the 1968 film 2001: A Space Odyssey

Speaking of the last decade, the way we refer to 2000-2009 as a historic period might be an even more poignant issue of nomenclature, since everyone, even now, stumbles when describing the decade we just got through. The “two thousands” sounds extremely clumsy, and does not parallel “the sixties” or “the nineteen hundreds” the way “the twenty hundreds” would. (Some have suggested “the naughts” as a good way to say it, reflecting the zeros as well as the fact that we’re leaving the decade on a sour note of lost productivity and political stalemates.)

In most Online polls, a majority of respondents preferred “two thousand and ten” to “twenty ten,” and a few had some irritable or extremely angry comments directed at those who would pronounce 2010 in a new way. They’d be singing a different tune if newscasters switched their pronunciation and everyone had to follow in.

To me, “twenty-ten” sounds better and smoother, but whenever I mention the year the words that come out of my mouth are the same old “two thousand and ten,” running just on instinct. Perhaps the naysayers are not being stubborn or moralistic about a petty issue so much as just being more honest than I am about what they’re likely to do.

November 16, 2009

Few would see a movie like 2012 expecting a heartrending story line or touching lesson in human nature. From the onset the purpose of the film is clear: we’re going to blow this place up.

That was certainly the reason I rushed to see it on opening night, after the trailer laid it out for us in telling sneak peeks: Acres of city blocks slide like clods of dirt into the Pacific Ocean. A Buddhist monk rings a sacred bell as a tidal wave the size of a continent washes over the desolate Himalayas, killing him. 2012 boasts itself with scenes that are as awe-strickening as they are disturbingly beautiful.

As far as that goes, the only criticism I can offer is that the film pivots, about a quarter of the way in, from going too slow to suddenly going way too fast. A few mild tremors shake Los Angeles in the first part of the film – during a painfully tedious exposition to let characters reveal their cliche family backstories – and then without warning, Southern California splits itself apart in an abyss as deep as the Marianas Trench. The next thing we know, Yellowstone Caldera is erupting as a supervolcano in a bang so powerful it sends out shockwaves and a mushroom cloud akin to a Hydrogen bomb. (True to action movie form, the explosion rips over the hills at thousands of miles per hour until it approaches our protagonists fleeing in a lumbering motor home, and slows to their pace for their nail-biting escape. The explosion seems to pause or even retreat for several minutes as they scramble out of the vehicle into a parked airplane, but quickly re-accelerates to barely kiss the tail of their plane as it speeds away, churning with black smoke, glowing rocks and lava all the while.)

By the time Yellowstone has erupted, the world is basically over. The skies darken. Subsequent scenes of destruction are posed as an afterthought. “Oh, by the way,” the President’s Chief of Staff explains to aides in Washington, “Rio de Janiero was just destroyed by an earthquake” (cue footage of Christ Redeemer crumbling off its perch), then in a similarly decontextualized scene, we watch St. Peter’s Basilica collapse and kill the entire hierarchy of the Catholic Church along with prayerful masses (we see nothing of the rest of Rome, though). When our protagonists get their airplane to Las Vegas, most of its hotels and casinos have already anticlimactically collapsed into the core of the Earth in a giant earthquake, and we witness the ruins of Las Vegas astride a seemingly infinite abyss. Washington D.C. is next to go, at a pace that is almost too fast to comprehend; an earthquake takes down the Washington Monument, then a gigantic, yawning tidal wave rises out of the Atlantic and looms over the city (which is covered by volcanic ash from Yellowstone), but the film cuts away just as the U.S.S. John F. Kennedy, an aircraft carrier, lands on top of the White House.

Scenes such as this will continue through the rest of the film, revealing impending disasters but ending before the actual destruction ensues. Tidal ways loom up, but we cut away before the waves strike, and we assume that the characters who stood watching it are killed. Earthquakes begin but leave the scene before they reach their full strength. In other cases, we happen upon a ruined city that has already been destroyed, as happens with Honolulu, covered by lava. Most often we get even less than this: a military commander tells a diplomat that “Tokyo has been destroyed by an earthquake. Singapore has been wiped out by a wave.” The most dramatic disaster scene of the film remains the drawn-out Los Angeles earthquake from the beginning.

The thing that is notably missing from the film is discussion of the Mayan prophesy (or rather, rumor) of destruction in 2012 that is its impetus and namesake. If the film had the spiritualistic or supernatural overtone, then maybe the rapid succession of perfect coincidences that destroy the Earth’s cities would have a haunting poignancy. Instead, we are left only with science to explain what is going on, and the science falls flat. (According to 2012, solar flares make the lower Earth’s crust melt and the continents slide around through the ocean like leaves on a pond.) We learn that the planets line up in such a way as to lead to these events every 650,000 years, even though there has certainly never been an event like this in Earth’s history, which has been through almost 7 thousand 650-million year periods in its 4.5 billion years. As exposited in the film, the geophysical anomalies leading to the Earth’s destruction cause the continents to shift thousands of miles in the blink of an eye – something that requires them to move at tens of thousands of miles per hour, yet nobody on the ground is flung into space as would realistically happen if that occurred. This all happens magically on the ground without so much as cough of disturbance in the atmosphere, as observers in airplanes don’t figure out that land masses have moved until suddenly what they thought should be Guam turns out to be Tibet.

It is the end of the film that really gets me, though (Spoiler Alert). Beyond all the drama of survivors taking refuge in gigantic futuristic arks (tickets cost a billion Euro, so it is the world’s richest people who survive – our middle-class protagonists are stowaways) that the world’s governments had been building all along for mankind’s survival, the last scene in the movie tells us where humanity will go to rebuild: Africa.

According to the final scenes of 2012, in all the earthquakes and tectonic shifting, Africa has risen in elevation so much that it avoided being washed over by tidal waves that obliterated the rest of the planet. The Cape of Good Hope is now the highest point on Earth, which is, confusingly, where the world’s governments decide to set up humanity again, on the peaks of what will likely turn out to be glaciated mountains (sounds like paradise, right?). Our protagonist tells his children, who mourn the loss of their Southern California home, that they will find new homes where they are going.

But – uh – don’t people already live in Africa? Mayans were the first ethnic group be written out of 2012, Latinos are strikingly absent from the casting, and now the narrative suggests that native Africans are absent from the Earth. Forgive me for my politics, but it seems that having the world’s billionaires land on a dark-skinned continent to “re-build humanity,” as the story explains, is just a tad colonialist. There is no reference to the African governments, which were evidently not even part of the international ark-building program to begin with. I’m confused if the film’s writers see all Africans simply as tribal nomatic peoples, or as so militaristically primitive that it just doesn’t matter whether or not they already own the land you want to take. Perhaps the pending television series will elucidate this further.

I would say that 2012 has the basic structure of a great disaster movie, with awe-inspiring computer-generated explosions akin to Armageddon and Independence Day. But it tries too hard to be something else; if its all about the disasters, then the disasters should follow the same natural arc that any good story line does: subtle at first, but introduced to witnesses (in this case, the world’s population) through a gradual process of discovery, first with small bad news, then in steps to full awareness. In 2012, everyone outside a secretive government agency finds out that the world is ending after it is already well under way. There is no great “pending doom” scenes as occur in the latter parts of The Knowing where the horizon glows red with what is to come. There is no gradual escalation of events like the way small volcanic eruptions and gas emissions precede the big climactic pyroclastic explosion in Dante’s Peak. For the full effect, panic needs to slowly build through political wrangling, small cities taken down, riots, disasters following realistic trajectories, looting and then outright terror before the world’s ultimate demise rather than mundane obliviousness until suddenly your home and city is swallowed by the Earth.

Finally, the end of the movie leaves out bits of information that nerds like me are interested in most. What is the state of natural flora in preserved Africa, or in the rest of the world? The closing scene zooms out to show us a remodeled Africa (with drastically altered coastlines), the lone continent that has not been stripped bare by waves, and its central parts are still green. But what about the rest of the Earth? Does Florida border Argentina now? Is Antarctica to turn into a tropical paradise? The producers were either too lazy or just didn’t think audiences would care enough to want to see what has changed. In a movie that utterly lacks a decent human story, there is too much emphasis on the human story when those of us who did like the film liked it for one reason alone: the awe-inspiring natural world.