You'd be hard pressed to find anyone who really liked 2010's Clash of the Titans remake, despite it earning more than $493 million at the worldwide box office. The movie received a pathetic 28 percent rating at Rotten Tomatoes, and Rolling Stone film critic Peter Travers, who is known to love popcorn flicks, derided it as a "sham" starring "good actors going for the paycheck and using beards and heavy makeup to hide their shame." Audiences-polling firm Cinemascore graded enthusiasm for it at "B," and IMDb users rated it a mediocre 5.8/10.

So there was not much of a clamor for the 3-D sequel Wrath of the Titans, opening today. Nonetheless, the big-budget machinery kicked into full gear, and the second installment of the warring Greek gods-franchise arrives, complete with grimy, mundane wall-to-wall action and terrible dialogue spouted by Liam Neeson and Ralph Fiennes in long beards and silky robes.

If 'Wrath of the Titans' flops, it will do so in the form today's Hollywood prefers: safely, quietly, without much of a fuss.

Though it's likely to be slammed by the second weekend of Hunger Games mania, the film appears headed toward a decent opening performance at the domestic box office. While there are no guarantees, the movie should prove successful enough on a global scale to concretely green-light the obligatory third Titans flick that's reportedly being planned. This is big studio business as usual: low-risk and low-reward, favoring a familiar franchise that offers familiar goods. If it works financially, it works, and if it doesn't, oh well.

But there's a deeper tragedy here. Think about how Titans compares with the furor that's greeted another big-budget event movie that opened in March: Disney's John Carter. In many respects, they're similar enterprises. Sword-and-sandals epics set in otherworldly pasts, both films feature heroes on quests to save civilization. They have complex labyrinths, allusions to mortality, sprawling battles captured in wide shots and kinetic close-ups, and universes populated by supernatural creatures and dashing, fiercely independent women. They're fundamentally old-fashioned event movies, structured to offer grand, big-screen escapism.

And yet, before its release, Andrew Stanton's ambitious, imaginative $250 million sci-fi epic had already been written off as a disaster, an overpriced folly. Competing studio executives gleefully anticipated a monumental financial shortfall. And despite some initial hopeful signs, in the form of a strong opening weekend internationally, the skeptics have been proven right. The movie fizzled at the domestic box office, the studio announced a $200 million write-down and—in the public consciousness at least—the film has been condemned to a permanent place in the hall of infamous bombs.

MORE ON MOVIES

But where Titans is a dry, rote affair, Carter embraces bold ideas. It's full of creative touches, offering a loving and affectionate mash-up of genres. Stanton balances tones, effortlessly shifting from tongue-in-cheek humor to stark drama and large-scale action. Drawing on his time at Pixar, the filmmaker stresses the story and his characters just as much as the spectacle. The movie has been derided for its 132-minute running time, which flies in the face of the "keep it short" ethos that's so predominant these days, but the extended length gives the filmmaker the chance to flesh out a creative, full-scale vision that gives John Carter the look and feel of an auteur effort. Most importantly, there's sense of wonder embedded in the work, a gleeful childlike joy in its depiction of the sights and sounds of Mars. It's the sort of earnest adventure, alive to imaginative possibilities, that would have been celebrated in the time of 20,000 Leagues Under the Sea.

And yet these days all anyone can talk about is Carter's $200 million write down.

Wrath of the Titans, on the other hand, is a dime-a-dozen sort of movie, with the familiar muddy visuals, sword-clanging action and poorly explained narrative, in which the giant, fiery demon father of the gods threatens to rise from his underworld prison and destroy the universe. It plays things painstakingly safe, without even the cheesy humor of its predecessor. There's no attempt to offer a bigger, broader picture of life in Ancient Greece and the grubby, indistinct action takes center stage. Beyond the occasional spark of ingenuity, as in a madcap fight between the main characters and giant cyclopses, it's assembly-line filmmaking. And it's been received with far less outrage, far less righteous anger, than the swarm of negativity that greeted John Carter.

There's a lesson here: It might be better to burn out then fade away, as Neil Young famously sang, but not in the film business. If Wrath of the Titans ultimately flops, then, it will do so in the exact form today's Hollywood prefers: safely, quietly, without much of a fuss.

Most Popular

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.

Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

Modern slot machines develop an unbreakable hold on many players—some of whom wind up losing their jobs, their families, and even, as in the case of Scott Stevens, their lives.

On the morning of Monday, August 13, 2012, Scott Stevens loaded a brown hunting bag into his Jeep Grand Cherokee, then went to the master bedroom, where he hugged Stacy, his wife of 23 years. “I love you,” he told her.

Stacy thought that her husband was off to a job interview followed by an appointment with his therapist. Instead, he drove the 22 miles from their home in Steubenville, Ohio, to the Mountaineer Casino, just outside New Cumberland, West Virginia. He used the casino ATM to check his bank-account balance: $13,400. He walked across the casino floor to his favorite slot machine in the high-limit area: Triple Stars, a three-reel game that cost $10 a spin. Maybe this time it would pay out enough to save him.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

A report will be shared with lawmakers before Trump’s inauguration, a top advisor said Friday.

Updated at 2:20 p.m.

President Obama asked intelligence officials to perform a “full review” of election-related hacking this week, and plans will share a report of its findings with lawmakers before he leaves office on January 20, 2017.

Deputy White House Press Secretary Eric Schultz said Friday that the investigation will reach all the way back to 2008, and will examine patterns of “malicious cyber-activity timed to election cycles.” He emphasized that the White House is not questioning the results of the November election.

Asked whether a sweeping investigation could be completed in the time left in Obama’s final term—just six weeks—Schultz replied that intelligence agencies will work quickly, because the preparing the report is “a major priority for the president of the United States.”

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

We can all agree that Millennials are the worst. But what is a Millennial? A fight between The New York Times and Slate inspired us to try and figure that out.

This article is from the archive of our partner .

We can all agree that Millennials are the worst. But what is a Millennial? A fight between The New York Times and Slate inspired us to try and figure that out.

After the Times ran a column giving employers tips on how to deal with Millennials (for example, they need regular naps) (I didn't read the article; that's from my experience), Slate's Amanda Hess pointed out that the examples the Times used to demonstrate their points weren't actually Millennials. Some of the people quoted in the article were as old as 37, which was considered elderly only 5,000 short years ago.

The age of employees of The Wire, the humble website you are currently reading, varies widely, meaning that we too have in the past wondered where the boundaries for the various generations were drawn. Is a 37-year-old who gets text-message condolences from her friends a Millennial by virtue of her behavior? Or is she some other generation, because she was born super long ago? (Sorry, 37-year-old Rebecca Soffer who is a friend of a friend of mine and who I met once! You're not actually that old!) Since The Wire is committed to Broadening Human Understanding™, I decided to find out where generational boundaries are drawn.