Fall is the most important season in the book business: From the Monday after Labor Day until close to Thanksgiving, the biggest books by the biggest writers and the most promising debut novels of the year relentlessly spew forth. Almost all will fade; each year sees fewer than a dozen extraordinary critical and popular successes. And every author knows that he or she is engaged in a zero-sum game: readers have limited time, and most of them have limited book-buying budgets, so if they are spending hours and money on one writer's book, they are almost certainly ignoring another's. The competition this year is especially severe, owing to the anniversary of 9/11. Publishers knew the somber commemorations would leave no room on the Today show for author interviews and would divert attention from book reviews—so the fall season starts late, the week of September 17. Most galling for writers is that to a large extent the game has been decided long before their books are in the stores. Publishers, of course, don't distribute advertising and marketing budgets evenly among their titles—they bet on the surest things and the sexiest subjects; at sales conferences in the spring they decide which books have the best chance of breaking out in the fall. The rest are pretty much left to die—no ads in major newspapers, no national author tours. During the summer, sales reps report which fall titles have won over booksellers across the country, further refining the list of titles the publishers will push. By late summer the trade journals—Publishers Weekly, Kirkus Reviews, and Library Journal—have nominated the upcoming season's winners and losers, which greatly influences which books, and how many of them, bookstores and libraries will order. By summer's end the cognoscenti in New York have already decided which fall titles to anoint with elaborate author profiles in The New York Times (the only paper that matters in the publishing world) and other publications. This seemingly idiosyncratic sanctification alone can explain the reams of flattering attention devoted to the Yale law professor Stephen Carter, which helped to propel his stilted, swollen, and predictable novel, The Emperor of Ocean Park, onto best-seller lists. We don't want to tell you which books will be the hot books. Rather, we want to tell you which to read—and which to ignore.

The second and concluding volume of Browne's life of Darwin covers the publication and reception of On the Origin of Species, Darwin's subsequent fame, and the gestation and writing of The Descent of Man. It is a masterpiece. At once wide-ranging and tightly woven, The Power of Place is as profound an intellectual history of Victorian Britain as has ever been written; an incisive consideration of Darwin's mind, personality, marriage, and tragic family life; and an elegant exegesis of his ideas, influence, and literary style and technique. Browne took on an enormously ambitious project, and only an astonishingly skillful writer and a masterly historian could have pulled it off. She has.

The decipherment of the Mycenaean script known as Linear B, inscribed on clay tablets discovered at Knossos and Pylos, was probably the most significant development in twentieth-century classical archaeology. Robinson reconstructs that elaborate code-breaking process with precision—or, rather, with as much precision as possible, given that he stresses the role intuition played. What makes this book so intriguing, though, is its depiction of the code breaker. Michael Ventris, the epitome of the gentleman scholar (he was an architect by training and profession), was a sweet, sad genius, and Robinson touchingly chronicles his life-long obsession with those mysterious clay tablets.

Books about Napoleon should, as Orwell said of saints, be judged guilty until proved innocent; the world needs many things, but not another pass at the Little Corporal—to whom more ink has been devoted than to probably any other historical figure except Jesus Christ. But Roberts, one of Britain's most talented and stylish young historians (his sparkling biography of Lord Salisbury still, alas, lacks a U.S. publisher), has written a brilliant work—an astute examination of Napoleon and Wellington's military rivalry in the Peninsular War and at Waterloo, a subtle and revealing character study of the two commanders, and a penetrating look at their evolving historical reputations and complex relationship (they shared, among other things, two mistresses).

This study of the 1919 Paris Peace Conference, written for the general reader, will—you'll see—get a lot of favorable critical attention. But bland and bloated (with a foreword by the appropriately superficial and self-important Richard Holbrooke), it falls between two stools. It's neither as entertaining and revealing as Harold Nicolson's irreverent and elegant (if somewhat dated) popular history Peacemaking 1919 nor as intellectually sparkling as Arno J. Mayer's commanding and controversial scholarly books Politics and Diplomacy of Peacemaking and Political Origins of the New Diplomacy. Better to read Nicolson for entertainment or Mayer (or such specialized studies as Lorna Jaffe's The Decision to Disarm Germany and Piotr Wandycz's France and Her Eastern Allies) for discerning scholarship.

"What was it like in the Middle Ages when darkness was nearly unbroken from the setting to the rising of the sun?" To answer this question Verdon mines his sources with dexterous imagination as he elucidates an extraordinarily wide swath of medieval life, from crime to sexuality to architecture to religion.

Through elaborate text and photographs this book helps to define the distinct look of the urban Midwest by examining the late-nineteenth- and early-twentieth-century aesthetic of the great Chicago architect Louis Sullivan and his disciples and imitators. Schmitt traces the development of a commercial midwestern architectural and decorative style, at once solidly handsome and sprightly, that is embodied as much in the banks and stores of Wisconsin and Minnesota downtowns as in Chicago's celebrated Carson Pirie Scott and Auditorium Buildings.

This unusually intelligent and straightforward cultural history (the topic virtually demands obfuscatory, PC-laden jargon) convincingly shows that our coming to view "biological sex"—the physical markers of femininity and masculinity—as malleable rather than immutable constituted one of the most profound moral, social, legal, and medical changes in twentieth-century America.

This vivid book takes readers through the daily life of European families at every economic level over three centuries. Sarti is a nosy guide: she barges into the houses of the past and shows us who hired a wet nurse and why, how often men changed their shirts, what toilets were like, how underwear fashions evolved, what kinds of soup people ate and when they ate it. With keen intelligence she explains what these facts reveal about relations between masters and servants, husbands and wives, parents and children. The publisher has recently issued a number of important, specific studies on the history of European family life, but this book, with its clear writing and wealth of arresting details, will fascinate and beguile the general reader.

September 17, 1862, the day Union and Confederate forces fought at the Battle of Antietam, is the bloodiest day in American history. More than 6,000 Americans were killed—well more than twice the number lost on D-Day. The battle was also the political and diplomatic turning point of the Civil War. McPherson's latest book (its publisher's lead title for the fall season) is a slim account of the battle that does little harm but little good, and hardly justifies its marketing campaign—which seems designed to synergize Civil War and 9/11-anniversary solemnity. Those looking for a detailed, dramatic, and authoritative narrative of Antietam will be far better off reading Stephen W. Sears's Landscape Turned Red.

Pritchett's graceful and elegiac portrait of London, its history, architecture, literature, and daily life, originally published in 1962, is finally back in print. I've never read a more penetrating or lovelier book on the city. The author, the most accomplished English critic of the second half of the twentieth century, concludes that London is above all a city of conversation, conversation that is "light, sociable, discursive, enquiring ... and is regarded as a relaxation and not as a means to an end."

The publisher has already issued the definitive texts of Lewis's greatest novels, Main Street and Babbitt, and with this volume readers now have easy access to all the author's important works from his most creative period, 1920-1930. Through their ceaseless, precise detail these books capture bourgeois midwestern life—its speech, its houses, its gadgets, its caste marks, its stultifying social rounds. Lewis's absorption in this milieu made him able, as E. M. Forster marveled, "to lodge a piece of a continent in our imagination."

In this reprint of his seminal, disturbing, and excoriating 1992 study, Carey argues that modernist literature is founded on an almost murderous loathing of common men and women. His argument is at times overdrawn, but Carey's assessment of Pound's, Joyce's, Woolf's, and Eliot's crude and unconscionable detestation of the newly educated reader is chilling, and his intricate dissection of the relationship between politics and aesthetics—along with his clean and scintillating style—should serve as a model for literary historians and cultural critics.

Whereas MacMillan's Paris 1919 takes a large subject and reveals little new about it, Hughes-Hallett's book examines a small topic—a single London dinner party in 1817, which included William Wordsworth, John Keats, and Charles Lamb—to illuminate an entire world. This discursive and lively book uses the party's conversation (which was recorded in great detail in the host's diary) to elucidate not only the personalities and work of the writers but also such subjects as surgical techniques, the London theater, and African exploration.

This exhaustive but trenchant biography limns Brittain's heartbreaking transformation from shallow jingoist in the First World War to committed pacifist in the Second—and thus, like Brittain's own memoirs, is as much a portrait of a generation as of an artist. With great sensitivity the authors focus on the central struggle in Brittain's professional life (and in that of many women writers): the effort to reconcile the attractions of domesticity and marriage with those of solitude and independence.

Bradsher, The New York Times's Detroit bureau chief for five years, examines the environmental, economic, and—most important—safety problems created by sport-utility vehicles. An intelligent reader will conclude from this meticulous and sober investigation that the makers of these behemoths have exploited a lucrative market of self-regarding urban and suburban consumers who care not a whit that by driving such menacing and wasteful machines they are committing a horrendously antisocial act.

The author, the foreign-affairs columnist for The New York Times, occupies the same position in the cultural and political landscape as that once held by Walter Lippmann—which will confirm for any independent-minded person that our civilization has utterly collapsed. Friedman's latest book, a collection of his columns, displays his peculiar propensity to be at once hokey and pretentious (a typical column takes the form of a memo to "The Arab Street"—a phrase signaling that the conventional wisdom is sure to follow—from "President Bill Clinton"). Friedman says in twelve words ("This book is the product of my own personal journey of exploration") what a competent writer could say in—actually, wouldn't say at all. What's worst about this book's publication, though, is the sickening display of mutual ingratiation on Charlie Rose that will, inevitably, kick off its promotional campaign.

Most Popular

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.

Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

Modern slot machines develop an unbreakable hold on many players—some of whom wind up losing their jobs, their families, and even, as in the case of Scott Stevens, their lives.

On the morning of Monday, August 13, 2012, Scott Stevens loaded a brown hunting bag into his Jeep Grand Cherokee, then went to the master bedroom, where he hugged Stacy, his wife of 23 years. “I love you,” he told her.

Stacy thought that her husband was off to a job interview followed by an appointment with his therapist. Instead, he drove the 22 miles from their home in Steubenville, Ohio, to the Mountaineer Casino, just outside New Cumberland, West Virginia. He used the casino ATM to check his bank-account balance: $13,400. He walked across the casino floor to his favorite slot machine in the high-limit area: Triple Stars, a three-reel game that cost $10 a spin. Maybe this time it would pay out enough to save him.

A report will be shared with lawmakers before Trump’s inauguration, a top advisor said Friday.

Updated at 2:20 p.m.

President Obama asked intelligence officials to perform a “full review” of election-related hacking this week, and plans will share a report of its findings with lawmakers before he leaves office on January 20, 2017.

Deputy White House Press Secretary Eric Schultz said Friday that the investigation will reach all the way back to 2008, and will examine patterns of “malicious cyber-activity timed to election cycles.” He emphasized that the White House is not questioning the results of the November election.

Asked whether a sweeping investigation could be completed in the time left in Obama’s final term—just six weeks—Schultz replied that intelligence agencies will work quickly, because the preparing the report is “a major priority for the president of the United States.”

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.