I see that Thomas Levenson is on the warpath against my post on what constitutes a 1953 kitchen. How can I say that 1953 kitchens didn't have stand mixers when the Kitchenaid was invented in 1919? The blender not much later? WHY DON'T I KNOW HOW TO USE THE INTERNETS!!!!!

It's certainly possible that I got something wrong in that post; I am of course not a historian of kitchens. But neither is Mr. Levenson, and I can't plead guilty to the offenses I am charged of based on his evidence.

Perhaps I was unclear that I was talking about things that the typical 1953 kitchen had, not the things that had been invented--the microwave oven had also been invented, in the late 1940s, but I think we can all agree that this was not a typical feature of kitchens of the era. Drip coffee (the Chemex), stand mixers, blenders, microwaves, had all been invented, but they weren't widely diffused. In context, I think this is obvious:

For that matter, how should we define what a 1953 kitchen was? Is it a kitchen with anything that had been invented by the time? Or is it a kitchen with the things that an average income family could afford? Surely it must matter not merely that something existed, but that it was cheap enough to become widespread?

As it happens, my kitchen--a galley kitchen in an urban apartment--was probably typical of 1953 in terms of major appliances (a stove and a refrigerator) and cupboard space. And yet, in some of the most important respects, it still wasn't a 1953 kitchen. 1953 kitchens did not have electric drip coffee brewers, stand mixers, blenders, food processors, or crock pots.

For some reason, Levenson has snipped off the context. Taking just the last sentence, he then launches into a diatribe about how wrong I am:

Stand mixers in the 1950s? Oh, you mean the standing mixer invented in 1908 by Herbert Johnson, sold to commercial bakers in 1915, and released for the home as the KitchenAid Food Preparer in...wait for it...1919? Sunbeam released its cheaper alternative in the '30s, and in 1954, (that kitchen of the 50s thing again) one could actually purchase a KitchenAid in a color other than white.

__

Blenders? Same story. The blender was invented in 1922 first as a tool for soda counters, and the iconic Waring Blender hit the market in 1937. By 1954, one million had been sold. As a sidenote, the Vitamix Corporation introduced a competitor to the Waring machine, and in 1949 sold it with the aid of a thirty minute broadcast on a brand new medium: WEWS TV in Cleveland, in what is thought to be the first ever direct response ad.

You get the idea. In the list above, food processors and slow cookers are in fact inventions that have their roots in the sixties and their commercial release in the early 70s. Give McArdle that -- but the point to take away from this is that in a list of five statements of fact, McArdle gets two wrong unequivocally, is deceptive in a third case (there were no automatic drip coffeemakers, but automatic makers using other brewing methods were readily available) and right only in two cases. .400 may be fabulous in baseball. In journalism, it wouldn't even propel you to the Cape Cod League.

I'm sorry that Mr. Levenson thinks I was being deceptive, but I thought it was obvious why I was specifying drip coffeemakers: at least for people from my generation, percolated coffee tastes horrible. I do not think of percolators as drip coffee equivalents, I think of them as something close to crime against humanity. As for the rest, my understanding is that the stand mixer was not widely dispersed in American households until the early 1960s; the stand mixer invented in 1919 was commercial grade; the home versions appeared in the 1930s and sold well, but were somewhat derailed by the dearth of consumer production during World War II. Of course, if anyone has data better than I was able to find, I am open to correction.

Some of this, however, may simply be an argument about what constitutes "common"; Sunbeam sold over a million mixers before World War II, but there were 31 million family households counted in the 1940 US census, and 38 million ten years later. The hand mixer didn't become available, as far as I know, until Sunbeam's handheld Mixmaster in 1952. The blender, according to research on appliance diffusion, wasn't found in as many as 20% of US households until 1967 (it hit 50% three years later). Again--are we looking at "typical" or "available"? After all, we don't think of television as having been a feature of American life in 1949, even though some percentage of US households had one. I lean towards requiring some amount of broad diffusion, but I can see the argument for the other side--indeed, that's what I was getting at in the post, that the definition of a "1950s kitchen" is tricky.

Then there's this, in which he supports my point while somehow arguing that this proves me wrong:

...aside from the privileged few who could afford copper, most Americans were cooking on thin, low-quality stainless steel and aluminum pans that deformed easily...

McArdle knows this how? It's a pretty bald declaration that would have come as a shock to a company like Lodge (founded 1896) or Wagner (founded 1891). And if you want to think about the availability of high-end cookware aimed at more regular folks, what about the company born of a trip to Paris in 1952, on which Chuck Williams first encountered "classic French cooking equipment like omelet pans and souffle molds whose quality I'd never seen in the U.S." Williams opened his first store in 1956 in the then very ordinary small-town farming community of Sonoma, California. Williams-Sonoma proved to have legs, I believe.

Lodge makes a very fine cast iron product, but unlike some of the die-hard fans, most people do not want to spend all their time using cast iron, because there are many things it isn't good for, and it's very heavy, which is why women abandoned it pretty quickly as new technologies became available. In fairness, in this case, my assertion is based on personal, not academic research: in the cookbooks, advertisements, cinema and television of the time, the pans are simply much flimsier than what a Wal-Mart chef is now used to, a view that is confirmed when you come across relics from the era in thrift stores or Grandma's kitchen. If Mr Levenson has contrary research, I will be pleased to retract my statement. However, his efforts so far seem to bolster, rather than weaken, my case.

But Mr. Levenson and I may simply differ on how we read the documents of the era; he criticizes my reading of the Betty Crocker 1950 Picture Cookbook by noting that many cookbooks of the era were still showing the legacy of rationing. This is certainly true of British cookbooks (rationing there ended in the mid-fifties), but it seems like a strange thing to say about American versions. Here, rationing of meat and butter started in spring of 1942 and ended in the late fall of 1945, a shorter time than the distance between the end of rationing, and the publication of the Betty Crocker cookbook (which stayed the bestselling non-fiction book of its decade.) To me, cookbooks pre- and post- rationing look more like each other (in the ingredients they call for) than the recipes I've read from the war years, which are heavy on alternate foods like (unrationed) organ meats and "meat loaf" made of nuts, beans and grains. But perhaps Mr. Levenson has a different experience, or some research that I haven't seen.

Mr. Levenson also seems to think that I am unaware of the invention of the refrigerated train cars which gave birth to the vast Chicago meat packing industry so memorably muckraked in Upton Sinclair's The Jungle (1906).

And how about this:

I don't believe that they have gone without fresh produce for six to eight months at a time, as my mother did in her childhood-and was told to be grateful for the frozen vegetables which hadn't been available when her mother was young...Is the shift to flash frozen produce greater, or less great, than the shift from flash frozen to the fresh produce made possible by falling trade barriers, rising air travel, and the advent of container shipping?

There's a lot wrong with this little passage, but here, let me just point out that McArdle is simply wrong in what she implies here about the history of the transport of refrigerated food. The earliest prototype of a mechanically cooled railroad car received a US patent in 1880. It certainly did take a long time for that to yield practical diesel-powered refrigeration on rails, but the use of natural ice for refrigerating specially designed rail cars -- "reefers" dates back to the mid 19th century. By the early 1880s, the Swift company were using ice-cooled cars to deliver 3,000 carcasses a week from the midwest to Boston. When ice production on industrial scale took off around the turn of the twentieth century, refrigeration on rails became so pervasive that 183,000 reefer cars were on US rails by 1930.__All of which is to say that the delivery of fresh food to locations distant from production is something that has evolved over the last century and a half -- and is not simply, or even mostly, the result of falling trade barriers, air transport or containers.Or, in other words, McArdle -- again -- knows not whereof she speaks.

I am afraid that Mr Levenson seems to have somehow confused "container shipping" with refrigerated boxcars. While his gloss on their wikipedia entry is interesting, it's sort of beside the point. I wasn't referring to our ability to ship produce in some sort of container, which predates the boxcar, and probably the invention of the box. Nor was I referring to the invention of the refrigerated train car, which I took as a given (doesn't "flash frozen" produce also imply refrigerated transport to most people?) Rather, I was referring to the revolutionary impact that containerization has had on global commerce.

I assume that if someone who teaches science writing at MIT is so unfamiliar with the containerization revolution that he can mistake "container" for "refrigerated", then it must be much less common knowledge than I had thought. So apologies to my readers for not explaining myself better. For those not familiar with containerization and its impact on global logistics, I hope you'll be able to take in the Smithsonian's fantastic exhibit on the topic. The idea of packing everything you want to ship in a standard-size container designed to move smoothly between ships, trains and trucks is simple--I don't know any obvious reason that we had to wait so long to develop it.

But the fact is that we didn't until the late 1950s, and the 1960s is when containerization really took off, transforming logistics and making possible a huge increase in global shipping traffic as costs fell and shipping times shrank. As wikipedia says, "The impact on society of reefer containers is vast, allowing consumers all over the world to enjoy fresh produce at any time of year and experience previously unavailable fresh produce from many other parts of the world." I don't think I have to explain why this trend was accelerated by faster, cheaper, more plentiful air cargo space, and falling trade barriers.

When my mother was growing up, it was theoretically possible to buy fresh strawberries or asparagus in December, shipped in a refrigerated boxcar from a hothouse or maybe California. But I doubt the grocers in her small town would have stocked them, and if they did, my grandparents, who were solidly middle class but whose memories of the Great Depression died hard, would never have dreamed of buying them; they would have cost a fortune. Now most people have access to a dozen kinds of lettuce every day of the year.

This is exactly the sort of question I was trying to get at with the original entry. Containerization wasn't even one of those inventions that was only waiting for a lot of prior technology development, as far as I can tell; by 1956, it was just waiting for someone to think of it. And then it took society several decades to really process the implications (the longshoreman's strike in California a few years back, and the panic over Dubai Ports World operating a US port, are both part of the transformations that are still going on, 55 years after Malcolm MacLean stuck that first load of truck trailers on a cargo ship bound for Texas.) But the end result has been that fresh produce is amazingly more available than it was to my grandmother when she was my age.

This has not exactly been a technological revolution, though things like computer networks have certainly aided it. Should it count towards "kitchen innovation"? A hard question. The biggest change happened outside of the kitchen itself. And unlike earlier innovations, it hasn't necessarily made us more productive, in the way that a refrigerator, by making storage easier, probably let a housewife spend less time shopping and cooking--in fact, judging by the number of "mix one can of mushroom soup into two cans of string beans" recipes I see in sixties cookbooks, maybe it's made us work harder. On the other hand, it's a pretty large improvement in our quality of life. Shouldn't that count?

Most Popular

Writing used to be a solitary profession. How did it become so interminably social?

Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.

Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.

Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.

But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.

Most of the big names in futurism are men. What does that mean for the direction we’re all headed?

In the future, everyone’s going to have a robot assistant. That’s the story, at least. And as part of that long-running narrative, Facebook just launched its virtual assistant. They’re calling it Moneypenny—the secretary from the James Bond Films. Which means the symbol of our march forward, once again, ends up being a nod back. In this case, Moneypenny is a send-up to an age when Bond’s womanizing was a symbol of manliness and many women were, no matter what they wanted to be doing, secretaries.

Why can’t people imagine a future without falling into the sexist past? Why does the road ahead keep leading us back to a place that looks like the Tomorrowland of the 1950s? Well, when it comes to Moneypenny, here’s a relevant datapoint: More than two thirds of Facebook employees are men. That’s a ratio reflected among another key group: futurists.

Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.

MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.

Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.

Two hundred fifty years of slavery. Ninety years of Jim Crow. Sixty years of separate but equal. Thirty-five years of racist housing policy. Until we reckon with our compounding moral debts, America will never be whole.

And if thy brother, a Hebrew man, or a Hebrew woman, be sold unto thee, and serve thee six years; then in the seventh year thou shalt let him go free from thee. And when thou sendest him out free from thee, thou shalt not let him go away empty: thou shalt furnish him liberally out of thy flock, and out of thy floor, and out of thy winepress: of that wherewith the LORD thy God hath blessed thee thou shalt give unto him. And thou shalt remember that thou wast a bondman in the land of Egypt, and the LORD thy God redeemed thee: therefore I command thee this thing today.

— Deuteronomy 15: 12–15

Besides the crime which consists in violating the law, and varying from the right rule of reason, whereby a man so far becomes degenerate, and declares himself to quit the principles of human nature, and to be a noxious creature, there is commonly injury done to some person or other, and some other man receives damage by his transgression: in which case he who hath received any damage, has, besides the right of punishment common to him with other men, a particular right to seek reparation.

Even when they’re adopted, the children of the wealthy grow up to be just as well-off as their parents.

Lately, it seems that every new study about social mobility further corrodes the story Americans tell themselves about meritocracy; each one provides more evidence that comfortable lives are reserved for the winners of what sociologists call the birth lottery. But, recently, there have been suggestions that the birth lottery’s outcomes can be manipulated even after the fluttering ping-pong balls of inequality have been drawn.

What appears to matter—a lot—is environment, and that’s something that can be controlled. For example, one study out of Harvard found that moving poor families into better neighborhoods greatly increased the chances that children would escape poverty when they grew up.

While it’s well documentedthat the children of the wealthy tend to grow up to be wealthy, researchers are still at work on how and why that happens. Perhaps they grow up to be rich because they genetically inherit certain skills and preferences, such as a tendency to tuck away money into savings. Or perhaps it’s mostly because wealthier parents invest more in their children’s education and help them get well-paid jobs. Is it more nature, or more nurture?

The Wall Street Journal’s eyebrow-raising story of how the presidential candidate and her husband accepted cash from UBS without any regard for the appearance of impropriety that it created.

The Swiss bank UBS is one of the biggest, most powerful financial institutions in the world. As secretary of state, Hillary Clinton intervened to help it out with the IRS. And after that, the Swiss bank paid Bill Clinton $1.5 million for speaking gigs. TheWall Street Journal reported all that and more Thursday in an article that highlights huge conflicts of interest that the Clintons have created in the recent past.

The piece begins by detailing how Clinton helped the global bank.

“A few weeks after Hillary Clinton was sworn in as secretary of state in early 2009, she was summoned to Geneva by her Swiss counterpart to discuss an urgent matter. The Internal Revenue Service was suing UBS AG to get the identities of Americans with secret accounts,” the newspaper reports. “If the case proceeded, Switzerland’s largest bank would face an impossible choice: Violate Swiss secrecy laws by handing over the names, or refuse and face criminal charges in U.S. federal court. Within months, Mrs. Clinton announced a tentative legal settlement—an unusual intervention by the top U.S. diplomat. UBS ultimately turned over information on 4,450 accounts, a fraction of the 52,000 sought by the IRS.”

During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.

During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.

Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.

Some say the so-called sharing economy has gotten away from its central premise—sharing.

This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.

The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”

The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.

What is the Islamic State?

Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.