Tag Archives: 1970s

The crepes craze, which began in the 1960s, became intense in the 1970s. By the late 1980s it had all but disappeared..

But before crepes achieved popularity, they were almost unknown in the U.S. The exception was Crepes Suzette, thin, delicate pancakes with an orange-butter sauce and liqueurs that were often dramatically lit aflame at the diners’ table. Like Cherries Jubilee, Crepes Suzette usually only appeared on high-priced menus, such as the Hotel Astor [1908 quotation].

Before 1960 even fewer restaurants served savory crepes, and those that did would also seem to have been expensive restaurants. In 1948 the Colony in New York City served Crepes Colony with a seafood filling. And in the late 1950s New York’s Quo Vadis offered Crepes Quo Vadis, filled with curried seafood and glazed with a white sauce, as hors d’oeuvres.

Although few Americans had ever eaten Crepes Suzette, it’s likely that the fame of this prized dish helped pave the way for the creperie craze, with restaurants primarily featuring crepes. Crepes were regarded as an exotic luxury dish that, by some miracle, was affordable to the average consumer, sometimes costing as little as 60 or 75 cents apiece around 1970.

Crepes enjoyed a mystique, offering a link to European culture and a break from the meat and potatoes that dominated most restaurant menus in the late 1960s and early 1970s.

At a time when America was seen as the world leader in modern ways of living – including industrially efficient food production — Europe was imagined as a romantically quaint Old World where traditional ways were preserved and many things were still handmade.

American creperies catered to their customers’ wish for a taste of Europe. With country French decor, servers in folk costumes, and names such as Old Brittany French Creperie and Maison des Crepes [pictured at top, Georgetown], diners were imaginatively transported to a delightfully foreign environment quite unlike the brand new shopping malls in which many creperies were located. Another exotic touch employed by quite a few creperies was to use the French circumflex mark in crêpes (which I have not done in this blogpost).

Filled with creamed chicken, ratatouille, or strawberries and whipped cream (etc.), crepes soon became a favorite lunch, dinner, and late-night supper for college students, dating couples, shoppers, and anyone seeking “something different.” Along with crepes, menus typically included a few soups, most likely including French onion soup, a spinach-y salad, and perhaps a carafe of wine.

San Francisco’s Magic Pan Creperie led the trend and, after being acquired by Quaker Oats in 1969, spread to cities across the country, with the chain eventually totaling about 112. The first Magic Pan, a tiny place on Fillmore Street, was opened in 1965 by Paulette and Laszlo Fono, who came to this country in 1956 after the failed anti-Communist uprising in their native Hungary. A few years later they opened another Magic Pan in Ghirardelli Square and Laszlo patented a 10-pan crepe-maker capable of turning out 600 perfectly cooked crepes per hour [pictured here].

As Quaker opened Magic Pans, they invariably received a warm welcome in newspaper food pages. It was as though each chosen city had been “awarded” one of the creperies, usually situated in upscale suburban shopping malls such as St. Louis’s Frontenac Plaza or Hartford’s West Farms Mall. When a Magic Pan opened in Dallas’ North Park shopping center in 1974, it was called “as delightful a restaurant as one is likely to find in Dallas.”

Among Magic Pan amenities (beyond moderate prices), reviewers were pleased by fresh flowers on each table, good service, delicious food, pleasant decor, and late hours. Many of the Magic Pans stayed open as late as midnight – as did many independent crepe restaurants. [Des Moines, 1974]

In hindsight it’s apparent that creperies responded to Americans’ aspirations to broaden their experiences and enjoy what a wider world had to offer. It was a grand adventure for a high school or college French class or club to visit a creperie, watch crepe-making demonstrations, and have lunch. [below: student at the Magic Pan, Tulsa, 1979] But what one Arizona creperie owner called the “highbrow taco” did not appeal to everyone. The operator of a booth selling crepes at Illinois county fairs reported that hardly anyone bought them and that some fairgoers referred to them as creeps or craps.

I would judge that crepes and creperies reached the pinnacle of popularity in 1976, the year that Oster came out with an electric crepe maker for the home. Soon the downward slide began.

Quaker sold the Magic Pans in 1982 after years of declining profits. The new owner declared he would rid the chain of its “old-lady” image, i.e., attract more male customers. Menus were expanded to include heartier meat and pasta dishes.

Even though new creperies continued to open here and there – Baton Rouge got its first one in 1983 – there were signs as early as 1980 that the crepe craze was fading. A visitor to a National Restaurant Association convention that year reported that crepes were “passé” and restaurants were looking instead for new low-cost dishes using minimal amounts of meat or fish. A restaurant reviewer in 1986 dismissed crepes as “forgotten food” served only in conservative restaurant markets. Magic Pans were closing all over, and by the time the 20-year old Magic Pan on Boston’s Newbury Street folded in 1993, very few, if any, remained.

Throughout the 20th century the number of mobility-impaired Americans grew – due to medical advances, lengthening lifespans, polio epidemics, wars, and rising rates of automobile accidents. In the late 1950s and early 1960s the problem of physical barriers confronting those using wheelchairs, braces, canes, and walkers, began to get attention, largely as a result of activism by the disabled.

At first the focus was on public buildings, but it soon expanded to include commercial sites such as restaurants. One of the early efforts to ease a path was the publication of a 1961 Detroit guide book that devoted several pages to describing features of two dozen popular restaurants that were at least minimally accessible. For instance The Village Manor in suburban Grosse Pointe had a street-level front entrance and a ramp in back as well as main floor restrooms outfitted with grab bars. But several of the restaurants listed had steps at entrances, narrow doorways, restrooms too small to maneuver a wheelchair, and tables too low for wheelchair seating.

In 1962 the National Society for Crippled Children and Adults (NSCCA, an organization that had added “Adults” to its name during WWII) joined with the President’s Committee on Employment of the Handicapped (established in 1947) to launch a nationwide movement to change architectural standards and building codes so as to remove barriers affecting people with mobility limitations. This marked a new attitude acknowledging that handicapped people wanted to “do more things and go more places” but were blocked by the built environment. It was becoming apparent, reported one newspaper, that those “who were no longer ‘shut-ins’ were ‘shut-outs.’”

In 1963 the NSCCA began sponsoring surveys of public and private buildings which included restaurants. In various cities local volunteers equipped with measuring tapes compiled records of buildings concerning the width of doorways, number of steps, presence of ramps and elevators, and placement and design of restroom facilities. Meanwhile, in New Jersey the Garden State Parkway altered its restaurants and restrooms for disabled travelers.

Overall, though, there was very little action. The surveys showed that accessibility in the United States – not only in restaurants, but in schools, court houses, hospitals, churches, and all kinds of businesses – was rare. A survey of Oklahoma in 1968 revealed that only 32 of the first 2,144 public facilities checked were fully accessible to anyone operating their own wheelchair, while 60% were entirely inaccessible. In Oklahoma City, the state’s capitol, only one of the 20 restaurants surveyed at that point could accommodate a wheelchair user.

1968 was the year when official recognition of the problems presented by architectural barriers was achieved with the passage of a federal law that decreed that any building constructed even partly with federal funds had to be barrier-free. Although restaurants remained unaffected by the law, it was significant for demonstrating a growing recognition that accessibility problems arose from the environment as much as from the disabilities of individuals. It would, however, take another 22 years, with passage of the Americans with Disabilities Act in 1990, before serious attention was given to eliminating obstacles in all kinds of public facilities.

Despite a common (and illogical) attitude held by numerous restaurant owners that there was no need to make their restaurants accessible since disabled people did not frequent them, there were a few owners who voluntarily removed barriers before the ADA passed. When the owner of the Kitchen Kettle in Portland OR remodeled in 1974 he built an entrance ramp and a low lunch counter. In Omaha, Grandmother’s Skillet, co-owned by Bob Kerrey who had lost a leg in the Vietnam war (and later became governor of Nebraska and a U.S. senator), had a restaurant designed in 1976 that could be used by anyone in a wheelchair or on crutches. In California, a builder constructed accessible homes as well as fast food restaurants with ramps and restroom grab bars in the mid-1970s.

In the 1980s it became a fairly common practice for restaurant reviewers to note whether an eating place accessible or, more likely, not. Most of America remained inaccessible. As irony would have it, that included much of Future World at Disney’s Epcot Center. Several fast food cafes there required patrons to get into a line formed by bars that were spaced too narrowly for wheelchairs. Even more depressing were the ugly letters advice columnist Ann Landers received in 1986 after she defended the rights of a handicapped woman to patronize restaurants. “Would you believe there are many handicapped people who take great pleasure in flaunting their disability so they can make able-bodied people feel guilty?” wrote one reader.

Passage of the ADA was a big step forward, but it didn’t work miracles. Even in the late 1990s it took enforcement activity from the U.S. Justice Department to get some restaurants to comply. Friendly’s, a family restaurant chain, was fined and compelled to alter entrances, widen vestibules, and lower counters, among other changes. Wendy’s settled out of court and agreed to remove or widen zigzag lanes at their counters.

Although many restaurants have gone to great lengths to guarantee accessibility, problems remain. Even when a restaurant is in compliance, there’s a good chance that disabled patrons will have an uncomfortable experience. This was detailed beautifully in a 2007 NYT story by Frank Bruni titled “When Accessibility Isn’t Hospitality.” His dining companion Jill Abramson, then editor of the paper and using a wheelchair following an accident, found that even luxury restaurants could present dismal challenges to patrons with mobility limitations.

The salad bar most likely developed from the Americanized version of the smorgasbord which, by the 1950s, had shed its Swedish overtones and turned into an all-you-can-eat buffet. The smorg concept lingered on for a while in the form of salad “tables” holding appetizers and a half dozen or so complete salads typically anchored by three-bean, macaroni, and gelatin. Eventually someone came up with the idea of simply providing components in accordance with the classic three-part American salad which structurally resembles the ice cream sundae: (1) a base, smothered with (2) a generous pouring of sauce, and finished with (3) abundant garnishes. Or, as a restaurant reviewer summarized it in the 1980s, “herbage, lubricant and crunchies.”

Whatever its origins, the salad bar as we know it – with its hallmark cherry tomatoes, bacon bits, and crocks full of raspberry and ranch dressings — became a restaurant fixture in the 1970s. Introduced as a novelty to convey hospitable “horn-of-plenty” abundance and to mollify guests waiting for their meat, it became so commonplace that the real novelty was a restaurant without one. Though strongly associated with steakhouses, particularly inexpensive chains, salad bars infiltrated restaurants of all sorts except, perhaps, for those at the pinnacle of fine dining. Salad bars were positively unstoppable at the Joshua Trees, the Beef ’n Barrels, and the Victoria Stations, some of which cunningly staged their salad fixings on vintage baggage carts, barrels, and the like.

Although industry consultants advised that a salad bar using pre-prepared items could increase sales while eliminating a pantry worker, restaurant managers often found that maintaining a salad setup was actually a full-time task. Tomatoes and garbanzos had a tendency to roll across the floor, dressings splashed onto clear plastic sneeze-guards, and croutons inevitably fell into the olde-tyme soup kettle.

The hygienic sneeze-guard came into use after World War II, first in schools and hospital cafeterias. Although a version of it had made its appearance in commercial restaurants in the early 20th century with the growth of cafeterias, many restaurants served food buffet style into the 1950s and 1960s without using any kind of barrier. The Minneapolis Board of Health required that uncovered smorgasbords either install sneeze-guards or close down in 1952, but it seems that their use did not become commonplace nationwide until the 1970s. Eklund’s Sweden House in Rockford IL thought it was novel enough to specifically mention in an advertisement in 1967. Massachusetts ordered them to be used in restaurants with buffets or salad bars in 1975.

On the whole salad bars went over well with the public – and still do — but by the late 1970s professional restaurant critics were finding it hard to hide their disdain. Judging them mediocre, some blamed customers who were gullible enough to believe they were getting a bargain. Others were wistful, such as the forbearing reviewer in Columbia, Missouri, who confessed, “It would be a nice change to get something besides a tossed make-it-yourself salad, and to have it brought to the table.” The trend at the Missouri college town’s restaurants, however, was in the opposite direction. In the 1980s Faddenhappi’s and Katy Station ramped up competition by offering premium salad makings such as almonds and broccoli while Western Sizzlin’ Steaks pioneered a potato bar.

Chocolate concoctions have always been found in the dessert section of restaurant menus. Right? You’ve already figured out that I’m going to say no. But, naturally, it’s a bit more complicated than that.

Until the later 19th century the main form in which Americans consumed chocolate in public eating places was not as a dessert but as a hot beverage.

Confusion arises over the meaning of dessert, which is used in various ways on American menus. In the 19th century, dessert often was the very last course, coming after “Pastry,” which included pies, cakes, puddings, and ice cream. In this case dessert meant fruit and nuts. But sometimes ice cream was listed under dessert. For example, the Hancock House hotel in Quincy MA displayed the following on a menu in June of 1853:

In cheaper eating places, there was no fruit or nuts and dessert came closer to what we mean today, which is how I will use it for the rest of this post – referring to sweet dishes that come toward the end of the meal and are rarely nuts and usually other than simple fruit.

The absence of anything chocolate on the Hancock House menu was not unusual for that time. I looked at quite a lot of menus – of course only a fraction of those still existing – and the first instance of chocolate other than as a beverage that I found was chocolate ice cream in the 1860s. It was not too unusual to find chocolate eclairs on a menu in the later 19th century, and chocolate cake turned up in the 1890s. According to an entry in The Oxford Companion to Food and Drink, however, chocolate cake in the late 1800s could refer to yellow cake with chocolate frosting.

By the early 20th century chocolate appeared on menus in various forms: as pudding, layer cake, devil’s food cake, ice cream, eclairs, and ice cream sodas and sundaes. In the 1920s, chocolate shops appeared and were similar to tea shops. They offered light meals, desserts, and chocolate as a drink or as candy, and other desserts. They were popular with women, as were department store tea rooms, another type of eating place that was heavy on sweet things. In the case of Shillito’s department store in Cincinnati, a 1947 menu offered quite a few chocolate treats.

Starting in the 1970s and reaching a high point in the 1980s began a chocolate frenzy that continues today. With the help of restaurant marketing, millions of Americans discovered they were “chocoholics.”

If you stepped into San Francisco’s Pot of Fondue in 1970 you could do Cheese Fondue for an appetizer, Beef Bourguignonne Fondue as a main dish, and Chocolate Fondue for dessert. But the Aware Inn in Los Angeles pointed more forcefully at dessert trends to come with its 1970s “dangerous Chocolate Cream Supreme” costing $2 and described as “somewhere between chocolate mousse and fudge.”

Adjectives such as “dangerous” continued the sinful metaphor conveyed earlier by “devil’s food.” Soon “special” chocolate desserts were named for immoral inclinations (“decadence”) or perhaps fatal pleasures (“death by chocolate,” “killer cake”). All this led at least one journalist to protest against the unsubtle marketing of chocolate desserts in the 1980s. She pleaded with servers: “Do not expect me to swoon when you roll back your eyes in ecstasy as you recite a dessert list that offers nothing but chocolate, via cheesecake, chip cake, profiterols, madeleine, mousse, bombe, eclair, napoleon, torte, tart or brownie.”

From restaurant reviews from the 1980s it’s noticeable that most reviewers jumped on the chocolate bandwagon with descriptions along the lines of “scrumptious” chocolate desserts “to die for.” But quite a few were critical, especially of chocolate mousse, which was readily available to restaurants powdered or wet, even “pipeable.” After a 1978 visit to a restaurant expo overflowing with convenience food products, the Washington Post’s restaurant reviewer Phyllis Richman observed, “The final insult of your dinner these days could be chocolate mousse made from a mix, but that is only another in the long line of desecrations in the name of chocolate mousse.” Often critical reviewers deplored chocolate mousse that tasted as if made of instant pudding mix combined with a non-dairy topping product, which very likely it was.

“Chocolate Decadence” cake took a beating in a review by Mimi Sheraton who in 1983 no doubt irritated many chocolate lovers when she referred to the prevalence of “dark, wet chocolate cake that seems greasy and unbaked, the cloying quality of such a sticky mass being synonymous with richness to immature palates.” More recently, what I call a “fantasy escape” restaurant in upstate New York was cited unfavorably for serving a boxed cake provided by a national food service that it merely defrosted, sprinkled with fresh raspberries, grandly named “Towering Chocolate Cake,” and placed on the menu for a goodly price.

Let the buyer aware, but no doubt many restaurant patrons do in fact realize that they are willing co-conspirators in fantasy meals. Along these lines, nothing can be too chocolate-y, triple obviously outdoing double. Decorations of some sort are de rigeur. Along with whipped cream, ultra-chocolate desserts might be adorned with orange rind slivers, raspberry sauce, or dripping frosting. In 1985 the Bennigan’s chain brought their “Death by Chocolate” into the world, consisting of two kinds of chocolate ice cream, chopped up chocolate candy bars, a chocolate cracker crust, with the whole thing dipped in chocolate and served with chocolate syrup on the side.

One theory about what brought about restaurants’ chocolate dessert blitz relates it to declining sales of mixed drinks in the 1980s as patrons became aware of the dangers of drinking and driving. Then, according to a 1985 Wall Street Journal story, elaborate, expensive desserts offered a way to make up for lost cocktail sales. Fancy desserts are undoubtedly higher-profit items than many entrees, but I suspect that another major factor favoring the rise of ultra-chocolate desserts was the culture of consumer indulgence that increased restaurant patronage in the 1970s, 1980s, and beyond.

Nothing decorated more restaurant plates in the 20th century than parsley, most of it by all accounts uneaten.

Why use so much of what nobody wanted? The best answer I can come up with is that parsley sprigs were there to fill empty spaces on the plate and to add color to dull looking food.

Parsley was not the only garnish around, but it has probably been the most heavily used over time. It has shared the role of plate greenery with lettuce, especially after WWII when lettuce become readily available, and to a lesser extent with watercress.

Parsley has long been a favorite in butcher shops where it is tucked around steaks and roasts. As early as 1886 restaurants were advised to emulate butchers and decorate food in their show windows with “a big, red porterhouse steak, with an edge of snow-white fat, laid in the center of a wreath of green parsley.” By the early 20th century, almost the entire U.S. parsley crop, more than half of which was grown in Louisiana and New York, went to restaurants and butchers. By 1915 parsley sprigs were a ubiquitous restaurant garnish that many regarded as a nuisance. Diners sometimes suspected that the parsley on their plate had been recycled from a previous customer.

While European chefs use garnishes as edible complements to the main dish, Americans have focused primarily on their visual properties.

Around 1970 when convenience foods invaded restaurant kitchens, garnishes took on heightened significance in jazzing up lackluster, monochromatic frozen entrees. In the words of Convenience and Fast Food Handbook (1973),“The emergence of pre-prepared frozen entrees on a broad scale has revived the importance of garnishing and in addition, has led to innovative methods of food handling, preparation and plating. If an organization is to achieve sustained success in this field, emphasis must be placed on garnishing and plating. These are the two essentials that provide the customer with excitement and satisfaction.” [partial book cover shown above, 1969]

Excitement?

The head of the Southern California Restaurant Association admitted in 1978 that he hated to see all the food used as garnishes go to waste in his restaurant, including “tons” of lettuce. But this was necessary for merchandising, he said: “We have to make food attractive. It’s part of the cost of putting an item on the table.” It was – and is – probably true that an ungarnished plate such as shown here looked unattractive to most Americans.

So many garnishes decorated food in American restaurants in the 1970s that food maestro James Beard got very grumpy about it, calling it stupid and gauche. He could allow watercress with lamb chops or raw onion rings on a salad, but put a strawberry in the center of his grapefruit half and he was outraged. Next to orange slices and twists, his most detested “tricky” garnishes were tomato roses and flowers. Funny that he didn’t mention radish roses such as the one shown above.

Like this:

Last week my brother found the following curious notice in his local newspaper offering aggrieved consumers a free pickle, cookie, or soda (valued at $1.40). The offer was the result of the settlement of a class action lawsuit by a woman who failed to get sprouts on her sandwich as a Jimmy John’s menu had promised.

I could not help but appreciate that the claimant was a resident of California, the state that originated Truth in Menu laws (aka Truth in Dining) that demand under penalty of fines that restaurants provide exactly what is stated on their menus.

Menu advertising is covered under a variety of consumer protection laws but many people have felt that restaurants’ misrepresentations deserved more focused attention. Ralph Nader, from a restaurant family himself, may have been the first to call for a Truth in Menu law, in 1972. The first attempt to enact such a law, in the form of a city ordinance, came in San Francisco in 1974 under the sponsorship of then-president of the Board of Supervisors, Diane Feinstein (US Senator, D-CA).

The impetus behind the San Francisco ordinance was to stop restaurants from serving convenience entrees that had been prepared elsewhere, frozen, and reheated in the restaurant, yet were not identified as such and leaving diners to believe they originated in the restaurants’ kitchens. Also at issue were the restaurants at Fisherman’s Wharf that purported to serve locally caught fish yet were known to substitute frozen fish shipped in from other states. Restaurant owners such as Tom DiMaggio, brother of baseball’s Joe DiMaggio and owner of DiMaggio’s Restaurant, argued that they had to fall back on frozen fish at times when fresh caught local fish was not available. DiMaggio admitted to serving frozen prawns from Louisiana. Proponents of the Truth in Menu law, however, claimed that some of the Wharf’s restaurants regularly served nothing but frozen fish.

San Francisco’s Board of Supervisors chose not to pass the ordinance, but Los Angeles took up the cause and became, probably, the leading enforcer of menu honesty. Other states and localities also adopted such laws but their enforcement has tended to be weak. The 1970s was the high point for restaurant inspections and TiM enforcement. Fines were issued for margarine referred to as butter, Maine lobster not from Maine, real maple syrup that wasn’t, frozen entrees touted as home-made, 8 oz prime steaks that weighed less and were lower grade, chicken and veal dishes made of turkey or pork, and fish that wasn’t what its name implied. As “home-made” became “home-baked,” restaurants learned to play it safe with their claims, as the postcard image above shows. Menu printers did a brisk business.

The use of frozen entrees eventually became an accepted practice in many restaurants as consumers happily accepted dishes prepared in a factory and microwaved in the restaurant’s kitchen. Restaurants are not required to acknowledge that they serve frozen entrees (as the Feinstein ordinance would have required), and many customers would not be horrified if this was revealed, feeling that as long as it tastes good and costs less than food made on-site from scratch, that’s fine with them.

Where do things stand today? Restaurant chains are the most likely targets for lawsuits and have been diligent in avoiding false claims. Elite restaurateurs wither at the very notion that they could use convenience foods or mislabel anything. Yet misrepresentations certainly occur, sometimes even among the staunchest supporters of truthfulness.

There’s the meat glue scandal in which chunks of beef were glued and pressed into shape as filet mignon.

But, if there is a single type of food most likely to be misrepresented on menus it is fish. Not too long ago I ordered grouper in Florida at three different restaurants. Each time it was quite different, indicating that at least twice I was served something else. As recent investigations show, fish misidentification is rampant among restaurants, suppliers, and retailers, always involving the substitution of a less expensive fish for a more expensive one.

It is always gratifying to find a piece of ephemera that marks a transition. The postcard above is blank on the back, probably to allow McDonald’s franchises to imprint it with their locations as they completed the changeover from the old building style to the new in the 1970s.

The old-style McDonald’s was based on the original design of California architect Stanley Meston who had once worked for Wayne McAllister, noted designer of modernistic 1930s drive-ins. The design was made for the McDonald brothers who in the 1950s had begun to franchise their California drive-in.

When Ray Kroc obtained a franchise from the brothers and spread McDonald’s outside the West and across the nation, he made modifications to Meston’s design, simplifying the arches and adding a glass-enclosed vestibule to the front as shown on the postcard at the top.

The original Meston design was of an exuberant style known as “Googie” that featured eye-catching elements such as swooping roofs, extensive plate glass, neon, and the use of shiny industrial building materials (but sometimes also lava stone as shown in Pick’s). I recommend the books Googie and Googie Redux by Alan Hess, which I have drawn upon for this post, along with Orange Roofs, Golden Arches by Philip Langdon.

In the 1960s Kroc’s McDonald’s (he had bought out the McDonald brothers in 1961) began to run up against resistance from local zoning boards that wanted something more restrained than the “franchise schlock” look of the golden arches model. In 1968 the corporation went to work on a new design for a brick-faced building with a dark mansard-style roof and indoor seating. “We have taken off the gaudy materials and eliminated the circusy atmosphere,” said a McDonald’s executive in charge of design. The arches, on their way to become an ever-smaller letter M logo, were relegated to the sign. The first mansardized McDonald’s opened in the Chicago suburb of Matteson in 1969.

The little red, white, and yellow stands began to disappear. In 1972 most – about 75% — had been remodeled or replaced, leaving only about 250. By 1980, fewer than 50 remained, out of a total of 5,082 McDonald’s in the U.S. Preservationists in Oregon and Virginia tried to have old-style McDonald’s placed on historic preservation lists on the grounds they were symbols of America; they were turned down. By 1990 only five remained. A McDonald’s in Downey CA which opened in 1953 has been preserved, and this is probably the only example of the original design remaining other than the corporation’s recreation of Kroc’s first unit in Des Plaines IL.

The cultural climate that brought McDonald’s and other fast food restaurants into contention with critics who sought to keep Googie buildings out of their towns and neighborhoods was in stark contrast to the optimistic futurism exhibited at the 1964-65 New York World’s Fair. Philip Langdon has used the term “the browning of America” for the turn away from buildings that were shiny, colorful, and blatantly commercial to ones that were low-slung, dark, and of natural looking materials. He suggested this shift signified a downcast attitude toward America. “The demand for a less garish roadside strip, when combined with other currents in the culture – a growing awareness of the nation’s faults and a fading away of the once-euphoric attitude toward futuristic technology – fostered a more subdued esthetic,” he wrote.

But another interpretation begs to explain the change as a progressive corrective to the post-WWII abandonment of nature, as evidenced in commercial roadside strips, napalm warfare, chem-lab convenience foods, and the widespread despoliation of the environment.

Top Posts & Pages

We eat in restaurants several times a week and yet know very little about their history. I plan to dip into my archive of research and images every so often to present a little tidbit that highlights aspects of our American restaurant culture. Let me know your thoughts.
E-mail me!