Posts from the ‘history’ Category

One of my favourite scenes in Alice in Wonderland is when the Caterpillar asks Alice ‘Who are YOU?’ Having spent the day being shrunk, telescoped, and grown again, Alice is at a loss: ‘I—I hardly know, sir, just at present—at least I know who I WAS when I got up this morning, but I think I must have been changed several times since then.’ During a period obsessed with lineages, classes, and groups, Alice’s inability to slot herself into the correct category feels profoundly transgressive. Her ontological uncertainty—she remarks to the Caterpillar ‘I can’t explain MYSELF…because I’m not myself’—is more mature than the Caterpillar who will, as Alice argues, turn into a chrysalis and then a butterfly. Nobody is one thing for very long.

The same can be said, of course, for confectionary. Periodically, Britain convulses in a fraught debate over the status of the Jaffa Cake. In their commercial form these are rounds of Genoise sponge topped with orange jelly, and covered with chocolate. Supermarkets sell bright blue packets of McVitie’s Jaffa Cakes in the same aisle as Digestive biscuits, Hobnobs, and shortbread. So to the uninformed, the Jaffa Cake is – despite its name – a biscuit.

But is it really? Legally, the Jaffa Cake qualifies as a cake. A long and complicated court case in 1991 ruled in favour of McVitie’s, confirming that the Jaffa Cake is indeed a cake and should not, then, be subject to VAT. Harry Wallop explains:

In the eyes of the taxman, a cake is a staple food and, accordingly, zero-rated for the purposes of VAT. A chocolate-covered biscuit, however, is a whole other matter—a thing of unspeakable decadence, a luxury on which the full 20pc rate of VAT is levied.

McVitie’s was determined to prove it should be free of the consumer tax. The key turning point was when its QC highlighted how cakes harden when they go stale, biscuits go soggy. A Jaffa goes hard. Case proved.

So this is a Cake which looks like a biscuit but is really a cake.

Oranges trees in Perth, Australia.

But this ontological uncertainty extends beyond its position as cake or biscuit. Jaffa Cakes are named after Jaffa oranges. (McVitie’s never patented the name Jaffa Cake, so chocolate-and-citrus flavoured confections are often described as ‘Jaffa.’) These were developed in Palestine – in and near the port city of Jaffa – during the 1840s. Sweet, seedless, and with a thick rind which made them perfect for transporting, Jaffa or Shamouti oranges became Palestine’s most important export in the nineteenth century. The arrival of Jewish immigrants in the 1880s and 1890s revolutionised citrus growing in the region. These new arrivals introduced mechanised, ‘scientific’ forms of agriculture, dramatically increasing yields.

By 1939, Jewish, Palestinian, and, occasionally, Jewish and Palestinian farmers working collaboratively, employed altogether 100,000 people, and exported vast numbers of oranges abroad. Britain was a major importer of Jaffa oranges, particularly after Palestine became a Mandated territory under British control in 1923. The Empire Marketing Board – which promoted the sale of imperial produce – urged Britons to buy Jaffa oranges, something picked up by McVitie’s in 1927 with the invention of the Jaffa Cake.

An Empire Marketing Board advertisement for Jaffa oranges.

Jaffa oranges were – and, to some extent, are – held up as an example of successful Palestinian and Israeli co-operation during the interwar period. But after 1948, the same oranges became a symbol of Israel itself. Similar to the boycott of Outspan oranges during apartheid, organisations like BDS have urged customers not to buy Jaffa oranges as a way of weakening Israel’s economy and demonstrating their commitment to a free Palestine. (Jaffa oranges are no longer, though, a major Israeli export, and are grown in Spain, South Africa, and elsewhere.)

The changing meanings of Jaffa Cakes – cake, biscuit – and their constituent ingredients – symbol of collaboration, symbol of oppression – show how the categories into which we slot food are themselves constructs. (We could, really, compare apples and oranges.) But also, the Jaffa Cake helps to draw our attention to how taxes, trade agreements, and the politics and practicalities of shipping shape the ways in which we eat, buy, and think about food. Last year, the supremely British McVitie’s – producer of the Jaffa Cake, the most widely recognised biscuit (I mean, cake) in Britain – was sold to Yildiz, a food group based in … Turkey.

Last week some friends and I had supper at the Cube Tasting Kitchen. I should emphasise at the outset that for all the fact that I write a blog about food, I’m not a huge fan of the mad flights of fancy which characterise fine dining at the moment. I’m not into molecular gastronomy. I think it’s really interesting—and for a number of reasons, not only culinary—but given the choice between that and the sublime comfort food served at The Leopard and Woodlands Eatery, pizza at Stella e Luna, or dim sum at the South China Dim Sum Bar, I’d probably choose one of the latter.

But Cube was, really, entirely wonderful. And fun. It’s a small, box shaped, white walled restaurant in Joburg’s Parktown North, in a row of good and unpretentious middle-range restaurants, including Mantra which is one of my favourite places at which to eat saag paneer. It was an evening of delights over fifteen courses. We began with six starters, each themed according to a vegetable—tomato, cucumber, cabbage, potato—or a deconstructed—pissaladière—or reconstructed—Parmesan ice cream with balsamic vinegar made to look like vanilla ice cream and chocolate sauce—version of a familiar dish. The cucumber came with a gin cocktail, the cabbage soup was blue and then turned purple, and the Parmesan ice cream didn’t really work.

Blue cabbage soup…

…that turns purple. (Apologies for the grainy photographs.)

That was okay, though. The fact that not every course was an absolute success was part of the fun. The infectious enthusiasm of the young chefs—who cook almost in the middle of the restaurant—and of the serving staff turned this into a game and an adventure. I had vegetarian main courses. The oddest, but most successful, was a combination of asparagus, humus, and shards of meringue with black pepper. The most delicious was a mushroom soufflé and a curry reduced to its most basic elements. The most beautiful was a Jackson Pollocked plate of beetroot and leek, which was also, paradoxically, the least flavourful.

Beetroot and leek.

And pudding—after baklava and cheese, and a palate cleanser of sherbet, pomegranate jelly, and orange sponge consumed as you would tequila with salt and lime—was a forest floor of pistachio marshmallow, rice crispy and cranberry cookies, chilled chocolate mousse, dried flower and chocolate soil, coffee biscuits, lemon gel, and wheat grass. Then there were chocolate brownies and coconut ice.

Forest floor pudding.

The size of the portions and the length of time it took to eat all of this—we were there for more than three hours—meant that we could digest at leisure. Because this was as much an intellectual and sensory exercise as it was supper. It would be easy to criticise this kind of dining on the grounds that its purpose is not really to feed people: it uses good, expensive food to allow fairly wealthy paying customers to have fun. But it is equally true that food has always been about more than nutrition. Human beings have long consumed—sacrificed—food in the name of status and power, in performing rituals, and marking celebrations.

It is, though, interesting that molecular gastronomy—which has its roots in the nouvelle cuisine of the 1980s—came to prominence before and during the 2008 crash, in a period marked by ever widening social and economic inequality. (On a side note, it’s worth thinking about relative definitions of wealth: our meal at Cube was expensive, but within the realms of financial possibility even for someone on a fairly modest researcher’s salary. I would never be able to afford the same menu at a similar restaurant in London, for instance.) Molecular gastronomy does not—despite the grandiose claims of some of its practitioners—represent the future of food.

It does, though, represent the past. What sets the foams, pearls, and flavoured air of molecular gastronomy apart from other iterations of fine dining is its reliance on technology. Indeed, the twin gurus of this kind of cuisine—academics Nicholas Kurti and Hervé This—were interested in researching the chemical processes which occurred during cooking. Their acolytes—from Heston Blumenthal to Ferran Adrià and René Redzepi—have used this knowledge to disrupt, deconstruct, reconstruct, and undermine what we think of as ‘food.’

This work, though, does not really fundamentally challenge our eating habits and choice of things to eat. Noma might serve insects and Blumenthal may have invented snail porridge, but molluscs and insects have been part of human diets for a very long time. I think that a more accurate name for molecular gastronomy is, really, modernist cuisine—the title of Nathan Myhrvold’s 2011 encyclopaedic guide to contemporary cooking. In all of is reliance and enthusiasm for technology, molecular gastronomy is supremely modern: this is the food of industrialisation. It is as heavily processed as cheese strings. Modernist cuisine is the logical extreme of an industrialised food system.

On my fridge, I have a collection of business cards from cafes and shops visited on trips abroad. This afternoon—months late—I added another few from a recent month-long stay in Canada and the US, and I was reminded of a fantastic breakfast at the August First bakery in Burlington, Vermont. I was in Burlington for a conference and spent a couple of days beforehand working and wandering around a small university town – I grew up in a small university town so I have a professional interest in them – which has a reputation for extraordinarily progressive and inclusive politics. There were posters advertising make-your-own banjo classes (out of gourds, apparently), vegan Thanksgiving, and homebrew nights; the local Democratic party was next door to a Tibetan dumpling shop; and I have never been so aware of the plight of the Dalai Lama as I was in the week I spent in Vermont. And there was the most amazing co-operative, which had a wall – a wall! – of granola. Progressive America is, truly, the most amazing place. (In a similar vein, Ann Arbor’s community co-op is opposite a Birkenstock shop.) I had, then, granola at August First. And it was wonderful granola, with whole walnuts and fat raisins, and with plenty of really good plain yoghurt. Burlington has embraced its granola. But – and I write this as one who makes her own granola – there is a contradiction at the heart of the association of granola with progressive living: a lot of the time, it’s full of sugar. Unlike muesli, which is left raw, granola is baked usually with honey, maple syrup, or (sometimes and) sugar, as well as oil, and, occasionally, egg white. This is not necessarily the healthiest breakfast. So why does granola signify healthy eating? This isn’t the only food to be linked to left wing politics. Paul Laity notes:

‘Socialism,’ George Orwell famously wrote in The Road to Wigan Pier (1936), draws towards it ‘with magnetic force every fruit-juice drinker, nudist, sandal-wearer, sex-maniac, Quaker, “Nature Cure” quack, pacifist and feminist in England.’ His tirade against such ‘cranks’ is memorably extended in other passages of the book to include ‘vegetarians with wilting beards,’ the ‘outer-suburban creeping Jesus’ eager to begin his yoga exercises, and ‘that dreary tribe of high-minded women and sandal-wearers and bearded fruit-juice drinkers…’

Orwell’s ‘cranks’—a term reclaimed by the London vegetarian restaurant in 1961—were the free-thinking and –living British Bohemians of the early twentieth century, who experimented with new forms of comfortable dress, sustainable eating, eastern religions, egalitarian social arrangements, and alternative sexual identities. This early counter culture was strongly influenced by late nineteenth-century dieticians and naturopaths—many of them based in Germany—who advocated raw, simple eating in contrast to the meat- and starch-heavy meals which characterised most middle-class diets. As Catherine Carstairs remarks in her essay ‘The Granola High: Eating Differently in the Late 1960s and 1970s,’ it was immigrants from central Europe who brought health food shops to North America, stocking vitamin supplements, wholewheat bread, and, inevitably, fruit juice. It was these shops that made widely available the foods eaten at more exclusive sanatoriums in Europe and the United States.

Like muesli and bircher muesli, granola was invented in a health spa. In her excellent and exhaustively detailed history of granola, Karen Hochman argues that Dr James Caleb Jackson—a farmer, journalist, and doctor—invented granula in 1863 for the patients at his spa, Our Home on the Hillside, in upstate New York. Relying heavily on Graham flour—invented by the dour evangelical preacher Sylvester Graham—he baked sheets of biscuits and crumbled them into granules to be soaked in milk and then eaten for breakfast. It’s likely that granula—the predecessor of Grape Nuts—would never have moved beyond the confines of Our Home on the Hillside had it not come to the attention of a rival sanatorium doctor and Seventh Day Adventist, William Kellogg, who used rolled, toasted oats instead of Graham flour biscuits. He renamed his product granola, and it became for a while a significant money earner for his Sanitarium Food Company (renamed Kellogg’s Food Company in 1908).

But enthusiasm for granola remained—largely—limited to the relatively small numbers of people who shopped in health food stores until the 1960s and 1970s. Then, concern about the effects of pesticides and additives on human, plant, and animal health; suspicion of the food industry; a desire to experiment with diets from elsewhere; and a back to the land movement all coincided to produce an interest in purer, healthier, more ‘natural’ foods. Hippies—another food counter culture—looked back and found granola. So did big food companies, as Hochman writes about the US:

The sweet, nut- and dried fruit-filled granola we eat today is derived from the granola reinvented in the 1960s and 1970s. Despite having been popularised by Quaker and General Mills—the enemies of the second food counter culture—granola retained its association with progressive, healthy living.

This cultural history of granola tell us three things, I think. Firstly, that the food counter culture has roots in alternative experiments in living stretching as far back as the late eighteenth century, when vegetarianism and lighter diets were picked up as markers of enlightened, rational eating. Secondly, that business has long taken advantage of the experiments done by people working and living on the fringes of respectability.

Finally, it also traces the shifting meanings of what we define as ‘healthy.’ Despite evidence presented to us by nutritionists, what we think of as being healthy food depends on a range of factors, including whether, historically, a product has been associated with health-conscious living.

A few months ago, I was interviewed on a radio station about changing attitudes towards food and eating. After a caller commented that when he’d lived in rural Limpopo, he’d happily eaten frogs, but preferred McDonald’s having moved to Johannesburg, I managed—somehow—to talk myself into an urgent appeal to the nation to eat insects. I’m still not entirely sure how this happened, but I think it was partly connected to the recent slew of articles on why we need to eat insects to save the planet.

This insect turn in culinary fashion is, of course, nothing new. In 1885, the entomologist Vincent M. Holt published Why not eat insects? To some extent, current arguments for eating insects deviate little from this little manifesto. Holt remarks, rightly, that there is nothing inherently dirty about insects—in fact, crustaceans, being bottom feeders, are potentially more dangerous to eat—and that they can form part of a balanced diet. He suggests that Western aversion to eating them is linked strongly to culturally specific ideas about what is fine and not fine to eat. He cites the example of a Chinese banquet at an exhibition in London, pointing out that Britons happily sampled a menu which included cuttlefish, sea slugs, and birds’ nests because it was both exotic and, apparently, healthy. Past Europeans ate insects, and societies in Africa, Asia, and elsewhere happily, according to Holt, eat insects:

Beginning with the earliest times, one can produce examples of insect-eating at every period down to our own age. Speaking to the people of Israel, at Lev. xi. 22, Moses directly encourages them to eat clean-feeding insects: ‘These ye may eat, the locust after his kind, and the bald locust after his kind, and the beetle after his kind, and the grasshopper after his kind.’ …

Cooked in many and various ways, locusts are eaten in the Crimea, Arabia, Persia, Madagascar, Africa, and India. … From the time of Homer, the Cicadae formed the theme of every Greek poet, in regard to both tunefulness and delicate flavour. Aristotle tells us that the most polished of the Greeks enjoyed them… Cicadae are eaten at the present day by the American Indians and by the natives of Australia.

He appeals to his readers:

We pride ourselves upon our imitation of the Greeks and Romans in their arts; we treasure their dead languages: why not, then, take a useful hint from their tables? We imitate the savage nations in their use of numberless drugs, spices, and condiments: why not go a step further?

Contemporary interest in eating insects is, though, strongly connected to anxieties about a food chain which seems to be increasingly ecologically unsustainable. Current methods of producing enough protein for the world’s population are to the cost of animal welfare and good labour practice, consume vast quantities of water, and produce methane and other greenhouse gases. Something needs to change, and insect enthusiasts argue that crickets, grasshoppers, and caterpillars are a viable alternative to beef, chicken, and pork. In a 2013 report for the Food and Agriculture Organisation, Dutch entomologist Arnold van Huis—academic and author of The Insect Cookbook: Food for a Sustainable Planet (Arts and Traditions of the Table: Perspectives on Culinary History)—notes more than 1,900 species of insects already form part of the diets of ‘at least two billion people.’ A lot of these insects are high in protein—higher, in some cases, than beef—and other nutrients. Many of them consume waste, and farming them is comparatively cheap and requires little labour.

While it is certainly true that we can and have chosen to eat foodstuffs once deemed to be dangerous or socially taboo—potatoes in eighteenth-century France, beef in Japan during the Meiji Restoration—these shifts in attitude take time to achieve. Also, in the case of potatoes and beef, these societies were strongly hierarchical with powerful aristocracies. Thankfully, most of us no longer live in a world where the king’s decision to consume a formerly shunned ingredient changes the way that all of us eat.

As every recent article on entomophagy notes, the main obstacle to the widespread incorporation of insects into, particularly but not exclusively, Western diets is a strong aversion to eating them. If only, the argument goes, picky Westerners would give up their hypocritical dislike of insects—they eat shrimp and prawns, after all—and then we’ll all be fine. But I think it’s worth taking this dislike seriously. As Goodyear makes the point, a lot of these insects aren’t particularly delicious. She tries embryonic bee drones picked from honeycomb:

the drones, dripping in butter and lightly coated with honey from their cells, were fatty and a little bit sweet, and, like everything chitinous, left me with a disturbing aftertaste of dried shrimp.

I’ve eaten fried, salted grasshoppers at a food festival on London’s south bank, and they were crunchy and salty—improved, like most things, by deep frying—but otherwise memorable only for having been grasshoppers.

Making insects palatable involves processing, something which almost inevitably increases the ecological footprint of the product. Perhaps even more importantly, as the caller I referred to at the beginning of this post said, insects are widely associated with poverty and deprivation. Modernity—life in the city—requires a new diet. While it is true that in many societies, people do eat insects out of choice, it is equally significant that when they can, people stop eating insects as soon as possible.

Our current anxiety about sustainable sources of protein is driven partly by concern that the new middle classes in China and India will demand to eat as much beef, in particular, as their Western counterparts. I wonder to what extent this concern is part of a long tradition of Malthusian yellow peril: that China, in particular, will somehow eat up all the world’s resources. I don’t have any objection to promoting entomophagy—although trickle down strategies have a fairly low level of success—but I think we should look more carefully at the reasons underpinning our interest in investing in alternative forms of protein, and also be careful that we won’t take seriously the interests and tastes of people clawing their way out of poverty.

When I was finishing my PhD, my friend Jane gave me a t-shirt emblazoned with the slogan ‘tea is not a food group.’ She used to shout that into my room—we lived a few doors down from each other in the same student residence—as she passed me on the way to the lift. She had good reason for doing so. When I’m absorbed in writing, I can forget that the world exists: that it’s necessary to brush your hair, dress properly, cook, and not have conversations with yourself out loud. And that it’s unwise to subsist on tea.

Over the past couple of months, I’ve been in the final throes of completing a book manuscript and I’ve tried—probably not always successfully—to maintain at least a semblance of normal, civilised behaviour, but tea has remained a constant. It’s a kind of writing comfort blanket; a small routine in the middle of anxious typing. In some ways, then, it was a misfortune to be in the United States for much of this period. I could drink as much excellent coffee as I could cope with, but tea? Good strong, hot black tea? Until I discovered a branch of TeaHaus in Ann Arbor, not so much.

I know that I’m not the first to complain about the difficulty of finding a decent cup of black tea in the US, and, to some extent, this belief that Americans don’t understand hot tea is something of a misnomer. Teavana, Argo, and Teahaus all attest to an enthusiasm—an apparently growing enthusiasm—for well-made tea. I’ve never encountered so many different kinds of tea in supermarkets. (And, truly, Celestial Seasonings is the best name for a brand of tea.) But it is true, I think, that it’s hard to find really good black tea in the average café. While this is probably linked to the fact that most tea drunk in the US is iced tea, it’s also because tea in these establishments is made with hot—not boiling—water. This is crucial. Tea leaves need to steep in freshly boiled water.

Tea.

This aversion to boiling water can be traced back to a 1994 civil case: Liebeck vs McDonald’s Restaurants. Two years previously, Stella Liebeck, an elderly Albuquerque resident, had spilled a cup of boiling hot coffee over her lap. She sued McDonald’s, and was awarded initially $2.7 million in punitive damages. While for some, the case has become emblematic of the madness of a litigious legal system, the truth is considerably more complex. Not only had Liebeck suffered third degree burns—resulting in extensive reconstructive work and long stays in hospital—but she and her family only sued McDonald’s as a last resort. When their reasonable request that McDonald’s cover her medical bills was turned down, they decided to go to court. Moreover, in the end, Liebeck received considerably less than $2.7 million: the judge reduced that sum to $480,000, and she was awarded, eventually, between $400,000 and $600,000 in an out of court settlement with McDonald’s.

This was not, then, a frivolous lawsuit. But it was interpreted as such, and became one of the examples cited in efforts to reform tort law—the legislation which allows people to sue others in the case of injury to themselves or their property—in the US. As some lawyers argue, the tort reform lobby led by Republicans isn’t really to reduce the numbers of lawsuits submitted by greedy people, but, rather, an attempt to protect business from having to pay for its mistakes.

For tea drinkers, though, this misperception (fanned by tort reform campaigners) has resulted in tepid, unpleasant cups of tea. Concerned about similar lawsuits, restaurants now serve hot—rather than boiling—water. But perhaps there is a kind of poetic—or historical—logic to having to search high and low for decent tea in the US. The chests of tea tipped into Boston’s harbour in 1773 was both in defiance of the Tea Act and a rejection of Britain’s right to tax the thirteen colonies. When patriots switched to coffee—indeed some refused even to eat the fish caught in or near the harbour on the grounds that they could have consumed some of the tea—it was in defiance of British rule. In the land of the free, shouldn’t tea be hard to come by? This association of coffee and freedom wasn’t new, even then. Coffee houses in eighteenth-century Britain and Europe were places where middle-class men could gather to talk and think. The work of the Enlightenment was done, to some extent, over cups of coffee. But coffee was produced on slave plantations and coffee houses—and the freedoms discussed in them—were largely for white men. Coffee represented, then, freedom of the few.

Like so many people recently, I’ve been thinking about the historical contexts which produced the principles on which liberal democracies are founded. Freedom of expression and of thought, freedom to gather, freedom of religious belief are fundamental to the functioning of liberal democracies. Regardless of the fact that these principles were originated during a period in which they applied mainly to white men—and regardless of the fact that they have not prevented injustices from being committed (sometimes in their name) in liberal democracies—these remain the best, albeit imperfect, protection of the greatest number of freedoms for the greatest number of people.

To suggest that they are somehow a western invention inapplicable to other parts of the world would be an enormous insult to Egypt’s cartoonists who continue to criticise successive oppressive governments despite risking potential imprisonment or worse; to Saudi Arabian blogger Raif Badawi, who received the first fifty of a thousand lashes last Friday, for writing in support of free expression; to the Kenyan MPs who last year so strongly opposed a new security bill which will dramatically curb journalists’ ability to report freely. Also, it would be a profound insult to the vast majority of Muslim people in France and elsewhere—members of a diverse and varied faith—who managed to cope with the fact that Charlie Hebdo and other publications ran cartoons which insulted or poked fun at Islam.

Whether you think that the cartoons in Charlie Hebdo were amusing or clever or blasphemous or racist, is besides the point. Free speech and free expression were no more responsible for the killings in France last week than they were for the murder of more than two thousand people in Nigeria by Boko Haram. This isn’t to argue that we shouldn’t discuss—loudly, freely, rudely—how right or wrong it was to publish these cartoons in a society which many feel has strongly Islamaphobic and racist elements—in the same way we should debate potentially misogynistic, anti-Semitic, racist, homophobic, or transphobic writing, art, or speech too. But to begin to suggest that there are times when we shouldn’t criticise and satirise, is to suggest that there should be limits to what we may think and imagine.

I never expected to receive an email from the Wayne County Airport Police. I had been so disoriented by the unpleasantness of immigration, crossing from Canada to the United States, that I’d dropped my travel notebook in Detroit airport. I’d only discovered its absence when unpacking in Ann Arbor and, as with most deep, unhappy losses, had only begun to realise how much I missed my small, black Moleskine diary a day or two later. But it was found, and a policewoman emailed to ask if it was mine. It arrived in a Fedex box six times its size within the week.

The diary would mean very little to anyone, I think. It contains addresses and phone numbers; lists of places to visit, things to buy, books to read, what to pack. It also includes recipes and descriptions of food I’ve eaten in Australia, Europe, Canada, and the US. It was these that I was particularly sorry to lose. In Kingston—a few days before arriving in Michigan—I’d written down the recipe for apple pie made by Elva McGaughey, my friend Jane’s mother, and an encyclopaedia of information on the home cooking of Ontario families.

That it was apple pie was significant. A week previously, Jane and Jennifer and Jennifer’s small son Stephen and I had picked apples in Québec’s Eastern Townships. We drove from Montréal, through bright green, softly rolling countryside. The sky was low and it drizzled. At the orchard, as Stephen snored gently in his sling, we filled deep paper bags with McIntosh and Cortland apples.

Several people pointed out to me that the saying should be, really, ‘as Canadian as apple pie’ because—in their view—the best pie is made with Macs, a popular variety developed by John McIntosh, who discovered these tart, crunchy apples on his farm in Ontario in 1811. The Mac now constitutes 28% of the Canadian apple crop, and two thirds of all the apples grown in New England. It is—as I discovered—excellent for eating straight off the tree, and cooks down into a slightly sour, thick mush in pie.

Today, the Mac is one of only a handful of apples grown commercially. Industrialised food chains demand hardy, uniform, easily grown varieties which can withstand long periods of storage and transport without going off or developing bruises. Until comparatively recently, there were thousands of apple varieties to choose from. Writing about the United States, Rowan Jacobsen explains:

By the 1800s, America possessed more varieties of apples than any other country in the world, each adapted to the local climate and needs. Some came ripe in July, some in November. Some could last six months in the root cellar. Some were best for baking or sauce, and many were too tannic to eat fresh but made exceptional hard cider, the default buzz of agrarian America.

Nomenclature of the Apple: A Catalogue of the Known Varieties Referred to in American Publications from 1804 to 1904 by the pomologist WH Ragan, lists 17,000 apple names. I wonder if a small part of the enthusiasm fueling the current rediscovery of old varieties—even neglected apple trees will continue bearing fruit for decades—is due to the multiple meanings we’ve attached to apples over many, many centuries. They feature prominently in classical and Norse mythology, where they are symbols of fertility, love, youth, and immortality, but also of discord. They are fruit with doubled meanings. The apple in fairytales represents both the victory of the evil stepmother, as well as the beginning of our heroine’s salvation: her prince will kiss her out of the coma induced by the poisoned apple. In her novel The Biographer’s Tale, AS Byatt represents the two wives—one in England, the other Turkish—of the bigamist Victorian explorer Sir Elmer Bole with green and red apples. The fruit in the Garden of Eden—since at least the first century CE described as an apple—bestowed both knowledge and banishment.

If the name McIntosh seemed oddly familiar, then it may be because of a now-ubiquitous Californian brand: the Apple Macintosh, launched in 1984, was named ‘Apple’ by Steve Jobs—apparently then on a fruitarian diet—and after the Mac apple, a favourite of one of the company’s top engineers. It is appropriate that these sophisticated machines which offer access to so much knowledge—licit, illicit, open, secret—should be named for apples.

When it was ready, she would put an apple in the little oven to bake. Before long, the grating of the burner door was outlined in a red flickering on the floor. And it seemed, to my weariness, that this image was enough for one day. It was always so at this hour; only the voice of my nursemaid disturbed the solemnity with which the winter morning used to give me up into the keeping of the things in my room. The shutters were not yet open as I slid aside the bolt of the oven door for the first time, to examine the apple cooking inside. Sometimes, its aroma would scarcely have changed. And then I would wait patiently until I thought I could detect the fine bubbly fragrance that came from a deeper and more secretive cell of the winter’s day than even the fragrance of the fir tree on Christmas eve. There lay the apple, the dark, warm fruit that—familiar and yet transformed, like a good friend back from a journey—now awaited me. It was the journey through the dark land of the oven’s heat, from which it had extracted the aromas of all the things the day held in store for me. So it was not surprising that, whenever I warmed my hands on its shining cheeks, I would always hesitate to bite in. I sensed that the fugitive knowledge conveyed in its smell could all too easily escape me on the way to my tongue. That knowledge which sometimes was so heartening that it stayed to comfort me on my trek to school.

The baked apple—Proust’s madeleine for twenty-first-century theorists—both opens up Benjamin’s memories of childhood during a period of acute homesickness, but, as a child, it contained the ‘fugitive knowledge’ of what lay ahead. It could fortify—sustain—him on the journey to school, between the dark warmth of home and the noise and brightness of school.

Notebooks contain the same fugitive knowledge: they are both guides for future action, and repositories of information, memory, fact gathered over time and place. They travel in pockets and backpacks and book bags from Drawn and Quarterly, accruing meaning, emotional and intellectual. They belong to time present, as well as time future and past.

I spent most of October and November in the United States and Canada, coinciding with Canadian Thanksgiving, Hallowe’en, and, probably most importantly, pumpkin spice season. This blend of cinnamon, nutmeg, and cloves—the flavourings associated with pumpkin pie—has become comically ubiquitous in the US. Alongside pumpkin spice muffins, macaroons, and cupcakes, I saw pumpkin spice air freshener, rooibos tea, and beer. I tried pumpkin spice chips (inadvisable) and Icelandic yogurt (odd).

Too far, I think.

The pumpkin spice phenomenon originated in 2003, when Starbucks—then on the cusp of almost-global domination—debuted a new flavour for autumn. As reported last year to mark the drink’s ten-year anniversary, the company was hesitant to introduce the pumpkin spice latte. It already sold several flavoured coffees, but was not entirely sure that another seasonal drink would take off. They needn’t have worried. Forbesreports that in 2013 Starbucks had sold more than 200 million pumpkin spice lattes:

If you just do the math, that means Starbucks has sold an average 20 million beverages a year whose flavoring once belonged primarily in a seasonal pie…

At the basic price of about $4 for a 12-ounce tall size, PSL means at least $80 million in revenue … for Starbucks, which serves it beginning in September. … The company says the PSL is by far the most popular seasonal beverage in its lineup.

In fact, the pumpkin spice latte was held responsible for a bounce in the chain’s revenues this year. Outrages and fears over pumpkin spice shortages, and the annual dash for the first pumpkin spice latte of the season, are canny marketing strategies which have helped to position the drink—the #PSL on Twitter—alongside Starbucks’s red cups as a marker of the beginning of autumn and the holiday season. Unsurprisingly, other chains and supermarkets have begun to produce their own versions of the PSL.

Small, independent coffee shops—the alternatives to corporate caffeine—have also developed ways of cashing in on the pumpkin spice craze. I had a pumpkin pie flavoured latte at New Moon—an excellent café in Burlington, Vermont—and a lumberjack latte at Babo in Ann Arbor, Michigan. They were sweet and spicy: less coffee than coffee flavoured drinks.

I don’t think that the wild enthusiasm for pumpkin spice—as a flavouring—is particularly surprising. After all, in the US, Europe, and some other parts of the world, this combination of spices has long been a feature of winter or festive cooking and baking. A more interesting question is why Americans drink so much flavoured coffee. In the interests of research, I also tried vanilla, and brown sugar and sea salt flavoured coffees, and resolved never to waver from the true path of Americanos, flat whites, and the odd cappuccino. For all the fact that new technologies and techniques—drip, siphon, cold brew—have gained wild popularity for making coffee which tastes, apparently, more acutely and complicatedly of coffee, the popularity of flavoured coffees continues unabated.

America remains the largest coffee market in the world, with a third of consumers drinking ‘gourmet’ (or specially prepared) brews every day. To some extent, the ubiquity of coffee today is linked to a major fall in the price of the commodity twenty years ago. In 1962, John F. Kennedy shepherded the International Coffee Agreement into existence. Including mainly Latin American countries—the producers of superior Arabica coffee beans—the ICA controlled the price of coffee globally and was also intended to stabilise these countries’ economies, immunising them against potential Soviet influence. The ICA favoured the US and Brazil, giving both countries veto rights on policy decisions.

The collapse of the ICA, along with the Berlin Wall, in 1989 was produced both by shifting Cold War politics as well as by the emergence of new coffee producing countries—like Vietnam—which were not signatories to the Agreement. The fall in the price of coffee meant a coffee boom, particularly in the US where enthusiasm for Arabica had grown steadily over the course of the 1980s. It is no coincidence that you may have tried your first cappuccino—in the US and elsewhere—in the early 1990s. The growth of Starbucks—founded as a small independent in Seattle in 1981—traced the demise of the ICA and the fall in the international coffee price.

It is now easier than ever to buy extraordinarily good coffee for relatively little money. I wonder if this could account for the amazing variety of coffee based drinks available in the US. As a cheap beverage—as an affordable luxury, as Sidney Mintz describes the consumption of sugar in the nineteenth century—has coffee become unmoored from its position as a bitter drink to be had in small quantities at defined moments in the day, to a sweet, comforting snack to be consumed at any time?

A few weeks ago, my friend Nafisa sent me a photograph of a banner outside a cafe in Linden in Johannesburg’s northern suburbs. In a particularly good demonstration of why punctuation helps to avoid horrific confusion, it advertises that it ‘now serves TIM NOAKES’—with ‘breakfasts and lunches’ in smaller script below.

In Linden, Johannesburg. Courtesy of Nafisa Essop Sheik.

Personally, I would prefer neither to eat Tim Noakes nor his high-fat, low-carbohydrate diet. This sign is interesting, though, because it still refers to a Noakes, rather than Banting, diet. In the past couple of months, restaurants all over South Africa have added Banting friendly meals to their menus, and I think it’s worth taking a closer look at Banting, his diet, and context. William Banting (1796-1878) was a prominent undertaker and funeral director whose family had long been responsible for organising the Royal Family’s funerals. He and what became known as ‘Bantingism’ rose to prominence in 1863 with the publication of A Letter on Corpulence, Addressed to the Public. In it, he described how he shrunk from obesity to a ‘normal’ weight as a result of a miraculous diet. The aptly named Michelle Mouton explains:

After many vain attempts to find a doctor with a cure for corpulence, and after futile experiments with Turkish baths and the like, it is ironically diminished sight and hearing that incidentally lead Banting to his miracle. His ear surgeon suspects a constriction of the ear canals, Banting reports, and advises him to abstain from what Banting terms ‘human beans’—‘bread, butter, milk, sugar, beer, and potatoes’—so called because they are as harmful to older persons as are beans to horses.

The diet was so efficacious that Banting lost forty-six pounds in a year, and reported feeling healthier than ever before. So what did he eat?

For breakfast, I take four or five ounces of beef, mutton, kidneys, broiled fish, bacon, or cold meat of any kind except pork; a large cup of tea (without milk or sugar), a little biscuit, or one ounce of dry toast. For dinner, five or six ounces of any fish except salmon, any meat except pork, any vegetable except potato, once ounce of dry toast, fruit out of a pudding, any kind of poultry or game, and two or three glasses of good claret, sherry, or Madeira—champagne, port and beer forbidden. For tea, two or three ounces of fruit, a rusk or two, and a cup of tea without milk or sugar. For supper, three or four ounces of meat or fish, similar to dinner, with a glass or two of claret. For nightcap, if required, a tumbler of grog—(gin, whisky, or brandy, without sugar)—or a glass or two of claret or sherry.

Noakes-ites will note that Banting included some carbohydrates in his diet, and seemed to shun pork (if not bacon) and salmon, possibly on the grounds that they were too fatty. His injunction against sugar is mildly ridiculous considering the amount of fortified alcohol he drank. No wonder he enjoyed the diet so much—it gave him licence to remain in a permanent state of gentle tipsiness.

Much of Bantingism’s popularity was linked to the fact that it emerged during a period when diets, perceptions of physical and moral beauty, and ideas about health were undergoing rapid change. The wild success of his pamphlet in Britain, the United States, and elsewhere caused intense debate within a medical profession which was increasingly linking weight—Banting’s corpulence—to health. Urban living and industrialised food production reduced the price of food and altered eating patterns. For the middle classes, for instance, meals were now eaten three times a day, with dinner moving to the evening. At the same time, thinness was increasingly associated both with physical beauty and moral behaviour. This diet seemed to offer an easy way to achieve both ideals. Self-denial would result in a more moral, thinner person. Mouton writes:

Toward the end of 1864, George Eliot wrote to a friend, ‘I have seen people much changed by the Banting system. Mr A. [Anthony] Trollope is thinner by means of it, and is otherwise the better for the self-denial,’ she adds.

The diet also offered the new middle classes a way of navigating new food choices, in much the same way that their embrace of evangelical Christianity assisted them in finding a place for themselves within Britain’s class system. As Joyce L. Huff observes, Banting chose to write his pamphlet as a tract. Similar to other confessions of earnest Christians who had come to the light of God’s grace, Banting’s Letter traces the journey of a humble man—a sinner in a fat body—to the light and clarity of a high protein diet. He had achieved full mastery of both his body and his soul.

Enthusiasm for the diet petered out fairly quickly, but Banting’s writing has been resuscitated more recently by pro-protein evangelicals like Robert Atkins, Gary Taubes, and Noakes. Thinking about Banting’s diet in historical context draws attention to a few exceptionally important points:

Firstly, anxieties about diet occur in the midst of major social change. I don’t think that it’s any accident that Noakes has found an audience among South Africa’s middle classes: whose numbers are growing, but who are also feeling the impact of global recession. Diets—particularly strict diets—offer a sense of being in control and of group belonging in times of radical uncertainty.

Secondly, as a closer look at Banting’s day-to-day eating demonstrates, his diet and that advocated by Noakes are fairly different. In fact, I wonder if Banting lost weight simply because he was eating less food more generally, than as a result of his switch to greater quantities of protein. Noakes cites Banting and other eighteenth- and nineteenth-century high protein dieters to lend his writing greater validity. This is knowledge, he implies, that has been around for some time. All he’s done is to bring it to wider public knowledge. Yet it’s clear that what we define as high protein has changed over time. Noakes’s diet is a diet of the early twenty-first century.

Thirdly, as the short lived initial enthusiasm for Bantingism suggests, this diet is no more successful than other diets at causing weight loss. Put another way, while eating a high protein diet will cause initial, dramatic weight loss—partly through dehydration—those who follow diets which encourage greater exercise and generally lower calorie intake lose the same amount of weight over a longer period of time. This has been demonstrated bystudyafterstudy. More worryingly, we have no idea what the longterm health implications of high protein diets may be.

Connected to this, Noakes argues that it is largely industry—Big Food—which has been behind efforts to discredit high fat diets. Although Banting was ridiculed by some doctors during the 1860s, this was at a time when medical professionals jostled with quacks for recognition, and did not occupy the same position of authority that they have since the mid-twentieth century. Doctors could not band together to suppress this kind of information. Moreover, food companies were in their infancy. Clearly, people chose to relinquish the diet for a range of other reasons.

Finally, this—as Banting’s contemporaries pointed out—is a diet for the wealthy, and for a planet with unlimited resources. It is out of reach for the vast majority of people who are obese, most of whom are poor. We know that intensive livestock farming has a devastating impact on the environment. Addressing poverty and rethinking agriculture offer the best means of improving the health of the world’s population and of mitigating climate change. Not eating more animal protein.

Two years ago today, police opened fire on a group of striking mineworkers encamped on a koppie outside of Marikana. Mainly rock drill operators doing some of the most basic and difficult work on the mine, these men demanded that Lonmin – in whose platinum mine they worked – raise their salary to match that of literate, better skilled miners, to about R12,500 per month.

After weeks of sporadic violence on both sides – during which policemen, shop stewards, and workers were injured and killed – mine bosses urged the police to end the standoff. Jack Shenker writes:

It was the police who escalated the standoff at Marikana mountain, bringing in large numbers of reinforcements and live ammunition. Four mortuary vans were summoned before a single shot had been fired. Lonmin was liaising closely with state police, lending them the company’s own private security staff and helicopters, and ferrying in police units on corporate buses. Razor wire was rolled out by police around the outcrop to cut the miners off from Nkaneng settlement; pleas by strike leaders for a gap to be left open so that workers could depart peacefully to their homes were ignored.

Police opened fire as workers approached them. In the end, thirty-four were killed, seventeen of them at a nearby koppie where it appears that they were shot at close range. The Marikana massacre has been described as post-apartheid South Africa’s Sharpeville. As the inquiry into the events near the mine has revealed, police arrived not to keep order, but, rather, to end the strike through any means possible.

The poster for Rehad Desai’s documentary on the Marikana massacre, Miners Shot Down.

The killings were followed by a strike – the longest in South African history – until May. Of all the details to emerge in the coverage of life in the platinum belt, the one that seemed to encapsulate the desperation of striking miners and their families was in a 2006 report commissioned by Lonmin: researchers had discovered children suffering from kwashiorkor near the mine.

Although already identified in 1908, kwashiorkor was named by Dr Cicely Williams, a Colonial Medical Officer, in the Gold Cost during the 1930s. Tom Scott-Smith explains:

she noticed a recurring set of symptoms amongst children who were aged between one and four: oedema in the hands and feet, darkening and thickening of the skin followed by peeling, and a reddish tinge to the hair in the worst cases. There was a clear pattern in the incidence of this disease, since it occurred in children who had been weaned onto low-protein, starchy foods such as maize, after being displaced from the breast by a younger sibling. Williams’ description first appeared in print in 1933, and two years later she identified the condition by its name in the local language: kwashiorkor, the ‘disease of the deposed child’.

Williams diagnosed kwashiorkor as a from of inadequate nutrition – similar to pellagra, which is caused by a diet insufficient in vitamin B3 – related specifically to an intake of too little protein. Williams had noticed that newly weaned babies and young children – the ‘deposed’ children referred to by the word kwashiorkor – were particularly vulnerable to the condition, and surmised that longer breastfeeding or a diet rich in the nutrients non-breastfed children lacked – protein especially – would eradicate kwashiorkor.

By the 1970s, though, doctors argued that this emphasis on protein supplements – which had driven United Nations and other organisations’ efforts to address kwashiorkor – was incorrect. Kwashiorkor, they argued, was the product of under nutrition: of not consuming enough energy. Scott-Smith writes:

Evidence from the 1960s demonstrated that a less protein-rich, more balanced diet could cure kwashiorkor equally well, and by the 1970s a number of other causes for the disease were suggested – even today, the details of kwashiorkor are still not fully understood.

Had scientists paid closer attention to the name ‘kwashiorkor’ they may have come to this realisation sooner. It is a disease of poverty where adults are unable to provide weaned children with adequate nutrition. As a result, its solution is distressingly simple: better and more food.

If there is any indicator of the extent of poverty in the platinum belt, then it is the fact that children suffer from kwashiorkor. While Lonmin has ploughed some of its profits back into communities surrounding the mines – opening schools and running feeding schemes, for example – it remains the case that mineworkers and their families are still desperately poor.

Keith Breckenridge argues that the wealth generated by workers operating in exceptionally dangerous conditions is channelled largely to a small group of beneficiaries. He adds:

Under the current arrangements in the platinum belt there is almost no movement of resources from mining to the wider problem of maintaining the physical and emotional well-being of the general population working in the mines. Mine managers have retreated from maintaining order and health in the hostels, and they have ceded control over the key human resource questions – employment and housing – to union officials and their allies. Like foreign shareholders and local royalty owners, these union leaders, using their monopoly over jobs and housing, have tapped into the demand for employment to enrich themselves (often at the expense of the working and living conditions of union members). Local government – caught between the mines and the prerogatives of tribal authorities – has all but abandoned the project of regulating the living spaces around the mines.

Where once miners were coralled into the prison-like conditions of single-sex hostels where their food, accommodation, and other expenses were covered by mining companies, now meagre housing allowances are meant to support these workers and their families in the otherwise badly provisioned and serviced towns and villages in the platinum belt. Salaries tend to go straight to pay interest on loans granted by micro lenders, charging exorbitant interest rates.

As the incidences of kwashiorkor reported to Lonmin suggest, these men were not earning enough to feed themselves and their children. While under cross examination at the Farlam Commission of Inquiry into the Marikana massacre, Cyril Ramaphosa – current Deputy President and Lonmin board member who had emailed the then-Police Minister, demanding an end to the workers’ strike – remarked:

The responsibility has to be collective. As a nation, we should dip our heads and accept that we failed the people of Marikana, particularly the families, the workers, and those that died.

This week I attended the Johannesburg launch of Gabeba Baderoon’s Regarding Muslims: From Slavery to Post-Apartheid. In it, she traces the long history of the representation of Muslims in South Africa, arguing that this is crucial to understanding how ideas around race and sexuality, for instance, have changed over time in this country. Importantly, though, she also looks at how Muslims themselves have both responded to and challenged the ways in which they have been portrayed.

She devotes an excellent chapter to the meanings and uses of ‘Cape Malay’ cooking. This is a cuisine, Baderoon notes, that carries with it the memory of enslavement and violence – a memory which was erased, in particular, by the recipe books written by white authors about the cooking of the Cape’s Muslim population, most of whom are the descendants of slaves. Part of the purpose of books such as Renata Coetzee’s The South African Culinary Tradition (1977) was to use this cooking to demonstrate the existence of a particularly South African cuisine which was linked more strongly to Europe – albeit heavily influenced by southeast Asia – than Africa.

When Muslim women – both in the Cape and elsewhere – began to write their own books during the early 1960s, they acknowledged the ‘Africanness’ of their cooking. Their recipe books

meant that Muslim food would no longer be a realm presided over by white experts who drew from silent or apparently submissive black informants in their kitchens, and spoke on their behalf. The transformation of Muslim cooks from silent informants to spokespeople of tradition began to subvert the use of ‘Malay’ food to solidify a ‘general’ South African cuisine that marginalised Africans and centred a European-oriented whiteness.

Baderoon uses the example of the Hertzoggie to demonstrate how Malay food encodes a fraught, but also subversive, history.

Hertzoggies are small – delicious – cookies consisting of a layer of pastry, a blob of jam (usually apricot), and a dome of desiccated coconut. They’re named after JBM Hertzog, Prime Minister of South Africa between 1924 and 1939. Representing largely the interests of white Afrikaners, Hertzog oversaw legislation which further entrenched segregation. The landmark 1936 Native Trust and Land Act and Native Representation Act not only further restricted the land that Africans could hold, but also removed those Africans who qualified to vote from the voters’ roll in the Cape and dashed any hopes of extending the franchise to blacks nationally.

According to Cass Abrahams – an authority on Cape cooking – it was in this context that the Hertzoggie was invented. Baderoon quotes her:

[Hertzog] made two promises … He said that he would give the women a vote, en hy sal die slawe dieselfde as die wittes maak he will make the Malays equal to the whites. Achmat [Davids, the late linguist and historian] reckoned the Malays became terribly excited about this and they put this little short-crust pastry with apple jelly underneath and then had the egg white and coconut on top of it and baked it and called it a Hertzoggie in honour of General Hertzog. However, when he came into power he fulfilled one promise, he gave the vote to the women, but he didn’t make the slaves the same as the whites. So the Malays became very upset and they took that very same Hertzoggie and covered it with brown icing, you know, this runny brown icing and pink icing and they call it a twee-gevreetjie [hypocrite].

I had never heard this account of the origins of the Hertzoggie before and it rings entirely true for me – particularly because of the widespread use of desiccated coconut in Cape Malay baking. It demonstrates Baderoon’s point about the use of food as a form of subversion by people otherwise socially, politically, and economically marginalised, particularly well.

However, I think that it’s also worth thinking about the Hertzoggie in relation to other baking traditions. It’s a little difficult to keep apart these different strands of South African cooking. Hertzoggies appear in recipe books written by – and, presumably, for – white, middle-class class women during the 1930s. And, often, they placed alongside recipes for Jan Smutsies or Smuts-Koekies.

Jan Smuts – statesman, war general, philosopher – was Hertzog’s main political rival. Although the differences between Smuts and Hertzog’s politics, particularly as regards segregation, should not be overstated – after all, they formed the fusion government between 1934 and 1939 – they tended to represent opposing liberal and conservative impulses within South African politics during the 1920s and 1930s.

Smutsies are similar to Hertzoggies, but have a plain pastry instead of coconut lid covering the jam. Was the relative austerity of Smutsies a commentary on his asceticism? That said, other recipes imply that Smutsies and Hertzoggies are, in fact, exactly the same – only the name changes according to the political sympathies of the baker (or the eater).

The same recipe books which include Smutsies and Hertzoggies also refer to puddings and cakes named after other white, Afrikaans heroes: like General de la Rey (hero of the South African War) and President Steyn (the President of the Orange Free State during the same conflict) cake. (They’re both cakes heavy with dried fruit and nuts, although Steyn’s is decorated with meringue.)

I don’t write this to undermine Baderoon’s argument, but, rather, to note how entangled South Africa’s culinary traditions are. Also, I want to reinforce her point about the subversive potential of food: that a biscuit invented by poor, black, Muslim women first in support of, and then in criticism of, a political figure could be taken up and celebrated by precisely the people who voted for him.

I’m Sarah Emily – that’s me about to eat an enormous breakfast – and welcome to my blog. I’m a South African historian who’s specialised in histories of childhood, food, and medicine.

This is not a food blog, but, rather, a blog about food – and, more specifically, about food, eating, and cooking. The world has enough recipes for red velvet cake floating around the internet. Here, I’m taking a closer look at the complex relationships between eating and identity; between cooking and politics; and between food and power.