Although I was as outraged by my government’s apparent indifference to the plight of Somalis, I did begin to wonder if that money could be used more wisely. Of course, South Africa must – and can – contribute to the international effort to distribute food in Somalia. Given the scrutiny of aid agencies working in the region, as well as the awareness of how aid money has been channelled to elites over the past few decades, it’s likely that South Africa’s donation will go to those who need it. But giving money to alleviate the famine is a short-term fix.

Possibly because of the way it echoes Africa’s other best-known famine, the Live Aid-engendering Ethiopian famine of 1984-1985, the famine in the Horn of Africa has generated an enormousamount of coverage in the international press. More information and analysis can only ever be a good thing, but much of the discussion around the famine suggests that it’s a crisis which emerged suddenly and without any warning. As the Guardian’s John Vidal put it, ‘A massive drought, as if out of nowhere, has settled over the Horn of Africa’. Moreover, some commentators, like Vidal, have blamed the famine on only one or two factors, usually climate change or Western indifference to African suffering.

The causes of famines are complex, but they are never entirely unpredictable. Counterintuitively, they are not necessarily caused by a lack of food, but are, rather, the result of long-term systemic failure: in agriculture, trade, and, most importantly, in government. By suggesting that South Africa’s paltry million rand donation would be better spent, my point is that South Africa’s involvement in the Somali crisis should go beyond giving money for food. It needs to stop famines from happening in the first place, and that is not impossible.

We have managed largely to eradicate famine in the twentieth century. Before then, food shortages and famines were part of the rhythms of everyday life. In societies where food production was inefficient both in terms of labour and technology – and until the eighteenth century, eighty per cent of the population of Europe was engaged in agriculture – frequent crop failures meant that famine occurred often. But during the 1700s, an agricultural revolution allowed greater, more regular, and, crucially, more reliable yields to be produced by smaller numbers of people. International trade also meant that countries could buy food to supplement local shortfalls. For example, during the 1870s, the failure of the European grain crop boosted Canadian and American wheat exports, as these two countries fed Europe for almost a decade.

Although initially developed in the Netherlands and Britain (and there is a strong link between the development of capitalist economies and efficient food systems), the methods pioneered during this green revolution of the eighteenth and nineteenth centuries spread around the globe. By the early 1900s, famine was caused increasingly by people, rather than only by nature. That said, the Great Famine in Ireland (1845-1852) was certainly the product of the potato blight, but it also occurred at a time when Ireland was an exporter of wheat: there was enough food to go around, it was just that those who were starving couldn’t afford to buy bread. The Cattle Killing Movement in South Africa (1856-1857) caused widespread famine among the Xhosa. Around 40,000 people died of starvation, 33,000 moved away from the eastern Cape to seek work, and the authority of the Xhosa polity was fatally undermined. But this was caused by a decision to slaughter cattle and destroy crops on a mass scale.

Equally, some twentieth-century famines were caused partly by crop failure, but were also the product of bad governance and ineffective systems of food distribution. As Cormac Ó Gráda explains:

Wars, blockades, poor governance, and civil unrest can also lead to famines; panics about the food supply and poorly performing markets can exacerbate them. In such cases…factors other than crop shortfalls reduce the purchasing power or ‘entitlements’ of vulnerable sections of the population: the size of the loaf matters less than its distribution.

The Nobel Prize-winning economist Amartya Sen argued in Poverty and Famines (1981) that – contraThomas Malthus who suggested that exponential population growth would result inevitably in famine – famines can occur in times of peak food production. Why? I think it’s worth quoting Sen in full:

In every society that exists, the amount of food that a person or a family can command is governed by one set of rules or another, combined with the contingent circumstances in which that person or that family happens to be placed vis-à-vis those rules. For example, in a private ownership market economy, how much food a person can command will depend on (1) what he owns, and (2) what he can get in exchange for what he owns either through trade, or through production, or some combination of the two. Obviously, in such an economy a person may suddenly face starvation either because his ownership bundle collapses (e.g., through alienation of land to the money lenders), or because the ‘exchange entitlement’ of his ownership (i.e., the command of what he owns) collapses (e.g., through his becoming unemployed and not being able to sell his labour power, or through a decline in his terms of trade vis-à-vis food).

In other words, people starve when they can’t buy food – either because they no longer have the money to exchange for food (as a result of unemployment, for example) or because food prices become prohibitively high. Peaks in food prices could be due to droughts and other ecological factors, conflict, and speculation.

The crisis in Somalia demonstrates particularly well how state intervention can prevent or cause famine. In 1960, British Somaliland and Italian Somalia became the independent Republic of Somalia. Nine years later, Major-General Mohamed Siad Barre seized power in a bloodless coup and ruled Somalia through a military dictatorship until the collapse of his government in 1991. Somalia’s experience of food shortages and famine must be understood in this context of Barre’s government (or lack thereof) and economic policies. In 1970, he announced the implementation of ‘scientific socialism’, introduced strict central planning, and viciously stamped out all forms of opposition. Peter T. Leeson writes:

The government slaughtered civilians who posed threats to the government’s plans or political power, used coercive intimidation to create artificial support for its activities, and forcibly relocated others to further the political or economic ends of Barre and his cronies. ‘Both the urban population and nomads living in the countryside [were] subjected to summary killings, arbitrary arrest, detention in squalid conditions, torture, rape, crippling constraints on freedom of movement and expression and a pattern of psychological intimidation’. The state ruthlessly suppressed free speech and controlled all forms of information reaching Somalis. Newspapers (only one was officially permitted by the government), radio, and television were fully censored and dissent in any form squelched with force. Under Somalia’s National Security Law No. 54, ‘gossip’ became a capital offense. Twenty other basic civil freedoms involving speech, association and organisation also carried the death penalty.

Funds were diverted from public works, education, healthcare, and infrastructure to the military, on whose support and ability to terrify and brutalise the Somali population Barre depended. The nationalisation of land and industry in 1975 was, predictably, a disaster. The abandonment of socialism at the end of the 1970s in order to attract assistance from the International Monetary Fund made very little difference either. Somalia was heavily dependent on international food aid during the 1970s and 1980s. The Horn of Africa is prone to drought, but it’s worth noting that despite catastrophic droughts in the mid-1970s and mid-1980s, Somalia managed to avoid famine – unlike its war-torn neighbour, Ethiopia, whose government ignored the plight of its population.

Somalia’s last major famine was in 1992 and was not caused by drought. Nearly 300,000 innocent people starved to death because of sectarian politics. The epicentre of that famine was in Bay, one of the country’s most productive agricultural regions, and starvation was induced by warlords who used food as a weapon against farmers and pastoralists.

Barre’s government collapsed in 1991, plunging Somalia into civil war and a chaos from which it has yet to emerge. It’s telling that a country which had managed to avoid famine for over half a century, despite drought, food shortages, and incredible food insecurity, saw widespread famine only after food supplies were disrupted by war.

So why famine now? Over the course of sixteen years, Somalia has been the subject of fourteen reconciliation conferences, none of which managed to produce a stable government. In 2004, the Transitional Federal Government (TFG), an anti-Islamist, pro-Ethiopian political grouping, was put into power in Somalia under the leadership of Abdullahi Yusuf and with the support of the United Nations. However, the TGF was neither popular nor effective as a government. In the absence of effective leadership, a number of attempts were made by Islamic groups, war lords, civil society organisations, and others to create some sort of order in Somalia, and particularly in Mogadishu. One of these, the Alliance for the Restoration of Peace and Counter-Terrorism, was formed by a group of war lords in February 2006. They were backed by the United States who saw them as allies against Islamic groups in the region.

Armed clashes between the Alliance and Islamist groups soon broke out and developed into a war which the Islamists won decisively. By the middle of 2006, they had taken control of Mogadishu as well as central and southern Somalia. Not only was this an embarrassment to the United States and its ally Ethiopia, but for the first time it seemed that Somalia was offered the possibility of a relatively popular and effective government in the hands of the Islamists, who quickly organised themselves into the Council of Islamic Courts (CIC). However, an invasion by Ethiopia at the end of 2006 caused the collapse of the CIC, the reinstallment of the almost entirely ineffective TFG, and the beginning of a new civil war between the Government and opposition groups. The most successful of these was Al-Shabab. Originally the CIC’s youth wing and affiliated with al-Qaeda, Al-Shabab is an Islamist group which now controls most of southern Somalia.

Years of political uncertainty, conflict, and chaos (best exemplified by the way piracy has flourished along the Somali coast) have left Somalis particularly vulnerable to drought and the less predictable effects of climate change. A combination of a US- and UN-backed blockade of the parts of Somalia controlled by Al-Shabab, as well as this organisation’s unwillingness to allow the World Food Programme to deliver food to southern Somalis has caused the famine. Samatar explains:

Normally, societies have three lines of defence against mass starvation: local capacity, national government and the international community. When a disaster hits a region, the first help comes from local administrations and the communities themselves. If events overwhelm the first responders, then the national government takes charge of operations; and when the crisis exceeds the wherewithal of the nation, international actors come to the rescue.

It is clear that all three levels of livelihood protections have failed in Somalia. Al-Shabab has prohibited the local population from organising their municipal governments and charities to fend off the disaster. Similarly, Somalia’s national government, which is beholden to sectarian leadership and international patrons, has been oblivious to the emerging calamity, and has thwarted the international community from coming to its aid

This was a famine which could have been avoided had order been established in Somalia. Here, Somali politicians and war lords are as much to blame as the international community, East Africa’s Intergovernmental Authority on Development, the UN, and, crucially in my view, the African Union. This famine is not the result solely of dastardly foreign countries plundering Africa, nor can blame be laid entirely on Somalis themselves. But after the effort to feed Somalis has ended, reconstruction needs to begin. And it’s here where South Africa must – and I think is obliged to – take a leading role.

Somalia also demonstrates the extent to which food security is linked to strong, functioning governments. Countries which are badly run, have weak economies, and, most importantly, are authoritarian, are the most strongly disposed towards famine. Last year’s narrowly-avoided famine in West Africa was due largely to the incompetence of Niger and Chad’s malfunctioning, undemocratic political dispensations. Only the spread of democratic and open government, with, crucially, a free flow of information, will prevent famines from happening in Africa. As Sen remarked, ‘There is, indeed, no such thing as an apolitical food problem.’

Note: I try to use sources which are easily available, but for this post I’ve relied on articles from academic journals. Unfortunately, these are securely behind paywalls. If you’d like copies of them, let me know.

Further Reading

Texts cited here:

Joyce Appleby, The Relentless Revolution: A History of Capitalism (New York and London: W.W. Norton, [2010] 2011).

I can’t wait to read this: Frank Dikötter’s Samuel Johnson Prize-winning new account of the 1958-1962 Great Famine in China.

Why industrial agriculture won’t feed the world – and why we need to stop industrial farms from denying us access to their operations.

The American Dietetic Association – an organisation providing supposedly objective and scientific advice on diet – has been accepting money from Coca-Cola and Pepsico. Not good.

The amazing ‘jellymongers’ Bompas and Parr organised a Rabbit Cafe in Brighton to celebrate Easter. It’s partly in celebration of the hundredth anniversary of the opening of the first Futurist restaurant. The Cafe, though, isn’t the first homage to Futurism’s fascination with food – this is an account of one recreation of the Futurist aerobanquet.

I’ve had an explosively sneezy cold this week, but with bed rest and pain killers to help me to sleep, I’m almost well again. (Unfortunately, my Head of Department remains unconvinced by my theory that I’ve been suffering from a bad allergy to undergraduate lecturing.) I really don’t see the point of taking anti-cold medication. It certainly won’t get rid of the bug, and the only time I’ve ever taken tablets for a cold – just before a long flight home from Paris – I hallucinated so badly that I thought it best never to repeat the experience. Taking it easy, avoiding dehydration, and being generally sensible seem to work every time. I’ve also had a range of advice about what I should eat: vitamin C supplements, garlic, zinc, lemon, and ginger. I’ve managed to consume nearly all of these over the past few days (although not at the same time), and – who knows? – maybe they made a difference.

We know that our diet influences our health. We know that the better we eat, the stronger our immune systems are and the longer we’ll live. It’s for this reason that many seem to believe that it’s possible to eat ourselves well: that we can both prevent and cure illnesses by eating some things, and avoiding others. I was struck forcibly by the strength of this thinking when I saw that Gwyneth Paltrow wrote a recipe book partly because she believed that her father’s eating habits caused the cancer which killed him. No, I am not completely mad, and, yes, I do realise that, at best, Paltrow can be described as a ray of ‘demented sunshine’, but this is an enormously popular and influential woman who really does think that had her father eaten more brown rice, he wouldn’t have had cancer – or, at least, wouldn’t have died from it.

There’s a logic to this thinking: if we eat pure, wholesome food, then, surely, we should be healthy and strong. The problem is that it’s difficult to define what is ‘pure’, ‘wholesome’, and ‘good’ food. However much nutritionists may dress up their work as ‘science’, we don’t know precisely what diet is best for our health. In the past few weeks new studies have demonstrated that drinking eight glasses of water and eating five portions of fruit and vegetables per day…will have very little effect on us at all. Oh, and vitamin supplements and probiotics are of dubious value too. It’s certain that we should eat plenty of fruit and vegetables and lessen our intake of red meat and saturated fat, but everything else remains guesswork. That study about Omega 3 supplements and children’s brains? It was nonsense. As is the advice sprouted by Patrick Holford. So, no, drinking green tea and eating mung beans and quinoa will not stave off cancer. (Sorry.) The amazing people at Information is Beautiful have provided a helpful visualisation of the relative benefits of dietary supplements (see here for a bigger and pleasingly animated version):

Our ideas around healthy diets have changed over time, and are inflected by a range of factors, including current debates in science and medicine, the interests of industry and food lobbies, and religious belief. In his magnificent study Flesh in the Age of Reason: How the Enlightenment Transformed the Way We See Our Bodies and Souls (2003), Roy Porter traces a shift in thinking about health and eating during the mid-eighteenth century. He argues that during the early modern period, stoutness and eating heartily – if not in excess – were seen as signs of good health. In Britain, a taste for roast beef was also connected to support for an incipient national ‘English’ consciousness.

But from the 1750s onwards, physical beauty was associated more frequently with slimness. (Compare, for example, portraits by Rubens and Constable.) Enlightenment bodies needed also to be fed in restrained, rational ways. One of the most popular prophets of the new eating orthodoxy was the physician George Cheyne (1673-1743) who based his views on plain, wholesome eating on his own experience of being morbidly obese. In The English Malady(1733) he argued that ‘corpulence produced derangements of the digestive and nervous systems which impaired not only health but mental stability. … Excess of the flesh bred infirmities of the mind.’ Porter explains:

Cheyne’s call to medical moderation was, however, also an expression of a mystical Christian Platonism trained at the emancipation of the spirit – he can thus be thought of as recasting traditional Christian bodily anxieties into physiological and medical idioms. For Cheyne, the flesh was indeed the spirit’s prison house. Excessive flesh encumbered the spirit; burning it off emancipated it.

Following the teachings of the German mystic Jakob Boehme, he imagined prelapsarian bodies innocently feeding on ‘Paradisiacal Fruits’. After the Fall, the flesh of the newly carnivorous humans had been subjected to the laws of the corruption of matter. …his works aimed at recovering the purity of the prelapsarian body.

Cheyne recommended a vegetarian diet on the grounds that it most closely resembled that eaten in the Garden of Eden. It was, in other words, the diet of spiritual perfection. Much of the success of his writing was due also to rise of a vegetarian movement in Europe during the eighteenth century. These Enlightenment vegetarians argued that it was cruel to slaughter animals merely for food, and also believed that ‘greens, milk, seeds and water would temper the appetite and produce a better disciplined individual.’

There has long been an association between corpulence and moral or spiritual laxity, and thinness with (self-) discipline. But what Cheyne advocated went further than this: he argued that rational individuals were partly responsible for their own ill-health because they could choose what they ate. Moreover, because he connected eating meat with sinfulness, deciding what to eat was also a moral choice.

Cheyne’s thinking proved to be remarkably durable. In the late nineteenth century, left-leaning social reformers promoted vegetarianism as the best example of ethical consumerism. Vegetarianism was healthy and it did not – they believed – cause the needless sacrifice of animals (although they didn’t address what happened to the bull calves and billy goats produced by lactating cows and nanny goats). In Sheila Rowbotham’s magnificent biography of the immensely influential socialist writer Edward Carpenter (1844-1929), she describes how Carpenter’s dictum of simple living took hold among the members of the Fellowship of the New Life, the forerunner of the Fabian Society. Carpenter agued for simple clothing, simple houses, and simple food:

Carpenter combined his evangelical call for a new lifestyle with an alternative moral economy. This recycled, self-sufficient praxis involved growing your own vegetables, keeping hens and using local not imported grain – American produce was forcing down British farmers’ prices.

But this met with some resistance. The physician and social reformer Havelock Ellis

protested against Carpenter’s advocacy of vegetarianism on the grounds that meat was a ‘stimulant’. Ellis wanted to know why meat? Why not potatoes? Was not all food a stimulant?

I’m with Ellis on this one.

The food counterculture of the 1960s embraced vegetarianism and an enthusiasm for ‘whole foods’ as a manifestation of a way of living ethically and sustainably. Last week I discussed Melissa Coleman’s memoir of her childhood on her parents’ homestead in rural Maine during the early seventies. Her father, Eliot Coleman, is dubbed the father of the American organic movement, and he fed his growing family mainly from the garden he soon established. They supplemented their diet with bought-in grains, seeds, honey, nut butters, and oils, but were strictly vegetarian. Their role models, Helen and Scott Nearing, were highly critical of immoral ‘flesh eaters’. Their book, Living the Good Life (1954), which became the homesteading Bible, argued that it was possible to feed a family on produce grown organically. Again, the choice of what to eat was a moral one. Eliot and Sue Coleman believed that their diet guaranteed their good health:

Papa often quoted Scott’s sayings, ‘Health insurance is served with every meal.’ As Papa saw it, good food was the secret to longevity and well-being that would save him from the early death of his father. The healthily aging Nearings were living proof that a simple diet was the key.

But, as Melissa Coleman notes, this was not a diet that suited everyone. The family suffered from a lack of Vitamin B, and at times they simply didn’t eat enough. It also didn’t prevent Eliot from developing hyperthyroidism.

His heart seemed to beat too quickly in his chest, and he had a cold he couldn’t kick, despite gallons of rose-hip and raspberry juice. … He tried to make sense of things in his mind. Health insurance, he believed, was on the table at every meal. In other words, the best way to deal with illness was to invest in prevention – eating a good diet that kept the body healthy. … He’d read up on vitamins and minerals, learning which foods were highest in A, B, C, D, and minerals like calcium, magnesium, and zinc. He drank rose-hip juice for vitamin C, ate garlic and Echinacea to build immunity, used peppermint and lemon balm tea to soothe the stomach, and used chamomile to calm the nerves, but perhaps all this wasn’t enough.

She concludes: ‘He never thought to question the vegetarian diet espoused by the Nearings.’

I don’t – obviously – want to suggest that vegetarianism is deadly. Rather, my point is that the choices we make about our diets are influenced as much – or even more – by a set of assumptions about morality, our responsibility for our health, and other beliefs as they are by information about the nutritional benefits of food. I am concerned by two aspects of this belief that we are somehow able to eat ourselves better. We need to acknowledge that what we eat will not prevent us from falling ill. Sickness is caused by many things, and although important, diet is not an overriding factor.

Eat food. Not too much. Mostly plants. That, more or less, is the short answer to the supposedly incredibly complicated and confusing question of what we humans should eat in order to be maximally healthy.

This won’t make terribly much money for nutritionists or the food industry, hence their interest in promoting things which, they suggest, will do miraculous things for our health. They almost certainly won’t. Unless you suffer from an ailment which needs to be treated with a special diet, deciding what to eat is not a complicated, mysterious process. No amount of goji berries will make you a healthier, happier, or better person.

Further Reading

Texts quoted here:

Melissa Coleman, This Life is In Your Hands: One Dream, Sixty Acres, and a Family Undone (New York: Harper, 2011).

Roy Porter, Flesh in the Age of Reason: How the Enlightenment Transformed the Way We See Our Bodies and Souls (London: Penguin [2003] 2004).

Other sources:

Warren Belasco, Meals to Come: A History of the Future of Food (Berkeley: University of California Press, 2006).

There’s recently been a gloriously self-important spat between (some) South African food bloggers and food writers. This is Mandy de Waal’s excellent article for the Mail and Guardian which started it, and this is the hilariously bonkers response from one blog.

Jay Rayner considers the latest research into the relationship between meat consumption and cancer.

The UN’s Food and Agriculture Organisation reports that the world wastes or loses 1.3 billion tons of food per year – that’s a third of the total supply.

Donald Paul for the Daily Maverickdiscusses South Africa’s food security.

How to make Cornish pasties. (And flapjacks – crunchies to South Africans.) And in praise of sandwiches.

I am not by nature a joiner. I became a member of the Green Party in the UK mainly to spite Phil Woolas after he made some more than usually daft comments about non-EU immigrants. That the Green Party did exceptionally well in the last general election and seems, to me, to offer the only credible way out of the global recession were pleasing perks of membership, but otherwise I didn’t appreciate being told to toe the party line on a few issues, sugar pill-based quackeryhomeopathy being one of them. I suppose that I don’t particularly enjoy being told what to think. This is why I’m in academia which is, as a friend put it, the last refuge of the sociopathic.

It’s partly for this reason that I’m fascinated by groups of people who set out, purposefully, to create alternative communities away from mainstream society: people who base these experiments in new living on complex rules for behaviour and thought. It’s something I would never do, and I am curious as to why others find it so attractive. I wrote my MA thesis about the first boarding school for the daughters of the Cape Colony’s Dutch-Afrikaner middle classes in the nineteenth century. This institution was a secluded, strictly evangelical retreat from colonial society for the pupils who lived there, many of whom complained that they found it difficult to return to the habits and routines of normal family life. Mission stations run by societies like the Moravian Brotherhood were similar. There, at places like Genadendal and Elim, residents were required to adhere to strict rules regarding work, dress, and speech.

The best known of these retreats were Robert Owen’s utopian socialist communities in the United States during the 1820s. The first of these, New Harmony, lasted only a few years. But there have been hundreds of similar examples, most of them unsuccessful. It seems that nearly every generation of reformers has a fringe which believes that the best way to reform society is to leave it, and construct a new way of living on its fringes. There are elements of this in the recent Dark Mountain Project founded by Paul Kingsnorth and Dougald Hine. They argue that the best way to prepare for a post-peak oil world and catastrophic climate change is to retreat, and to learn how to live sustainably and self-sufficiently away from society.

One of the most striking features of these experiments is the primacy they give to food. The cultivation of crops and, less frequently, the care of animals (and it’s interesting, although not surprising, how many alternative communities were vegetarian) were central to life in these societies. Not only was this importance due to practical reasons – before the beginning of the twentieth century, at least, it would have been too expensive to buy in adequate food supplies in rural areas – but for symbolic ones: ‘pure’ food produced by hardworking and hardthinking workers was bound to be better than that grown by exploited wage labourers.

The Colemans bought their land from the Nearings, and, using Living the Good Life as their Bible, set about living in accordance to the rules established by the Nearings. Eliot built their wooden cabin himself; they lived without electricity and running water; they cultivated most of their food themselves; and they bought as little as possible from the local shopkeepers. However idyllic this life may have appeared, it was precarious and dependent on backbreaking labour:

That my parents had chosen this lifestyle over an easier one wouldn’t matter in the moment when the goats had eaten the spring lettuce, there was nothing left in the root cellar, the drinking water was muddy with runoff, and there was no money under the couch for gas to get to town – not to mention that Jeep’s registration had expired, and we had no savings account, trust fund, or health insurance policy, no house in town to fall back on.

They soon realised that complete self-sufficiency was impossible. The Nearings, for all their status as homesteading gurus, bought in a range of luxuries, and the Colemans had to purchase oats and other grains, yeast, seeds, bacteria for making yogurt, and vitamin B supplements for their diet. And the absolute seclusion they enjoyed during their first year or two of homesteading – when Melissa was born – came to an abrupt end as a result of an article in the Washington Post by a sympathetic journalist, and Eliot’s ambitions to spread the gospel about organic gardening. He was already selling the surplus from their garden, and believed that organic methods offered an alternative to the new farming orthodoxy espoused by Nixon’s Secretary for Agriculture, Earl Butz, who advised farmers to plant maize ‘from fence row to fence row’.

Eliot’s increasing renown, his ever longer absences to study and lecture, as well as the numbers of enthusiastic students who came to work on the garden in the summer – often in the nude – put strain on the Colemans’ marriage. And it’s here that one of the main problems of these alternative communities becomes especially apparent. For all their desire not to replicate the power structures of mainstream society, they invariably do. Women continue to undertake the burden of domestic labour. Eliot Coleman worked unbelievably hard – to the extent that he developed hyperthyroidism as a result of stress and exhaustion – but, as a contemporary article on homesteading makes the point, he did the ‘fun’ bit: the growing. When he finished his work in the garden at night, he could rest. Sue, though, was responsible for keeping house and doing laundry without soap, detergent, or appliances. She ground their own flour, made yogurt, sewed and mended their clothes, and bottled, canned, and preserved food to see them through the winter. She had three daughters under the age of seven to care for. Oh, and she ran their vegetable stall too. Her work – invisible and largely unappreciated – was unremitting.

Michelle Nijhuis suggests that one of the reasons why women find homesteading so difficult is because of the absence of labour saving devices like washing machines and vacuum cleaners: without them, otherwise easy chores become difficult, time consuming, and very, very boring. But I’m not so sure about this argument. (Although who am I to disagree? I wouldn’t touch homesteading with a bargepole and she’s a paid-up member of the movement.)

Much of This Life is in Your Hands reminds me of John Matteson’s Pulitzer Prize-winning double biography of Louisa May and Bronson Alcott, Eden’s Outcasts (2007). Louisa May is now best remembered as the author of Little Women (1868), but during the mid-nineteenth century her father, Bronson, was a well-known and controversial educationalist and philosopher with strong links to the Transcendentalists. He also experimented with living away from society and, like the Colemans, his and his family’s time at Fruitlands, a commune in Massachusetts, ended in disaster. Established in 1843, Fruitlands lasted slightly more than a year, and Bronson was largely responsible for this: he and his small group acolytes planted the fields too late, ran out of money, and constructed a set of rules which actively hindered work on the farm. All animal products and labour were banned, and the members spent as much time raising funds and discussing whether or not they should drink coffee, as they did actually working the land.

Indeed, most of the work was done by Abba Alcott, Bronson’s long-suffering wife. She cared for their four young daughters, cooked, sewed, cleaned, chopped wood, and washed laundry. This was not an unusual lot for a women in nineteenth-century America, but it was made worse by their poverty, and wilful refusal to use ‘luxuries’ – warm clothes, hot water, a greater range of foodstuffs – which would have made the work any easier or, at least, more interesting. And, of course, the point of the commune was that it was meant to be wholly egalitarian. In the end, Abba did the same work – in possibly worse conditions – as women living in nearby Concord.

Towards the end of her memoir, Melissa Coleman describes her mother’s mental breakdown after the drowning of Melissa’s little sister, Heidi. But she makes it clear that this was the trigger for something which had long been developing:

Just that morning the gardens were bustling as usual with apprentices and customers and vegetables needing to be picked. It was a humid-hot day, a Saturday near the end of July. Baby Clara was strapped to Mama’s back in Heidi’s old sling, sleeping mouth-open as Mama cooked lunch, skin glowing and tan from summer. Skates was coming to visit, and Mama needed time to clean the house, to hide from her mother-in-law the chaos her life had become: Bess and Papa having breakfast together that morning, mud tracked in from the gardens, piles of laundry to be washed by hand, Heidi and I running around the small kitchen pulling each other’s hair and screaming.

In a recent post for Grist, Tom Laskawy makes the point that the longer hours worked by Americans – and I think that this is true elsewhere as well – have been sustained by the greater availability of cheap food – food which is not necessarily nutritionally sound, nor ethically produced. On the other hand, appliances and a greater variety of food available at affordable prices in supermarkets have facilitated women’s greater entry to the workplace in greater numbers. We know, nonetheless, that this is part of a food system which is entirely unsustainable.

So what do we do? I certainly don’t want a retreat into homesteading. I suggest that we take another look at the ways in which we work: both at home and outside it. There is a significant body of work which suggests that a reduction in the numbers of hours we work would not only be good for our and the planet’s wellbeing, but also for the economy. If we had more time to cook and to grow our own food – although within reason – we would have the beginnings of a more stable food system. Importantly, most of the labour performed in the home is still by women and, clearly, men need to share more of it. The burden of ensuring a shift to more sustainable lifestyles cannot be women’s responsibility alone.

In this week’s Mail and Guardian, Mandy de Waaldescribes a spat between established food journalists and South Africa’s increasing ranks of food bloggers. This tension between professional food writers and amateurs who write simply for pleasure isn’t particular to South Africa: in the United States, Pete Wells was notoriously rude about food bloggers, and Giles Coren of the Timesreferred to them as ‘pale, flabby’ and ‘wankerish’ (although, presumably, he didn’t include his food blogging wife in this description). In a sense, it was inevitable that the same argument would boil over in South Africa.

The focus of De Waal’s piece is on charges of unprofessionalism levelled at the new media, as well as on the amusing pettiness of food bloggery in this country – particularly in the Cape, where most of the country’s top restaurants are situated. I’d like to pick up on one point that she makes in passing. She quotes JP Rossouw, author of the eponymous restaurant guides:

Yes, [food blogs] are playful and fun, but the mistake we make is to attach too much importance to what essentially is candyfloss.

Exactly. One of the most comical features of many food bloggers is their incredible self importance. (De Waal mentions one author who refused to accept food served to her at a special lunch at Reuben’s Restaurant because it came on a large platter and not individual portions. Good grief.) They dress up recipes and accounts of meals as moments of incredible profundity – moments which demonstrate the authors’ connectedness with the local restaurant in-crowd, insider knowledge of international culinary trends, and superior ability to understand the ‘real’ significance of food. These blogs are, in other words, manifestations of food snobbery.

It’s little wonder, then, that so many food bloggers describe themselves as ‘foodies’. This term has travelled a long way since it was coined by Ann Barr and Paul Levy in the early 1980s. Now it’s usually taken to mean a love and enthusiasm for eating, cooking, and finding interesting new ingredients. It’s shorter, and sounds less pretentious, than ‘food lover’. I think that many food bloggers use the term in this sense. But it was originally meant to describe a kind of food snobbery. Stephen Bayley explains:

We have, in the past, had epicures, gourmets and gastronomes, but today’s foodie is rather different. A foodie is someone whose interest in comestibles is not only ardent, but also exquisitely self-conscious. Foodies treat their asparagus kettles not as mere utensils, but as badges of honour in the nagging battle for self-identity.

The same goes for ingredients: while once we had only Sarson’s non-brewed condiment to put on our chips, the foodie store cupboard now contains vinegars with genealogies, rare and costly vintages of balsamic, fruit vinegars made with herbs, herb vinegars infused with fruit to put on their pommes allumettes. A defining foodie product is verjuice, or what used to be known as filthy cooking wine. And foodies explore more than the palate: they hunt and collect restaurants, too.

Barr and Levy’s The Official Foodie Handbook (1984) walked an uneasy line between being a spoof of a new middle-class trend (and this could only be a fashion followed by those wealthy enough to buy the exotic and expensive ingredients and meals demanded by foodie-ism) and a guide to it. Angela Cartercommented that it was best to understand the Foodie Handbook within the context of the other Handbooks published by Harpers & Queen:

the original appears to be The Official Preppy Handbook, published in the USA in the early days of the first Reagan Presidency. This slim volume was a light-hearted check-list of the attributes of the North American upper middle class, so light-hearted it gave the impression it did not have a heart at all. The entire tone was most carefully judged: a mixture of contempt for and condescension towards the objects of its scrutiny, a tone which contrived to reassure the socially aspiring that emulation of their betters was a game that might legitimately be played hard just because it could not be taken seriously, so that snobbery involved no moral compromise.

The book was an ill-disguised celebration of the snobbery it affected to mock and, under its thinly ironic surface, was nothing more nor less than an etiquette manual for a class newly emergent under Reaganomics. It instructed the nouveaux riches in the habits and manners of the vieux riches so that they could pass undetected amongst them. It sold like hot cakes.

In other words, the books began a process which has recently been completed: satirising while simultaneously approving, even celebrating, snobbery.

I realise that many food bloggers don’t know about the etymology of ‘foodie’ and don’t mean to use it to mean what it did originally. And I have nothing at all against whatmostfoodbloggersdo: sharing recipes, advice, and useful information about food and cooking. They do what cooks and food enthusiasts have been doing for hundreds of years. A greater flow and availability of information about food can only be a good thing.

But I do object very, very strongly to the foodies – in Barr and Levy’s terms – of the internet who use their blogs and, occasionally, presence on social media to write about food, and good food, as the exclusive preserve of those who have the knowledge, sensitivity, and right social connections truly to appreciate what is worth eating. A few years ago, the BBC aired a fantastic comedy series called Posh Nosh. Starring Arabella Weir and Richard E. Grant as a social-climbing (her) and upper middle-class (him) pair of foodies, the series lampooned the deep, moral seriousness of foodie-ism. As Grant’s character says in the first episode (and I urge you to watch the series – most of it seems to be available on YouTube), ‘Food is beauty. And beauty is food’:

One of the useful things about foodies is that it takes very little to show how ridiculous they and their pretentions are.

Food, then, is like cars, furniture, or clothes: it’s another way of signifying people’s class status and social positioning. Of course, we’ve used food to do this for hundreds of years. But the difference with foodie-ism is that it attaches a moral value to eating in a particular way. Foodies confuse this snobbery with doing and being good. For foodies, knowing about and eating good food is a badge not only of class status and social and cultural sophistication, but also of moral superiority. Roasting organic purple-sprouting broccoli and then drizzling it with an estate-origin extra virgin olive oil signifies the foodie’s commitment to being green and supporting small, artisan producers. This dish is a manifestation of why that foodie is a Good Person.

It’s for this reason that I object so strongly to foodie-ism. Not only does it mystify cooking and eating, and elevate them to experiences that can only be appreciately properly by appropriately trained foodies, but it judges those who – for whatever reason – eat less well than themselves. Foodies, then, don’t really care that much about food and eating.

The subtitle of the Official Foodie Handbook is ‘Be Modern – Worship Food’. By elevating – or fetishising – food to the level of something which needs to be worshipped, foodies no longer think of food as food – as nourishment which we all need to consume – but as something which is simply an indicator of status and value. They don’t aim to inform and educate about food, nor do they work towards making good, wholesome food more widely available. They simply congratulate themselves on eating well. In a time when food prices are sky high – and look likely to remain that way for the forseeable future – and the planet’s diet is looking worse than ever, it strikes me that to ignore these crises while claiming to be interested in food and to enjoy eating, is deeply hypocritical.

I’ve spent the past fortnight in New York – mainly for a conference at Columbia – and on my last morning had breakfast at a restaurant which could only have been in New York, and, more specifically, in Morningside Heights. The Hungarian Pastry Shop is a shabby, comfortable, and much adored cafe among local residents and Columbia’s students and academics. It serves a range of unbelievably good cakes and pastries, the menu for which is an ancient and faded handwritten banner above the counter. Mothers with small children munch apple strudel alongside workmen in overalls, lecturers with textbooks, and small old ladies with thick foreign accents.

The Hungarian Pastry Shop in Morningside Heights, New York

Founded by immigrants, this could only be called The Hungarian Pastry Shop outside of Hungary. Over the years, it’s been tweaked to satisfy the demands of now elderly mittel-European customers, a group of whom was sitting in the sunshine when I arrived, as well as the undergraduates who spend long hours reading over its big mugs of strong coffee. The Shop has a menu in German and table service, as well as an exterior decorated with murals, a graffiti-covered loo, and posters advertising digs, extra tuition, and auditions for student productions.

Breakfast at the Hungarian Pastry Shop

Over a cherry danish, orange juice, and iced coffee, I considered a comment made by my friend Ester a few weeks ago when we had lunch at a new cafe which has recently opened in Cape Town. Skinny Legs and All (yes, as in the novel by Tom Robbins) in Loop Street serves ‘real food, unadulterated, and unadorned’. We had homemade lemonade, soup, and excellent coffee.

As we were admiring the cafe’s interior, Ester noted perceptively that we could have been anywhere – that we could have found this restaurant and eaten similar food, underpinned by the same values and ideas about cooking, in any other city with a demand for sophisticated good food, be it Melbourne, San Francisco, or London. I think that this is a point worth exploring.

The menu at the Hungarian Pastry Shop

In New York I had coffee and lunch in cafes which I could have described in precisely the same terms. At Bubby’s in Brooklyn’s Dumbo, Tablespoon in the Flatiron District, and the City Bakery off Fifth Avenue I could have been anywhere. Of course, all of these restaurants say a great deal about New York, its gentrification and the role of food and restaurants in this process. The City Bakery was founded in 1990, at a time when the slow regeneration of Manhattan was nearing completion and when enthusiasm for artisan bread (best exemplified by the craze for sourdough in San Francisco) was beginning to peak. Bubby’s and Tablespoon – both of which emphasise the extent to which they source seasonal ingredients locally – ride on the City Bakery’s success. In a similar way, Skinny Legs and All is an indicator of the success of Cape Town’s central city improvement district, and also of the very, very slow emergence of a food-focussed South African green movement.

To note this similarity isn’t a criticism – it’s simply to point out that these cafes are local manifestations of a global phenomenon. But not all aspects of globalised eating are seen in such positive terms. Since the 1980s at least, there has been a heightened concern that globalisation is causing diets to become homogenised: that the international popularity of fast food chains, supremely McDonald’s, signals the end of discrete, local food cultures.

The apparent ubiquity of the golden arches seemed to indicate a kind of culinary ‘end of history’: as liberal democracy appeared to triumph with the collapse of the Soviet Union, so did the eating habits of the West. The opening of a branch of McDonald’s in Red Square in Moscow in 1990 was the final nail in communism’s coffin. I remember clearly going to eat at one of the first McDonald’s to open in South Africa after the end of the international business boycott. Eating there was as much an affirmation of South Africa’s re-entry into the world as was the country’s participation in the 1992 summer Olympics.

I think it’s fair to say, though, that McDonald’s no longer means these things – which isn’t to suggest that it’s not doing well. A recent article in the Economist predicts that McDonald’s and other budget chains, like Aldi, are set to profit out of a world in recession. However much revelations about the chain’s profoundly unhealthy products and poor environmental and labour practices have dented its apparent invincibility, it is still believed to be part of a broader shift in an international Westernisation of diet. This was confirmed, apparently, by Oxfam’s recent report on the global food crisis, Growing a Better Future, which claims that pasta is the world’s favourite food.

The City Bakery, off Fifth Avenue

But is this anything new? And it is possible for all of us, truly, to eat the same diet? As I wrote a few weeks ago, the survey on which Oxfam bases its report on favourite foods seems to be pretty dubious to me. It’s also worth noting that the success of global brands depends on their ability to ‘localise’ their products. McDonald’s has diversified its menu to appeal to local tastes, with a greater number of vegetarian options in Indian branches, smaller portions in Japan, rice products in Singapore and Taiwan, kebabs in Isreal, and pita bread in Greece. In other words, the success of McDonald’s lies not in the imposition of a foreign brand, but in its ability to make its products at once familiar and enticingly exotic.

Restaurants on the upper end of the scale use precisely the same strategy. Writing about the opening of a branch of Les Halles in Tokyo, Anthony Bourdain describes how he adapted his French bistro cuisine to suit Japanese tastes:

I…scale[d] down the portions and [prettied] up the presentations. …I rearranged plates to resemble smaller versions of what we were doing in New York: going more vertical, applying some new garnishes, and then observing customer reactions. I looked for and found ways to get more colour contrast on the plates, moved the salads off to separate receptacles, stuck sprigs of herb here than there.

Then there was local taste. Some ingredients simply didn’t sell. If he brought in pigeon, he told me, they would lie in the fridge for a week, neglected by the customers, until, in desperation, he would turn them into a terrine. ‘And then I would eat the terrine.’ He also found himself serving a lot of meat well done.

On a domestic scale, the middle classes have eaten strikingly similar things all over the world since at least the nineteenth century. The movement of people within the British Empire caused the same dishes and menus to be served up on at last four different continents. When Abbie Ferguson and Anna Bliss arrived at the Cape from Connecticut in 1873 to establish an elite girls’ school, they were pleased – and relieved – to find that their middle-class Dutch-Afrikaner hosts ate the same meals, and in the same way, as they had done in the United States. Bliss wrote to her mother:

thus far I have seen quite as well regulated families & as much attention paid to ‘propriety’ as in America. … Wherever I have taken a meal there has been a servant in the room to wait on table or one has come at the tap of the bell, & all done so quietly & orderly.

The circulation of recipe books and advice on cookery in newspapers and in private correspondence around the Empire demonstrates the extent to which these diets remained fairly similar. They were, as today, inflected by local tastes and produce. In the Cape, the American teachers commented on the colonial habit of eating ‘yellow rice’ (rice cooked with turmeric and raisins and flavoured with cinnamon and bay) with every meal – something introduced by slaves from southeast Asia during the seventeenth and eighteenth centuries.

The City Bakery, New York

In other words, the diets of the wealthy have tended to be fairly globalised since international travel was made easier, and more common, from around the beginning of the nineteenth century. With the invention of the jet engine in the mid-twentieth century and, latterly, the internet, these trends have moved around the world more quickly and we’re also considerably more aware of them. It’s the poor – those whose diets we have an unfortunate tendency to romanticise – who have historically tended to eat a fairly limited range of things.

The difference now is that there are far more middle class people wanting to eat similar diets. Oxfam also notes that the newly-affluent Indian and Chinese middle classes consume more meat and dairy products than ever before. Exactly the same trend occurred in Europe during the 1950s and 1960s, but this was a shift on a far smaller scale and in a world where food systems were not as globalised as they are today.

How to find the City Bakery

I think that it’s misleading to suggest that diets are becoming progressively more Western. Rather, particular ingredients – meat and dairy above all – are increasingly popular in societies which, traditionally, have tended to eat more fish, vegetables, and other starches. Our planet simply can’t sustain meat- and dairy-heavy diets. Refocusing our attention to responding to the demand for these foodstuffs would be considerably more effective than simply bemoaning the Westernisation and homogenisation of global diets. This is an argument which not only draws an impossible distinction between ‘bad’ global and ‘good’ local diets, but also ignores a long history of global culinary exchange which has been mitigated by local tastes and preferences.

Harvey A. Levenstein, ‘The Perils of Abundance: Food, Health, and Morality in American History,’ in Food: A Culinary History from Antiquity to the Present, eds. Jean-Louis Flandrin and Massimo Montanari, English ed. by Albert Sonnenfeld (New York: Columbia University Press, 1999), pp. 516-529.

I’m Sarah Emily – that’s me about to eat an enormous breakfast – and welcome to my blog. I’m a South African historian who’s specialised in histories of childhood, food, and medicine.

This is not a food blog, but, rather, a blog about food – and, more specifically, about food, eating, and cooking. The world has enough recipes for red velvet cake floating around the internet. Here, I’m taking a closer look at the complex relationships between eating and identity; between cooking and politics; and between food and power.