This is compounded by a drive for healthy eating in the UK which tends to heavily emphasise a tropical diet. From public messaging about your 5 a Day which almost always include pictures of bananas, to cookbooks such as The Happy Pear and I Quit Sugar which rely heavily on the use of coconut and avocados, there seem to be strong messages that that it is not possible to eat locally and ecologically-grown food while also being happy and well nourished.

Is this true?

Not in my experience.

Is all British food bland and stodgy?

Well, yes we do, and they are really tasty by the way, but we also grow well over a hundred other foods, ranging from sweet juicy figs to exquisite mizuna salads and the best pumpkins I’ve ever tasted. We also grow grains (rye, spelt, oats) in small amounts and keep a dozen chickens who lay the most delicious, nourishing eggs I have ever eaten.

All this on one acre of otherwise windy and chalky hillside near the sea (we have great hedging which protects us), with about 4-6 people leisurely working two afternoons per week, approximately the equivalent of one person’s normal work week.

Is eating local only viable for the middle classes?

While many others have been demonstrating the potential to eat well and local in Britain – from the Great British Revival BBC programme (and cookbook) to those keen on foraging the free weeds on our doorsteps, it seems to be a pretty middle class preoccupation. People on low incomes are less likely to be able to afford a veg box or buy organic, much less spend time volunteering on a community food project.

Lack of policy support is a major factor keeping local and organic foods out of reach for the wider public. Part of the problem could also be that policymakers, like consumers, are unaware of the potential for and benefits of growing nourishing food to be consumed locally. Farming policies and reports in Scotland (PDF), for example, tend to be based on the idea that animal grazing is the only viable way to farm in the hills.

However, some people – crofters and the Tap o’ Noth Farm, for example – do continue to grow arable crops, including not just oats and barley but also vegetables such as purple sprouting broccoli, and even tea, in addition to keeping sheep and cows.

Such initiatives tend to be dismissed as anomalies but what if we were curious about what the outliers could show us?

Treating food and agriculture as separate entities contributes to the problem

Another problem is that many policies treat agriculture separately from food. Farming policies aim to boost exports, support farmers and, more recently, reduce environmental degradation (whether they effectively do this is another matter). They do not consider what foods are being produced or who is consuming them.

Scotland is a bit ahead of the game in that it does have a National Food and Drink policy (in contrast to England). While it does include a focus on local production for local consumption, the emphasis is still strongly on exports and I cannot help but think that the two are contradictory. As a recent report by the Food Research Collaboration argues, we need to transform Agricultural Policy into Sustainable Food Policy.

Time, costs, and affordability perceptions

As for why people do not have the time and money to buy good quality local food, this seems to be a question that goes beyond the issue of farming to issues of housing, economy and lifestyles more broadly.

Value added at the farm represents only 5% of what UK consumers spend on UK food and drink, so increasing (or decreasing) costs there is not really going to make a huge difference. Looking at mark ups for processing and retailer margins would be more relevant. And with some of the highest costs of housing in the world, it is questionable whether the lack of access to quality food in the UK has to do with the cost of food itself or the cost of housing and other basic needs.

Where do we draw the line(s) about what we consider to be local food?

So, if we were to consume more local foods, what would constitute being as ‘local’? The answer is not as straightforward as you would think.

The Food Meters project, for example, indicates that the geographical radius for local is different in different areas. And this is further complicated by the fact that having less food miles, does not necessarily mean that foods are more sustainable – organic tomatoes imported from Spain may have been grown and transported using less energy and producing less pollution than tomatoes grown in the UK in heated polytunnels or with high levels of fertilisers and pesticides. A third option, and the one that makes the most sense to me, would be to simply give up the idea of eating tomatoes all year round and simply enjoy them in the summer.

Who should be making decisions about our food systems and how?

According to principles of Food Sovereignty (PDF), decisions about how to improve our food systems should be centred on the realities of farmers (particularly small-scale and family farmers) and consumers themselves, rather than solely determined by the momentum of markets, the priorities of large businesses (including large-scale industrial farms) or the perceptions of high level government officials who are often removed from the realities of farming and eating on a budget.

But how can we manage our limits in understanding and our own biases?

One option is to use processes in which farmers and consumers come together to reflect and to hash it out, which we are using in our Pathways to Agroecological Food Systems project. This includes bringing together diverse groups of people to discuss and debate what an agroecological food system might look like at a local and national level. It entails building on people’s own knowledge while also presenting them with information that may challenge their perceptions.

One of the most important aspects is that the process includes a focus on the potentials (notice the plural of that word) for things to be different. It entails thinking both about the implications of our current food systems as well as the type of food systems we want to be creating for our future generations.

Last month members of BSUFN took part in an event which explored the Future of Food. The event was held at the Science Museum, London, as part of the 50th birthday celebrations of the Science Policy Research Unit (SPRU) at the University of Sussex.

Members of the public were given the opportunity to interact with, and ask questions of, SPRU researchs and experts on four topics concerning the future of food. Conversations were held around four questions:

How will Brexit impact UK farmers?

Could we be growing more food underground?

How can the UK government best respond to the rising obesity problem?

What impact does our food system have on climate change?

Some of these questions raised contentious issues among the public who engaged in the debates. There were areas of agreement while on other issues people disagreed and had different opinions about priorities and the future.

BSUFN would like to continue this debate. If you have any thoughts, opinions or ideas about the above questions please leave them in the comments box below and we will build the ongoing discussion around these comments.

You can read more about the SPRU anniversary event at the Science Museum and other celebratory events here.

In the 1930’s Cecily Williams identified a condition of advanced malnutrition which she called kwashiorkor, or ‘disease of the deposed child’. She made a cautious suggestion that the disease was associated with the loss of protein when a mother weans a toddler abruptly on the arrival of a new baby.

Little attention was paid to this discovery until the 1949 FAO/WHO Committee became involved in nutrient deficiency diseases and announced kwashiorkor was ‘one of the most widespread nutritional disorders in tropical and subtropical areas’. The condition was treated with skimmed milk, so it was assumed that it must be caused by a deficiency of protein (Brock and Autret, 1952). Sathyamala (2016) is one of several with an interesting take on this:

‘This discovery of the cure coinciding with the availability of dry skimmed milk in the USA was a fortunate by-product of a domestic surplus-disposal problem. It was clearly more satisfactory in every respect to dump [skimmed milk] in developing countries than to have to bury it in the United States as was contemplated by the Department of Agriculture at one point’ (McLaren, 1974).’

Thereafter the contentious term ’the protein gap’ or ‘crisis’ was born. This was first advocated when different committees believed child protein requirements were high in comparison with currently accepted values, but successive downward adjustments of the value over the years made it clear that children in areas where kwashiorkor was prevalent did not have a protein deficiency unless their overall energy intake was low (Briend, 2014). So although the symptoms of this disease are ‘persuasively consistent with protein deficiency rather than energy deficit, acute shortage of energy would, however, lead to use of protein as an energy source’ (Webb 2012, 283).

To her credit Williams later wrote ‘For the last 20 years I‘ve been spending my time trying to debunk kwashiorkor’ (McLaren, 1974). Even today its aetiology and pathogenesis remain unclear, although textbooks authoritatively report it as a protein deficiency disease.

This apparent enormous ‘protein gap’ initiated a whole new field of research which also satisfied many commercial interests, specifically the mass production of protein-rich functional foods from sources such as fishmeal and microbes. In 1972 a ‘Protein Advisory Group’ was established to monitor this research. Sathyamala (2016) summarises the situation as follows:

‘..once the marketing of the surplus of skimmed milk in the USA had ceased to be a problem and given that the meat industry was unlikely to be able to play a role because of the levels of poverty in countries that were said to be afflicted with this condition, industry turned its attention to developing new, synthetic protein foods and exploiting them commercially….. Despite heavy funding and promotion, with few exceptions most of the protein-rich foods never reached commercial viability, with some products costing four times more than the original they were said to replace (McLaren, 1974).

One product which did succeed was Quorn but this is now marketed as a meat substitute for vegetarians in the West, rather than the high-protein food for needy children in developing countries (Webb, 2012).

Unfortunately, as McClaren noted (1974), ‘As a result, (of the protein gap) measures to detect protein deficiency and treat and prevent it by dietary means have been pursued until the present time. The price that has had to be paid for these mistakes is only beginning to be realised.’ Newman (1995) suggests ‘the unwarranted attention to protein ended up by wasting a great deal of time, money and lives’ and more recently Webb (2012, 279) refers to ‘The huge costs both financial and medical, of exaggerating human needs’. He suggests ‘much of this effort was wasted and ‘directed towards solving an illusory problem’. McClaren (1974) went so far as to entitle his historically important article describing the so called protein gap: ‘The great protein fiasco’.

Sathyamala (2016) explains the focus on kwashiorkor in the 1960’s by describing it as ‘the construction of a pure protein deficiency disease’, which shows how ‘scientific discourses in nutrition are shaped by the needs of capital and capital determines scientific truth’. She describes how:

’The entry of the pharmaceutical and food industries into functional food and supplements has created a new epistemic authority for truth claims whose strength lies in the ability to convince through propaganda with little pretence of a scientific base’.

The recommended dietary intake of protein has been progressively lowered, and Infant protein requirements decreased significantly from approx. 40g g/day in 1943 to 13g/day in 2005. In the 1970’s a recalculation at the stroke of a pen unwittingly closed the ‘protein gap’ and shattered the theory of the pandemic of ‘protein malnutrition.’

The former director of India’s Institute of Nutrition (Gopalan, 2007) made the following comment in his evocatively entitled article ‘Farms to Pharmacies: Beginnings of a Sad Decline’. ‘No arbitrary cocktail of synthetic nutrients’ — which he called a ‘blunderbuss pharmacy’ approach to undernutrition — ‘can substitute for a judicious combination of natural foods. What an undernourished or overnourished population requires is access to appropriate and adequate amounts of conventional, regular foods and not their allegedly superior functional products… a diet of cereals, pulses, legumes, fruits and vegetables can meet these micronutrient requirements’. Of course natural disasters and civil war etc., may deny this basic requirement to many, but supplements should not be the norm.

It appears that the ‘diverse, largely plant-based, diet of people in colonial territories was regarded as deficient in comparison to the flesh-based diet of the colonizers’ (Arnold, 1994), a notion which has fuelled the ‘protein gap’ and still influences us today.

My interactions with students reinforce my fear this attitude cannot be readily divested, and the notion is, metaphorically speaking, in our genes.

After 10 years of lecturing in biochemistry and nutrition, when discussing plant based diets, I am still asked by the majority of students the question, ‘but where do they get their protein’? After 10 years of lecturing on proteins I am met with stunned faces when I say all foods (except gelatine) contain all essential amino acids. After 10 years of lecturing I still see exam answers saying we cannot survive without animal protein and 10 years of lecturing I am asked by some young men if 300 grams of protein a day will increase their muscle mass, despite the fact that only approx. 56g/day is recommended?

The rat experiments unwittingly started the train and perhaps Kwashiorkor was the fuel remorselessly pumped in. It thundered along on tracks built by corporations and vested interests until it was finally derailed in the 1970’s. The trouble is, some people are still sitting on that train, undeterred, oblivious it may be heading on a collision course, unaware or indifferent to the reality that our choice of food can no longer be a personal matter, and the future of our grandchildren may depend on it. The N8 AgriFood founding director Sue Hartley said: “We cannot grow our way out of this problem; we have to try to change the way that we behave.”

It is a widely-cited statistic that it takes ten kilograms of feed to produce one kilogram of beef, meaning an overall loss of nine kilograms of food produce. Increasing populations and climate change, which is partly due to our voracious appetite for animal produce (the so-called complete proteins), will have an explosive impact if they continue unabated. So how should we proceed? If we do nothing and global warming is as bad as predicted, our grandchildren will surely suffer the consequences. If, on the other hand, the deniers were correct, we may have wasted yet more time and money, but in this case, at least the next generations will have a future.

This article by Dr Caroline Hodges is shared in two parts, the second part will be posted on Thursday the 4th of August 2016. Caroline teaches nutrition at the University of Brighton and is a member of the BSUFN steering group.

The following description of what we should eat might surprise many people: “Households should select predominantly plant-based diets rich in a variety of vegetables and fruits, pulses or legumes, and minimally processed starchy staple foods.”

In my previous blog I presented the so called ‘myths’ relating to dietary protein which I encounter in biochemistry and nutrition textbooks. In this one I would like to discuss two specific events which contributed to the perpetuation of these myths and consider the long term negative impact on our health and the environment. First I will describe the early experiments on rats to determine optimum protein requirements and second, I will outline the discovery and repercussions of a disease called kwashiorkor.

Before I unravel the effect of these two events I would like to explain what initiated my concerns about this subject. The following multiple choice question is used in my nutrition exam. (Correct answer is C, but A is the answer given by approx. 50% of students).

What is the primary source of protein for most of the world’s population?

Meat

Dairy

Grains and vegetables

Fruits

The number of students who get this wrong always surprises me, despite lectures which should lead them to the correct answer and me going so far as to announce that this, or a similar question, will be in the exam! Every year I go further; this year I told the students that approx. 50% of students might get the answer wrong, in the vain hope of alerting them to this error.

It is not absenteeism from lectures that causes the mistake, so I find it both disturbing and revealing to ponder the reason. It seems so many students, even those at medical school, are so convinced by our need for animal protein that whatever else they read does not register. I still find comments in the exam suggesting that we cannot survive without animal protein and in its absence we become ill. It seems the perceptions of my students (and the public) have been shaped by decades of poor information on this subject. It does not help that textbooks still describe proteins as ‘complete’ (animal sources) and ‘incomplete’ (plant sources). The insidious and seemingly pervasive implication being that we cannot survive without the ‘complete’ ones. After all, these textbooks can’t be wrong, can they? Most dictionaries define incomplete as ‘lacking a part’, but as applied to plant protein this is not so, every one of the essential amino acids is present, just in varying proportions in different plants. If we only ate one food item all day, every day, that proportion would be hugely important, but in most countries this is unlikely to be the case. So what has caused this misunderstanding?

Rat experiments

The first of the two subjects I would like to describe is related to animal experiments conducted over a century ago to determine the optimum protein requirement for humans, the legacy of which still prevails. The myth, described by (Young et al, 1994) is as follows:

‘Animal procedures can provide good indices of the human nutritional value of food proteins’.

In 1914 Osborne and Mendel studied the protein requirements of laboratory rats and demonstrated nutritional requirements for the individual amino acids of which proteins are made. At that time it was not known that rats have much greater protein requirements than humans (Rose 1948) because, by comparison, they have a much more rapid tissue growth. This difference in protein requirements is further demonstrated by the comparison of breast milk from both species; the protein content of rat breast milk is 10 times greater than the milk intended for human babies (Bell 1959; Reeds 2000).

The other damaging outcome of this animal-based work is the concept of complete and incomplete proteins, also referred to as first class or superior (from animal sources) and second class or inferior (from plants sources) proteins. These descriptions are based on the premise that animal products provide the most ideal pattern of essential amino acids for humans, which is now known to be incorrect. These animal experiments and subsequent definitions of protein quality are much less relevant in human nutrition, and our metabolic requirements are quite different. This is substantiated by the difficulty in demonstrating in normal healthy adults any difference in nitrogen balance (an indication of appropriate protein intake) between diets based on plant protein and those based on animal sources (Rand et al, 2003).

So over misinterpretation of animal experiments (Ioannidis, 2012) and inappropriate extrapolation to humans has encouraged both inflated estimates of protein requirements, especially in children, and erroneous distinctions between the quality of plant and animal protein. The estimated protein needs of children are now half as much as they were in the 1940’s and it is becoming apparent that the much greater risk (in the West at least) is over-consumption of protein.

It seems that comparisons of ‘complete’ and ‘incomplete proteins’ are much more academic than practical and require rethinking, as described so well by Bender, (2014, 255):

‘While protein quality is important when considering individual foods, it is not relevant when considering total diets because different proteins are limited by amino acids, and hence have a relative excess of others. The result of mixing different proteins in a diet is an unexpected increase in the nutritional value of the mixture…..The average Western diet has a protein score of 0.73, whilst the poorest diets in developing countries, with a restricted range of foods, and very little milk, meat, or fish, have a protein score of 0.6’. (The difference is minimal.)

Miller and Payne (1969) concluded that ‘almost all dietary staples contain sufficient protein to meet human needs and that even diets based on very low protein staples are unlikely to be specifically protein-deficient. Webb (2012, 279) points out that since 1969 this view has become the nutritional consensus. It seems unfortunately that this message has not permeated to the lay public.

The second event known to have a huge influence on policy and recommendations led to what is called the ‘great protein fiasco’. Caroline’s article on the ‘great protein fiasco’ will be posted on Thursday the 4th of August 2016.

References

Arnold, D. (1994) ‘The Discovery of Malnutrition and Diet in Colonial India’, The Indian Economic and Social History Review, 31:1, 1–26

On the 16th of June 2016, the Brighton and Sussex Universities Food Network hosted its third annual symposium. The event was the biggest BSUFN has held to date, with attendance from students, staff, and faculty across Brighton and Sussex Universities, as well as wide representation from non-academic organisations, community groups, food producers, activist networks, and other academic institutions across the UK.

The programme reflected the diversity of contemporary food issues and the prevalence of food in practice, policy and research at present. Topics addressed during the day included obesity, food poverty, international food security, novel foods and edible insects, sustainable food systems, and food manufacturing.

Presentations and the poster display triggered much discussion during the day, both in the room and online via social media.

Issues of diversity, privalege and marginalisation, and representation within the food system and academia were addressed by the keynote speaker, Dr Tom Wakeford (Coventry University), and again during other presentations including Beth Kamunge (University of Sheffield). This discussion particularly focused on the dominance of white middle-class voices within the UK food system and academia and called for the current structures to explicitly seek the involvement of those from ethnic minorities and other social classes. Perspectives of activists, researchers, and food producers were reflected in the discussion on this topic.

You can read more about Gilly Smith and Jo Rallings’ discussions surrounding the influence of TV chefs on the way we eat and Jamie Oliver’s Sugar Smart Campaign here.

The results of the poll of food issue priorities, as referred to by a number of speakers, can be found here.

Details of some of the arts and creative methods projects outlined in the poster display are available through BSUFN’s online resource on creative methods and food here.

The BSUFN annual symposium yesterday was a lively one and there was a particularly heated discussion during our parallel session on ‘Consumers, Identity and Culture’.

My media colleague Gilly Smith and Jo Ralling from the Jamie Oliver Foundation talked about TV chefs, the changes they might effect in wider food culture and the materials and structures which accompany those changes. For example, Gilly mentioned new restaurants and a foodie tourist trade in Hungary, in part the product of a Hungarian version of Jamie Oliver. Jo talked up the successes of Jamie’s food campaigns in the UK, including the sugar tax and changes to school dinners.
A couple of people in the room took issue with their account, accusing Jamie-style cheftivism of unforgivable smugness and asking why Oliver doesn’t raise the issue of food poverty more often.

Others worried that these chefs’ campaigns tend to shame people, that lifestyle TV formats of problem-expert advice-redemption are inherently judgemental, assume lack of information is the reason for poor eating and don’t recognise the varied and individual circumstances people eat and cook in. The same might be said however for public health campaigning the world over…

To her credit Jo acknowledged she and the team at the JO Foundation were aware of these issues, discussed them and tried to produce programmes which took account of them. At all costs they wanted to avoid shaming she said.

Gilly argued people who make sweeping statements about foodie campaign TV tend not to have watched the half hour programmes, but only read the soundbites in news reports. These leave less room for nuance she argued.

This discussion about lifestyle TV activism is a really important one and we didn’t even scrape the surface on the day. It seems to me the question of form or format is key – Jo and Jamie and other media producers are bound by generic conventions like the quest or the transformation. Commissioners need to show they are moving with ‘the next big thing’ and won’t always stick around to follow things up (a point Gilly made). Sound bites may get read more, but long form journalism and longer programmes do have the potential to be both entertaining and a bit judgemental but also so much more.

I think we should all be mindful that food debates of all kinds are mediated, and all are affected by the medium. Academic journal articles, conferences presentations and Q&As are no exception. Our discussion was at least as inadequate on the day as any news report – but it was, I hope, an important opener to a much longer conversation.

Food insecurity and food poverty have come top of a poll on contemporary food issues. Receiving 32% of votes, issues of food insecurity and food poverty emerged as the highest priority among voters.

In the lead up to our annual symposium on the theme of contemporary food issues, we’ve been running a poll to establish which food issues our members and wider audience consider to be the top priority. We’re announcing the results of the poll at the start of the symposium on the 16th of June 2016.

Other priority issues include nutrition and diet, food waste, and food sovereignty.

Results of the online poll of priority areas for contemporary food issues

The full results of the poll are listed below. In the coming months we’re going to feature more posts and articles about the topics which have ranked as high priorities for BSUFN members and we will incorporate these issue into future events and research.