Cooking, comedy, cartoons and codswallop

Main menu

Tag Archives: Guardian

This week’s episode of “No Shit, Sherlock” is brought to you by the consumer group Which?, whose analysis of the worthy-looking sandwiches and pasta salads we buy for our lunch indicates that they can can be as unhealthy as so-called junk food.

Today’s Guardian, in its analysis of the analysis, notes that “Caffè Nero’s brie and bacon panini was highlighted as having more calories (624) than a McDonald’s quarter-pounder with cheese (518)”.

All that surprises me about this is that we’re expected to find it surprising in the first place. Break down the two sandwiches into their constituent parts – bread, red meat and cheese – and there’s no meaningful difference. So what would possess anyone to imagine that the brie and bacon panini – or panino, for all you Italian grammar pedants out there – would be somehow healthier than a burger comprising much the same stuff?

Other findings reported in the Guardian article are similarly illuminating. Mayonnaise – a mixture of a little egg and vinegar and an awful lot of oil – is quite high in fat. Adding a packet of crisps and a bottle of Coke to your lunch will increase its calorie count.

The Which? report, and the fact that it appears to qualify as news, say little we didn’t already know about packaged food, but a great deal more about our relationship with brands. To be fair to Caffè Nero, I’m not sure they’ve ever advertised the offending panini as some kind of healthy option. But if you perceive McDonald’s as representing the very worst of everything, it’s only logical that you’d imagine anything with an M&S or Pret a Manger logo to be less bad for your physical and moral well-being.

The only problem with this is that it’s patent nonsense, and always has been. If we accept the vices and virtues of capitalism in all its other aspects, why would food be the exception? Everyone in the business of selling food, be it McDonald’s, Caffè Nero or your local kebab shop, wants you to buy their wares. Within the scope of the brands they’ve defined for themselves, they’ll try to sell you what you’re most likely to buy. And they don’t much care if you get fat, develop type 2 diabetes or die; unless, of course, their brands become widely associated with those things, which might affect their sales.

We overconsume certain types of food – meat, salt, fat, refined sugars and starches – because we like the way they taste. So it’s hardly surprising that our convenience foods, from the Big Mac to the mayo-laden chicken pasta salad, are packed with these food types. Any food vendor could change the composition of its products tomorrow to include less of any or all of them. But we’d enjoy them less, so we’d be less inclined to buy them; and so it would be commercial suicide.

Of course, the holy grail of food marketers is to convince us that we’re eating better when in fact we’re doing nothing of the sort. And if we really are willing to believe that one red meat and cheese sandwich is fundamentally healthier than another, it looks like they’re on to a winner. But if we pause for even a moment to assess what we’re about to eat, we can work out that the difference exists purely in our imagination.

Which? executive director Richard Lloyd has stated that he wants “all manufacturers to adopt traffic-light nutrition labelling […] so consumers can see exactly what products contain.” And who knows, maybe that would have some impact on what we buy; but I have my doubts. What I can say with rather more certainty is that the big vendors would do as they’ve done with all previous legislation to give us improved information about the food we buy: first, they’d resist it, then they’d circumvent it. (For a longer and more considered view on this particular topic, see this piece I published last year.)

Ultimately, if we genuinely do want to eat better – and I’m yet to be convinced we really do – the solution will be found not on food labels but through our senses, thoughts and actions. You don’t really need to be told that Subway bread is chock-full of sugar; you can taste it. (If you walk within a few yards of one of their outlets, you can even smell it.) Nor should you need a Which? report or Guardian article to tell you that most processed food contains a lot of not-so-good stuff; buying it will always represent an act of blind faith, even if the labels end up 90% covered in nutritional information.

As was always the case, the best way to be confident in what you’re eating is to buy unprocessed (or less processed) food and prepare it yourself. Your lunchtime pasta salad might contain almost anything; an orange, labelled or otherwise, is still an orange.

How much importance you place on all this is, of course, entirely up to you. Few of us have the time or inclination to eat the “right” thing all the time. And I’m certainly not averse to the odd double cheeseburger. But if you’re as surprised by today’s news as Which? seem to expect you to be, it might be time for a little more lateral thinking and a little less brand loyalty.

As you surely guessed from that last deliberately clunky sentence, I’m referring to the frankly splendid news that Sweden has added the gender-neutral personal pronoun “hen” to its dictionary.

The Guardian reports that “the pronoun is used to refer to a person without revealing their gender – either because it is unknown, because the person is transgender, or the speaker or writer deems the gender to be superfluous information”.

Far from being “political correctness gone mad”, this is grammatical correctness gone sane. For writers, the lack of such a pronoun in English is a right pain in the non-gender-specific arse. When we want to refer to an unspecified individual, all our options are unsatisfactory. We can attribute a gender to the person; we can find a longwinded way of making clear the gender neutrality (as in the “he/she” example above, or the even more irritating “(s)he”’); or we can opt for the gender-neutral but grammatically incorrect “they”.

In spoken English, almost all of us choose the third option, largely because “he/she” sounds even dafter than it looks. Many of us do so in writing as well; but while I don’t think I’m the most ardent grammar pedant around, I can’t quite bring myself to write something I know to be wrong. I make more than enough grammatical errors by accident without throwing in deliberate ones as well.

Some writers choose to alternate between male and female for their hypothetical examples. Daniel Kahneman does this to good effect in his 2011 masterpiece, Thinking Fast and Slow. But this only really works in longer pieces of writing, where the reader has time to get used to the convention. Use an arbitrary “he” or “she” in isolation, and the reader is unnecessarily distracted by the question of why the author has chosen that particular gender for that particular example.

In many languages, the issue doesn’t arise. In French, every noun in the language is gendered; but the personal pronoun is the same for everyone. For instance, “sa plume” can mean “his pen” or “her pen”; although the pen itself remains resolutely female. Additionally, French-speakers have the option of using the impersonal “on” in their hypotheses, whereas the English equivalent – “one” – has the unfortunate effect of making one sound as if one is talking out of one’s pompous posterior.

So we end up simply muddling through, finding all sorts of elaborate ways to remain readable without being wrong. We make our examples plural so we can use the gender-neutral “they”/”their” without fear of correction, or we reconstruct entire sentences so that the personal pronoun doesn’t feature. Basically, we go to disproportionate lengths to replace one short word that our language has evolved not to have.

As many have noted – including Gary Nunn in this Guardian piece, which itself draws on Denis Baron’s geekily glorious The Web of Language blog – this issue has been rumbling on for 150 years and more, championed by people far more eminent and zealous than me. New pronouns such as ze, ip and thon have been proposed, discussed and roundly ignored.

But I can’t quite bring myself to accept Baron’s conclusion that “after more than 100 attempts to coin a gender-neutral pronoun over the course of more than 150 years, thon and its competitors will remain what they always have been, the words that failed.” After all, if a century of failure were considered reason enough to abandon a cause, Brighton and Hove Albion fans like me wouldn’t exist.

So I’m proposing a modern solution to an age-old problem: online petitions. After all, if the campaign for the BBC to reinstate Jeremy Clarkson after his unfortunate producer-thumping misdemeanour can attract over a million signatures, just imagine how many people would support a petition to transform our language forever?

Of these, twelve included cheaper meats in addition to lamb; seven contained no lamb whatsoever; and five had been so heavily processed that the scientists were unable to identify what meat(s) they contained. In other words, we can decipher the human genome, but not the doner kebab.

Granted, none of this is very pleasant. It’s undoubtedly fraudulent. Depending on what the mystery meat turns out to be, it could be stomach-turning. (The Mail, with characteristic restraint, took the discovery of the “UNIDENTIFIED” meat as its cue to ask, “Is there rat in your kebab?”)

But is it surprising? Hardly.

As long as there are humans, there will be fraud. And while we tend to think of fraud in purely financial terms, it occurs in relation to any tradeable commodity, from fine art to fishcakes. As such, every government devotes resources to combating it.

But not every form of fraud is pursued with equal vigour, or with equal success. Tax fraud might cost the UK 16 times as much as benefit fraud, but you’d hardly know it from reading the papers or listening to Government ministers. And food fraud, no matter that it has existed for as long as food trading, barely enters our consciousness until a particular scandal happens to capture the attention of the media and the public.

The inevitable consequence of such a scandal is that we get terribly worked up about one particular symptom, with no great attention paid to the others (let alone their causes). Such was the backlash to last year’s revelations, it’s unlikely we’ll find much horse in our burgers for the foreseeable future. But that’s no indication that the fraudsters have gone straight. And as public investment in food safety analysis continues to be cut, we’d be impossibly naïve to imagine that the situation is destined to improve.

But then, we are impossibly naïve; and what’s more, it suits us to remain that way, because our chief expectations around food are fundamentally incompatible.

We expect to take pleasure from the food we eat; but at the same time, we expect to pay as little as possible for the privilege. And achieving both of those things at once depends to a great extent on the questions we choose not to ask, and the truths we don’t much care to think about.

When we eat a £1.99 portion of fried chicken and chips, it doesn’t suit us to consider the conditions in which the birds lived, or the drugs with which they were pumped in order to survive them. Deep down, we might have a fair idea of the grim reality; but at the moment of consumption, the truth would impinge unacceptably on the pleasure.

So it is with our “lamb” curry. We choose to believe it’s full of good-quality diced lamb, of the kind we see in our local butcher’s or supermarket – though when we compare the price of fresh lamb with that of the curry, it’s hard to see where the takeaway is making its money. But anyway, the meat looks like lamb – though admittedly it’s taken on the colour of its surroundings – and even though its flavour is overwhelmed by the curry spices, we’re still reasonably certain it tastes like lamb. But as a few of the Guardian’s writers recently discovered, our palates and expectations frequently conspire to hoodwink us. And so the opportunities for substitution and dishonesty are far greater than we’d care to admit.

What allows us consistently to enjoy the cheap food we eat is the same thing that lets us engage with a far-fetched book or film: our capacity to suspend disbelief. Our empathy with the heroic space warrior, taking down alien after alien with his trusty plasma gun, is contingent on our suppressing what we know perfectly well: that the entire scenario is utterly implausible.

This suspension of disbelief doesn’t only apply to the ethics of the fried chicken meal or the authenticity of the lamb curry; it applies to every aspect of food processing, from farm, to factory, to retailer, to restaurant. Every time our food passes through a different pair of hands, another opportunity arises for fraud, adulteration or corner-cutting to take place. And at every stage, we take great pains not to think about it.

Even when the “fraud” is entirely inconsequential, we choose to pretend it doesn’t happen, then become indignant when confronted with the reality. Anything that compromises our fantasy of what goes on in a professional kitchen – for instance, the common and harmless practice of food being prepared off-site and heated up on the premises – is sufficient to provoke our largely directionless outrage.

This is ludicrous. Face it: everyone involved in your food’s journey from origin to plate needs to make a living, legitimately or otherwise. And the more steps this journey involves, the more people need to get paid. Even if the final product seems remarkably cheap, they all have to make their money somehow. So the only way you’ll make actual rather than perceived savings, while reassuring yourself that nothing untoward has gone on, is by taking the various jobs on yourself.

The trouble is, not many of us are in a position to grow our own vegetables, catch our own fish, raise our own animals or even cook all of our own meals. So what are the alternatives?

At one extreme, we have the option of saying “sod it”. Buy whatever’s cheap and tasty, and simply accept the fact that we could be eating almost anything. To a large extent, this is what most of us do already; but when we do, we should at least have the balls to admit it, and not get affronted when we find out that our 3am kebab might contain something other than prime lamb.

At the other end of the spectrum, we can go all out to establish the provenance of our food – if we have the time and money. The simplest way to do this is to go organic: not particularly because of any inherent superiority, but because organic producers are required to submit to a regime of scrutiny, testing and animal welfare that goes way beyond any Government-imposed standards. (Tacitly, we appear to believe organic food to be more trustworthy, judging by the way we feed our children: organic produce accounts for only around 2% of overall UK food and drink sales, but the figure for baby food is a startling 54%.)

Between these two extremes, of course, there’s a substantial middle ground: it isn’t a straight choice between doing everything or nothing.

The more we cook for ourselves using fresh, unprocessed (or, at least, less processed) ingredients, the less risk we run of eating something unsafe or unexpected. The broader the range of foods we learn to cook, the better placed we are to eat with the seasons, enjoying ingredients when they’re plentiful and inexpensive (ironically enough, lamb is cheap as chips at the moment…).

Many of us, myself included, have a tendency to lose interest in writers’ and artists’ work, and to doubt its worth or significance, as their success and profile rises.

I’m far happier in a cellar bar among an audience of 50 people, sharing the secret of a brilliant but undiscovered band, than in the arena or football ground they might eventually graduate to playing. Once they do, I tend to leave the adoring masses to watch their gigs – often from 100 yards away, while nursing a £5 pint of piss – and return to my preferred dingy haunts to discover someone new.

It’s an attitude that’s both healthy and unhealthy at once. It’s healthy because it allows me to discover new and potentially exciting artists as a matter of course, and so to maintain a state of near-constant renewal and curiosity.

But it’s unhealthy, not to mention illogical, because there’s absolutely no good reason to turn your back on someone whose work you’ve long enjoyed and admired, simply because other people happen to have noticed it too. That’s nothing more than cultural snobbery, which is hardly an attractive trait. And it’s hard to escape the uneasy feeling that such changes in perspective have relatively little to do with the quality of the artist’s work, and plenty to do with your own tacit resentment of that person’s success.

In the modern world of blogging, social media and instant celebrity, there’s more scope for scorning the successful than ever before. A blog post that happens to go viral, usually by virtue of a well-placed retweet or two, can be the catalyst for turning a virtual unknown into a ubiquitous media personality. And when success really does arrive overnight for a select few, but never at all for the vast majority, it’s little wonder that some of the latter group should come to resent the former.

All of which brings me to one Jack Monroe.

It’s possible, though unlikely, that you haven’t yet heard of Jack Monroe. In case you haven’t, she’s a food writer who rose rapidly to prominence through the frugal, meticulously-costed recipes on her blog, A Girl Called Jack. Now, she’s a Guardian columnist, Labour Party activist and anti-poverty campaigner, with two books and (surely) a TV series on the horizon.

Other than Nigella, I doubt there’s a food writer or broadcaster who has occupied more column inches over the past six months or so – thanks in large part to the Daily Mail and its perennial rent-a-prat columnist Richard Littlejohn, whose ill-researched character assassination (I refuse to link to it, but search it out if you like) was promptly and brilliantly dissected by Monroe herself.

In Monroe’s case, the majority of the backlash has come not from her blogging and tweeting peers but from the reactionary right within the “old media”: those who genuinely seem to believe that anyone who’s ever called on the social security system is, by definition, a workshy scrounger. Given that I believe that view to be complete bollocks, it’s probably unsurprising that I’m inclined to take her side on most things. But there’s a further reason too: I’m convinced she’s doing something that’s not just new but genuinely important.

Now, on the face of it, I don’t cook like Jack Monroe. As regular readers of this blog will know, I believe in shopping without lists and cooking without recipes. So her approach – recipe-driven, measured to the gram and costed to the penny – might appear to be the antithesis of my own.

But in reality, it isn’t; because without a cogent, costed demonstration that home-cooked can be cheaper as well as better than mass-produced frozen filth, the single biggest justification for my way of cooking is removed. All the other reasons for home cooking – recipe or none – are, to me, secondary to the fact that it makes financial sense. With that point proven, cooking for yourself can be seen as a logical lifestyle choice. Without it, a sceptic could reasonably argue that it’s little more than an indulgent hobby.

By breaking down all her recipes into exact unit costs – based, it should be said, on a single supermarket (Sainsbury’s) that she began using, like most of us, purely for reasons of proximity and convenience – Monroe gets us thinking about one of the more important but ambiguous words in the English language: “value”.

Take a look at her latest piece for the Guardian, published online yesterday. It chronicles her attempts to create home-made versions of cheap ready meals for a lower price. In all but one case, she succeeds; but that’s only part of the point.

When we think of value, our first association is likely to be the one I’ve made already: monetary cost. But “cheaper” doesn’t necessarily equate to “better value”. If you were looking to debunk the Jack Monroe rationale – and God knows, enough people have tried – you might observe that it fails to take account of the value of people’s time. After all, no-one could reasonably claim that it’s quicker to make a lasagne from scratch than to microwave a shop-bought one.

But this is where other, less obvious notions of value come in.

Monroe’s lasagne, as well as being cheaper than the bought version, uses free-range meat – whereas most ready-made ones contain unspecified, untraceable mince that, for all we know, could well be somewhat horsey. So if you place value upon the provenance of what you eat, the DIY approach enables you to feed your family with a relatively clear conscience, while still undercutting the frozen food giants. (Try to find a ready-made lasagne that uses free-range meat, and the cost differential will come into far clearer focus.)

And, as the dietitian Sasha Watkins points out in the same Guardian article, going home-made gives you far more control over what goes – and what doesn’t go – into the meal. Excess fat can be poured away and excess salt omitted. So, if you have your eye on another kind of value – nutritional – there’s a further reason to cook for yourself.

The only reason I haven’t cooked any of Monroe’s recipes – though I’ve certainly taken several ideas from them – is the same reason I rarely follow anyone’s recipe to the letter: because it takes the fun out of both shopping and cooking. And one of the things I value about creative cooking is that it’s stimulating and fun; whereas following a recipe, however good it might be, feels like a chore. When I cook, the satisfaction of a job well done – or, more specifically, a set of instructions well followed – isn’t quite enough. I want to be able to say, “I invented that.”

Nonetheless, as I make my way round the supermarket, my thought processes are, I suspect, very similar to Monroe’s own, rooted in the time-honoured principles of home economics.

In other words, if an ingredient is on the cheap, I’m much more likely to buy it. I might fancy some cashew nuts on my stir-fry; but if I can’t find the “value” brand at one-third the price of the regular one, I won’t bother – or, more likely, I’ll seek out an alternative way of delivering the desired crunch. (Fried breadcrumbs – a Monroe favourite garnish – offer a cheap and effective option in this case. Granted, you’ll never find them on a Chinese restaurant menu – but really, what does that matter?)

So, while I’ll leave others to test out Jack Monroe’s recipes, I’ll continue to endorse the principles behind her work – even as she becomes a near-constant presence on our cookery pages and TV screens, as she inevitably will. Should I ever feel any momentary pangs of resentment towards her success, I’ll do my best to dismiss these for what they are: simple jealousy.

And where I fit in – if I fit in – is at the margins of Monroe’s work. She’s doing the measuring, the costing and the testing – and, if I’m honest, taking quite a few of the bullets – so the likes of myself don’t particularly need to. What I’ll continue to do instead is to explore the myriad possibilities that exist beyond the confines of the recipe, and to encourage anyone who appreciates the value (there’s that word again) of her approach to cooking to develop the confidence to take it a step further.

I’m one of a growing number who have come to be mildly addicted to the lurid red stuff, though I’m not quite at the point of putting it on everything I eat – a stage I reached with sweet chilli sauce around a decade ago, before the excitement wore off as rapidly as it had developed.

Still, I don’t much like to find myself sriracha-less. As Sue Quinn observes in the article, “a dash of sriracha, with its rich combination of chilli, vinegar, garlic, sugar and salt, can hide a multitude of culinary sins.”

True enough; though I’d offer a more positive assessment than that. As an ingredient as much as a condiment, it’s a relatively cheap and convenient pathway to a multitude of virtues. And while the rise of sriracha might be perceived as a prime example of our collective chilli addiction, I don’t believe it’s all about the heat.

Sugar and vinegar are, for me, the unsung seasonings. Most of our table sauces, spicy or otherwise, rely on their capacity to offset one another. But in much of our cooking, we tend to forget about them.

Much of our passion for sriracha arises from the fact that it achieves the sweet-sour balance that best suits our tastes. Sweet chilli sauce is too sickly, the sharpness of the vinegar obliterated by an excess of sugar. With Tabasco, it’s the other way round. But sriracha gets it just right. It’s the chilli sauce Goldilocks would go for.

To my mind, appreciating the power of the sweet-sour balance is a fundamental part of cooking, whether or not chillies are involved. But if you always cook from recipes, it’s an appreciation that you may never gain.

Most recipes will invite the reader, almost as an afterthought, to “season to taste with salt and black pepper”. I’ve rarely, if ever, seen a recipe that directs the cook to season a meal the way I normally do: with salt and pepper, yes, but also with something sweet and something sour, judiciously added and counterbalanced to lift the flavours of the dish at the last moment.

But the dutiful recipe-follower, obeying the writer’s instructions to the letter, is left somewhat hamstrung. He or she may possess the tools to enhance the dish, but without the explicit authorisation of the recipe’s creator, is reluctant to use them. The role of enhancing and balancing the flavours is handed over to the eaters, armed with ketchup, mustard or, these days, sriracha. And the shared perception at the end of the meal is that the cook has produced something rather dull, only rendered interesting by the welcome presence of various types of magic dust on the table.

Get the balance of flavours right before you serve the meal, and it will have a quite different impact. If a finished stew fails to inspire and you’re not sure what to do, think sweet and sour, not just salt and pepper. And if a further flavour boost is required, bear in mind that if something works as a condiment, it will work just as well as an ingredient (perhaps with the exception of mayonnaise).

That last observation is central to my favourite post-pub meal, ideally suited to those times when knife work is too hazardous to contemplate.

Fill a shallow oven dish with a single layer of spare ribs and douse with sriracha, a little soy sauce and enough water to (just about) cover the ribs. Cover with foil and cook in a medium oven for an hour or so, or a low oven for just about as long as you like, then remove the foil and turn up the heat, allowing the ribs to brown while the sauce reduces.

Accompanied by a pile of lovingly microwaved rice – 2 parts rice to 3 parts water, covered and microwaved on medium until the water has been absorbed – it’s a meal that suits both my tastes and my capabilities after a night on the sauce (and for once, I don’t mean sriracha).

It works because sriracha does. The sweet-sour balance is already just about right, and the chilli and garlic I crave are present and correct, saving me a chopping job I’m ill-suited to undertake. Nothing else is needed, other than a little extra salt (from the soy sauce) to suit my tipsy tastes.

A big bottle of sriracha, costing as little as a couple of quid depending on where you look, will be enough for dozens of meals along these lines, with plenty to spare for table use. Compare that to the price of almost any jar or sachet of sauce in the supermarket, and the prospect of a sriracha drought becomes as much of a worry for the pocket as the palate.

Working out the answer to this question is one of the things I look forward to each day, whether I’m planning to shop for the ingredients (I don’t yet know what, of course), improvise a meal from what’s already in my fridge and cupboards, or some combination of the two.

What I’m assuredly not going to do is go shopping for a prescribed combination of ingredients, assemble them to somebody else’s specification, then take a photograph of the results and send it to a national paper on the off-chance of winning a cookbook.

But plenty of people are, courtesy of this competition from the Guardian. Cook your favourite Nigel Slater dish, send in your photo, and you might just win a signed copy of his new book.

Looking purely at the ratio of required effort to potential rewards, you’d be better off buying a lottery ticket (and I’m not going to do that either).

But this competition has next to nothing to do with what the entrants might win, and almost everything to do with the kudos of seeing their “creations” appearing in the pages of the Guardian, Observer Food Monthly or wherever.

It’s designed to appeal to the people who habitually photograph their meals and post the pictures on Facebook or Instagram, most probably accompanied by the caption “NOM NOM!”.

The fundamental pointlessness of this is generally well understood, at least by the silent majority who don’t do it. I suppose it’s just about forgivable – apart from the “nom nom” bit, obviously – if you’re posting a snap of a meal you’ve created yourself, perhaps accompanied by some insight into how you made it.

But when the height of your ambition is dutiful emulation, the act of photographing your dinner reaches a new level of ridiculousness. Undertake a household task, take a photo of the results and send it off into the ether. You might as well post a picture of your completed washing up.

In fact, I think I will.

(My washing up. Today.)

Or, if replication is now perceived as an art form in itself, why not have a competition to find the reader who can produce the most accurate reproduction of the Mona Lisa? It would be utterly futile, of course. But is it really that much dafter than the contest they’re running at the moment?

None of this is intended as a dig at Nigel Slater himself. I like his writing, and I’ve no idea whether he had anything to do with devising this spectacularly silly competition. But what it represents – a perfect storm of obedience and vanity – sums up the flawed relationship we’ve developed with food and cooking.

Years of watching cookery programmes on telly – and, in particular, shows such as Masterchef or The Great British Bake-Off, where cooking meets reality TV – have fundamentally affected our perceptions and priorities.

It’s an inevitable consequence of a visual medium: we can’t taste the food that the chefs or contestants produce, so we become obsessed with its appearance. Even where actual sampling is involved, we can never be the ones to do it, so the analysis of the food becomes secondary to what we can see; except perhaps when things go hideously wrong, and Gregg Wallace and friends get the enjoyable opportunity to dust off some of their more colourful figures of speech. In other words, what food programming isn’t about, and arguably can never be about, is the most important thing of all: the taste.

And yet, rather than allow our own palates, judgements and preferences to guide us, we persist in trying to replicate other people’s creations, whether we’ve Sky-plussed them from the TV or, more likely, read them from a cookbook, newspaper or website. We’ve never tasted these people’s cooking, and we never will; yet we follow them nonetheless, in what amounts to an act of blind faith. And if the end result fails to inspire, we don’t question the merits of the recipe; instead, we presume we must have done something wrong, and vow to do a better copying job the next time. As behaviours go, it’s bizarre to the point of masochistic.

Add to this the many other factors that militate against a recipe-driven approach to cooking – the drudgery, the inherent deference, the potential for wastefulness – and the arguments for an alternative methodology become compelling.

Elsewhere in the Guardian’s pages, you can read the work of a different kind of food writer: the newly ubiquitous Jack Monroe, whose rapid journey from impoverished single mother to successful blogger and Labour Party campaigner has earned her the coveted accolade of being smeared by Richard Littlejohn in the Daily Mail. (I can’t bring myself to link to the odious Littlejohn’s original piece, but Monroe’s eloquently indignant riposte is well worth a read.)

Her articles include recipes, naturally – newspaper food editors aren’t ready to let go of that particular comfort blanket just yet – but they also explore more interesting and relevant issues around resourcefulness, inventiveness and cost. In short, she writes about a subject that’s long since gone out of fashion, but remains as relevant as it has ever been: home economics.

While the term itself isn’t exactly alluring, taking your lead from home economics doesn’t mean that cooking becomes boring: quite the reverse. Even if you’re relatively well off, there’s immense satisfaction to be gained from finding value, making use of what you have, avoiding waste and turning the proverbial sow’s ear into an equally proverbial silk purse. And as with any creative process, the act of invention can bring enormous pleasure in itself.

The end results may or may not be worth photographing. That doesn’t matter – and anyway, you don’t want your dinner to go cold while you’re getting that perfect shot. What matters is that the food is nourishing, satisfying and tasty.

Mind you, if the Guardian were to run an alternative competition, inviting readers to photograph and describe the best meals they’ve ever made for a quid a head, that would be a hell of a lot more interesting, and infinitely more meaningful.

I was going to write a piece on texture for the blog, then I remembered I’d already written one and published it to my personal blog a few weeks back.

In the likely event you missed it, I’ve reproduced it here. Hope you enjoy.

Originally published on 1 October 2013

AUTHOR’S NOTE: On the Guardian website today is a piece by Amy Fleming on the changing shape of the Dairy Milk bar. I admit that this particular furore had passed me by; but as it happened, I’d just finished a piece on a connected subject, but with the emphasis on what the shape and texture of our chocolate can teach us about cooking creatively. Here it is.

A couple of connected questions for you.

Firstly, how many dishes do you know how to cook? Five? Ten? Twenty? More? Enough to keep you and yours from staring sadly at your plates while thinking “oh, not spag bol again”?

Secondly, what’s your favourite chocolate bar?

The link between the two questions may not be immediately obvious, but bear with me. If you’re able to answer the first question with an exact figure, you might do well to spend some time thinking about the second.

These results aren’t exactly surprising. What is even less surprising is the fact that the supermarket is using the findings to promote its range of pre-prepared meals. The message is unambiguous: if you don’t want to eat the same thing over and over again, look no further than the ever-expanding ready meals section.

Well, here’s an alternative idea. If you’re about to cook the same meal for the 521st week running, don’t just admit defeat and reach for the convenience food. Instead, borrow a little trick from the chocolate-makers: take those familiar old ingredients, and look for a new way to put them together.

And if you doubt whether that will make any significant difference to your meal, may I refer you back to the chocolate question.

Wispa campaign

Do you remember the outcry when Cadbury withdrew the Wispa from sale in 2003? Attempts to rebrand it as a variant on Dairy Milk were unsuccessful, and the bar was finally restored permanently to our shelves in 2008, following a coordinated protest on social media – a Wispa campaign, if you will.

I can certainly recall being one of the outraged many when the Wispa disappeared; but why? Why didn’t I just shrug my shoulders and buy a Flake, a Twirl, a Spira – itself discontinued in 2005, prompting a Facebook campaign of its own – or any of the other milk chocolate bars made from exactly the same ingredients?

The answer, of course, is in the texture. The ingredients might be the same, but the eating experience is quite different in each case, solely as a result of the relative distribution of chocolate and air.

Whether we realise it or not, we have a pretty sophisticated understanding of the power of texture, at least as far as confectionery is concerned. As a nation of eaters, we know our Twirls from our Wispas. But when we cook, it tends to be the forgotten factor. We’re forever looking for new and exciting flavour combinations; but we’re oblivious to the textural possibilities of the ingredients we buy every week.

Where flavour meets texture

Writing in the Scotsman, Tom Kitchin discusses the years he spent as a trainee chef, learning different ways to chop and prepare ingredients to produce a range of effects. And he makes the crucial point that “cooking isn’t just about recipes. It’s about taking ingredients and making them taste as good as you possibly can.”

This is a sentiment I’d wholeheartedly endorse – to the extent that I’ve just written an entire book about the benefits of cooking without recipes – but I have a slight problem with the terminology. To return briefly to matters chocolatey, is there actually any difference in taste between a Dairy Milk and a Flake? I’d argue not; but their contrasting textures lead us to perceive them differently.

So why wouldn’t the same apply to savoury ingredients? Our eating experience is determined by the combination of flavour and texture. The two factors might not quite be equally weighted – in that no amount of textural magnificence can rescue a meal that tastes repulsive – but they are as fundamental as they are inseparable. A gelatinous, mouth-coating, lip-smacking sauce is a world away from a watery broth, even if they “taste” about the same. And the coleslaw in your sandwich would be an altogether cruder – and, let’s face it, weirder – experience, if the vegetables were roughly chopped rather than finely grated.

Safe experimentation

When I’m encouraging people to get creative with their home cooking, I invite them to think of their kitchens as their own personal research and development departments. The potential problem with this, of course, is that few of us can afford the time or expense of a failed experiment when we’ve got a family to feed.

But this is exactly where textural innovation comes into its own. Experimenting with flavour can be a fraught business. Attempt to pair lamb with banana, and you might just create something wonderful, but there’s every chance that it’ll be disgusting to the point of inedible. Focus on the texture, however, and you run none of the same risks. The ingredients are all familiar, you already know you like them, and you know they work well together. So you can get as creative as you like, secure in the knowledge that there’s not an awful lot that can go wrong.

So when you’re next faced with the ingredients for that over-familiar spag bol, why not try putting them together in a different way? Roll the minced beef into balls – you won’t need any additional binding agent, as the tackiness of the meat will be enough on its own – rather than using loose mince. Try putting the garlic in the meatball mix rather than the sauce, so that each morsel carries a distinct garlicky hit. If you’re in the habit of leaving the vegetables as chunky dice, try chopping them as finely as you can, then frying them gently so that they melt away into the sauce. Experiment with solid cuts of meat instead of mince, and with how finely you chop them.

Alternatively, why not play around with how the constituent parts (pasta and sauce) are divided? Leave the bacon out of the Bolognese and the Parmesan off the table, and instead, toss the spaghetti with Parmesan and fried pancetta before serving alongside the sauce. And feel free to take your pick from the dozens of shapes of pasta on the supermarket shelves, knowing that each will produce a slightly different effect.

It’s true that several of these examples would fail to meet any accepted definition of spaghetti Bolognese. But to put it bluntly: so what? If it turns out that I prefer it, then give me “bucatini al Tom” any day.

The fallacy of authenticity

Here’s one final question. If it’s so straightforward, why aren’t we all in the habit of experimenting with texture every time we cook?

In my view, there are two reasons. I’ve mentioned the first already: we tend to underestimate the significance of texture in our meals. The solution to this is straightforward: think back to the chocolate bar question, and remind yourself that the same principles apply to everything you cook and eat.

It’s not just that we underestimate the significance of texture when we cook (though most of us undoubtedly do). It’s also that we’re all too bloody obedient for our own good. We follow recipes dutifully, rarely bothering to ask why. And we have an unhealthy obsession with authenticity, as if there were some omniscient spaghetti God watching our every move, ready to strike us down at the first sign of non-compliance.

Well, I’ll risk an eternity of pasta damnation by saying to you now: there isn’t.

Food, like language, evolves constantly. Moreover, there are only two characteristics shared by all of the world’s most celebrated dishes, from paella to haggis. The first is they were invented not by design, but by happy quirks of necessity and circumstance. And the second is that no two cooks can agree on the “right” ways to make them. So our quest for authenticity is doomed to failure, because the holy grail we seek simply doesn’t exist.

So, with all that in mind, might I nudge you gently in the direction of a little textural experimentation? Take the meals you know only too well and reassemble them in a way you don’t. You never know: you might just stumble upon your own savoury equivalent of the Wispa bar.

And best of all, the next time anyone asks how many dishes you know how to cook, you’ll be able to answer honestly and with pride: “I have absolutely no idea.”