Category Archives: Media advice

“Never use ‘I’ in columns,” was one of the first lessons I received as a journalist. Oh, there I go – breaking the rule…

Still, my errant behaviour is a small ripple in comparison to the first-person obsession of modern journalism. The thinking behind not using ‘I’ is simple; the reader wants to read your opinions on a subject, not the story of your life. Few people are interesting enough to write in the first-person (God, yes. The Queen, maybe).

There’s another reason for not using the word ‘I’. You’re writing a column, so everyone knows it’s your opinion. In other words, you’re stating the obvious. And it’s boring.

Which makes the national media’s obsession with first-person accounts slightly baffling. The Mail, for example, presents a daily collection of extended rants – telling the reader how the journalist bravely gave up fish paste for Lent. Or something of that ilk.

No paper is immune. This weekend, The Guardian Magazine on Saturday splashed with one man’s story of why he doesn’t eat meat anymore (there was another cover story from the same author in today’s G2). I don’t eat meat either; can I have a book contract?

So, what’s to explain the rise of I-journalism (see Steve Jobs, I can be clever with ‘I’ too)? Probably a combination of factors: limited resources; the rise of celebrity culture, where every one is famous for 15 minutes; and the cult of the individual, where everyone believes they have something interesting to say – and everyone is meant to find it interesting.

Which begs one final question. Why am I writing this self-obsessed blog…?

My wife knows nothing about technology. She doesn’t have a Facebook account and watching her search the web is more frustrating than watching Aston Villa fail to score in four successive Premier League matches.

She cares nothing for the bits and bytes of technology, like much of the world (an oft forgotten detail). But she did mention that she’d heard Apple had released some new technology.

“The Apple iPad,” I said, recognising that while she cares nothing for Steve Jobs’ latest device, she is equally unable to avoid media hype. The iPad – depending on your chosen review – is either a big phone, the greatest innovation ever (since the last Apple innovation, anyway) or the saviour of the publishing industry. Such hype suggests we’re all about to start reading books and papers on our iPads ; my wife’s response to that suggestion?

“Reading is all about relaxing, so why would anyone choose to read a computer screen?”

Quite (now get your own Facebook account and stop using mine to connect with your mates).

The nature of work has changed. Want proof? Search LinkedIn and see how many people choose to define their job role with what might previously have been seen as non-traditional, even esoteric, terms:

4,475,626 results for owner

3,677,739 results for consultant

1,911,106 results for specialist

537,068 results for advisor

469,201 results for founder

468,044 results for expert

405,901 results for freelance

398,420 results for contractor

365,215 results for writer

138,506 results for speaker

84,840 results for strategist

61,032 results for ambassador

50,013 results for thinker

45,848 results for visionary

42,614 results for guru

20,318 results for blogger

16,270 results for evangelist

4,582 results for entreprenuer

3,094 results for gatekeeper

1,637 results for futurist

Old favourites – like freelance and contractor – are still popular. At the start of the last decade, such descriptions were seen as being catch-all phrases for individuals operating at the fringes of the formal economy and providing an outsourced service to larger businesses.

Twenty years ago, futurologists predicted something called ‘the internet’ would allow us to all work flexibly. Now, in a new economy driven by collaborative technologies, freelancing has become the mainstream. A global economy of contractors is fast-developing, with individuals selling their expertise on-demand.

Old monikers – such as freelance and contractor – do not necessarily encapsulate the act of work. The result is a collection of meaningful/meaningless terms that are used to describe what people actually do, or would like to do.

I wonder how the table will develop as the economy changes? Feel free to suggest other esoteric descriptions.

My Twitter feed seems to be a constant trickle of lists, with the latest bunch of social media gurus keen to impart their knowledge on topics like search engine optimisation and social networking. Good for them.

And good for me, as I jump on the top tip bandwagon and ride into the search-optimised sunset. Ladles and gentlespoons, let me unveil my top five types of numbered list stories:

Top 10: The all-time favourite – most top tip lists come in tens and there’s a reason for that; it’s a round number

Top 100: The ultimate list story – particularly good for top album blog entries. And for Channel 4 TV shows compiling clips from the 1980s

Top 5: Half a top ten but not necessarily half as good. Great for your basic, short tip list

Top 6: Also has a nice, round feel. Useful for list compilers that are aiming for ten, but who quickly run out of ideas

Top 9: There’s an honesty about giving a top nine; the complier knows they’ve only got nine points and they’re admitting as much

I recently saw a ‘Top 9′ list story where one respondent complained that the journalist hadn’t bothered to round the list up to ten. That might be so, but at least the journo was honest – the scribe clearly got to nine and ran out of ideas. I mean, it’s not like these list stories take five minutes to put together.

Social networking is great. You can use Facebook to see photos of people you lost touch with years ago, celebrating the birthday of someone you don’t actually know. You can use LinkedIn to hype yourself up as the latest, greatest ‘social media guru’. And you can use Twitter to find out that loads of people got up this morning, ate some food, listened to a bit of music, were busy at work, went home, watched TV and went to bed.

But social networking is also a bit odd. I was watching the news on TV earlier and there was a lot of coverage of Peter Harvey, the teacher from Mansfield who has been charged with attempted murder. After I’d finished my fix of retro information gathering (news on the TV), I went all Web 2.0-tastic and did a search for the teacher on Facebook. And there was quite a bit of stuff, some of which surprised me – names, alleged actions, etc. You know, the kind of stuff the retro media aren’t mean to print in case of prejudicing a trial.

But all that stuff is fair game in the world of social networking. Isn’t it?

When I was little, community was the first word in ‘community centre’. It still is, of course – even if I no longer live in the countryside and my small world no longer revolves around a small building in the middle of a small village.

While community centres still exist (I think), the word community seems to have taken on a life of its own. Rather than just a simple adjunct to another word, like centre or rural, community is a term imbued with its own connotations.

When TV reporters head into the field (usually a place, rather than a green piece of land), they refer to the community – they talk about the ‘reaction of the the community’, ‘the feelings of the community’. With a knowing tone, we are all supposed to know what they mean – we’re supposed to feel their interviewees’ collective pain. Because in the end, we’re all part of a wider and understanding community. But are we?

I would suggest not, actually. What we actually have is rampant individualism, and what was started in the 1980s has come to a head in the consumer and celebrity-obsessed noughties. The internet isn’t (always) helpful. Web 2.0 is meant to be about collaboration and community but often becomes manifest as individualism, with everyone worried about their presence – look at my Twitter page, pay attention to what I think, please read my blog. Talking of which, please read my blog.

In this webtastic age of search optimisation, one answer is a lot of references to stuff that will get you a high Google ranking. The basic theory goes something like ‘never mind the quality of my story on service-oriented architecture, just check out how many times the headline mentions the recession, Angelina Jolie and Twitter’. Goal.

Or is that an own goal? Back in the world of ink, paper-based headlines are usually short. There’s often a pun involved, too. I worked with a sub who thought song titles by The Smiths made the best headlines. The theory works well to some extent, such as in the case of ‘How Soon is Now?’, ‘What Difference Does it Make?’ and ‘This Charming Man’. But ‘Girlfriend in a Coma’ and ‘The Queen is Dead’ have a more limited applicability.

And of course, there’s the online problem. ‘This Charming Man’ is a nice title for a magazine article on a friendly CIO. But most paper-based articles end up on the web and would you click on the article if you weren’t a fan of The Smiths? More importantly, would you be able to find it?

The end result is that puntastic magazine headlines get rewritten for the age of webtastic search optimisation. In fact, stories start to exist simply because people know they will get hits, such as ‘Top 10 tips’ articles. As journalist Andy McCue said to me the other day: “I’ll go mad if I see another ‘Top 10 tips for beating the recession’ article.” Check out Google News, my friend – there are plenty to push you over the edge.

Still, the headline and the content are no guarantee of attention anyway. I heard a woman on the train say to her friend the other day: “So, what was that story about a plane landing on a river? I missed that.”

It is difficult to understand how she could have missed the story of the US Airways plane landing on the Hudson River. Well, unless she’d sold her TV, refused to read, smashed up her radio, disconnected the computer, refused to talk to another human being and moved to Venus.

Which, I assume, she hadn’t. In short, you just can’t grab some people’s attention – even when the headline is great and the content of the story is even better. But good luck trying.

There’s been a lot of guff about the carbon cost of Google searching during the last couple of days, with the debate prompted by research from a Harvard academic, which suggests two Google searches produces the equivalent C02 as boiling a kettle. If you’ve found this post through a Google search, I hope you’re enjoying your ‘equivalent’ of half a cup of tea.

The research doesn’t really tell us anything we don’t already know – in short, searching for stuff, using energy-hungry computers and data centres, eats a lot of power. So, I started thinking about stuff we’re doing that eats power – especially the stuff that is meant to be green.

Take Christmas cards, for example. No one posts Christmas cards anymore (except my wife and her Mum). People send emails, Facebook pokes and electronic cards – it’s meant to convey the same message and can be sent with a cheery: ‘I am saving the environment by not posting a paper card’.

Except you’re not, because all this electronic stuff eats carbon, too. And it’s rubbish anyway – cards are much nicer and much more personal. And I bet posting a card causes less of a drag on resources that all those tweets, emails and pokes. Long live the Christmas card!