CategoryMedia

There’s a certain amount of, well, let me call it self-pitying whining going on in some quarters about how hard it is to run a tech content site in Britain. Oh, it’s expensive. Oh,

“£30,000 (my personal investment into this business) doesn’t get you very far in media.”

Well, cry me a river while I play the world’s smallest violin. (Let me also ask: where exactly has that ££30,000 gone? My understanding is that pretty much nobody who worked on the early site setup got paid, and that the continuing web hosting is provided at, well, generous rates.)

Because you know what? There are plenty of people who are making media, and tech writing, work in Britain. Yes, startup tech news sites in Britain. How do they do it? Not by insulting the people they’re writing about, or flying the most ludicrous (and proven-by-time wrong – now six months) stories, but by making contacts, understanding the market, and working hard, and being prepared for it to be a real slog.

Want some names? Stuart Miles, creator of Pocket-Lint – who is a fantastic journalist: first – as far as I know – in the UK with details about the Nokia Lumia 800 having a micro-SIM. about the iPhone 5 having a nano-SIM, about Orange/T-Mobile/EvEv having iPhone 5 with LTE. And that’s just off the top of my head; someone who knows the site’s, and his, output better could doubtless name more.

How did he manage it? By knowing people, knowing the industry, making friends, listening, talking. And all this while being a father to a young child too. (The latter is the really tough part.) Oh – and he didn’t hire people on vague promises of money. He just built the site until now it employs multiple staff. And is a charming guy as well as being very good at his job.

Or how about Tom Warren, creator of WinRumors? How on earth do you create a site writing about Microsoft and all of the wrinkles of Windows and everything else when you’re situated in the UK? By being determined, and prepared to slog away at it, that’s how. I haven’t met Tom – only know him through Twitter – but have been constantly impressed by the fact he could spot the nuances in announcements, or see the angle for his audience in something everyone was covering, or just get in there with a rewrite of something that was running, and get the story up. Was Winrumors a gigantic money-spinner, or money-sink? Neither, as I understand it; I think Tom was doing another job while running the site. Often, it’s just sheer determination to post that makes the difference in the modern world. Tom has since been picked up by The Verge, which is well-funded; but if anything were to go wrong, you can be sure another site would pick him up rapidly. Or he could do his own.

Or – there are tons of these – Rafe Blandford of All About Symbian? It’s an impressive-looking site – and he also grabbed All About Windows Phone when he saw which way the wind was blowing, Nokia-wise.

Want another? How about Electricpig? And not forgetting the granddaddy of them all, The Register, which started in 1994 (yes, really) when you’d only have broadband – or perhaps any internet connection – if you were in an office. It’s a good employer and quick payer (at least in my experience; I wrote for it as a freelance in 2005, and only have good memories of the experience).

Or of course Mike Butcher at TechCrunch Europe, who ran it pretty much as a one-man show (and before that had his own mbites offering) for a substantial time; it’s hardly as if Michael Arrington was leaning over his shoulder or pouring money into his bank account.

There are tons of sites like them – British tech journalists, doing stuff their readers want, making contacts, breaking news, remembering the adage that news is “stuff you care about, and/or stuff you want to pass on“. (Give me some more names in the comments, I’ll add them here.)

Which is why journalism is done best when it’s done with the readers in mind, and when it’s not trying to annoy for the sake of annoying but instead with the aim of shaking up the reader’s expectations. For while there are plenty of companies that find The Register’s style irksome, they can’t deny that it gets facts in front of readers. Lots of readers.

So yeah, it’s hard, but it’s not that hard as long as you don’t have delusions of grandeur, and approach it in the expectation that it will be really hard. I haven’t created a tech site – though as a freelance (twice) I’ve had a mortgage to support, and the second time kids as well.

Equally, starting a blog has never been easier. You just have to bring some quality to it if you’re going to make money at it.

OK, so this is a review, and it contains spoilers. Though that raises the question of whether you can spoil a film that is irredeemably bad in the first place.

To begin:
I had high hopes for Prometheus. I love the original Alien film – as I’ve blogged here previously, its script and screenplay (and design and direction) are timeless marvels. The fact that it was made pre-personal computers (so that all the computer interaction is utterly clunky to modern eyes) is actually a blessing, because it lets you focus on the thing that is always fascinating in a good film – the interplay of the actors, the things they say and do, and the plot.

If you want a hilarious dissection of the first half-hour or so of Prometheus, do enjoy yourself by going over to Digital Digging, which starts by looking at it from an archaeologist’s perspective, and then just the perspective of someone who wants people to behave a little more rationally than just “that hole looks dark, I think I’ll stick my head in it and then turn the light on”. The comments (especially the dimmer bulbs transported over from Boing Boing) are worth a laugh too. Bear in mind, of course, that some day those people will be eligible to vote. You could also enjoy James Whatley’s post on the many WTFs in the script.

Focus, always focus

But I want to focus on those elements that Prometheus missed, which are the essential things of a successful film. In part it’s because I’d like to be able to imagine what a good screenplay would look like, but also because it’s only the very worst of things that shows you quite how badly things can be.

I now discover that one of the screenplay writers worked on Lost. Oh, the TV series that threw off loose ends and never bothered to tie them up endlessly, and sprawled over seven series before gasping over the line? Sure, that would be a discipline for writing a self-contained film. Not.

Not that a film has to answer every question. You can’t. In a screenplay, some things just have to be accepted: why someone is a stepchild, why they are rich, why there are a bunch of pods that seem to just be sitting there, where the blue light that plays over them came from. (Answer: from The Who, who were rehearsing in the studio next door. Sorry, did I spoil that?)

I knew in the first moments of Prometheus (which I watched in 3D at an Imax – I told you my hopes were high) that something was very, very wrong. Why? The music. Whereas the original Alien runs violin bows up your back, this was playing jolly major chords as though you’d just accomplished something. How can a film that’s going to discover the makers of the Alien going to be jolly? That’s all wrong.

Cut to a scene with SuperOffWorldMan drinking something and curling up and dying and his DNA all splurging into the already rather fecund streams around Iceland. Er, why? Why does he need to do this in order to seed the planet? Eh? This is not explained, and while it’s OK to have some things be mysterious, it would be nice to feel they fit into a broader picture. Later we learn (after being told “you can’t cast off hundreds of years of evolutionary theory”) that human DNA is a 100% match with mateybloke’s. Which raises the question rather forcefully of the whole animal kingdom and the preservation of DNA and genes throughout the entire phylogeny. Seriously: if you’re going to play around with science in a film, try not to insult those in the audience who might have even a vague scientific knowledge, because you’re going to piss them off. None of the science in the whole thing was the least bit convincing. None of it. It’s not even worth bothering writing why it wasn’t. None of it at all is how scientists behave – that is, thoughtful, rational, reflective.

Next moment I knew this was a wrong ‘un: Noomi Rapace, as an archaeologist, finds a cave (how? Not explained) and shouts down a Skye valley to another archaeologist. If you’ve ever gone anywhere in a valley of any description, then you know that you can’t shout down them and expect anyone to hear a damn thing. But, magically, blokey down in the valley does. Though by the time he’s made it up the valley, she’s tidied it all up and dated it. Uh-huh.

Space stupidness

Some space stupidness follows; my heart sank as I saw that the Prometheus spacecraft is meant to have a crew of 17. Seventeen people. Now, that might actually be what you need to run a spacecraft. However, for a film it breaks a key rule.

Rule No.1: how many characters?: people can only follow a story with a maximum of seven, perhaps eight, characters to worry about. Seventeen is a bus load. (Alien: seven people. Friends: six people. ThirtySomething: six people. Mad Men: six people, plus a few who come in and out – Don, Peggy, Joan, Roger, Bert, Pete, and the wives and some of the others.) Do not try to write more than six people into your script unless you absolutely have to have the seventh. (Stuff Magazine’s Mat Smith, who has been to see it twice – he’s a man of some taste – says that he still doesn’t know who some of the people are.)

Rule No. 2: pacing. This film doesn’t have it. There’s no obvious motor. Yes, we know that it’s an expedition, and we in the audience are all on edge expecting an alien to jump out; but that’s not the same as a motor, the reason why you keep watching. In thrillers, it’s called the Macguffin – the excuse for keeping you interested. (So in Mission Impossible Ghost Protocol it’s the stolen Russian nuke codes, for example. Doesn’t matter if you’ve seen the film – you already understand that stolen Russian nuke codes are probably something you want to be unstolen, or grabbed.) In Prometheus, what is the Macguffin? What are you waiting to find out? You’re never sure, and that’s a key weakness.

Rule No. 3: tidiness. If you drop hints about events or people, don’t then drop them. So Idris Elba, having emerged from the sleep things, decorates a Christmas tree, because they’ve missed a whole load of Christmas parties. Aw. Except that’s the last reference to Christmas or parties. We don’t learn whether he’s a party type, or whether Christmas has some deep meaning, or what.

Rule No. 4: character. This is so important. Idris Elba again: he’s the captain of the ship. This should in theory mean that he can tell anyone what to do in order to keep the ship safe. If he can’t do that, then he’s just another Red Shirt. But we never find out which he really is, because he never has the sort of confrontation with Charlize Theron which would tell us what he thinks of their relative positions. Only that he would like (and gets, apparently) a shag with her, which doesn’t actually advance either of them as characters; although it does tell us that he’s perfectly happy to leave the bridge uncrewed while two of the crew are marooned off-ship, leaving them effectively without radio contact. So, basically, a completely crap captain. Except that at the end he then becomes big brave captain, prepared to try to wallop the departing alien ship. Why? What? When did that change come about? And how exactly did they choose him in the first place, if he’ll abandon the bridge like that? The character makes no sense. You can’t predict what he’ll do at any time – although I did realise after a while that it would always be “the utterly stupid thing”. Someone been attacked by an organism off-ship? Let them on! Someone attacking the crew down below at the ground hatch? Open the hatch a bit more so you get a good look – don’t want to close it off or anything. Hell no.

Rule No. 5: consistency and plausibility. Cite above: Idris Elba and his wandering characterisation. (I blame this on the script, since his Luther and Stringer Bell were so powerful.) Cite 2: plausibility. Not explained: how do the archaeologists know that the star formation is… how the hell do they know anything, actually? Why does the “invitation” turn out to be a pointer to what we are led to believe (perhaps wrongly, mind) is a military dump?

And while we’re on the military dump thing: another part of the plotting/pacing/plausibility thing is that at no point do the characters get together and try to figure out what’s going on. In Alien, after the alien escapes, they gather and plan how to catch it; after Brett gets done, they gather again and plan what to do. See? Talking. A council of war. Some discussion of quite what they’re dealing with.

Update:Rule No. 6: script. I was reminded of this by this tweet (if you can’t be bothered, it says: “Ok, yes, Prometheus was awwwwful. What a disappointment. I feel like I need to scrub the black goo out of my brain by watching Contact.”

Damn, yes. Contact – the film in which we discover aliens beaming messages to us instructing us how to build a rather large and scary structure – has one single exchange which deals with questions about faith and belief and existence far better than anything in the whole of this film.

So the setup in Contact is this: Jodie Foster (rationalist scientist) is debating with a reasonable, but religious fella, about how you can “prove” things in religion, and whether science can prove everything (she maintains it can). Foster’s father, we’ve already learnt, is dead.

What a line. It’s the sort of line that completely floors you. And of course it floors Foster. That’s great scriptwriting – create a situation where the viewer is drawn in, and then leave them in the same place as your main character.

Doesn’t happen in Prometheus.

What Hollywood wants

But no, that’s all ignored; instead Hollywood wants what Hollywood gets, which is a daft action movie, with loud noise and unscary creatures, unexplained motives (I couldn’t figure out what Fassbender’s android was meant to be doing at all) and a refusal to deal with stuff like plot in favour of stürm-und-drang. Kids might enjoy it, but I think that actually kids can tell the difference between lazy scripting and good scripting.

If I hadn’t been told this was an “Alien prequel” (which it can’t actually be, because it’s not the same planet – the number differs, there isn’t an astronaut in the pilot chair, it hasn’t been attacked by an alien, there isn’t a message warning people off) nor that it was made by Ridley Scott, then my expectations would have been much lower; I might have tolerated it, but I’d still have thought that it was crap, with stupid behaviour and cardboard characters who don’t do things real people do.

Let’s leave with this, from Red Letter Media, which asks most of the questions that I couldn’t be bothered to ask.

What was that black goo – was that different to the sparkly green goo… Why did Ridley Scott let his 12-year-old son do the makeup for the old man… How did the old man know where to point where the scientists were when he did the introduction…

Basically, it’s just as bad as it was in 2D, but some of the sequences are a bit more 3D-y.

But the bits that you really want to be more 3D-y – to take advantage of the (limited, but still not zero) possibilities that the three-dimensionality offers – aren’t.

Specifically, the pod race, which you’d think would offer terrific opportunities, if it was properly done, to give you those “oof” feeling in the pit stomach, is just the same pod race as in the 1999 version, but with just a shimmer of 3D-ness added. The stone pillars don’t loom at you, the tunnels and canyons don’t zoom out of the screen.

And the climactic fight between Gingerchops (Darth Maul to you) and the Linen Sack Wearers (Jedi Knights if you prefer) really wants to have some bullet time added. (The Matrix and this film orginally came out in the same year, 1999.) But of course they can’t. You’d need to call everyone back, redo the sets, and reshoot it.

However there’s a worrying indication that 3D is being used just as the CD was in its early years – for the industry to shore up its revenues and profits by redoing films that did well the first time. As we left the cinema (the boys had wanted to see it) I noticed an ad for something else from the past – so old I’ve already forgotten what – being re-offered in 3D.

It’s a bad trend. Star Wars might have its own fanbase who’ll go to everything (and a new younger audience who have never seen it in the cinema) but if the film industry tries to rely on 3D as its moneymaker, things are going to go badly.

Not that they’re going swimmingly as it is. There were ads for tons of rubbish films there – Prometheus (looks like nonsense), Abraham Lincoln: Vampire Slayer (er, what?), Battleship (more Transformers-like nonsense). No doubt they’ll have 3D versions. Not worth the money, I’d wager.

In fact, besides Avatar (which I haven’t seen) is there any 3D film that really makes good use of the technology? Actually, is there any good use of 3D in normal cinema films?

In case you’re interested in how the book will read, here’s something that I wrote last night. It’s looking at one of the key stages in the iPod’s development: the very early stages. So here’s some draft content. It’s got notes and repetitions and things that need to be tweaked, and the name of the main interlocutor has been removed because, well, that’s for the book, isn’t it?

Comments welcome (eg “you left out the bit where…” or “just as important in 2002 was…”). And I’m really interested in hearing from anyone who:
– worked for/with Microsoft around the time it was trying to get Windows Media Player/Audio/Janus implemented

– worked for/with Microsoft on its “online services” system – MSN – while it was being passed by Google in 2002-4 for revenues and market share: what did Microsoft think, internally? (I’d be just as interested in talking to someone who mentioned this to Microsoft as an ex-Microsoftie.)

– worked for/with Google pre-IPO who could talk about its thinking over whether it wanted to confront Microsoft.

>>> The launch of the iPod in 2001 intrigued MusicMatch, and soon they were talking to Apple about the possibility of tweaking their software so that the millions of Windows users – a huge, untapped market for the iPod – could use it with their machine. At the time the iPod’s iTunes software only worked on Macs, and required a high-speed Firewire connection – which every Macintosh since 1999 had, but which was comparatively rare on Windows machines. Even so, enough had it (because the Windows PC market was so big and various) that it made sense for MusicMatch to offer it.

In July 2002, Apple introduced its second-generation iPod, with up to 20GB of storage – and introduced “iPod for Windows”, which used MusicMatch’s software to connect to Windows PCs.

BBB knew that the relationship with Apple was on borrowed time: “we could see that if it took off then they would write iTunes for Windows and steamroller us,” he recalls. But the experience was fascinating, and there was always the possibility that MusicMatch might be able to engineer some way to hold on to Apple – or perhaps to get Apple to hold onto it.

He had a number of meetings which Jobs attended: “generally he would walk in, say ‘this is shit’, and walk out,” he recalls. “Or he would say ‘this is far too big. It’s too bulky.’”

At the time the music business was in flux. The original incarnation of the file-sharing network Napster had been downed in the courts, but that had led to a hydra-headed decentralised sharing system called Gnutella, which had no central index as Napster had had. The record labels had nothing to aim at.

Since they were unable to shut down those networks, the record labels’ logical next move was to prevent music being ripped from CDs onto computers; that would prevent new songs being uploaded and shared, and should tamp down piracy. “Sony had had success in Japan with the MiniDisc format, which prevented you from copying songs back and forth,” said BBB. “Together with Sony Music, they seemed to have the formula. And Sony Electronics was huge in those days.” So the labels pressed for similar copy-prevention technology – known in the business as “digital rights management” software – to be included in music players and ripping software, and separately on CDs.

BBB adds his own context to the labels’ drive to get DRM instilled everywhere: “in the record business, everyone feels that they got screwed in their last deal. So in the next one they’re always looking to get the best possible deal. Songs will have different publishing rights in different countries. And the record labels and the publishers don’t see eye to eye. It’s a recipe for disagreement.” And for stalemate.

But Microsoft was listening to the record companies’ calls. It was a company full of skilled programmers who would be able to write software that would implement DRM to prevent copying. It quickly devised a strategy: using its Windows Media Audio format (which “independent” tests suggested gave better listening results and smaller files than MP3 at the same compression ratio). Files ripped on PCs using Windows Media Player, the default system, would be transferred with DRM onto digital music players so that the songs could not be copied onto another PC. That would tie the player to its owner’s computer. And uploading WMA files protected in that way to file-sharing networks would mean they wouldn’t work on the PCs of anyone else who downloaded them.

It was a brilliant strategy, except for two things. First, CD-ripping was still a minority sport limited to people who understood how to do it and what its purpose was; that meant they were specialists who were wise to Microsoft’s machinations especially the DRM,. (The high profile of Microsoft’s conviction in the antitrust case had eroded user trust that it was really acting in their best interests, rather than the interests of its partners.) They instead downloaded other programs – such as MusicMatch – which could play WMA files but could also rip songs into MP3 format.

The second problem was Microsoft overcooked the software, says BBB: “it was just too hefty for the hardware. It didn’t quite work right. There would be glitches, and the drivers didn’t quite work right, and the transfer was really slow.” That was because they relied on USB 1.1, rather than Firewire, connections. Firewire was about ?20-40 times faster[how much faster Firewire than USB] and USB 2.0, the faster standard that was comparable in speed, wouldn’t arrive until XXX[when USB 2 released?] and would take some time to become widespread in consumer electronics devices – particularly digital music players.

Then there was the industrial design aspect. BBB recalls seeing the prototype for the third-generation iPod during a discussion with Apple executives; Steve Jobs made an appearance – “he would kind of drift in and out”, is how he puts it – to pick the prototype up and criticise it for being too thick and then walk out.

A month of so later BBB was at the headquarters of Dell Computer in Austin, Texas. Dell was eager to get into this burgeoning market: it reasoned that it could use Microsoft’s software, and design its own hardware (as it did with PCs) but that unlike Apple it would be able to use its buying heft to drive down costs and so undercut Apple. The market was there for the taking.

BBB was handed a prototype for the Dell DJ player, which like the iPod used a 1.8” hard drive. “Jeez, this thing us HUGE!” he blurted out.

It was indeed noticeably deeper than Apple’s existing iPod, and substantially more than the forthcoming iPod – which MusicMatch knew about but about which its team had been sworn to secrecy, on pain of extremely costly legal action. “One of the Dell designers explained that that was because the Toshiba version of the hard drive had its connector on the side, and the Hitachi one had it on the bottom, but because they were dual-sourcing they could get the price down by 40 cents,” BBB recalls. “That was the difference in a nutshell. Apple was all about the industrial design and getting it to work. Dell was all driven by their procurement guys.”

[NUMBER IPODS SOLD PREV QUARTER]
[AUTOSYNC IN IPOD]

In September 2003 Apple launched its third-generation iPod, supplanting the one that Dell’s engineers had been comparing their design against. This one was notable for two features: four touch buttons just below the screen, instead of being embedded into the scroll wheel – a feature that was abandoned in the next generation as unwieldy – and a proprietary 30-pin dock connector on the bottom of the device. That allowed it to connect to a Firewire or USB 2.0 port, via a cable. (The buyer had to specify which cable they wanted.)
>>>

Something on Twitter reminded me of this. This was written for the January 27 2000 edition of The Independent.

BY CHARLES ARTHURTechnology Editor

Britain’s Greenwich meridian could become the new reference point for time over the Internet, after two rival groups of British businesses resolved their differences over whose measurement they should use.Greenwich Electronic Time (GeT) will be a powerful brand which could guarantee that companies based in different countries doing business deals could be certain of when they happened.With more and more time-sensitive data being exchanged – such as online stockbroking and consumer purchases – it is increasingly important to be able to confirm when transactions take place, said James Roper, chief executive of the Interactive Media in Retail Group.“Who owns a product at what time if you buy it over the Internet?” said Mr Roper. “If you don’t agree about what time it is, you could find that there is a time during which people think they own it – and if both of them then try to sell it you could have real problems.”By using GeT as a single reference time, confirmed by a network of super-precise clocks around the Internet, Britain would be “at the forefront of Internet development,” said the Government’s newly appointed “e-envoy” Alex Allan, the former British High Commissioner to Australia.Comparing timestamps of online transactions has already helped to track down fraudsters, said Ian Collins, managing director of Cybersource, which provides the software that powers many e-commerce Web sites. Extending GeT further would help to do that in future, he said.Yesterday’s launch saw the unification of two factions that had threatened to split the initiative before it started.The Prime Minister Tony Blair initially launched GeT on January 1 – but it did not then have the essential backing of the London Internet Exchange (Linx), which represents the major Internet service providers in the UK.Linx, whose offices lie on the Greenwich meridian, had planned to launch its own Greenwich Net Time earlier this month – but was persuaded not to by lobbying from the Government and other industry bodies. Instead the two merged their efforts to produce the single brand.The Internet already has a network of clocks which are meant to contact each other and confirm their time by connecting to other precision clocks, usually running on “Coordinated Universal Time”, a global standard adopted in 1982.A key step in promoting the GeT “brand” will be the availability of free software from its Web site at www.get-time.org which will enable businesses and users to ensure that their computers are in tune with GeT, and to timestamp e-mails and Web transactions against them. That software should be available in the next three or four months, said Mr Roper.//ends

—-

Great idea! (Well, inside the civil service it seemed great. I thought it was a pile of nonsense. After all, you already had UTC, coordinated via atomic clocks over the net.) What could possibly go wrong?

And then in August:

—–

BY CHARLES ARTHURTechnology Editor

A high-profile scheme launched by Tony Blair in January to make Greenwich the reference point for “Internet time” has run into a dead end. It cannot work with Microsoft’s Web browser, used by the vast majority of Net surfers.Now, the team behind the “Greenwich Electronic Time” (GeT) initiative are wondering if they will ever be able to persuade people to use their product.“Overhyped? Er, that would be true and fair I suppose,” said James Roper, chief executive of the Interactive Media in Retail Group (IMRG), one of the scheme’s backers. “We have encountered a nightmare of problems that were so compounded we hardly knew where to start.”Announcing the plan to create “Greenwich Electronic Time” (GeT), at the start of the year, Mr Blair suggested it would put Britain back at the centre of timekeeping in the new millennium just as the invention of Greenwich Mean Time (GMT) did during the age of sail.But the reality has proved rather different. The GeT team had suggested in January that within four months they would offer free software for PCs which would be accurate to 0.003 seconds against an existing world standard set by atomic clocks.Instead, the project only last week produced the first version of its software – and The Independent found that it can display times on the same screen which are out of sync with each other by nine seconds or more.The problem stems from Microsoft’s Internet Explorer browser, used by more than 80 per cent of Web surfers. Computer code within the program behaves unpredictably, creating the differing time display. But the software giant shows no signs of changing its product to please Mr Blair or the GeT team.“You would have to ask Microsoft why their version of their own software doesn’t do what their published details say it will,” said Keith Mitchell, executive chairman of the London Internet Exchange (Linx), who is exasperated by the mismatch. “I don’t know why it doesn’t.”The failure is another embarassment for the Government’s repeatedly proclaimed desire to make Britain an e-commerce capital. Last week the House of Lords passed the Regulation of Investigatory Powers (RIP) Bill, which has been criticised by business and consumer groups for infringinging on civil liberties. A number of Internet companies have said they will relocate outside Britain to avoid the email and communications snooping that the RIP Bill allows.The flaw in GeT is caused by differences between Microsoft’s version of a computer language called “Java” and the public standard created by Sun Microsystems. Microsoft is being sued by Sun for breaking its licence to use Java in the browser. No resolution is in sight.The GeT team had hoped that their system – backed by a network of atomic clocks around the Internet – would rapidly become a reference point for all sorts of online transactions. which backs the scheme, suggested last week that it could be used to help people doing online share dealing, gambling and auctions: these, he said, could hinge on messages which would have to be time-stamped to an accuracy of less than a second from a central reference point. The Government’s “e-envoy” Alex Allan, said it would put Britain “at the forefront of Internet development”.Instead, despite the non-appearance of GeT, electronic commerce has snowballed this year. Online gambling, sharedealing and auctions are all booming, used by millions of users worldwide.“The world is muddling through,” insisted Mr Mitchell, “but the volume of transactions compared to their potential is still small.”The same applies to GeT, though: its present network of atomic clocks could handle “tens of thousands” of users, said Linx. That compared with projects like Napster, which has an estimated 20 million people using its software.The GeT project, meanwhile, was reluctant to publicise the release of the first version of its software in case too many people try to use it: there are fears that the atomic clocks would be unable to cope with a large volume of demands for the time.

—-

Oh God, you have to believe that I was just astonished at how bad that was. And how fundamental the mistakes were.

Still, we don’t have that sort of idiocy any more in the civil service or government. Do we?

So there’s lots of people reading my post about the evils of PR done badly.

But who ever suggests how to do it correctly?

Well, here’s a start.

Emails: have a meaningful subject line. Often it’s the only thing the journalist will read before deleting it. Journalists delete lots of emails. Never, ever leave it blank.

DO include the content of what your client insists should be attachments in the body of the email. More and more journalists are reading their emails on the move, so they can’t necessarily view attachments, and won’t set their phones to download them. Text is cheap. Put it in the body of the email. And then tell the client you don’t need to include the 1MB attachment because it’s been dealt with in the 50K text of the file. (It’s just left out the vast logo nobody cares about.)

DON’T send PDFs as attachments. Can’t get the text out cleanly, can’t read them easily.

DON’T include pictures unless they’re the very smallest thumbnails, for the reason just given above: mobile data is an expensive pain.

DO include a link where we can get the entire press release and/or the images for it. We might want to link to it so readers can gasp at your brilliance. Plus it means we don’t need to copy or retype stuff. If it’s embargoed, give a username and password to log in so we can look at it. But set that to expire so everyone can see it in time.

DO, if you’re going to inflict a survey on people (mostly: please don’t) include a link to the original data where the journalist can download it and play about with it. Normal humans might like to do the same.

DO understand that journalists get gazillions of emails every day, plus we’re looking around at blogs, plus we have stuff to do ourselves. We don’t necessarily have time to respond to every one. In fact, we definitely don’t. (See above about deletion.) That followup phone call just gets in the way of us writing a story, linking to your press release, writing our own hard-hitting expose. That’s why journalists are so arsey on the phone. Well, some of them.

DO read my post about how PR and journalism are orthogonal. You don’t ring up McDonalds asking them to fix your car. A lot of PR is getting too mailing-list driven. Know your journalist before you email them.

But most of all do include links. Put this stuff on the web. It’s 2010, not 1995. News organisations have changed. Why hasn’t PR?

This story was first written for The Independent to appear in its 13 April 2001 edition. $2.50 for every copy of iTunes? One wonders if Apple will ever remove the facility to encode in MP3 from iTunes….

BY CHARLES ARTHUR

Technology Editor

Are you still listening to MP3s? Microsoft wishes you wouldn’t; and so does the record industry – the first because it would rather push its own, proprietary music-digitising format, and the latter because MP3s have, it claims, undermined the business through web sites such as Napster.

Although millions of Internet users have shown themselves to be hooked on the MP3 format, which can turn music tracks into small files that can be swapped and transmitted over the Net, Microsoft said that its next consumer operating system, Windows XP, due out in autumn, will “not include” the ability to produce high-quality MP3s.

That will severely restrict the listening quality of any music turned into an MP3 with that program. Instead, anyone trying to digitise music will be encouraged – not particularly subtly – to use Microsoft’s own “Windows Media Audio” (WMA) format.

Meanwhile RealNetworks [CORR RealNetworks] of Seattle, which was set up by a former Microsoft employee, is also pushing its proprietary RealPlayer format for digitising music.

The intent: to ease computer users to a position where they cannot send each other copies of music without paying for them. Both the Microsoft WMA and RealPlayer formats have “digital rights management” software, with copyright protection built in that will automatically police the use and sharing of music between computers. Only people who can show they have permission to listen to a WMA or RealPlayer file could listen to it on their computer – unlike MP3s, which can be swapped freely.

The WMA format does have the advantage that songs take up less room on disks. But with new technologies providing exponential increases in storage in all formats, that is unlikely to be a burning issue for consumers.

The intent of the two companies to have their own formats used by consumers belies the obvious popularity of MP3s, which are produced under an open standard: anyone can write a software program that will decode them, although software to create MP3s calls for a licence fee payable to the Fraunhofer Institute, which developed the format. That costs $2.50 for every copy of the software produced. For Microsoft, which hopes to sell millions of copies of XP, that could add up.

“We think at the end of the day, consumers don’t really care what format they [record] in,” said Dave Fester, a manager at Microsoft’s Digital Media Division. He said that despite the new restrictions, XP will do “a great job of making sure our player will play back MP3s.” But for new content that users might want to create, he says there “are clear advantages” to not using MP3.

Clear for Microsoft, and also for the record industry, which has been driven to distraction by the success of MP3s, particularly in the form of the Napster file-swapping service, which has allowed tens of millions of people to download literally billions of tracks without paying for them.

That is where consumers and the record industry diverge. “The industry doesn’t want [MP3] pushed, and Microsoft and RealNetworks don’t want it pushed. The consumer is going to eat what he’s given,” said David Farber, former chief technologist at the US’s Federal Communications Commission, who generally opposes the company.

He thinks that XP will be a major weapon in that. “When Microsoft decides to put something in their operating-system support, it becomes the standard,” says Mr. Farber, who against the company during the Microsoft antitrust trial. “The average consumer will use what comes on the disc when he buys the machine. They’re very effective in that way.”

But even those who wish MP3s would disappear allow that that might never happen. “It’s a little like the VHS tape,” said Steve Banfield, general manager at RealNetworks. “DVD is great, but VHS is ubiquitous and it isn’t going away anytime soon.”

Observe the trailer above: it tells you pretty much from the outset that this is a comedy – it’s going to be one of those dash-everywhere-oh-my-god-can-they-do-it, rather like the last 10 minutes of Notting Hill (hope that doesn’t ruin it for you).

If you look an early version of the poster (here at Coming Soon) then you get the message straight away: Comic Sans font! Hey, it’s a laugh!

If you went by the trailer, or the Comic Sans font and the rib-nudging tagline, you’d think that The Concert is just a bit of comic nothing – an easy way to pass 90 minutes or so.

No. Completely not. It’s a terrific film which packs a huge emotional punch in a closing section which has no dialogue at all but explains all the loose ends in the story. (There’s a question about whether some of what you see in that section is a flash-forward or just an ambition – I think it’s a flash-forward which, for reasons of keeping the ending tidy, had to be put before the climax).

Don’t just believe me – IMDB, the movies database, is a reliable guide to what people think of a film. And people there give it 7.5/10 (I’d give it higher, personally).

It’s one of the rare examples I’ve seen where the trailer gives you no idea of the emotional power of a film; it makes it look like a silly comedy, but it makes many more points – some of them in comic fashion, sure, but the heart is serious.

It’s unusual, isn’t it, for a trailer to undersell a film? Before seeing The Concert, I saw the trailer for Knight and Day, the Tom Cruise/Cameron Diaz effort – trailer here (no embedding allowed, it seems). But people on IMDB give it 6.6. Every time I’ve seen the trailer, I’ve thought “I’d like to see that film. Looks fun. Cruise not taking himself too seriously.” Apparently not, going by the people who’ve seen it.

So do you know other trailers that have undersold the film? Do tell. Obviously, reference to IMDB to prove that it’s a great film may be needed…

It is annoying to see the annoying company Ryanair – whose motto I imagine to be “if they’re stupid enough to fly with us, they’re on a mental level with sheep and should be treated as such” – given occasional credibility over ludicrous ideas without anyone asking the straightforward question.

Such as: would implementing that idea actually cost Ryanarse money, or profits?

When Michael O’Leary makes a stupid pronouncement, the media seems happy to repeat it. None seems happy to examine it and throw it back at O’Leary to ask whether he has lost his mind and is trying to annoy his shareholders as well.

For instance: charging people to use the toilet. (That’s a Google search link: the top link at the moment is to an April 2010 story saying that Ryanair is going ahead with it… and the third link is from February 2009, with “pilots aghast at proposal to bring in £1 charge”, which shows you how long this story has been bing-bonging around the mediasphere.

Let’s examine this the way it should be examined: from a business standpoint. If Ryanarse starts charging for access to the toilet, I think it will lose money. Here’s why.

If Ryanarse starts charging for the toilet, fewer people will use it. Obviously. It may also have to do more cleanups from parents of young children who run out of money. It’ll also have to get staff to watch over the toilet to make sure people don’t hold doors open for each other – which will be unpopular with the aircrew, since nobody like to be toilet cop.

So it will get a bit of money from people paying to use the toilet, though there will be fewer visits – meaning that the fixed cost, cleaning the toilet reservoir, will only be slightly offset by the takings. And aircrew will have two new grievances: cleanup and toilet cop rota.

But while Ryanarse makes some money from selling toilet access, it will lose money from sales of coffee, tea and other liquids. This is stupid, because it already has the highest prices for coffee and tea and food according to a 2008 survey by Which? Holiday:

The Irish airline charges £2.50 for a bottle of water and £2.50 for a cup of coffee while a small bottle of red wine costs £5.00.

Why will it lose there? Because people will think “Hmm, if I drink this coffee I’ll have to pay for letting it out too.” So the passengers won’t buy the coffee or use the toilet. Ryanarse is suddenly losing money: the profit it used to make on coffee/tea sales. And that is pure profit: apart from heating the water, pretty much everything that it buys for coffee/tea – instant coffee, teabags – can be reused on another flight if it isn’t used. Whereas the toilet reservoirs have to be emptied every time; it is actually more efficient to encourage their use – that way, you get your money’s worth for the cleaning services.

Michael O’Leary – who I think is despicable; if you want to think of the future driven by his credo, imagine Adam Smith’s invisible hand slapping the human face forever – ought to be able to see that charging for access to the toilet is a stupid move, economically. It would actually make better business sense to announce that the “toilet charge” will be rescinded – and raise the price on coffee and tea. In fact, expect it.

And if O’Leary is too stupid to see it, then perhaps his shareholders could show him this blogpost.

And finally, to the business press: next time O’Leary puts forward a stupid idea like this, ask whether it can make business sense. Think about fixed costs and operating costs. And quiz him. When he can see he’s going to lose, he caves in. I think if this is implemented, it will be a money-loser. But you’d need to ask the hard questions – how many drinks are sold per flight before, how many after, what’s the take – to know whether, when Ryanarse announces it’s not implementing (or is withdrawing) these charges, precisely why it’s doing it.

My suggestion: it won’t be because of an outbreak of warmth in O’Leary’s heart, which I imagine as a coal-black thing that would make Lord Voldemort shudder.

This piece first appeared in The Independent around September 2000. Given all the talk about some handheld(ish) computer released by some company or other, I thought it might be interesting to look back on…

A couple of notable phrases: “Microsoft’s failure in this market is unusual..” and at the end that “In the long term though functionality is sure to win out over form”. Debate among yourselves whether this was just history talking…

Handheld computers

BY CHARLES ARTHUR

Technology Editor

Handheld computers cannot do what most people want them to. This may seem surprising, given that millions of models using operating systems from Palm, Psion and Microsoft have been sold since 1984, when the British company Psion introduced its first handheld model.

But all are severely limited compared to the expectations placed upon them, which can be traced back to two sources: the 1960s TV series Star Trek, and the hit BBC radio series first broadcast in the 1970s, The Hitchhiker’s Guide To The Galaxy written by Douglas Adams. Only at the turn of the [21st] century does it look like people will soon be able to buy products with the facilities that people have been hankering after for decades.

The sight in the 1960s of William Shatner as Captain Kirk landing on alien planets and flipping out a palm-sized machine which could act as a radio, intelligent locator and general categoriser of knowledge had a subtle effect on the baby boomers’ belief about what computers of the future could and should do. It was voice-activated, and context- and location-sensitive. Similarly, in the radio series, the Hitchhiker’s Guide to the Galaxy was actually the name of a computerised guidebook. It contained as much information as the galactic hitchhiker could need. While its indexing method was hopeless by any standards – the traveller had to look up a number in the index and enter that in order to get the corresponding entry (“so bad it could have been designed by Microsoft,” Adams later quipped) – it did create the belief that someday one could build a handheld machine able to hold all the knowledge not just in the world, but in the galaxy. And if aliens had them, why shouldn’t we?

The reality of the first retail products was rather different. The Psion 1, the brainchild of David Potter, was launched in 1984. It had a mighty 10K of non-volatile memory, an alphabetic keypad and a one-line 16-character LCD screen. Entering data was tedious. It would not have passed muster with Captain Kirk. However its descendants are now widely used by people in jobs requiring simple data collection, notably including traffic wardens.

In August 1993 Apple Computer launched its $700 Newton, which seemed at the time to promise at least some Star Trek functionality. It had handwriting recognition software able to “learn” your specific cursive style; there were promises of wireless communications and word processing.

It turned out to be an example of the computer industry’s occasional hubris. The software did learn your writing style, but often failed to interpret the letters correctly. The Newton was a flop (officially abandoned in 1998, but dead some years earlier) which poisoned the well for entrepreneurs in the US handheld market for some years. Bill Gates of Microsoft reckoned it put the market for such products back by two years. (Probaly an underestimate.) Palm Computing, founded by Jeff Hawkins and Donna Dubinsky in 1992, only managed to survive by selling itself to the modem maker US Robotics early in 1996.

However, in the UK and Europe Psion was thriving, and had developed its Psion Organiser 3 series, which had a miniature keyboard and inbuilt software including limited word processing, a calendar, contacts book and spreadsheet. It looked like a miniaturised version of a laptop computer, and proved very successful in its local market.

But on the west coast of the US, Hawkins and Dubinsky were developing a palm-sized machine which would have some, at least, of the ease of use both of Captain Kirk’s communicator and The Hitchhiker’s Guide. Hawkins envisaged a machine – which later became the Palm series – that would not stand alone, but would synchronise and back up its files with a standard PC. Thus it would not have to do everything; only have enough functionality to be useful while out of touch with the PC.

For data entry, he developed a shorthand cursive system called “Graffiti” which all Palm users have to learn. He tested the ergonomics of the product by carving a block of wood into a size and shape that he could carry comfortably around in his pocket. Function and form thus developed in parallel.

The Palm operating system was hugely popular, even though the basic machine only offered a calendar, address book, task (“To-do”) and notes list, plus a calculator and search system. Its success stemmed from its ability to coordinate with a PC; the openness of the operating system; and the coincidental rise of the Internet. The first point meant users could access their databases more easily than with tiny keyboards; the second that software developers could write programs to enhance the machine; and the third, that those programs could be widely and quickly distributed. Psion, with its EPOC operating system, had attracted some software developers but was held back by its European location (where Internet development lagged by a couple of years compared to the US) and lack of connectivity to PCs.

Launched early in 1996, the first Palm computer sold 1 million units in 18 months. In 1998 Hawkins and Dubinsky left with Ed Colligan, marketing head of Palm: they were dismayed by the slow working of the monolithic 3Com, which had bought US Robotics. They set up their own company Handspring, and licensed the Palm OS, which then had 80 per cent of the world market, served by 100,000 developers, while Psion and Microsoft scrabbled over the remainder.

Microsoft’s failure in this market is unusual, but seems to stem from its WindowsCE operating system (renamed and rebuilt as PocketPC in spring 2000) being too complex for the limited power of the machines. WindowsCE is used in petrol pumps and set-top boxes for decoding digital TV signals.

The future promises rapid change. Until 2000, handheld computers sat apart from mobile phones: an address list on one could not be transferred to another. As usability expert Jakob Nielsen noted, this is absurdly inconvenient. Mobile phones are no good for noting data (such as phone numbers) while you are in a call; but handhelds have been little use for making phone calls.

But Handspring especially has been forcing the pace, as its Visor machines, which use the Palm OS, include a slot called the “Springboard” where the user can plug in items such as a camera, memory module and – from autumn 2000 – a GSM modem.

That abruptly made the Handspring into the potential killer combination of handheld address list and mobile phone. Palm rapidly announced that by the end of 2000, all of its products would have wireless capability. Separately, IBM demonstrated a version of a Palm machine with an add-on board which gave it voice recognition capability, using the ViaVoice technology. Suddenly, the humble handheld was beginning to look like the machine which would be able to do everything.

But mobile phone makers and Psion are not finished. The so-called “third generation” of mobile phones, which will have high-speed data connections, were being designed in 2000, and the Symbian consortium (which uses the Epoc OS) won a contract to provide the OS for a number of phone companies.

What was still unclear at the end of 2000 was whether handheld computers would swallow mobile phones, or vice-versa. The handhelds had the functionality; the mobile phones had the usability. However the mobiles rapidly lost that edge as new WAP (Wireless Applications Protocol) phones attempting to squeeze Internet interactivity into a few lines of a monochrome LCD screen. In some respects, it was a step back to 1984. But the market’s explosive growth may mean that there is room for everyone to survive. In the long term though functionality is sure to win out over form.