Monday, 29 December 2014

Cry me a river, said the Ferryman. Rise and shine for the colour of mercury; golden fingers taste the destination. If modern society is stupid and trivial then perhaps the most stupid and trivial things are the most representative. And perhaps they aren't so trivial after all, and the things we value for their intelligence and complexity are actually well-presented shams. Or perhaps the state of human thought has moved on, leaving a trail of obsolete genius in its wake; and perhaps a thing of genuine complexity and intelligence is simply out of step with prevailing fashion.

Few people argue that fast food restaurants represent any kind of human triumph, but fast food is the pragmatic application of a set of complex industrial processes to one of mankind's most fundamental problems. Fast food restaurants are the final step in a chain of processes - one of the final steps, because sewage works exist beyond them, and ultimately human endeavour is a process whereby soil is transformed into more soil. The soil nourishes protein which is bred and slaughtered and chopped into bits and delivered and cooked and put in a bun and given to the customer at record speed; the customer excretes this and the faeces is transported and sterilised and returned to the Earth and thus the human animal is a soil-processor, soil goes in and soil comes out.

There are trains and trucks and aeroplanes to enable all of this. On top of that, fast food restaurants use a sophisticated arsenal of psychological weapons to grab attention. The fried chicken joint next to the bus station sells much the same product as the KFC a few doors down from the Halifax, but KFC has meaning, because it understands that human beings crave meaning and it has the resources to act on this, whereas the fried chicken joint is a haunt for chavvy drug dealers and working people. And so KFC can charge a higher price, and no-one need feel shame for eating at KFC.

Three G4s from the actual, distant past of 2001-2002; the top at Dorkbot London 1.0, the bottom two elsewhere.

Beyond fast food there is the fashion industry, which uses a globalised manufacturing network to produce inessential goods that sell at an enormous mark-up, to a clientèle that has invested more emotional value in a handbag than in any human being. This is us. McDonald's and Gucci are part of a system as complex as Apollo or the Internet. An alien viewing us from space would see a network of lights, a global network of lights, of container ships transporting frozen meat and high-heeled shoes, he would see lights burning in the desert so that the lights would not go out. He would see highways of death running to the shining city on the hill, lines of light burning through fissures of the mind.

A lot has changed since 2005. Who now remembers Apple?

The Titanium G4 PowerBook was launched in January 2001, just as the dot.com boom was turning sour. My recollection is that Apple went through the dot.com boom relatively well, perhaps because the company already had masses of problems. The G3 iMac is indelibly associated with the period, but it continued to sell even as Webvan and eToys were going under, and in fact the iMac brand survives to the present day.

The G4 Cube flopped, but in general Apple's product launches of the early 2000s sold steadily and came to a natural end. The Titanium G4 itself was discontinued towards the end of 2003 in order to make way for the new, aluminium-bodied G4. Apple fans grumbled that the new G4 wasn't much faster than the old G4, but looking through the reviews and dredging my own actual personal memory of the actual period of actual history, the aluminium G4s were even more desirable than their titanium ancestors.

In general the "TiBook", as it is often called, tends to be overlooked nowadays. In terms of design and materials modern MacBooks owe a lot more to the aluminium G4s and there is the simple matter of age; the most modern TiBook is eleven years old. The average internet user is too young to remember them. Nonetheless the machine has left traces on the internet. The chap from ExplodingDog owned one and wrote a sweet essay about his, which ended its life as a battered wreck, but don't we all. Some people have an emotional response to Apple gear; owning Apple equipment is like taking a holiday from the grey rain and misery of everyday life. Apple laptops are like heroin, socially acceptable heroin.

The first model launched at $3499, but later models were $2499 - 2999 depending on specification.

The TiBook was only ever available with a 15" 3:2 widescreen display and for the love of God I'm not writing TiBook again. The Titanium G4 or just G4 from now on depending on context.

Fifteen inch screen. Mid-way through the run Apple increased the resolution slightly, although the screen remained the same size. The first run of aluminium G4s was launched in early 2003, but it wasn't until September of that year that a 15" aluminium G4 came out, and so the titanium model soldiered on for a few more months. The Titanium G4's wide screen was a novelty at the time, and contemporary reviews tend to make a point of mentioning it, so I'm reasonably confident that the G4 was the first ever laptop with a widescreen display.

Widescreen displays in general were unusual in 2001. Not just for laptops, they were unusual for desktop computers as well. For that matter LCD monitors were still only a couple of years old in 2001. People still had CRT screens. I had a CRT. God it was awkward. My hunch is that a lot of people who bought titanium G4s ended up with a laptop that was more advanced, perhaps even faster than their desktop computer; the G4 was part of the first generation of laptops that could be your one and only machine, and in that respect the price was less of an issue than it seems.

Nowadays depreciation has wiped almost 99% of their value away. Literally so; ratty models fetch $20-30, and only the absolute best used examples of the fastest titanium G4s with the original packaging fetch over $100.

The titanium G4 was launched during a period of transition for Apple. The company's design language was shifting away from the brash, translucent plastics of the original iBook towards a minimalist, metallic aesthetic. Apple was also introducing its new operating system, OS X, and the company was making tentative steps on the path to being a consumer electronics giant rather than purely a computer manufacturer. The original iPod was launched a month after the G4, the first physical Apple store had been launched a few months earlier. There had been a Cult of Apple in the 1980s and 1990s, but it was nothing compared to the fully-fledged religion that was about to emerge.

The G4 emerged in parallel with OS X. OS X was a big thing for Apple. It was in theory the reason for Steve Jobs' return. His company, NeXT, was bought up so that Apple could have a new operating system. In much the same way that Ash persuaded Ripley to bring the infected Kane back into the Nostromo. Within a few years the metaphorical alien foetus that metaphorically burst from Jobs' chest had devoured all of Apple and covered the inside of the company's HQ with a kind of biomechanical slime, and in that respect it's ironic that Jobs was eventually killed from within, by cancer. On a fractal level the cancer asserted its dominance over Jobs, Jobs asserted his dominance over Apple, Apple asserted its dominance over the mobile phone and tablet market - and also the world of money - and where does it end, hmm? From the microscopic world of energy particles to the macroscopic domain of colliding galaxies, there is a continuum, and we are part of that continuum, like lengths of conductive metal welded into an electricity conduit. The energy flows through us, we must not break the chain.

By all accounts OS X was a solid piece of work that got better as it went along. For a few years in the mid-2000s Apple sold rackmounted OS X server boxes - the Xserve range - but, facing competition from Linux, the company eventually threw in the towel. Most people who use a computer have used Windows at some point; the same is not true of OS X, and unless you own or use a Macintosh there's a good chance that you've never seen it outside an Apple Store. It tends to be overlooked when magazines run comparative tests of operating systems, presumably because PC owners can't install it, and Macintosh owners have no need for anything else. iPad and iPhone owners use iOS, which is so transparent that I imagine a large percentage of iOS users aren't aware that their machines even have an operating system. And why not? It's good that iOS is transparent, it gets out of the way.

The original 400mhz and 500mhz titanium G4s came with Mac OS 9.1. OS X was launched to consumers a couple of months later, and the subsequent 550/667mhz models dual-booted OS 9.2.1 and OS X 10.1, with OS 9 as the default. The 667mhz-1ghz models had OS X 10.1.3, with an option to boot into OS 9. The Titanium PowerBooks were the last Apple laptops that could boot into OS 9 - the aluminium G4s dropped OS 9 entirely.

The titanium G4 introduced Apple's modern laptop look, although it feels semi-formed. It's not quite as minimalist as the later, aluminium G4, or the contemporary, white plastic G3 and G4 iBooks. Apple's design language in the 1980s and 1990s was conservative and stylish, in fact the company was unusual in that it actually had a design language. A language that it applied consistently across its entire product range, over a period of several years. Apple called it "Snow White". Snow White Macintoshes had subtly curved corners and stylish mock cooling vents, and generally managed to be cute and formal at the same time. In the 1990s Snow White was gradually smoothed out, and Apple's designs of the 1990s - the Quadras, Performas, early PowerPC Macintoshes and PowerBooks - were generally very bland and anonymous.

It takes 321 seconds for the 667mhz G4 to calculate Pi to just over 2m digits with SuperPi. A chap in this thread got a time of 177 seconds with his 1.25ghz G4, which suggests that the PowerPC's performance progressed in a linear fashion; the 1.25ghz model is clocked at 1.8x the speed of the 667mhz and calculates Pi 1.8x faster.Comparing this across platforms is futile, but for the sake of history my 500mhz Pentium III-powered ThinkPad 600X calculates exactly 2m digits in 532 seconds, a single core of a 2ghz Core II Duo takes 62 seconds, a single core of a 3.3ghz i5-2500K takes 23 seconds. A casual reading of these figures suggests that the 667mhz G4 was on a par with an 800mhz Pentium III for this one very narrow task, but there are too many imponderables to draw a firm conclusion.

Apple turned things around abruptly, in 1998, with the brightly-coloured, translucent plastic G3 iMac. The original iMac was brash, extroverted, garish, post-modern, silly, but very influential; it was launched in a brash, extroverted age. It fit right in with the mood of the times and it was surprisingly good value, and sales were brisk. By selling it as an internet terminal that could also run Microsoft Office, Apple sidestepped the Macintosh platform's biggest problem - its lack of software support - and I remember seeing the iMac on sale in mainstream supermarkets, which was unprecedented for a Macintosh, especially in the UK. Apple didn't open a store in the UK until 2004, and in doing so they gave hope to a nation that had only known hope once before.

I have the impression that Apple fans don't like to be reminded of the translucent era. It puts me in mind of Pierce Brosnan's stint as James Bond, or Batman Forever, fashionable at the time with an air of newness and modernity but not something people want to revisit. The iMac still exists, although over time Apple moved away from the idea that it should be outrageous and playful. The G3 iMac was replaced by the extraordinary G4 iMac, which resembled an anglepoise lamp, but the G5 iMac was essentially a large monitor with a computer built into it, and Apple has continued to use this concept to the present day.

Top defunct universe simulator Celestia runs on the machine, but only low-resolution textures work; the 16Mb Mobility Radeon presumably isn't up to the task.

Have you ever thought about the word "carpetbagger"? It's one of those words that pops up every now and again, and you read it and don't think about it - some kind of ne'er-do-well perhaps - but it haunts you. Is it a man who puts carpets into bags? Well, the great thing about the internet is that I can instantly look up the answer. A carpetbagger is a man who has a carpet bag, essentially a rucksack made out of a carpet. These bags were capacious and hard-wearing, and people from the north of the United States packed their belongings into them and travelled to the south in the wake of the civil war, hoping to profit from the ravages of conflict. Twenty years ago I would have had to look that up in a dictionary, assuming I had access to a dictionary, but it wouldn't have had pictures; now I have instant knowledge at my fingertips. Thank you, Apple, for making the modern world.

This part gets very hot.

Marketing types often go on about benefits, the idea being that it's a mistake to simply list the features of a product, you have to sell its benefits. The internet allows me to look up the meaning of odd words, and find new words, such as trigonal and gyromancy and geocyclic; that is its benefit, and if I had to market the internet to someone who had never used the internet I would emphasise... well, I would probably emphasise an endless supply of free porn, so many naked women that you would get sick of looking at naked women, plus social media which allows you to maintain the illusion that you have actual friends who really care about you. Steve Jobs understood all of this. His keynote speeches rattle on about how the Macintosh and iTunes and so forth will allow people to do things faster; he only delves into the nitty-gritty of the Altivec engine and seamless file synchronisation in order to illustrate the benefits of these technologies. The internet has a video of the 1997 Macworld expo, during which Apple's then-CEO Gil Amelio announces the return of Steve Jobs to the company Jobs had co-founded; Jobs immediately outlines Apple's mission, to "provide relevant, compelling solutions that customers can only get from Apple", and he lifts his voice at the end and there is applause. And then, ominously, he moans that Apple does not have many cars in the parking lot at 3pm on a Sunday. Apple was entertaining to watch from afar, I imagine it wasn't much fun to work for.

On the level of industrial design, Steve Jobs' return to Apple was a complicated thing. Nowadays it's assumed that he immediately set to work rejigging Apple's physical products, but he was initially hired for the operating system, the development framework, application integration etc. It seems that he saw in Apple chief designer Jonathan Ive a kindred soul, and immediately promoted him to god-head of Apple's physical realm. Nonetheless I have the impression that Apple's industrial design process was something of a mess at the time. The translucent G3 iMac had presumably been on Ive's mind before the return of Jobs - it was a cross between Ive's translucent Apple eMate and the all-in-one G3 All-In-One - but the contemporary, second generation G3 PowerBook looked like something from the pre-Jobs era. The "Wallstreet" model was curvaceous and black and came across as an evolution of Apple's earlier designs, rather than a clean break; I'm not sure who designed it, my hunch is that the work had been done before Ive and Jobs turned their attention to the laptop side of things.

The G4 was designed by Jory Bell, Nick Merz, and Danny Delulis, melancholic figures who have been written out of history. Bell and Merz left Apple shortly before the G4's launch to set up their own laptop company, OQO, which specialised in teeny-tiny Windows palmtops running on Transmeta Crusoe chips. The OQO Model 01 (2005), a tablet with a slide-out keyboard, is the spitting image of the titanium G4. Presumably Bell failed to persuade Ive and Jobs to release it as an Apple product.

Merz's name has popped up a few times since the G4 - it seems that he filed some patents that were used in the later iPhone - but on the whole Bell, Merz, and Delulis are yesterday's men, forgotten men. In future it will be remembered that Ive oversaw the design of the titanium G4, and then it will be remembered that Ive designed the titanium G4.

Running VLC. Some men look good in suits, some men do not (it seems that Turkish and Scottish men do).

Early-2000s Apple was translucent, but the professional laptop range skipped that period. The pre-Jobs G3 PowerBook remained on sale through the translucent era, and by the time of the titanium G4 Apple had moved on. Apple's consumer-level laptop, the iBook, began life with a translucent case, but this felt like a mistake and was quickly abandoned in favour of the minimalist white G3 iBooks, which have aged well. When I think of the archetypal mid-2000s "blogging in Starbucks" laptop I actually think of a G4 iBook rather than one of the PowerBook machines.

To my eye the titanium G4 is almost a great design. The plastic border surrounding the body ruins it; the hinges at the back of the machine look ugly, like parts of a hot water boiler; the thick body and thin screen looks odd from the side, because the lid doesn't quite mirror the base. On the other hand the screen itself is excellent. The thin border is still very narrow by modern standards, and from the right angle the colours and backlighting are consistent across the screen. The G4's titanium panels are apparently not very robust, and the body tends to accumulate dents and scrapes. The hinges were notoriously weak. Dents on the lid can press against the back of the LCD panel, short-circuiting it; used titanium G4s often end up with spurious vertical lines on the screen because of this.

Technology-wise the titanium G4 went through three basic iterations, each one better than the last. The original 400-500mhz models had an 1152x768 screen, VGA out, a 100mhz system bus, a rubbish graphics card, headphone audio out but no audio in except for a microphone. There was an airport wi-fi card available as an option. The transitional 550-667mhz models introduced a high-speed gigabit ethernet port, with the 667mhz model upping the bus to 133mhz. This is the model I have. Did Apple think that laptop owners cared about high-speed ethernet? I have no idea. The transitional models came with Airport wi-fi as standard, although the original Airport uses an older wi-fi protocol that doesn't support modern encryption, so in practice I use a USB wi-fi dongle instead.

The final, 667-800-867-1ghz models increased the resolution to 1280x854, and added a DVI port that could drive a 2k, 2048x1536 monitor, although nobody said "2k" back in 2002. All models came with analogue 56K modems and s-video output, which is antiquated nowadays. There were detail differences - the later models added audio in and stronger hinges - but otherwise the case remained the same. They can all be upgraded to a maximum of 1gb of memory and will officially run OS X 10.4.11, which hasn't been supported for several years, and unofficially OS X 10.5, which also hasn't been supported etc. OS X 10.4 is the last version of OS X that officially ran on pre-G5 PowerPC Macintoshes, and it still has a wide range of applications. Browsers are awkward however. Chrome post-dated 10.4 and was never released for it; Firefox ended support for 10.4 many moons ago. There is a fan-made port of modern Firefox to 10.4 - TenFourFox - but it's very slow on my machine. I tend to use Safari, which also isn't supported but is very fast.

On the whole the range only improved as it went on, and if you want a Titanium G4 nowadays for your collection there's no reason not to buy the best 1ghz model you can afford. Getting hold of OS X 10.4 is difficult nowadays, so try to find a G4 that comes with the original system disks. I replaced the hard drive in mine, cloning the original slow 4800rpm drive in the process, and I keep the original hard drive as a backup.

Apple
I'm old enough to remember when Apple was stereotyped as a frivolous brand for the terminally hip. A company that made things for people with more money than sense. A kind of weird cult for shallow people. Who worshipped it with an almost religious fervour.

Things were very different in the 1980s and 1990s. Apple was fashionable, but it was a different kind of fashion; nowadays Apple gear is fashionable in the same way that fashion is fashionable - it's sexy, young, exciting and cute. The gear is cheaper and people have more money, and young people can aspire to owning it. In the 1980s and 1990s however Macintoshes were very expensive and had a Scandinavian air of high seriousness. They were Ikea-fashionable, worthy-fashionable, Raspberry Pi-fashionable. And stereotypes have changed. In Apple's early days computer nerds were poxy schoolboys, and designers were thirtysomething men who wore Miami Vice suits. Apple in the 1980s was the tidy beard of Timothy Busfield, star of television's thirtysomething. The hipster Nathan Barley figure that people associated with Apple in the 2000s didn't exist in the 1980s.

And computers have changed. Nowadays everybody has a computer of some form or another. Your parents have computers. The internet made computers relevant for the great mass of people. Men like the internet because of porn. Women can look at pictures of cute cats. Transgendered people can do both of those things. In the pre-internet age computers were a minority taste; most people didn't need one. In the 1980s Apple's machines appealed to wealthy middle-aged types who had chosen the easiest degrees. English professors, marketing people, designers and the like. While you and I spent hours tinkering with CONFIG.SYS and AUTOEXEC.BAT in order to free up enough memory to load a laser printer driver, these shallow lamers went on holiday, had jobs, and wasted their evenings having meals in restaurants with ladies. You and I owned a computer so we could tinker with computers, whereas Macintosh fans owned a computer so they could do stuff. Non-computery stuff. Reports and things. I don't know what people did with Macs before the internet.

In the 1990s Macintoshes were sold in generic computer shops rather than dedicated Apple stores, and Apple had a programme whereby Mac fans could hang around these shops, showing people how to use Macintoshes, in exchange for a t-shirt. Think of the kind of men who evangelise Linux today, now imagine them making a beeline towards you when you go to the shop, so that they can talk to about about Macintoshes; you would run a mile, and I wouldn't be surprised if they did more harm than good. While researching this paragraph I stumbled on this forum thread from October 2001, in which a Macintosh fan stalks a woman. One of his friends chimes in with "I just told my future girlfriend how to use smilies an hour ago". My future girlfriend. "I just showed a woman I've barely spoken to how to use smilies, I hope she rewards me with sex, and if she doesn't then women are awful because they friendzone me all the time I'm not surprised they get raped etc".

But what about the machines? The problem with evaluating Apple's hardware is that it's hard to separate the computers from the Macintosh ecosystem. Compared to the contemporary Atari ST and particularly the Commodore Amiga, Apple's 1980s hardware was nothing special. But the Macintosh ecosystem encompassed the operating system, a consistent cross-application interface, plus the LaserWriter, the seamless networking, Apple's particular weltanschauung, PageMaker. All of this could have been implemented by Commodore or Atari, and in fact Atari had the Calamus DTP package and an own-brand laser printer, but Apple was more committed. If Apple had, for some reason, licensed Jay Miner's Amiga as the basis for the Macintosh, the Mac would have been a better machine; but in our world this did not happen, and in general the Macintosh system-as-a-system was ahead of the curve, until the PC ecosystem caught up. It took a long time for the PC to catch up, but as with the open source development model, once it finally did catch up it had enormous inertia on its side and swept most of the competition away.

Apple originally toyed with the idea of making the Macintosh a mass-market product, but decided in the 1980s to aim for the mid-high-end market, and so although the company made hefty profits the machines simply weren't very widespread. The general public could go through life without ever using one or even seeing one. This gave Apple machines something of a mystique, although as time went on it became a hindrance. Developers turned away from the platform, because the effort involved in porting software for the Macintosh was only rewarded with access to a small market of Macintoshes. Apple's strategy even started to backfire. At a time when computer hardware was inexorably diminishing in price, Macintoshes came to seem extravagantly overpriced. The company's attempt to launch cheaper models resulted in the LC and Classic ranges, which were technically retrograde but sold well, especially to the education market. This had the unfortunate effect of reinforcing the perception that Apple hardware wasn't worth the price. For the average student in the computer lab, Apple in the 1990s was not the fast Quadra 840AV or one of the new PowerPC models, it was the ratty old LC II sitting in the corner, with its tiny 12" monitor, that together cost more than a fast 386 with a bigger monitor, more memory, a larger hard drive, more software, Doom at a pinch.

Apple's original operating system was undeniably more sophisticated than DOS shells and pre-95 Windows, and although the Amiga was more advanced, Commodore had none of Apple's commitment and passion. Apple pushed the Macintosh as if it was the company's last chance, whereas Commodore launched the Amiga in the hope it might stick; ditto Atari and the ST, although Atari tried harder to target businesses. For a few years the ST sold well as a kind of cheap mini-Macintosh, especially in Germany, and both the ST and Amiga had their own niches. I mention the Amiga and ST because those two platforms had the same basic architecture as the Macintosh, and in another world they might have won. Atari, Apple, and Commodore had similar histories, at first selling a popular 8-bit home computer before transitioning to 16-bit designs. But Apple was always slightly ahead of the pack. This by itself was not a guarantee of success. Xerox had been even further ahead of the pack; the computer world is littered with pioneers whose dreams came to nought. What separated Apple was an iron will, and an understanding that being early to bed and early to rise was no jolly good if you didn't advertise, to paraphrase Captian Mainwaring from Dad's Army. Apple's adverts of the 1980s were simple affairs with bold text that emphasised the benefits of the Apple system, at a time when computer adverts generally consisted of a photograph of some PC components with some text that emphasised the low, low price. Apple's adverts were run in newspapers and magazines rather than just the specialist press; on one memorable occasion the company even advertised on TV.

The Amiga and ST never came close to the sales figures of the Macintosh. They flopped in the United States; the US audience generally preferred games consoles for games and an IBM PC for work, and looking back through old issues of Infoworld and PC Magazine it seems that neither Commodore nor Atari set any money aside for marketing. Here in the UK the ST and Amiga were popular, but the UK is a very small market. The irony is that although the Macintosh became stereotyped as a multimedia machine for multimedia types, its graphics and sound capabilities were very basic compared to the Amiga. The 1985 Amiga had a technical lead right up until the 1987 Macintosh II, which cost over $5,000 (the colour monitor itself cost more than a contemporary Amiga system). As a publicity stunt Commodore sent an Amiga off to Andy Warhol, who tinkered with it, although the company seemed to give up on marketing it to artists or indeed marketing it at all thereafter. Despite Commodore's indifference it was adopted by the television industry for TV graphics and transitions, and eventually 3D animation, at a time when Macintoshes were essentially just desktop publishing machines. Meanwhile the Atari ST dominated music production, because it had built-in MIDI ports and Cubase, and it was a fraction the price of a Fairlight. Ultimately both Commodore and Atari were swept aside by the PC.

The two companies were run by hard-nosed businessmen who wanted to sell "to the masses, not the classes". In fact they were run by the same hard-nosed businessman - Jack Tramiel, who founded Commodore but left the company under a cloud in 1984. By all accounts Tramiel had Steve Jobs' abrasiveness but none of his charm and, like Britain's Alan Sugar, he was essentially a shrewd opportunist who just happened to gravitate to the computing industry. His abrasiveness was legendary and permeated Commodore, although it was not hard to see where it came from. He was born Jack Trzmiel in Poland in 1928 to a Jewish family. Tramiel and his parents were rounded up by the Nazis in 1939 and sent to the ghetto in Łódź, and thence to Auschwitz. The family was separated, with Tramiel and his father being sent to a work camp while his mother remained behind. Tramiel's father died. By some miracle his mother survived. He was eventually rescued in 1945. I spent my teenage years going to school, he spent his teenage years being forced at gunpoint to work himself to death. I imagine that must have eroded away his faith in humanity.

In 1984 Warner wanted to get rid of their loss-making Atari division, and so they gave it to Tramiel for a pittance, which led to a peculiar situation where Atari's new ST was designed by ex-Commodore people working for the former boss of Commodore, while Commodore's Amiga was based on a design created by ex-Atari people who had left because they didn't like Tramiel. Although it's easy to feel sympathetic for Tramiel's personal history, as a businessman his approach traded a shot at future glory for short-term gain. It's certainly feasible to continue targeting the low-end market indefinitely, and lots of visionary companies have gone bust chasing a dream, but it takes hard work to win the commodity end of the market, and with slim margins only a couple of failures can destroy a company. Commodore and Atari had several failures; the magic left them, then the money, and then they were no more.

The lovely palmrests. You press the black buttons with your fingers and symbols appear on the screen. Some combinations of symbols form words; some combinations of words can make people cry or give them a hard-on or make them angry or inform them that they have an appointment to see the dentist on a certain date.

OS X
The last time I used a Macintosh in anger it was a Macintosh Classic running System 6. It dated from a period in Apple's history when the company sold a range of cut-down machines at a price that was still very high. On the arrogant assumption that the general public would pay for the Macintosh operating system because it was posh. My high school had bought a room full of Classics in order to teach us desktop publishing with PageMaker, which is forgotten now but was the market leader in the pre-Quark days; a second room had some IBM PC clones running Windows/386, plus what was probably MSDOS 3 and a mixture of dBase and WordPerfect. Even at an early age we were taught that Macintoshes were effete and dainty and that PCs were tough and unsentimental.

My impression was that the PC clones could do most of the things that the Macintoshes could do - Flying Toasters excepted - but the Macintoshes were easier to use and a lot more elegant; it took me no time to get used to Aldus PageMaker, which I learned by looking at the menus and clicking things and trying things out. If I had done that with a PC, it would have crashed. I could probably have used WordPerfect for DOS to emulate the functionality of PageMaker, but it would have been difficult, and the effort involved would have been effort that could have been spent improving the document. Although I had no part in setting up the Macintoshes, I imagine that they were put on the desks, hooked up to a server with cables, switched on, and they just worked, whereas the IT staff had to spend their weekend getting the PCs to talk with each other.

I distinctly recall the Classic's pin-sharp black and white screen, and the fact that you could turn the machines on by pressing a button on the keyboard rather than by flicking a physical power switch. This seems silly nowadays but it felt like magic at the time. On the whole the Macintosh felt like a complete, well-thought-out system rather than a bag of bits. It had meaning. It felt real and permanent, in a world otherwise composed of transient rubbish. I can understand why Apple attracted damaged people. Divorcees, the mentally ill, alcoholics, drug addicts. They have bleak, empty lives of self-pity and self-hate, largely devoid of purpose and meaning, sleeping in a sleeping bag in the office, sitting in McDonald's until the hotel room has been cleaned; the Macintosh gave them something pure and untainted to restore their faith. As I gazed upon the Mac Classic's ADB keyboard and its software-controlled power button I imagined that I was Gustav, from Thomas Mann's Death in Venice, gazing upon the beautiful Tadzio. There was beauty in this world, beauty I could never have, beauty I would probably destroy if I touched, probably destroy if it was aware of my gaze. That is why men dress up as women in those hideous rubber masks; they want to be beautiful, not so much for the looks, but because they adore the concept of beauty. They cannot accept that there are things beyond their grasp.

The original Macintosh operating system evolved from the early, single-tasking Finder into Mac OS 9, which was very attractive but had some major limitations - no multi-user support, unimpressive multi-tasking, poor memory management, constant crashes. Apple attempted to replace it with an in-house project called Copland, but although Apple was a large company, it was split into several different departments, and the operating system division had neither the resources nor the authority to make Copland work. OS X was essentially a fresh start, a completely new operating system bought in from outside the company. It was launched in parallel with the Titanium G4, and although it took a few years to win the hearts of Apple fans it was technically solid, and is by all accounts an excellent operating system.

In the past it was often opined that Apple's focus on making its own computers by itself was misguided, and that it should have instead developed a Macintosh standard that other manufactures could use. In 1985, a year after the Macintosh had been launched, Bill Gates wrote to Apple's John Sculley to argue the case that Apple should open up the Macintosh standard and encourage the development of third-party clones. We tend to think of Microsoft and Apple as bitter arch-enemies, but for most of their history they competed in different fields; even Windows itself does not directly compete with the Macintosh, because it was never intended for Macintosh hardware. That Windows PCs came to outsell Macintoshes by a huge margin was, I suspect, incidental, besides which the IBM PC had been outselling Macintoshes even before Windows came along.

Gates' memo implied that HP, Wang, DEC, TI etc should launch their own Macintosh-compatible machines. Would they have done so, even if Apple had asked them? A Catch 22 would have been in effect - the Macintosh market was small because there were no cheap Macintoshes, but why build cheap Macintosh clones for such a small market? My hunch is that Apple would have charged steep licensing fees, and that a Wang-intosh would have been a pile of junk, and that the Mac clones would have quickly died out. Apple did try the idea many years later, with essentially the result I have just outlined. It's tempting in retrospect to assume that Bill Gates had a nefarious purpose to his memo, but I suspect he had a soft spot for Apple hardware and wanted Apple to do more to justify Microsoft spending money developing for the platform.

OS X runs on the same architecture as my desktop x86 PC, but sadly Apple doesn't officially let people install OS X on non-Apple computers. Given the tepid response to Windows 8, you have to wonder whether OS X for Intel at a knock-down price would have sold well and expanded the OS X ecosystem. But which major PC manufacturer would have been brave enough to sell PCs with OS X, thus dooming their relationship with Microsoft? But is Microsoft the tyrannical giant-killer that it once was? I assume Apple has thought about all of this and whispered no, but it's a shame that OS X is restricted to Apple computers. It seems locked into an ever-narrowing niche, as Apple fans transition to the iPad, and there will come a time when further development of OS X is just not economical. As the Cloud Age advances upon us the choice of desktop OS is increasingly irrelevant, in which case OS X faces strong competition from Google Chrome and Android.

On a technological level OS X is fundamentally UNIX with a fancy graphical shell. Almost a kind of Linux distribution, albeit that it is actual official UNIX. OS X was originally developed with the same basic look as the original Macintosh OS, but by the time of launch it had a new interface design, Aqua. My recollection of the OS X launch is that everybody fixated on that. It had buttons that looked like tiny droplets of water, and brushed metal menu bars, and in general it was a visual analogue of the translucent, candy-coloured Macintoshes. The Aqua look became a target for mockery and Apple eventually toned it down. OS X 10.4.11 is positively restrained in comparison to earlier versions. Its simple menus and low-key icons complement the titanium G4 very well. The animations can be turned off, the menus are neat and simple, it's less silly than Windows Aero.

I like OS X. Version 10.4 was designed for hardware twice as fast as my titanium G4, with twice as much memory, but it has never crashed, the audio system works well, it eats up the USB peripherals I have chucked at it - it just seems to get them working instantly - and in general it doesn't become choppy and unresponsive when too much is going on. My assumption is that subsequent versions of OS X running on more powerful hardware are even better.

It's striking how flexible OS X has been. It was originally designed for single-core, single-processor 32-bit PowerPC systems running at 400mhz, with 128mb of memory. Since then it has been made to work on a huge range of configurations, nowadays settling on quad and eight-core 64-bit Intel machines running in a post-ghz era, with gigabytes of memory and SSDs. The iPad and later iPods use iOS, which is built on the same fundamental UNIX kernel as OS X, with different components layered on top. OS X 10.2 Jaguar and OS 10.9 Mavericks are of course very different beasts, and XP supports a wide range of hardware as well, but still, in my opinion OS X was the right decision on a technical level.

I'm not too keen on some of OS X's interface elements. The Windows XP-era start menu and taskbar are good designs that work well; OS X 10.4 has nothing that combines the thoroughness of the start menu with the intuitiveness of the taskbar. Little things are irritating, too - OS X has no simple maximise button, no obvious "cut" command, it puts the machine to sleep when you close the lid, whether you want to or not, etc. The " and @ keys are swapped. Windows Explorer is a better file manager than the Finder. These are little things.

Still, it works, it works well, it's fast and smooth. Unlike Linux it supports a wide range of applications that I want to use; Logic in particular is very impressive, you get a lot for your money. OS X 10.4.11 dates from 2005, and I remember using Ubuntu around that time. I had to extract the firmware from my broadband modem using Windows and port it to Ubuntu before it would work, and then in order to get widescreen monitor support I had to edit xorg.conf - a text file - and meanwhile Ubuntu fans were telling the world that Ubuntu was a drop-in replacement for Windows. They are a bunch of ideologically-driven liars with tiny, highly-focused minds. My hunch is that Linux fans use Linux so that they can tweak Linux, and that for the most part their genuinely productive work consists of text editing, which is not a great advert for Linux. They seem to believe that if Logic or Ableton is not available for Linux, people should simply not create music, or not use a computer at all, ditto Photoshop and image editing.

Incidentally, I used this particular G4 to write the last couple of posts, largely because the palmrests are fantastic and the screen is lovely. The palmrests are just painted metal, but they feel smooth and warm. The keyboard isn't lit in any way. It's a bit rubbery but it is anchored well. I can type for extended periods with it.

Is it me writing, or the drugs? There is a battle for supremacy that will not end until one of us is dead, except that drugs cannot die because they are a substance. I use Bean, a simple text editor, which runs quickly, which appeals to me because my mind runs quickly. Could I have done all of this with my desktop PC? Yes, undoubtedly, but writing is a creative act, and I have to be in the mood. The G4's smooth palmrests get me in the mood. I like to move my hands over them. I like to feel good.

The Register's design never changes.

PowerPC
The titanium G4 was one of Apple's PowerPC machines. The original Macintoshes were powered by Motorola 68000 chips, but in the 1990s Apple, IBM, and Motorola - the AIM Alliance - felt that they could do better. Working together they came up with the new PowerPC architecture, a fresh start based on then-novel RISC technology. PowerPC was fast enough to run 68000 applications under emulation, and did more work than the new Pentium at a given clock speed. It was cheaper to make, used less power and generated less heat, although the last two factors were mostly ignored at the time. The G4 in my G4 is a low-power version of the 7450 that uses either half or two-thirds the electricity of contemporary, comparable Pentium III Mobiles. As a consequence the titanium G4 gets about four, five hours out of the battery, which is still not bad.

Nowadays people care about heat output and power consumption, but my recollection is that until the mid-2000s it was simply accepted that computers should be hot, hungry beasts with eight fans. In contrast the G4 Cube of 2001 was released without a fan at all, because Steve Jobs wanted his machines to be quiet and cool, and at the time it seemed like a silly affectation. The titanium G4 has a fan, but it only runs when the machine gets really hot. Otherwise the titanium case acts as a heatsink. This creates something of a paradox. PowerPC Macintoshes generated less heat than contemporary Pentiums, but the machines often ran very warm because their cooling was less aggressive. Apple eventually had to abandon this philosophy when it became apparent that the only way to keep the G4 competitive was to speed it up, and later G4 desktops were notorious for having very loud fans, at which point the PowerPC architecture was obviously in trouble.

The AIM Alliance had high hopes for PowerPC. It was supposed to replace x86, and by the year 2000 businessmen would be running Windows NT and OS/2 on their PowerPC computers while Macintosh owners would run the Macintosh OS on PowerPC Macintoshes, and it would be easy to port applications from one platform to the other because there was a common PowerPC standard, and there would be PowerPC UNIX workstations and PowerPC cash machines and PowerPC space probes and the Intel x86 would be just like whiny little Pluto, crying out in space because it can't go into the planet club any more boo hoo because it's not a planet it's a stunted little speck of dirt. But the AIM Alliance fumbled the ball, which was immediately kicked out of their hands, or perhaps they never came close to grabbing the ball, I don't know. IBM seemed to fumble the ball the most. The company still makes POWER chips for its server range, and my hunch is that they were more interested in developing high-power chips for the servers than piddling little low-power chips for Apple's silly laptops. IBM recently sold its server division, and it is generally assumed that POWER is not long for this world.

Eventually the biggest threat to Intel's dominance wasn't the AIM Alliance at all, it was AMD, and then only briefly; and a few years later it was ARM, and that drama has not yet played out. PowerPC ended up in a few games consoles and lots of Macintoshes, and it was always a good mobile chip, which made sense for Apple given the success of the PowerBook range. On the whole it was a worthwhile idea but it might have been so much more. In a way the dream of the AIM alliance came true - modern Macintoshes can run OS X and Windows, and two platforms use a common ports at the very least - but it was a long and winding road that seems to have been navigated in the dark.

In the context of Macintoshes the early PowerPC chips were wasted on a bunch of anonymous machines that nobody remembers. The G3 generation was part of Apple's renaissance; the G4, to my mind, is the definitive PowerPC. It was the apotheosis of the design and sadly its peak. The titanium G4 reached 1ghz in late 2002, but three years later, when the G4 was finally laid to rest, it had only progressed to 1.67ghz. In theory the G5 should have taken over, but IBM never managed to design a satisfactory mobile version of the G5, and so Apple's PowerPC laptop range ended with a G4.

The G4 desktops dealt with the performance bottleneck by using multiple processors, but by the time of the G5 the PowerPC was starting to reach a point of diminishing returns. And in practical terms although the PowerPC was generally competitive with Intel chips, it never reached a point where it was inarguably better. The 667mhz G4 in the desktop contemporaries of my PowerBook is supposedly on a par with a 1ghz Pentium III, especially if the Macintosh application uses the G4's Altivec floating point engine, but by that time the Pentium III was hitting 1.2ghz and 1.4ghz, and the even faster Pentium 4 was available. The G4 may well have been more efficient, but what did that matter if the end result was still slower? The G5 was competitive, but it was also very hot and power-hungry, and Intel's Core Duo architecture was inarguably better.

It's fascinating to imagine what might have happened if Apple had turned down the G5 and abandoned the PowerPC earlier. In the early to mid 2000s Intel's biggest competition came from AMD. At the time, the Pentium 4 seemed very expensive for what you got; the AMD Athlon and Athlon 64 were cheaper and outperformed it. Given AMD's subsequent history, an Apple-AMD alliance would probably have been a bad idea, but if Apple had turned down the G5 and opted for the AMD Athlon 64 instead it would surely have given AMD a shot in the arm. Would a few hundred thousand more sales have helped AMD? Probably not, and I imagine that Apple would have ended up wedded to the wrong partner, although they could always have dumped AMD in favour of the Core Duo and its heirs. OS X had apparently been ported to the x86 architecture several years before the switch was announced, so that would have been less of a problem than it seems. It's ancient history now.

I have an old Pentium III laptop, a ThinkPad 600X running at 500mhz. It's a year older than the G4 and belongs to an earlier generation. It has a single USB port, a much dimmer 1024x768 screen, a maximum memory limit of just over half a gigabyte. Physically it feels a lot more robust. It's made of glass fibre-reinforced plastic. The lid stays rigid, whereas the G4's lid wobbles back and forth because it's made of very thin metal. The G4's battery lasts longer. The two machines weigh more or less the same. It's an unfair comparison, but for the sake of completeness the G4 feels a lot faster, even with half the memory (mine originally came with 256mb, and I only managed to expand it to 1gb after most of this post had been written; the music above was made with a 256mb machine). OS X loads up and then gets out of the way, whereas XP SP3 has a nasty habit of booting to the desktop - and then continuing to boot.

Are Apple fans nostalgic for the PowerPC? It seemed to reach an ignominious end. From my perspective, as a PC owner, Apple announced the new G5 as if it was God's gift to mankind, and then almost overnight the G5 and the PowerPC architecture were gone. I pity anyone who paid money for one of the later, faster G5 machines. They had a liquid cooling system that was prone to leaks, so not only have they depreciated like mad they probably don't work any more.

Envoi
Apple's history as a computer company, rather than as a consumer electronics giant, is a long and melancholic story that defies summarisation. Like Mercedes and Wright, Apple played an instrumental part in the development of a new industry; it still exists and is doing well, which is impressive given that the industry is vastly different. When Mercedes was founded cars were exclusive toys of wealthy people, but within twenty years they were mass-market commodities. Nonetheless there is still a demand for Mercedes saloons, and people will pay more for them, and they are genuinely good cars. And of course Apple now targets several different markets, whereas Mercedes is still fundamentally an automobile manufacturer. A history of Apple would be a history of the consumer computer industry, a massive undertaking. The titanium G4 was one of the key products that helped Apple survive during a period when it had staved off collapse but was not yet dominant, and along with OS X it showed that Apple was still fighting. And yet in a way it was all for nothing, because despite OS X and an impressive product portfolio, Apple's share of the home computer market has actually declined since the 1990s. But the market is much larger, there are other markets, and domination of the PC market no longer seems the victory it once was.

The Apple Macintosh range began with a single, desktop machine, an all-in-one that remained on sale largely unchanged for a couple of years. The range expanded to include tower machines, plus a series of laptops that almost became an entirely separate line; in the 1990s it briefly seemed that Apple would abandon desktops in favour of PowerBooks and Newtons. As of late 2014 the Macintosh range consists of a professional tower that feels superfluous, a kind of budget home media server, and an all-in-one iMac with a fantastic screen that everybody wants. The 5K iMac is essentially the direct heir of the original Macintosh, and I can't think of another computer company that has managed to successfully sell the same concept for thirty years, certainly not without coming across as a throwback. Ultimately computers exist to serve mankind, and mankind - including women, who are basically men - has not changed since 1984. We breathe the same air and have the same desires.

What will Apple be like, thirty years from now? In the future of 2044 we will travel by downloading our bodies into 3D printers at our destination, except that we won't travel because there will be no reason to do so; instead we will sit at home and send our drones to go on holiday for us, except that every square inch of the world will have been photographed and everything will be the same, so we will probably just sit at home in front of the television. Thirty years from now computers will have generated every text, thought every thought, produced every creative work possible in this universe, and all of humanity will exist as a set of symbols engraved on a silicon chip embedded in an asteroid slowly making its way to the site of the Big Bang, in the hope that it might re-enter the cosmic womb and become unborn. Short pause Apple something-or-other. Apple.

Wednesday, 24 December 2014

A while back I bought a stash of old film on eBay, mostly without the packets; among them was a roll of Agfa APX 25, an old and very slow black and white film. It was introduced in 1989. Google Books suggests that it was discontinued in 2002 or 2003, with the last rolls expiring a couple of years later, by which time Agfa itself had collapsed.

With only one roll I had no idea what to expect, and I couldn't test it in any way, but I was pleasantly surprised. My approach to odd finds like this is to just shoot it; just flick the switch. "One more body amongst foundations makes little difference. Well? What are you waiting for? Do it. DO IT."

ISO 25 equals roughly 1/30th of a second, f/4 in overcast daylight, but this isn't a huge problem with a TLR. One of the best things about slow film is that you can use wide apertures in daylight, equals narrow depth of field, which looks neat. It brings in the punters.

I stand developed it with R09 / Rodinal, but despite this the result was basically grain-free. There is grain, but it looks like a low-level blur rather than individual little spots. APX 25 sharpens well, and generally it's very Photoshoppy - low contrast, good resolution, similar to digital but with low contrast. You can add your own contrast. Shame it was discontinued. I have no idea how the previous owner stored my roll, but in general slow black and white films seem to age well.

A couple of these images have a pattern, particularly the BMW. I'm not sure if this is from the film's storage or from my scanner's dirty glass flatbed. Half and half, probably.

"... in exchange for £1,200 a month"

Seriously, the old sharknosed BMWs were wicked cool. If I ever become a multi-millionaire I will either buy an old sharknosed BMW or a Porsche 912, which I will park on the street thusly, and never use - because I would already live in London. Heathrow is at the end of the tube and the rest of the UK isn't worth visiting.

There will be cocaine hidden under the seat cover in a ziploc bag inside a second ziploc bag, and many years later I will feel slightly sad that the police never stopped me, not once. It was because I looked too boring. And that hurts.

In an age of ISO 25,600, twenty-five is distinctly retro. As far as I can tell the only remaining slow film is Ilford PAN F 50, which I can't remember ever using. I'll have to dig into my box of things to see if I have some.

Sunday, 14 December 2014

Off to Marrakech (and Essaouira) with an Olympus Pen FT. At this time of year Britain's weather is the meteorological equivalent of a boot stamping on a human face forever, and I can't think of any reason to stay. In fact Britain doesn't have much going for it even at the best of times. On a strategic level the location is superb; our bombers can reach the Continent, Scandinavia, North Africa and the Middle East, and so can our airlines. Historically Britain once had vast coal deposits, which came in handy during the industrial revolution and meant that Britain was, for a time, an energy powerhouse. Nowadays Britain is an attractive investment haven. It has a stable government, a strong military, and a police force that doesn't ask questions of the very wealthy or demand bribes.

But unless you are wealthy Britain is no good. The people are stupid and fat, the weather is awful, there's no cultural or artistic tradition, everything is very expensive, and yes not all of the people are stupid and fat - Steven Hawking is clever, Keira Knightley has a BMI of 18.6, which puts her at the lower end of the normal range - but the vast majority are worthless. Keira Knightley obviously is an exception. She is valuable. She is good and pure, and when I am King she will be Queen. Or Helena Bonham-Carter if Keira Knightley refuses. I know that some people don't like Keira Knightley. When I am King they will learn to love her. Or else.

It's not so much that I dislike stupid fat people. I don't have a problem with stupid fat happy people. It's not even a lack of intellectual curiosity that bugs me. The problem is that Britain has a tradition of aggressively stupid people; people who aren't simply dumb, they're actively ignorant and suspicious of knowledge. As if they were dimly aware of their own inadequacies but unable to man up and confront them. I mention this because despite the obvious merits of eliminating most of the population plus the existing Royal Family and replacing them with myself and Keira Knightley and our offspring and some robots to do the work, neither of the main political parties have so far agreed to implement my plan. Indeed the police were downright angry when I tried to present my ideas to David Cameron and they have forbidden me from sending letters to the Duke of Hamilton as well.

The Pen F is a half-frame SLR from the 1960s. Olympus made almost half a million of them spread across three different models. Mine is a Pen FT, the most popular of the three. It accounted of over half of the F's sales. Compared to the original F, the FT has an uncoupled lightmeter, single-stroke film wind-on and a couple of other tweaks. Judging by the serial number mine is apparently the very first Pen FT to leave the factory in 1971, which means that it's five years older than me.

The original Pen F was launched in 1963, in the midst of a short-lived half-frame boom. Design-wise it was the work of top late genius Yoshihisa Maitani, who had sparked off the boom in 1959 with the original viewfinder Pen, a tiny but well-made compact that shot twice as many images on standard 35mm film as the competition. Maitani wasn't quite on top of his game - in my opinion the OM models were his high water mark - but at least on a visual level the Pen F is one of the best-looking cameras of all time, one of the most attractive objects of the last half-century. It looks like a cross between a rangefinder and a compact camera, but it's actually a full-blown SLR, with a through-the-lens viewfinder and an interchangeable lens mount. The half-frame format allowed Maitani to fit the shutter, viewfinder, and lens into a space about three inches cubed, with the mirror arrangement turned on its side, and as a consequence the F has the lens mount shifted off to the left.

What was half-frame? Half-frame cameras used standard 35mm film, but the images were half the size. They were the same shape as standard 35mm pictures, but turned on their side, and so when you look through the viewfinder of a Pen F the image is in portrait orientation. Half-frame was popular for a while because you could take lots of shots with short film rolls, but perhaps because of this Kodak never embraced the format, and it didn't take off in the United States. Kodak wanted people to buy more film, not less, and in the US half-frame was dwarfed, nay crushed by Instamatic and 110. Most blog posts that talk about half-frame blame the format's demise on the Minox and Rollei compact 35mm cameras of the 1970s, but half-frame was already dying off by then, and the German cameras were generally aimed at a posher market.

I've always had a soft spot for half-frame. The resolution is essentially the same as 35mm motion picture film, and it's neat having 72 shots per roll. Photo labs process it without any issues because the frame gaps are in the same place as standard 35mm, and my Epson V500 scans it without any problems. The scanner assumes that each pair of pictures is a single 35mm image with a black lamp-post running through the middle. I like the way that the images are paired; I can see a little bit of my thought processes as I selected shots, and the juxtapositions are sometimes striking.

The original Pen F was manual everything. You had to use an external meter or informed guesswork if you wanted the pictures to turn out right. It was sold between 1963 and 1966. Olympus enlisted top late photojournalist and Hemingway-esque photographic icon Eugene Smith for the adverts, although I have no idea whether he actually used it. He was a fan of smaller film formats, and had been sacked by Newsweek in the 1930s for refusing to use 5x4 plate cameras, and photojournalists are a pragmatic lot, and (breathe in) the Pen F would have been pretty much a drop-in replacement for contemporary Leicas, and I began this sentence some time ago, and it now has a life of its own, and it would be wrong to end it

The FT was sold from 1966 right until the end of the run, in 1972. The uncoupled electronic lightmeter means that you still have to set the shutter and aperture yourself, but with the FT there's a readout in the viewfinder that helps you. It has a strip of numbers, running 0-7 from top to bottom, which match up with numbers on the lens. You're supposed to set the shutter to roughly the correct speed, check the meter, and then select (say) 3 on the lens. The lens also has a conventional aperture scale which you can bring into play by pulling and twisting the aperture ring.

The shutter runs from one second to 1/500th, with flash sync at all speeds. The Pen FT has a PC terminal but no hotshoe, so you have to mount the flash on a bracket. As far as I can tell the original Pen F didn't even have a PC terminal, so I'm not sure whether it supported flash at all. My FT was made in 1971, forty-three years ago, but the shutter and meter are still accurate. The shutter fires with a reassuringly solid sound. This is one respect where rangefinders have the Pen F beat; rangefinders are usually much quieter.

The meter uses PX625 mercury batteries, which were discontinued long ago. I have an adapter that lets me use silver oxide 386 batteries instead. Of late a Russian source on eBay has started selling PX625s, apparently from a military stockpile, so you have to ask yourself whether you want to (a) buy one of these batteries for £10 but run the risk that it will explode or contain a bug (b) spend £30 on an adapter (c) spend £100 on a second-hand Sekonic L-308. Good luck! Assuming you want to use the meter at all. Mine is still accurate, you might want to check yours before you shoot slide film. For this article I used an external meter for the first five minutes and then gave up on it because the FT's meter is accurate enough for slide film.

Velvia 100F

The FT had a couple of physical tweaks to support the meter. There's a window in the top plate that channels sunlight into the meter display so that you can see it clearly. The viewfinder itself is apparently dimmer than the F, because some of the light is diverted to the meter, but I haven't used an F so I can't compare them. The shutter speed dial runs from ISO 25 to ISO 400, which means that you're going to have to do some mental mathematics if you want to use faster film. Given that this is 2014, and the only films left are Ektar 100 and Fuji Superia 400, that's less of a problem than it was in the 1990s.

The FT was followed by the FV, which was essentially an FT without the meter - but with the original Pen F's brighter viewfinder. It was available in chrome and black, and was sold as a budget alternative to the FT. As fate would have it the black FVs are now the most sought-after Pen F model, because they look awesome and they have a nice viewfinder. The Pen F range was discontinued in 1972, making way for the compact, full-frame Olympus OM.

The Pen F was the only half-frame SLR with interchangeable lenses, although surprisingly it wasn't the only half-frame SLR. In 1988 Yashica tried to revive the format with the Samurai, an autofocus bridge camera with a through-the-lens viewfinder and a built-in zoom lens. It looked like a video camera and seems to have sold well enough for a plethora of badge-engineed copies, although nobody remembers it nowadays.

Olympus sold a modest range of Pen F lenses, which ran from a 20mm wide angle to a rare and expensive 800mm mirror lens plus a couple of telephoto zooms. For years after the demise of the Pen F they languished in obscurity until a new wave of interchangeable-lens digital rangefinders made them desirable again. They're well-made, cute little metal lenses that can be adapted for Micro Four-Thirds and Sony NEX bodies. Half-frame has a cropping factor of 1.4x, roughly the same as APS-H, and so the 20mm f/3.5 is a 28mm, the standard 38mm is a 53mm, the 60mm f/1.5 portrait lens is an 84mm, etc. On M43 digital bodies the focal length is doubled, on NEX it is multiplied by 1.6.

The big star lens was a 42mm f/1.2. The slightly posh normal was a 40mm f/1.4. Apart from the 800mm mirror lens there were no exotics - no fisheye, no soft focus, no tilt-shift, although there was a 38mm macro and a 38mm pancake. Zeiss built a prototype 54mm f/2, but otherwise there were no third party lenses for the system. Olympus sold adapters that allowed standard SLR lenses to fit on the Pen F, but these adapters are rare and expensive nowadays. I have only ever seen M42 and T-Mount adapters, but there were apparently Nikon and OM adapters as well. There was also a Leica M39 adapter, although sadly this only allowed M39 lenses to focus in the macro range.

I only have the standard 38mm f/1.8. EDIT: Although many years later I bought the 25mm f/2.8, which is just as good but less bokeh-licious. I don't have a problem with it. It focuses very closely, I didn't notice any distortion or CA or glow. The filter thread is an odd 43mm. It's interesting to compare it with the later OM lenses - the aperture stop-down and lens release buttons are mounted on the lens rather than the camera body, but whereas OM lenses had them at 180 degrees from each other Pen F lenses mount them at 45 degrees, like Dalek ears.

I mention up the page that the Pen F is a good-looking camera. The styling has something of the jet age about it, particularly the dynamic step in the top plate. Maitani reused this design idea in the later Trip 35, which seems to have been based heavily on the Pen F but with the lens mount put in the middle rather than off to one side. The gently curving body is nice to hold and looks as if it is bulging with goodness. It's roughly the same size as an Olympus OM or Pentax ME but shorter and flatter; it's not as small as I was expecting, although it fits into a pocket much more easily than a conventional SLR because it's smooth. It's surprisingly heavy, too, although that makes it feel more expensive. I'm sure that Mr Man-About-Town circa 1962 would have been thrilled to buy a Pen for his wife, and would have felt that he got his money's worth. Judging by an article on Popular Mechanics, September 1964, the price of the Pen F was around $140 back then, which hovers around $1,000 nowadays.

On the negative side the shutter speed dial looks like an afterthought. It's not quite in the right place to turn without taking your hand off the camera, and it spoils the body's smooth lines. The OM system had the shutter dial mounted around the lens mount, and it's a shame Maitani didn't have the idea ten years earlier. On a mechanical level the Pen F is apparently a tricky repair. The shutter is a metal semi-circle that rotates very quickly rather than flapping up and down. It's made of titanium, but it's very thin and easily damaged. It seems that there were no obvious design flaws, but with twice as many shots per roll the cameras apparently wear out and the compact innards are difficult to work on. Mine has a sticker inside the film area stating that it was served by South Coast Camera Engineers, Southsea, although sadly there's no date. The company appears to have gone bust in 1985. It's odd to think of the Pen F and 1985 in the same mental breath. The F's metal body and curvaceous styling would have seemed centuries old in the mid 1980s.

And, yes, Marrakech. And Essaouira. It was the beginning of December but not once did I hear Slade's "Merry XMas Everybody". Marrakech has not changed very much since I was there last, reason being that it is starved of investment. There is a new shopping mall under construction between the centre of town and the airport but in general the potholes are the same, the crumbling semi-completed buildings are still half-finished. The poverty is charming if you're a tourist - poor people are cute - but probably not much fun if you have to live there; if you're a young man looking forwards to a future of sitting behind a market stall until you die, earning just enough to live but nowhere near enough to improve your life or the lives of your children.

My solution would be to demolish the marketplace, move the poor people to slums on the outskirts, slash tax, basically strip-mine the country and use the locals as slave labour, with myself as king and the most beautiful Moroccan woman as Queen, although we would spend most of our time outside the country, in an expensive house in London that the people of Morocco paid for. This wouldn't help the people of Marrakech very much, but as King of Morocco I would have to ask myself whether it is better for all Moroccans to be poor, or for one of them to be very wealthy. And if the burden of wealth has to fall on my shoulders, so be it.

Putting on my serious hat, it seems that without oil or any other means of "cheating", Morocco's economy relies on slow, steady growth, and that in order for this to work and for the people to have a better life, the economy has to be managed; managed efficiently; managed efficiently over the course of decades, nay centuries; and that it has to be managed openly and transparently, so that the population understand why they can't have 4K televisions now, but their children might. The problem is that stable, effective long-term economic management is difficult, and Morocco is at the mercy of crises both external and internal.

Is it that simple? Are the problems facing North and South Africa really that simple? Is it the case that wisdom, strength, unity and respect are the only things needed to manage a country effectively? Why haven't more countries adopted those values? I can only attribute it to human error.