Posted
by
timothy
on Thursday December 01, 2011 @08:50AM
from the still-living-in-the-basement-probably dept.

Alioth writes "The BBC has an article on the BBC Microcomputer, designed and manufactured by Acorn Computers for the BBC's Computer Literacy project. It is now 30 years since the first BBC Micro came out — a machine with a 2 MHz 6502 — remarkably fast for its day; the Commodore machines at the time only ran at 1MHz. While most U.S. readers will never have heard of the BBC Micro, the BBC's Computer Literacy project has had a huge impact worldwide since the ARM (originally meaning 'Acorn Risc Machine') was designed for the follow-on version of the BBC Micro, the Archimedes, also sold under the BBC Microcomputer label by Acorn. The original ARM CPU was specified in just over 800 lines of BBC BASIC. The ARM CPU now outsells all other CPU architectures put together. The BBC Micro has arguably been the most influential 8 bit computer the world had thanks to its success creating the seed for the ARM, even if the 'Beeb' was not well known outside of the UK."

People used to get excited when a CPU clock was measured in MEGAHERTZ! Now we're jaded...

The fucking things did not run a GUI that emulated transparent glass. They could process video or images that we use today etc. People use to get excited about ASCII art and how clever that was. Today you can see pictures Hubble has taken in intricate detail, and instead of playing ASCII strip poker people are viewing HD porn instantly.

When home computers were new anything they could do was a marvel. Now we've seen what more processing power can do. We have a lot of bloat. We also have a lot of functionality that is taken for granted. You have to remember that international direct dialing was considered a wonder when the BBC microcomputer was introduced. ("What, you mean no operator connects you!?")

In the late 1980s Apple Computer and VLSI Technology started working with Acorn on the second generation of the ARM core. So once again Apple is there. It's getting like the black obelisk on 2001. Pick anything and apple may not have invented it but they did shape what it became.

I really think that's the issue when people talk about computers of yore, its not that we now have 3.8Gh quad-core chips with megabytes of RAM as cache that appear to perform the tasks we set them with less user responsiveness than the old computers, I think its because in the old days, you got the chance to be clever to make it work. Today's computers are built up with layer after layer of bloat that is designed to make it easier to code, but really makes the overall experience for the end user less than o

Ones the eye candy novelty factor has worn off which takes, ooh , 3 minutes , no one cares one way or the other. All it does is waste energy by forcing the GPU to do pointless calculations. You couldn't have picked a worse example to explain why computers are better today.

That depends. If I'm in a first person shooter, I most certainly do want a good emulation of glass in the windows. If I'm using a loope in an application, then I certainly do want a good emulation of a glass magnifer. And in a GUI, if you want to indicate that a button is clickable, it's best to give it some kind of 3D effect so it looks like a physical button. And a glass effect can be part of that.

But I assume you're talking about glass borders around windows on the Windows OS. In which case I agree. But

(Please note, above linked project is actually pretty fucking cool: "In the summer of 2009, working from a single 6502, we exposed the silicon die, photographed its surface at high resolution and also photographed its substrate. Using these two highly detailed aligned photographs, we created vector polygon models of each of the chip's physical components - about 20,000 of them in total for the 6502. These components form circuits in a few simple ways according to how they contact each other, so by intersecting our polygons, we were able to create a complete digital model and transistor-level simulation of the chip.

This model is very accurate and can run classic 6502 programs, including Atari games. By rendering our polygons with colors corresponding to their 'high' or 'low' logic state, we can show, visually, exactly how the chip operates: how it reads data and instructions from memory, how its registers and internal busses operate, and how toggling a single input pin (the 'clock') on and off drives the entire chip to step through a program and get things done."

It is, however, the case that this might not be the fastest way to execute 6502 instructions...)

I loved Micro Men. The Beeb show absolutely no interest
in releasing it on DVD, but it's on YouTube. I'm getting really fucking tired of waiting for banner ads to load
on YouTube, but if you want to see it, start here [youtube.com].

Some of the original people have cameos, like Sophie Wilson as the exasperated barmaid toward the end.

But there were far more megahertz than we'll get in gigahertz! You'll never get a 150GHz machine...

I'm with you totally...that's like well more than an order of magnitude faster than what we commonly have now, it would take huge advancements in technology to get there...heck, that would make the totally advanced and awesomely powerful computers we have these days feel like pocket calculators. Never'll happen;)

Which means we go back to the same strategies we did in the 80s and early 90s -- coprocessors. Or, put another way, multiple cores, stacked GPUs, DMA, hardware DSPs. And (gasp), the Second Coming of CISC.

At the end of the day, RISC was a way to get cheaper megahertz (and later, gigahertz). Now that we've largely maxed out clock speed to the point where it's almost counterproductive, CISC is just about the only place we have left to go. Instead of wasting 50 cycles loading values into registers. operating on those registers, evaluating the outcome, and branching based upon it, you can have complex variable-length opcodes that use billions of transistors and have sinful amounts of silicon dedicated to niche operations that would have been absurd to even contemplate 25 years ago with far fewer clock cycles.

There's a reason why a 16MHz 68000 can still run circles around a 100MHz ARM, and why a 1GHz Pentium-M beats a 1GHz Atom or Arm to a bloody pulp -- the CISC chips get more done behind the scenes with every public clock cycle. The fact that behind the curtain, they're secretly executing chains of RISC instructions with private, semi-asynchronous clocks as fast as they can & just presenting the public facade of a CISC architecture responding to a system-wide clock is a quibble. The point is that every time the public system clock ticks, they're getting WAY more done than a conventional RISC architecture could ever fantasize about. In effect, a modern AMD64 (or Core2) CPU is like a container full of virtual, disposable/pooled RISC processors that get instantiated to execute a single public opcode while privately dancing to the beat of their own drummer.

You are very sadly deluded if you think that a 16Mhz 68000 could run circles around a 100Mhz ARM. Saying something that stupid means your whole argument collapses.
I was a big Amiga fan back in the day, but I would never dream of saying that an Amiga with its 7Mhz 68000 could perform faster than a basic Acorn Archimedes with its 8Mhz Arm 2. Load up any 3d game that was common to both platforms (Zarch/Virus) and watch them side by side. The Amiga loses (The Amiga wins in 2d games though because of its powerf

OK, I got interrupted halfway through the post, and accidentally made it look like I was attributing 100% to the CPU alone. The full argument I intended to make was that if you compare a 16MHz 680x0-ish PalmOS phone circa 2002 to any ~100MHz Windows Mobile phone circa 2003 or 2004, the PalmOS phone will almost always "feel" faster, because the PalmOS phone used separate peripherals for everything, but the nominally-faster WinMo phone tried to use the CPU for everything. It's the combination of CISC (enablin

There really isn't a CISC-vs-RISC endgame. The competition is over. The only thing that CISC buys you with your model is more compact code. But code compactness is not the big deal it used to be except on smaller embedded systems or simple CPU designs. The RISC architectures of ARM and MIPS solve that problem on the embedded side by using optional 16-bit instruction sets, and the embedded world has plenty of 16-bit RISC architectures for when you don't need a 32-bit processor. In the higher end Intel s

The screen display came out of the 1K of RAM but it only used as much RAM as was needed. There was a special 'end of line' character to mark the end of each screen line. A blank line only needed one byte (the end of line char). A line with 'Hello, world!' on it would need 14 bytes. A screenfull of text needed 768 bytes.

Many programs went to extremes to save RAM. There was a 1K chess program which displayed the moves as five chars at top of the screen, eg. 'E2E4+'. You had to use a real chess board to follow the game.

When I started, there were still guys around who worked on computers in the 1950s. They used to talk about how they thought they'd got the world by the balls when they got their hands on an IBM 1401 with it's 4K of RAM.

Once I sat next to an old dude at a professional banquet, and it turned out he'd started working on very early computers during the Korean War. He told me about when his department got an IBM 701 (which would have made it around 1953 or so). He cracked a grin and said (I swear to God), "Yep,

Hey you spoiled kids, I had a VIC20 (you need to use caps, we didn't have no lowers back then ya know!) and ya know what? It was fun! Sure the datasette was flaky and if you didn't watch it your little sister would take the cassette that you had saved your three days worth of heavy programming onto and record culture club on it (That left scars) but where else could you get new games and programs out of magazines and have everyone who had a computer be as geeky as you?

While everyone waxes on about the old days there are some things however I do NOT miss, like the prices of RAM. Back then a 4Mb stick cost more than a fricking car! As I sit here with 8Gb in my netbook, another 8Gb in my desktop, where even my $50 graphics card has 800 stream processors and 512Mb of RAM on its own? Yeah I really don't want to go back to counting each byte and having to PEEK and POKE and GOTO every chance I could just to squeeze a few more bytes in there.

So congrats to all those Brits with their Beebs, I hear it was like us and our VICs, If you want to date yourself just compare your first machines to what you have now and boy won't you feel old. I could fit my VIC AND my first FIVE PCs into the speed and memory of my $50 GPU and have cycles left over. And the first 5 of my PCs could have their entire hard drives dumped to my $14 flash stick and still have room for every program I had ever written for the VIC...wow...yay excess?

The BBC was expensive. It cost £400, back in the early '80s. It was an amazingly powerful machine in comparison to most home computers, but also much more expensive, so even the man on the street who did have a computer didn't usually have a BBC. The government gave schools extra funding to buy machines that had a certain feature list, and the BBC was about the only machine that qualified when this was launched, which accounted for a lot of the sales.

Sez you...the cool kids in my neighborhood were programming their Sinclair ZX81 [wikipedia.org] (which apparently was sold in the U.S. as the Timex Sinclair 1000) in Z80 assembly. B registers, C registers, (and pairing them as BC) D, E, F (paired with A as AF if I remember correctly?) H, L, IX, IY, etc.

Then we got the Commodore 64 and I had only three registers to play with (can't remember what their names were). I felt like I'd taken several steps backwards.

What the 6502 in the C64 (and the BBC Micro, under discussion) had which most people who started asm on the Z80 missed was the zero page. Effectively, using the zero page you had 256 registers. Zero page operations on the 6502 were as fast as register operations on the Z80. While I'm much more proficient at Z80 asm than 6502, I really appreciate the very straightforward and uncomplicated - but powerful - design of the 6502.

I used hand assembled machine code on my Apple ][ Plus until a magazine published an in place assembler in Basic for the Commodore.I typed it in, modified the addresses, and stored it to cassette tape. I then implemented "life" cellular automata and went door to door until someone hired me.At that job, I met Lance Leventhal, author of my 6502 Assembly Language Programming. I still have the book.

If I wanted to go back, I would burn a soft-core 6502 into an FPGA and run code on it. I had more time to do th

Indeed, you just had to write your ASM inside square brackets as if it were a BASIC program, and it was assembled into memory. (But the [ and ] rendered as arrows in the default text-only graphics mode.)

The most remarkable thing about the BBC is that they're still going running production code.

I had the good fortune of working with (or rather, near) one of these systems a few years ago. When I asked why they hadn't upgraded the machine in nearly 3 decades the head of the system simply responded; "It still works."

Ah, someone with brains. If it still works, why would you change it (concerns about suitable replacement being timely aside as that's a separate issue).

BBC's were great for all sorts of things. Working in school IT departments I often find them, and sometimes I find "old" staff there who tell me how they used them for EEPROM reading/programming, and other interfacing that today's school machines hardly do any more with specialist adaptors.

They even ran the Teletext service in the UK (they actually have a "Teletext" video mode on them) and all sorts. It was a programmable, extendable computer that did what was necessary and no more.

Oh for those days again. Here's hoping that Raspberry Pi thing takes off.

Of course, Teletext is pretty much dead in the UK now, replaced by shiny MHEG-based digital text services. The BBC ended up having to add Teletext-style page numbers to their replacement for it though - there was just such an incredible amount of demand from people that preferred that method of navigation.

With a system that old you can get away with taking that approach - with today's fascination with having everything somehow connected over the Internet, you simply don't have the luxury of being able to say "It works. Why change it?"

The first computer I ever used was a BBC Micro. It was around 1986 in a small private boarding school in the middle of the bush in Zambia. We were over an hour's drive from the nearest telephone. The school got one or two of these computers just before I left, and somehow they got me hooked on computers.

The only command I still remember was that you had to type "CHAIN" to run something. I've been curiours about that command ever since, but a quick Google search leads me to believe that it "chained" the LOAD and RUN commands together.

There was a shortcut (control shift escape? Something like that - a few keys all on the left side of the keyboard) that would launch the first program on the disk or tape (depending on which was connected). You only needed to use chain for disks containing multiple programs.

"L" was the game that first got me hooked with computers. I played that game through to completion on one of our school's BBC micros, even though it involved doing so during break times, lunch and after school. I was very fortunate to have a maths teacher that was really into the BBC and knew what could be done with computers. We had an Econet network, fileserver and a computer room that we could spend our breaks in.

The OS and built in BASIC in the BBC are extremely elegant: functions, procedures, a VDU dri

Aged 7, my school had three BBC Model Bs and one BBC Master. The head teacher gave us one half-hour lesson each week on whatever he felt like teaching at the time. Sometimes it was classics, for a few weeks it was programming. He taught us BASIC and Logo on a BBC B connected to a big TV. In break times and after school, we could reserve one of the machines to use, if we were the first to request it. I spent a lot of time ages 7 to 11 writing little programs on them. At home I got an 8086 PC and learne

Changing the screen mode 3/4 of the way down each screen refresh. Programming while counting every clock cycle - fantastic.
I still wonder where all the resources are wasted in current software.
I still say FRAK! when the need arises. Nobody knows what I'm talking about:(

We used to have a room full of them at school and we soon discovered if you rubbed your feet on the carpet and then pushed your locker key in between the keys to the exposed circuit board... they stopped working.

The irony is I later in life wound up maintaining student labs for a university and had to put up with "dickheads" like I forgot I used to be...

I remember in electronics classes being told the TTL chips could only handle 5 volts. And they gave us power supplies which went up to 25V. I mean seriously, what did they EXPECT a bunch of teenagers would do?? "Hey, nice bang, cool smoke effects! Lets try a some capacitors now!"

Elite [wikipedia.org], developed for the BBC Micro and published by the same company that made the Micro, did get a lot of attention here in the U.S. (it was ported to all the major platforms). It was one of the first big universe sandbox games, and modern games like EvE Online are still influenced by it.

It was roughly contemporary to the Commodore 64 and Atari 800/800XL micros. More expensive than both of those, but cheaper than an Apple II, which were very expensive in the UK. The Apple II predated it by about 4 years if I recall. My impression at the time was that the Sinclair ZX Spectrum (Timex Sinclair in the US) was far more popular in the UK, along with the C64. The BBC was common in schools, but less common at home, mostly due to a dearth of games and pricing. A cheaper version, Electron, was releas

Cool - I remember something about Atari Jaguar being much more widespread in the UK than the US, but, then, today I can't even put my finger on what exactly the Jaguar was - I had an 800, plus a couple of 400s when they dropped to clearance price ($99 or less..), and then one of the Atari 16 bit machines that finally died when its internal floppy drive belt stretched out (OS was always loaded from floppy, so....)

Was the Beeb available before the Apple ][ ? Was it more or less expensive in the UK?

I get the feeling that the BBC Micro enjoyed a kind of tax protected status, the way American made pickup trucks do in the US.

The BBC came quite a while after the Apple II - if you've been following the 30th Birthday announcements, its actually younger than the IBM PC (...of course, the IBM was eye-wateringly expensive for a few years, until the Clone Wars began).

I've programmed both and, generally, the BBC was considerably more powerful than the Apple.

It had a (much) better BASIC with 'structured programming' facilities (Repeat/Until loops, multiline if/then/else named procedures), a built-in 6502 assembler (so you could use BASIC as a macro language) and neat indirection facilities for working with bytes/words/strings in memory. Unusually for "home" computers of the time it had a 'proper' operating system, quite separate from BASIC - the BASIC ROM lived in a paged memory space alongside applications such as wordprocessors and other utility ROMs such as the disc filing system (popular BBC expansions included extra ROM sockets for applications or 'sideways RAM' for use as a RAMdisk or to let you develop your own ROMs).

The graphics were much better (but with a caveat) than the competition - 160x256 in 8 colours, 320x256 in 4 colours or a TV-tousing 640x256 in monochrome. Also, those colour modes were fully bit-mapped c.f. the attribute-based solutions on other systems (where you could e.g. only have 2 colours in each 8x8 cell, or on the Apple where you could only plot white by plotting a magenta pixel next to a green pixel). There was a proper palette system (so you could do fast animation by palette switching - only TTL though so its always the same 8 colours) and 'hardware' scrolling by tweaking the memory mapping (which could also pull tricks like changing display mode half-way down the screen, as used in Elite). The caveat was that the RAM was shared between data and video - so the higher modes used 20K out of your 32K. Although aftermarket upgrades appeared that added a 20K page to replace the video RAM (which worked seamlessly provided that the application used the correct OS calls rather than poking things directly) Acorn took their own sweet time before building that feature into later models.

It also had a shedload of internal hardware: a Teletext-compatible character generator chip for low-memory, high-quality TV friendly 40 col text & block graphics (without eating your RAM); a 'proper' sound generator chip; analogue inputs (not audio frequency, but great for proper joysticks and school science experiments) and a 'user port' which made about half of a 6522 VIA chip available for digital I/O, a serial port, parallel port, proprietary expansion port & vacant sockets on-board for a floppy controller and 'econet' LAN... Plus a really decent keyboard (the kind with discrete key-switches for each key). Then there was Acorn's 'Tube' interface, which allowed you to hang off a 'second processor': i.e. a headless 6502, Z80 or (later) 32016-based computer that used the BBC as an I/O processor. (Of course, the really interesting one was the ARM second processor, but AFAIK that was never publicly available).

The Apple's advantages were (a) software base (but the BBC accumulated quite a big software base in the UK) and (b) internal expansion (the BBC had lots of expansion potential but it was either via external interfaces or slightly kludey piggyback boards). I think there were more options for upgrading an Apple 2 to '64K clean' RAM configuration.

However, If you got the BBC 6502 second processor (a 4MHz 6502 with 64k RAM, with the original BBC handling all the I/O) then anything else with 8 bits (and quite a few things with 16) could eat your dust... unfortunately the price of that hampered adoption and, hence, software support (although you could play the definitive version of Elite).

I remember we had three of these on trollies in my primary school - two had colour monitors, and one had a black and white monitor.
Somehow I managed to network/schmooze/brown nose my way into becoming a "computer mover" when I was in the 5th year with two of my friends. We were tasked with moving the computers first thing on a Monday morning into a new classroom, who would then have it for a week. We'd plug it in, turn it on and load up the correct disk that the teacher wanted to use.
I think that's wher

Aside from Elite, one of the classic games for the BBC was Citadel. I'm still amazed how it managed to fit about 100 screens worth of platform adventure game into 12k of memory without touching the disk after it had loaded. IIRC it ran in mode 2 - which took 20k out of the available 32k memory. I think they only used part of the screen and used the rest for storage with some weird trick to make it invisible. The Electron version (see link) couldn't do the hiding trick somehow.

The BBC version also spoke to you when the menu program loaded up, and to this day I think of it as "Seeta-toddle", which gives you some idea of the audio quality.

The current ARM has little to do with the BBC micro. Apple purchased a stake in Acorn with the goal of getting them to FAB a low powered CPU to power the Newton. While the Newton was never a success as a device, the technology and patents that resulted from the project set ARM on it's current trajectory. In a round-about way the Newton did save Apple. At it's darkest hour the sale of Apple's holding in Arm netted $800m in hard cold cash when Apple needed it most. Without the Newton Apple wouldn't be wh

And if it weren't for Acorn, the ARM would never have existed for Apple to put in $800M in hard cold cash. The ARM project started in 1983, years before Apple or anyone else for that matter even knew of the project. The BBC Micro was the seed, the ARM's original purpose was to put in the next BBC Micro (the Acorn Archimedes).

The current ARM has little to do with the BBC micro. Apple purchased a stake in Acorn with the goal of getting them to FAB a low powered CPU to power the Newton.

...except that if the 6502 BBC micro hadn't happened, Acorn wouldn't have developed the ARM2/3 to use in the next gen BBC Micro and there wouldn't have been anything for Apple to buy in to. It may have evolved since then, but Apple sure as hell didn't invent the ARM.

The first ARM-based machines were the ARM2-powered Acorn Archimedes range, released in 1987, the entry level model of which was still branded as "BBC Micro". At the time, they kicked sand in the face of 80286-based machines. The Newton didn't appear until much later.

Cheekily, in 1994, Apple touted their new PowerPC-based Macs as the first RISC-based personal computers.

I think you're missing the point. ARM clearly lost to x86 in terms of the PC and I think it's fair to contend that they would have become a footnote in computing if not for the capabilities they developed while working with Apple. Specifically the reason arm continues it success today is because the Newton needed ultra-low (for it's time) power consumption. Apple spent almost 5 years working with ARM on that processor before releasing the Newton with the ARM6. I think it's fair to say there are bits of

At least I think they were BBCs. I remember they had this special hard plastic yellow thing that went in the floppy drive (a 3.5" IIRC) to keep it from being damaged when the machine was moved or something.

Great memories with this computer. And it was so far ahead of all competitors : even the predecessor of the Acorn BBC B (the Electron) already had 2Mhz and 32KB RAM and was networkable using a thing called Econet.The BBC B+ could be expanded up to 128KByte and had a second processor (we're talking 1986 !!!!), teletext-reader, lightpen that allowed you to draw by using a pen on your screen (think tablet !) and so on.And then Archimedes with its 32-bit RISC CPU came in 1987 (!), doing 4 MIPS and offering a Wi

Multiplication is done largely the same on Z80 and 6502. You need additions and perhaps table lookups. The length of the product can be abstracted out in a loop if you want a generic multiprecision multiplication. If your multiplication is long enough, you will save cycles by using FFT that uses many shorter multiplies that take less time than a naive long one. The only major difference is that Z80 has 16 bit add/subtract. That's what you nebulously refer to by saying "Z80 could use 16 bit data words". Well

Well, on a 6502, 16 bit addition is a whooping two instructions instead of one, so I don't see the problem, really.

Ah, but it's not as simple as that. You've got to fetch four bytes separately from memory, which takes longer than two word fetches.

The clock speed doesn't tell the whole story, though, because the 6502 used a split-phase clock. So, a single machine cycle takes two clock states - the fetch occurs on the leading phase and the execute on the lagging phase. Theoretically a 2MHz 6502 is rattling through instructions at roughly the same rate as a 4MHz Z80.

So even though the CPU was underpowered by the mid 80s the programs written in Basic could still blitz a lot of other faster machines Basic programs. Also it had procedures which most (all?) other home computer basics lacked. Mind you, Amstrad Basic had high level interrupts which allowed a sort of early threading along the lines of

The BBC Micro at 2MHz was considerably faster than the Spectrum at 3.5MHz. The Z80 is a CPU that I like (I still write Z80 assembler, indeed I'm much more proficient at Z80 than 6502 and I've designed and made an ethernet card for the ZX Spectrum fairly recently as a fun retro project). However, we have to consider this. The fastest 6502 instruction executes in 2 T-states, most execute in 3 T-states, and the slowest take 7 T-states. The fastest Z80 instruction takes 4 T-states and the slowest over 20 T-states. The 6502 therefore has better interrupt latency (that monster 23 T-state index register instruction on the Z80 can't be interrupted).

The other thing the 6502 has going for it is the very fast zero page instructions, which are tantamount to giving you 256 extra registers.

The competing ZX Spectrum also had contended memory. Thanks to the 6502's predictable memory cycle when compared to the Z80, the BBC Micro designers could interleave screen memory access with CPU access, so no memory is contended. The Spectrum has to pause the processor while the ULA accesses the screen memory, meaning anything in the lower 16K of RAM takes a noticable performance penalty (and you can't use the lowest 16K for anything timing critical that must run while the ULA is reading the frame buffer).

Don't get me wrong, I love the Speccy, it's probably my favorite 8 bit (and I own several!) - it did an awful lot for very little money, it was immense value for money - but the BBC Micro was at the time had excellent performance.

The BBC Micro at 2MHz was considerably faster than the Spectrum at 3.5MHz.

True, but it had far more screen RAM to update so it mostly evened out for games. The Beeb had slightly fancier graphics hardware though (eg. hardware screen scroll) and if you could leverage that you could do things that the Spectrum had no hope of doing.

My chemistry teacher had a BBC Micro sitting in the corner of the lab. I never saw it used, until near the end of the final term when I was 18 (2004). He ran a simulation of the electron cloud round a hydrogen atom, and admitted that he only used the machine once per year for this purpose.

My dad was a school IT teacher in the 1990s and early 2000s, so there were always lots of Acorn machines for me to play on. BBC Micros were old by then, but I remember an Archimedes A310 (A320?) which was borrowed from

My chemistry teacher had a BBC Micro sitting in the corner of the lab. I never saw it used, until near the end of the final term when I was 18 (2004). He ran a simulation of the electron cloud round a hydrogen atom, and admitted that he only used the machine once per year for this purpose.

What is it with chemistry teachers not being able to find elegant demonstrations of ideas outside of the BBC Micro? My own chemistry teacher did something similar, as did a chemistry teacher in a school I worked at for a year.

My mate had one of those Archimedes - red function keys if I remember?

I still had an Acorn Electron - but I had the Plus-3 disk drive and Plus-1 cartridge interface. Rendering the initial Mandelbrot set took me 8.5 hours. His machine then managed it in 15 seconds. Man was I gutted.

my first computer was an Acorn Atom.. in fact I still have it in a box on my wardrobe.. I might just see if it works.

Crazy thing about it was that my dad built it for me, not like todayÃs computers where 'building a rig' means slotting cards into slots on a PCB. Dad soldered the chips and other electronics into place directly on the PCB.