Posted
by
timothyon Friday January 14, 2011 @06:05AM
from the ok-but-let's-only-rebuilt-the-allies dept.

nk497 writes with this bit from PCPro: "The first working stored-program computer is set to be rebuilt at Bletchley Park, home to the UK's National Museum of Computing. The Electronic Delay Storage Automatic Calculator ran its first programme in 1949, and was two metres high. Its 3,000 vacuum tubes took up four metres of floor space, and it could perform 650 instructions per second. All data input was via paper tape. The EDSAC used mercury-filled tubes for memory, but in the interests of safety, the replica will use an alternative non-toxic substance. Rebuilding it will take four years, and the public can visit to watch the work as it happens."

Maybe. In theory you might get a scaled down bare bones Linux to run, but even so you would be hard pressed to run any programs with it. Vaccuum tubes were replaced by transistors in the '50s and '60s, which in turn were largely replaced by ICs in the '60s and '70s. A single vaccuum tube preforms the same functionn as a single transistor. The Z80 CPU chip, which came out in 1976, had 8,000 transistors, more than twice the tubes of this entire computer.

... since LEO [wikipedia.org], the first commercial business computer, was based on the EDSAC design. Amazingly LEO computers were still in use in 1981. Check out the LEO Computers Society [leo-computers.org.uk].

The Z80 processes around 40k instructions per second, compared to EDSAC's 650 IPS. That's sixty times as fast as the EDSAC.

That's pretty unfair on the Z80, too. To get anywhere near that figure, you'd have to take your basic 1MHz Z80, and have it continually execute the longest possible instructions, which were rarely used and took up 23 T-states (clock cycles) - that would give you 43,478ips.

In practice, most Z80 instructions executed in the range 4 to 13 clock cycles, and just about every Z80 I ever met back in the day was the 4MHz part, so you're talking between 300k and a million instructions per second. So more like a th

Thing is, early computers were immensely different than the computers we have today. Addressing modes weren't fully thought out, instruction sets were esoteric and more suited for hand assembly, and even just getting information to/from memory wasn't quite what you'd expect. Both delay-line and drum memory were delay based, you had to have extremely tight timings to get the word you wanted.

Linux leverages many modern conveniences and paradigms. Without heavy modification, it canno

You'd need to reduce Linux's total footprint to 1024 instructions+data with no swapping, no hard disk, no networking, and all I/O through punched tape, but within those limitations it should run just fine.

Yet Grace Hopper managed to write compilers on not much better equipment. You can always mimic complex instruction set rith a reduced instruction set, but of course that will slow it down even more. PUT, GET, ADD, and SUB are theoretically the only instructions a CPU needs (did I miss one or two essential instructions?).

That's basically in the range of minimal ram requirements of Contiki (and not the only one for sure);

While Contiki has a minimal RAM requirement of 2K, it also occupies 40K of ROM. The EDSAC had a total architectural maximum of 1024 words (albeit 18 bit words), but only 512 words were actually implemented. Still, I recall when I had a TRS-80 Model I that had a pretty functional version of BASIC implemented in only 4K of ROM and 4K of RAM. It's amazing what can be done in such constrained environments.

I'm at a loss for why it will take so long. I'm guessing it's because they'll have one guy working on it by himself during the weekends so he can avoid his nagging wife. When they asked him how long it would take him, he pulled "ahh! Four years!!" out of thin air.

Not so bad now that we know what a computer is. Requirements slip was of course a big problem in the early days as a little bit of experience in construction gave you 1000 new ideas to try to implement.

Actually, EDSAC was originally built pretty quickly for the time (about 2 years) precisely because Wilkes, the project leader, decided to use only proven techniques and methods so as to supply a usable computing facility to Cambridge University, rather than extend the state of the art.

I think you're not far from the truth. The museum is run by volunteers, and depends on donations for income. They operate on a shoestring budget; this particular build will have dedicated funding, though.

I'm not sure that's a genuine stored-program computer. From what I recall, it was a very clever adding machine. Either way, it was a prototype rather than a fully complete, reliable machine that was used for research or commerce.

None of the early Zuse machines were stored program computers - they had a relay memory for data and got their instructions from punched tape. The table in the Wikipedia page about the Z3 seems about right:

The Manchester Baby was the first stored program machine, quickly followed by the modified ENIAC (the original used patch panels and cables) and then the EDSAC. Since the Baby was created to explore ideas for the EDSAC rather than as a usable machine on its own, I guess if you squint enough the article is right in an Obi-wan kind of way:-)

Mark I and others also should be noted, in 1949 it was definitely not the first...

The development of computers that have all of the architectural features we consider standard took about 15 years and there were several steps in the process with each one having some sort of bragging rights. And deciding when the process was "done" and we had a fully modern architecture is something of a matter of judgment.

Back in the 1980s I researched exactly this question for a CS course project, and I examined the architectural details of every early computer to MANIAC and IAS or so. EDSAC was the comp

In fact Alan Turing himself pointed out that a mixture of alcohol and water would do the job as well as mercury (he wanted to use gin.) Perhaps "Mercury delay line" just sounded more techie to the Civil Service.

Mercury in a sealed tube is only as safe as the tube and the seal. There have to be arrangements to fill and empty the tube, and to allow for expansion. These are all potential weak points. I once had to condemn a piece of equipment built by an "electrician" which used 24 large mercury glass relays operated by rotary solenoids, in an open wooden box. The glass elements were rigidly attached and each time they switched the point of contact with the frame came under considerable pressure. One broken switch el

One broken switch element and an entire factory would have had to be evacuated.

That's insane. When I was a kid, there was mercury in a hell of a lot of places. Thermostats and thermometers in every home used mercury. If a thermometer broke, we kids would play with the mercury (fascinating metal).

And these thermometers and switches had been in use for a couple of generations by then, yet I saw no evidence that anyone was harmed by it.

Now, if you ingest it, from eating tuna fish or inhaling the dust from a br

It takes a lot of alcohol to poision and afaict it breaks down organically into harmless stuff so releases aren't a concern.

Afaict liquid mercury isn't hugely dangerous simply because the body won't absorb much but vapours of mercury are worse and organic compounds of mercury are even worse. This gives a good reason for controlling it's use.

Maybe they could use a gallium eutectic - you would preserve the ambience of a room-temperature liquid metal device* with more similar characteristics than water/alcohol (better acoustic impedance, non-corrosive, etc.). Although you see it for sale at prices of $15/g its current metal market price for high purity gallium is only $0.70/g. An alternative is Cerrolow 117, a reasonably inexpensive commercial alloy used for making mold prototypes, melts at 117 degrees F. Adding a small heating element would keep

The method of storage in the Baby - a static charge used to represent 1 or 0 - proved to be the most effective form of storage for RAM (as static and dynamic CMOS) and is becoming more and more of a competitor for hard drives. Though CRT memory was short lived, in the long run Williams proved to be right. The Baby was prescient.

I was at Bletchley Park a couple of months ago and by chance the National Museum of Computing [tnmoc.org] was open that day. They've got some interesting displays of old computers, and their goal is to get them all running again. They cover everything between EDSAC and modern computers. Their oldest computer is a Harwell WITCH from 1951 (a decimal computer), this is being restored at the moment. Other fun stuff includes a collection of calculators, and a BBC micro with a working BBC Domesday Project laserdisc installation.

It's a separate museum on the Bletchley Park grounds, and its opening times are a bit limited (esp. in winter), so check before you go.

Also of interest is the 1949 CSIR Mark 1 (CSIRAC), which is held at the Museum of Victoria [museumvictoria.com.au] in Melbourne (unfortunately no longer on display). Because of its historical value, there is no intention to restore it to working order.

I'd love to visit Bletchley Park one day though if I'm ever on that side of the world.

A FedEx driver screws up a delivery. A manager/supervisor at the local FedEx office takes it upon himself to actually be helpful and sympathetic to the situation. He uses his free time to deliver the package, and probably has to jump through spinning hoops to handle the details of the delivery in his computer system (a fact you kindly point out yourself). Likely he will also have to answer to his superiors about the way he bypassed established procedures in order to help y

I love the fact that there is a common desire to preserve our historic technological achievements.

Working reproductions of dying / dead machines are a great learning tool -- We are all truly standing on the shoulders of giants today.

I feel that efforts such as rebuilding the EDSAC are in the same vein as those that would create emulators [warwick.ac.uk] for our out of production computers and video game systems as a cheap way to preserve the past.

What good is the EDSAC or an Emulator without a sampling of the programs the systems used to run? Surely different people would attribute different degrees of importance to different programs -- Thankfully digital storage is abundant and cheap enough that we are capable of preserving entire catalogs of programs.

Notice however, that the more relevant, beneficial and useful a replica or emulator is, the more illegal it is to produce due to patents and copyrights.I fear that if the current copyright laws could be enforced absolutely, we stand to loose important parts of our history and culture for no other reason but greed. Given the long terms of copyright, it's a safe assumption that much of our digital heritage could decay and be lost before it's legal to reproduce it -- Even under good conditions CDs, Magnetic and Solid State Drives will all fail before 70 years after the author's life has elapsed.

I'm very wary of DRM and the DMCA -- Today we can recreate past works to better understand the significance of the shoulders on which we stand; Tomorrow we may find ourselves searching for footing that has long since crumbled away.

You'll be happy to know that the DMCA exempts the Library of Congress then:) I believe portions have also been clarified such that archivers are not only allowed to store ditigal backups, but they can reverse the DRM if they can prove that it is for archival purposes.

Of course, you'll probably also be happy to know that the DMCA doesn't mean a thing in most of the world, and that somewhere like Belarus will likely be the "digital Iona" of the future....

You'll be happy to know that the DMCA exempts the Library of Congress then:) I believe portions have also been clarified such that archivers are not only allowed to store ditigal backups, but they can reverse the DRM if they can prove that it is for archival purposes.

Of course, you'll probably also be happy to know that the DMCA doesn't mean a thing in most of the world, and that somewhere like Belarus will likely be the "digital Iona" of the future....

Sadly, just making it legal to break DRM doesn't mean the DRM will be broken. I.e. my brother's Zune will not sync with his Linux machine because of encryption -- He's now experiencing the Vendor lock-in that I warned him of.

The point being: Once DRM is perfected it may not matter if it's legal for you to break it -- Encryption done right is very infeasible to break. E.g. DMCA does not require the DRMmers to jail-break your device for you -- What happens if you can't do it any other way?

It's great to see that EDSAC will be rebuilt! I wonder if Maurice Wilkes, the project leader, was told before he passed away just this last November? He was probably the last of the "first generation" computer pioneers to pass away. Several slashdot stories of his passing were submitted, but I don't think it ever made the main page. At least he can get his props here now.

Can no one look up and confirm well-known facts? Heck, this stuff is still within living memory.
The article claims that EDSAC was the "first working stored-program computer" and that is just wrong.

The Manchester Small-Scale Experimental Machine [often known as "Baby"] was the first stored-program computer, not EDSAC. Baby was operational on June 1948;
EDSAC didn't run anything until May 1949. Please don't play semantics with the word "working"; Baby worked, and in any case, all of these early compute