A look back at the first decade of technology that led to the software revolution.

On June 21, 1948, at what was then known as the Victoria University of Manchester, software was born. On that day 65 years ago, a proof-of-concept computer called "Baby"—officially designated as the Manchester Small Scale Experimental Machine—ran the first "program" retrieved not from paper tape or hard-set switches, but from random-access memory.

Designed by Frederic C. Williams, Tom Kilburn, and Geoff Tootil, Baby wasn't the first programmable computer. But the technology proven in Baby, with its 1,024 bits of cathode-tube based RAM, would become the basis of the first commercial computers.

In celebration of Baby's birthday, here's a look back at the first decade of computing and the computers that led to the birth of software and the computing revolution that would follow.

The Z3, designed and built in 1941 by Konrad Zuse, was the first programmable computer. But it used 2,000 relays for all its functions, like those used in the phone switches of the time. It stored its programs on external tape. Used in statistical studies for aircraft engineering, the original Z3 was destroyed in 1943 during an Allied bombing raid on Berlin.

Designed by British engineer Tommy Flowers of the Post Office Research Station as a codebreaking machine, its first version, the Mark 1, was completed in 1943 using 1,500 tubes for processing rather than relays. An improved Mark 2, with 2,400 tubes (or "thermionic valves") was completed a year later.

A total of 10 Colossus computers were built for use during World War II, but they were destroyed, along with the blueprints, to preserve the secrecy of the codebreaking program. This is a photo of one of the operations at Britain's cryptographic center at
Bletchley Park. Enough details were left for the construction of a replica at Bletchley Park, now the home of the National Computing Museum.

The first electronic "general purpose" computer, ENIAC was used to calculate ballistic tables for use in aiming US Army artillery. Designed by J. Presper Eckert and John Mauchly of the University of Pennsylvania'a Moore School of Electrical Engineering, the "Giant Brain" was unveiled in Philadelphia in February of 1946—too late to play a role in the war.

In 1947, ENIAC was moved to the Army's Aberdeen Proving Grounds, where it remained in operation until 1955. It used 17,468 vacuum tubes and was programmed through the use of switches and cables.

Paul W Shaffer, curator of the University of Pennsylvania ENIAC Museum.

"Baby," the Manchester Small-Scale Experimental Machine (SSEM), was the first "stored program" computer. Built at the Victoria University of Manchester, the SSEM was designed by Frederic C. Williams, Tom Kilburn, and Geoff Tootil as a proof-of-concept machine for a new type of computer memory developed by Williams and Kilburn—an early form of dynamic RAM based on a cathode ray tube.

Programs run on Baby were set with a series of red input switches. Since it was just intended to be a test-bed, it could only perform subtraction and negation—if you wanted to add numbers, you needed to subtract their negative values.

The results of programs run on Baby were output as a binary matrix of dots on a cathode ray tube.

The success of Baby led to the Manchester Mark 1, completed in 1949. The Mark 1 was programmed using code punched into paper tape and generated output using a "printer"—an International Telegraph Alphabet No. 2 (ITA2) teleprinter that encoded text as holes punched in paper tape. ITA2 is the predecessor of ASCII. Alan Turing developed a way to use ITA2's five-bit code to encode programs and output in base 32 and for the computer to convert base 32 to binary to run the program.

The Mark 1 was the basis for the Ferranti Mark 1, the world's first commercial computer, which was released in 1951. The Ferranti also holds the distinction of being the first computer to generate music—it was programmed to play "Baa Baa Black Sheep" and "God Save the King" for a visit to Manchester by the BBC in 1951.

Promoted Comments

It's also notable that the idea of stored programs first became a widely-known concept in 1945 by way of the report "First Draft of a Report on the EDVAC", by John von Neumann. Hence, the concept of stored programs has come to be known as von Neumann architecture. Others on the EDVAC team claimed that von Neumann was simply re-stating ideas that were originally developed by others at UPenn (specifically, the Moore School of Electrical Engineering).

Is it just me, or is the 20th century the most mind-boggling 100 years of human history? It seems like things advanced so slowly up until that time, and then look at everything that was developed and changed between 1900 and 2000. Will this insane trajectory continue unabated, or was it really just a renaissance period like the Industrial Revolution? Or - does the progress only seem so significant because it's so recent, and maybe 500 years from now, the progression during that time will appear similar to any other period?

Holy moly, it must've been some sheer geniuses who invented/programmed these beasts. Truly amazing. And some of think we're hot stuff when we know how to program a bit. Seeing these ancient computers is really humbling. The pioneers of computing must've been mind-blowingly smart.

It's also notable that the idea of stored programs first became a widely-known concept in 1945 by way of the report "First Draft of a Report on the EDVAC", by John von Neumann. Hence, the concept of stored programs has come to be known as von Neumann architecture. Others on the EDVAC team claimed that von Neumann was simply re-stating ideas that were originally developed by others at UPenn (specifically, the Moore School of Electrical Engineering).

Is it just me, or is the 20th century the most mind-boggling 100 years of human history? It seems like things advanced so slowly up until that time, and then look at everything that was developed and changed between 1900 and 2000. Will this insane trajectory continue unabated, or was it really just a renaissance period like the Industrial Revolution? Or - does the progress only seem so significant because it's so recent, and maybe 500 years from now, the progression during that time will appear similar to any other period?

The 19th century seems very similar to the 20th in that regard. Consider the state of society & civilization at the founding of the US in comparison to 1900. There were massive changes, especially with regard to industrialization and the development of electricity. Also, the theoretical and scientific bases for a lot of 20th century inventions were developed in the 19th century.

Going on a century-by-century scale it seems like things really took off in the 19th century and accelerated in the 20th.

Holy moly, it must've been some sheer geniuses who invented/programmed these beasts. Truly amazing. And some of think we're hot stuff when we know how to program a bit. Seeing these ancient computers is really humbling. The pioneers of computing must've been mind-blowingly smart.

Actually they were quite simple compared to today's computers. Building them was a challenging project given the limitations of technology at the time, but programming early stored-program computers is very easy, if somewhat inconvenient, given the need to enter programs via switches, paper tape, cards, etc.

One of my favorite early computers, if not quite as old as Baby, is the IBM NORC. NORC, like many early computers is a decimal computer (not binary) and it had amazingly powerful machine instructions for I/O -- like one machine instruction to read a whole record off of tape into memory.

I don't think it is a coincidence that the explosion of tech started soon after the American revolution. Don't forget, the Patent office wanted to close in the early 1800's because "everything had been invented." But the political revolution of America, and the subsequent spreading of those ideal, is largely responsible for it all.

Consider: America as originally envisioned (and not what it has become) was all about enabling the everyman, as it were. You could get an Edison, a Gatling, and scores, if not hosts more, from any class, not just the landed, noble gentry. The industrial revolution started in Britain, but it took off in America.

Once the Civil War was over (the first modern war), innovation continued at breakneck speed to the electricity revolution. Marconi, Edison, Morrisson, Farnsworth--the concepts of radio, electric appliances, the telegraph, then the television--those were truly worldshaking revelations, bringing the world closer.

Then you get Ford, and the wright brothers. Automobiles and air travel. Also, bringing the world so much closer.

I would submit that after WWII, we've sort of stagnated a bit, actually. Someone born in 1900 would have be truly astonished to see 1950. Someone born in 1875 in a covered wagon in the western united states somewhere and who lived until dying in 1960 (85 years)? Think of what they saw: From the railroads and horse and buggy to catching a flight to Tokyo from New York and watching people orbit the world in space on that box in their front room. If they lived exceptionally long to 1970, they could watch a man walk on the moon. What a staggering amount of progress in one human lifespan.

The last half of the 20th century has mostly focused on one thing: computers, and the internet. Is that epochal, fundamentally life transforming, like the automobile or radio? I don't know. I don't see what is coming as the next great wave, either.

Governments are regaining their absolute power of the King over common people. There's really no difference between Obama, Putin, or Cameron and King George II or Louis XIV, is there? They all want absolute power over their citizens. This will most likely stagnate everything once again.

Holy moly, it must've been some sheer geniuses who invented/programmed these beasts. Truly amazing. And some of think we're hot stuff when we know how to program a bit. Seeing these ancient computers is really humbling. The pioneers of computing must've been mind-blowingly smart.

Actually they were quite simple compared to today's computers. Building them was a challenging project given the limitations of technology at the time, but programming early stored-program computers is very easy, if somewhat inconvenient, given the need to enter programs via switches, paper tape, cards, etc.

One of my favorite early computers, if not quite as old as Baby, is the IBM NORC. NORC, like many early computers is a decimal computer (not binary) and it had amazingly powerful machine instructions for I/O -- like one machine instruction to read a whole record off of tape into memory.

Yeah, the innovation has definitely occurred in different areas of computers over the years. I think that, in the 40's and 50's, most of the challenge was simply making a soundly-designed system. Things like data representation, memory, and I/O weren't standardized, so you had to solve minutiae related to all those problems just to make your machine function.

On the other hand, they couldn't make electronic components of high-enough quality to start considering multiple levels of cache, pipe-lining, out-of-order execution, and all the other stuff that really makes a CPU complex these days. They simply wouldn't have been able to budget enough logic gates to implement those features well.

So, as the first stored-program computer, could Baby do self-modifying code? It's interesting how the concept of stored-program computer eventually was usually extended with the idea that programs and code be treated uniformly, allowing self-modifying code, but recently we seem to be pulling back from that (in most cases) by making it much harder to write self-modifying code (causes large slowdowns because of cache-flushing etc in modern processors, and many OSes have started to write-protect program memory for enhanced security).

We haven't gotten rid of the idea of "stored program", but now recognize that allowing code modification is a rare and special privilege, granted only when necessary to things like the OS, compilers, and virtual machines.

The last half of the 20th century has mostly focused on one thing: computers, and the internet. Is that epochal, fundamentally life transforming, like the automobile or radio? I don't know. I don't see what is coming as the next great wave, either.

but programming early stored-program computers is very easy, if somewhat inconvenient, given the need to enter programs via switches, paper tape, cards, etc.

I guess, if you are an assembly level/machine code level programmer anyway. Maybe if you are a good general programmer? I admit I more just a scripter... But I think it is very difficult. I did some very, very basic assembly in on a simulated 8-bit risc in school. Sure, it is easy enough to do some basic math or whatever. But I can tell building a large scale (relatively speaking) program out of stuff at that low a level would melt my brain.

We haven't gotten rid of the idea of "stored program", but now recognize that allowing code modification is a rare and special privilege, granted only when necessary to things like the OS, compilers, and virtual machines.

In early stored-program computers, self-modifying code was both normal and expected. Limited hardware resources meant a limited number and complexity of instructions available, which meant that self-modifying code was usually the only way to perform subroutine calls, indexed addressing of memory, etc.

but programming early stored-program computers is very easy, if somewhat inconvenient, given the need to enter programs via switches, paper tape, cards, etc.

I guess, if you are an assembly level/machine code level programmer anyway. Maybe if you are a good general programmer? I admit I more just a scripter... But I think it is very difficult. I did some very, very basic assembly in on a simulated 8-bit risc in school. Sure, it is easy enough to do some basic math or whatever. But I can tell building a large scale (relatively speaking) program out of stuff at that low a level would melt my brain.

Assembly language programming is no different than any other type of programming. You just have to practice it enough to internalize the virtual machine model it offers and then it becomes natural. Of course even then it still requires more work to do anything, but it doesn't require extraordinary intelligence.

Also, early computers were easier to program at the machine level than today's computers and the extremely small amounts of memory available meant programs didn't get too big.

Actually they were quite simple compared to today's computers. Building them was a challenging project given the limitations of technology at the time, but programming early stored-program computers is very easy, if somewhat inconvenient, given the need to enter programs via switches, paper tape, cards, etc.

Sure, they are simple given what we know NOW, but I couldn't imagine figuring that stuff out from scratch. It took quite a mind to even comprehend the sheer IDEA of programming, let alone the very first implementation of such a thing.

Huh? No, it's not. This article is about Baby, a british computer that was the first to store its programs in memory. Grace Hopper was American and first worked on an American computer, the Mark I, which was NOT a stored program computer. It used paper tape.

Import TableOne section of note is the import address table (IAT), which is used as a lookup table when the application is calling a function in a different module. It can be in the form of both import by ordinal and import by name. Because a compiled program cannot know the memory location of the libraries it depends upon, an indirect jump is required whenever an API call is made. As the dynamic linker loads modules and joins them together, it writes actual addresses into the IAT slots, so that they point to the memory locations of the corresponding library functions. Though this adds an extra jump over the cost of an intra-module call resulting in a performance penalty, it provides a key benefit: The number of memory pages that need to be copy-on-write changed by the loader is minimized, saving memory and disk I/O time. If the compiler knows ahead of time that a call will be inter-module (via a dllimport attribute) it can produce more optimized code that simply results in an indirect call opcode.

RelocationsPE files do not contain position-independent code. Instead they are compiled to a preferred base address, and all addresses emitted by the compiler/linker are fixed ahead of time. If a PE file cannot be loaded at its preferred address (because it's already taken by something else), the operating system will rebase it. This involves recalculating every absolute address and modifying the code to use the new values. The loader does this by comparing the preferred and actual load addresses, and calculating a delta value. This is then added to the preferred address to come up with the new address of the memory location. Base relocations are stored in a list and added, as needed, to an existing memory location. The resulting code is now private to the process and no longer shareable, so many of the memory saving benefits of DLLs are lost in this scenario. It also slows down loading of the module significantly. For this reason rebasing is to be avoided wherever possible, and the DLLs shipped by Microsoft have base addresses pre-computed so as not to overlap. In the no rebase case PE therefore has the advantage of very efficient code, but in the presence of rebasing the memory usage hit can be expensive. This contrasts with ELF which uses fully position-independent code and a global offset table, which trades off execution time against memory usage in favor of the latter.

The last half of the 20th century has mostly focused on one thing: computers, and the internet. Is that epochal, fundamentally life transforming, like the automobile or radio? I don't know. I don't see what is coming as the next great wave, either.

I don't think it is a coincidence that the explosion of tech started soon after the American revolution. Don't forget, the Patent office wanted to close in the early 1800's because "everything had been invented." But the political revolution of America, and the subsequent spreading of those ideal, is largely responsible for it all.

No. You are revising history with an America-only viewpoint. The Industrial Revolution had nothing to do with America nor its patent office. It started in Great Britain (with King and all!) in the late 1700s. http://en.wikipedia.org/wiki/Industrial_Revolution. At that time America, as a continent, was still mostly a remote backwater, though with plenty of room to grow.

Someone born in 1875 in a covered wagon in the western united states somewhere and who lived until dying in 1960 (85 years)? Think of what they saw: From the railroads and horse and buggy to catching a flight to Tokyo from New York and watching people orbit the world in space on that box in their front room. If they lived exceptionally long to 1970, they could watch a man walk on the moon. What a staggering amount of progress in one human lifespan. .

You're talking partly about the amazing progress sometimes known as the Second Industrial Revolution (http://en.wikipedia.org/wiki/Second_Ind ... Revolution). There have been amazing developments since then in every conceivable field. Thanks to several of these you now have access to the worlds knowledge from a tiny device carried in your pocket, lifespans are longer, we have not one but two rovers on another planet, are probing the depths of space and time, are discovering the tiniest of particles. We live in amazing times.

The last half of the 20th century has mostly focused on one thing: computers, and the internet. Is that epochal, fundamentally life transforming, like the automobile or radio? I don't know. I don't see what is coming as the next great wave, either.

You don't think the Internet is a fundamental shift in the trajectory of humanity? You must be the most jaded person who has access to it. Or you were born in the 90s.

The last half of the 20th century has mostly focused on one thing: computers, and the internet. Is that epochal, fundamentally life transforming, like the automobile or radio? I don't know. I don't see what is coming as the next great wave, either.

You don't think the Internet is a fundamental shift in the trajectory of humanity? You must be the most jaded person who has access to it. Or you were born in the 90s.

I would add shopping mall. It is the tangible improvement in material abundance. With internet shopping you can easily research and buy the most suitable items from another part of planet.

Google, EBSCO and Wiki have enabled that you can quickly gain a very specific knowledge. Internet debates also help in increasing a very specific knowledge base.

On the downside, it is harder to live the 1970s/80s American dream, because of the economic downturn. Manufacturing workers have a worldwide competition in tradable goods...

It's also notable that the idea of stored programs first became a widely-known concept in 1945 by way of the report "First Draft of a Report on the EDVAC", by John von Neumann. Hence, the concept of stored programs has come to be known as von Neumann architecture. Others on the EDVAC team claimed that von Neumann was simply re-stating ideas that were originally developed by others at UPenn (specifically, the Moore School of Electrical Engineering).

Since Eckert and Mauchly were under government contract, they couldn't appear in any articles. Von Neumann was not, so he was free to promote himself.

Since Eckert and Mauchly were under government contract, they couldn't appear in any articles. Von Neumann was not, so he was free to promote himself.

Eckert and Mauchly were frauds and their fraud was why Honeywell was able to throw down Remington Rand's patent.

John Vincent Atanasoff at Iowa State College constructed a desk sized special purpose computer several years before ENIAC came into being. John Mauchly visited JVA's lab in Ames and spent an afternoon examining the machine and seeing it in operation.

In some ways the ABC was more advanced then the ENIAC despite being special purpose. It was the absolute first computer to use only electrical logic, and to perform calculations in base 2. It also implemented the first form of regenerative memory, a rotating drum of capacitors frequency locked to the computer's logic gates by using the 60hz signal of the power supply. The compute elements consisted of banks of five and six vacuum tubes joined together in removable bunches each implementing a chunk of logic. This made it easier to service and test the tubes as each group behaved like a logic IC.

The usage of regenerative memory meant that the tube count was quite low (roughly 300) as intermediate results could be stored between clock cycles; the entire machine was roughly the size of a large workbench and operated on outlet power.

Compared to other computers of the time it was quite slow as it had to do operations over multiple cycles. In essence it was the primogenitor of RISC style computing.

Since Eckert and Mauchly were under government contract, they couldn't appear in any articles. Von Neumann was not, so he was free to promote himself.

Eckert and Mauchly were frauds and their fraud was why Honeywell was able to throw down Remington Rand's patent.

John Vincent Atanasoff at Iowa State College constructed a desk sized special purpose computer several years before ENIAC came into being. John Mauchly visited JVA's lab in Ames and spent an afternoon examining the machine and seeing it in operation.

In some ways the ABC was more advanced then the ENIAC despite being special purpose. It was the absolute first computer to use only electrical logic, and to perform calculations in base 2. It also implemented the first form of regenerative memory, a rotating drum of capacitors frequency locked to the computer's logic gates by using the 60hz signal of the power supply. The compute elements consisted of banks of five and six vacuum tubes joined together in removable bunches each implementing a chunk of logic. This made it easier to service and test the tubes as each group behaved like a logic IC.

The usage of regenerative memory meant that the tube count was quite low (roughly 300) as intermediate results could be stored between clock cycles; the entire machine was roughly the size of a large workbench and operated on outlet power.

Its limitation was that its logic was designed for solving linear equations would have needed extra operations in order to be Turing complete.

It's not that I don't believe you, just that I've read some books on how the two were made to look like frauds that I ask where you got your information, and if you can provide a link if possible. I'd like to read it for myself just to get a better understanding of it, I haven't gone too deep into this topic.

It's not that I don't believe you, just that I've read some books on how the two were made to look like frauds that I ask where you got your information, and if you can provide a link if possible. I'd like to read it for myself just to get a better understanding of it, I haven't gone too deep into this topic.

"Eckert and Mauchly did not themselves invent the automatic electronic computer, but instead derived that subject matter from one Dr. John Vincent Atanasoff." —U.S. District Judge Earl R. Larson, ruling on Honeywell v. Sperry Rand, April 1973

It's not that I don't believe you, just that I've read some books on how the two were made to look like frauds that I ask where you got your information, and if you can provide a link if possible. I'd like to read it for myself just to get a better understanding of it, I haven't gone too deep into this topic.