Post navigation

The Cloud Is Much, Much Older Than You Think | Motherboard

The cloud happened fast. It seems obvious of course because most things in computing happen fast, most especially when it comes to the past decade. The cloud seemed to happen fast while also achieving its ubiquity in the background, at least from a consumer’s point of view. Circa 2016, outsourcing computing to an Amazon, Microsoft, or Google cloud service is just obvious—almost unsaid.

As a concept, the cloud is much older than you might imagine—tomorrow happens to be its 76th birthday. That’s right, the first cloud computing demonstration took place in 1940, an event predating by several years the general-purpose digital computers that would inform our very sense of what a computer is. At the American Mathematical Association Meeting in Dartmouth, New Hampshire, an operator entered numbers into a teletype terminal representing calculations devised by often incredulous meeting attendees, which were then sent via phone lines to New York City, where they were received by George Stibitz’s Complex Number Calculator.

George Stibitz was a key figure in the development of the earliest digital computers, to say the least. In 1942, he’d suggested in a memo that the term “pulse”—used generally to describe computing with binary-encoded (on/off, 1/0) information—should be replaced with “digital.” In the 1930s, Stibitz built a computer based on electromechanical relays known as the Model K, which performed binary addition operations. It was based on components salvaged from the hardware scrap heap at Bell Labs, where he was employed as a research mathematician.

In 1939, Stibitz completed the Complex Number Calculator, which, as one might guess, was designed to do calculations on complex numbers. It contained parallel circuitry for handling both real and imaginary components.

“What George Stibitz realized was, that a relay calculator could perform not just one but a sequence of calculations, with relay circuits directing the order and storing interim results as needed,” explains an entry at History of Computers. “Specifically, it could perform the sequence of operations required to perform multiplication and division of complex numbers: two mathematical operations that researchers elsewhere at the Bell Labs frequently performed in connection with filter and amplifier design for long-distance circuits. At Labs in the 1930s, a roomful of human ‘computers’ figured complex number quotients and products using commercial mechanical calculators.”

Here, all of those operations undertaken by human calculators were replaced by circuits in a single machine. Complex multiplication required about six individual computations, while complex division required about 12. To string them together, the machine needed only to implement a small amount of intermediate memory. The Complex Number Calculator worked well for its given tasks, but it was not programmable.

In a 1967 article for Datamation, Stibitz himself described that first cloud connection, stretching hundreds of miles from Dartmouth to New York. “With my usual genius for making things more difficult for myself and others,” he wrote, “I suggested direct telephone operation.” Input panels were installed at the auditorium and arrangements were made for a dedicated telephone circuit between the two locations that would be protected from interruptions.

The crowd of mathematicians was sufficiently wowed by the sight of a teletype machine returning difficult results from some distant digital machine in mere minutes, but the Complex Number Machine was enormously, prohibitively expensive. From design to construction to debugging had cost $20,000—”an astronomical sum in those days,” Stibitz wrote. Bell Labs wasn’t going to invest further.

Given the computers of the time—and those for several decades thereafter—cloud computing would seem to make a lot of sense. The computers were ungainly and immobile, yet here was this country newly crisscrossed with media for electronic communications. In a sense, the contemporary cloud seems as much a late-breaking technological revolution as it does an acknowledgement of what was already there.