Posted
by
Unknown Lamer
on Monday December 10, 2012 @11:01PM
from the before-her-time dept.

First time accepted submitter MrBeeudoublez writes "Honored by a Google Doodle, Ada Lovelace is the first computer programmer. From the article: 'Ada's life as a member of British society (first as the daughter of Lord Byron, and later as the wife of the Count of Lovelace), brought her into contact with Charles Babbage, whose concepts for mechanical calculating machines (early computers) she took a great interest in. Ultimately, her work on explaining Babbage's design for the Analytical Engine resulted in her being credited as the first true computer programmer in history, even if the computer she programmed for was not actually built until 2002.'"

According to Wikipedia [wikipedia.org], the ancient Greek mathematician invented "a programmable cart that was powered by a falling weight. The "program" consisted of strings wrapped around the drive axle."

Perhaps this *will* diminish Ada's contributions:
http://www.answers.com/topic/ada-lovelace#Controversy_over_extent_of_contributions [answers.com]
Choice quote: "Not only is there no evidence that Ada ever prepared a program for the Analytical Engine but her correspondence with Babbage shows that she did not have the knowledge to do so."
The depressing lack of female role models in CS is a real problem, but revisionist history is not a valid solution.

Except that she didn't. Zuse envisioned Plankalkül, only he didn't technical means at that time to make it work. But I think he's a strong candidate for being the inventor of the notion of a formal, high-level, algebraic language for digital computer programming.

One huge difference.... Hopper actually wrote COBOL and put the concepts into practice by doing it.

Sure, you could read the notes of some of the early theoreticians who played with the idea of computing theory line Von Neumann and Turing, but I give more cred to those who actually did the work.

Ada Lovelace did precede the work by Konrad Zuse by nearly a century. Noting against Zuse, as he certainly did a whole lot to advance computing in the 20th century as well. Sometimes you need to go in baby steps until something actually happens, which is simply how scientific progress is made at all. If you've done something useful in this world with your life, it is that you've been able to explore the frontiers of human knowledge and perhaps contribute a little bit more knowledge to go just a little bit further.

One huge difference.... Hopper actually wrote COBOL and put the concepts into practice by doing it.

Yes, and COBOL already came after FORTRAN and LISP, the latter of which was much more powerful that COBOL, BTW, so how is that relevant to this?

Sure, you could read the notes of some of the early theoreticians who played with the idea of computing theory line Von Neumann and Turing, but I give more cred to those who actually did the work.

Except that was "the actual work". We don't celebrate Pythagoras for discovering the Pythagorean theorem, nor do we non-celebrate him for not having a pocket calculator to use it routinely in practice (who wants to calculate all those pesky square roots by hand?). We celebrate him for proving that it works. This is computer science, not business or politics. Or do y

On the contrary. Most incredible ideas are built upon the ideas of others that have contributed in the past. Isaac Newton would not have been able to write Principia Mathematica or even Optics without involving the work of a great many people before him, and Albert Einstein could not have conceived of General Relativity without having studied the work of Newton and other great physicists of the past.

The same thing is certainly true of the mathematicians who participated in the development of mechanical co

So? Da Vinci not only envisioned, but drew plans of, airplanes. But I don't see anyone crediting him with their invention. Because, you see, having a notion that something can be done is not the same as doing it. To be credited with inventing something, you have to actually make that something exist.

She was responsible for a lot more than that. She found the first computer bug, when she "...traced an error in the Mark II to a moth trapped in a relay. The bug was carefully removed and taped to a daily log book. Since then, whenever a computer has a problem, it's referred to as a bug." Here it is [navy.mil].

The problem isn't COBOL by itself. The problem is that COBOL is seen by some as the last and greatest language that has ever been invented and that nothing else could improve upon it other than minor refinements of the language. That is what gives us crazy junk like object-oriented COBOL [wikipedia.org].

Rethinking computer programming language design has given us a multitude of languages and conceptual models that we can apply to software design and substantially improve the quality of the software being produced. I wou

"I then suggested that she add some notes to Menabrea’s memoir, an idea which was immediately adopted. We discussed together the various illustrations that might be introduced: I suggested several but the selection was entirely her own. So also was the algebraic working out of the different problems, except, indeed, that relating to the numbers of Bernoulli, which I had offered to do to save Lady Lovelace the trouble. This she sent back to me for an amendment, hav

My architecture textbook summarized the issue like this: "Someone had to be the most overrated person in computer science." As someone else in this thread mentioned, Grace Hopper is a much more impressive example of a computer scientist hero. In any case, calling Ada the first programmer is definitely silly, since Babbage clearly wrote programs for his machine before she did.

I'm not so sure about this.From "The Information", by Gleik:Her exposition took the form of notes lettered A through G, extending to nearly three times the length of Menabrea’s essay. They offered a vision of the future more general and more prescient than any expressed by Babbage himself. How general? The engine did not just calculate; it performed operations, she said, defining an operation as “any process which alters the mutual relation of two or more things,” and declaring: “This is the most general definition, and would include all subjects in the universe.” The science of operations, as she conceived it,"is a science of itself, and has its own abstract truth and value; just as logic has its own peculiar truth and value, independently of the subjects to which we may apply its reasonings and processes. One main reason why the separate nature of the science of operations has been little felt, and in general little dwelt on, is the shifting meaning of many of the symbols used."

Symbols and meaning: she was emphatically not speaking of mathematics alone. The engine “might act upon other things besides number.” Babbage had inscribed numerals on those thousands of dials, but their working could represent symbols more abstractly. The engine might process any meaningful relationships. It might manipulate language. It might create music. “Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.”

It had been an engine of numbers; now it became an engine of information. A.A.L. perceived that more distinctly and more imaginatively than Babbage himself. She explained his prospective, notional, virtual creation as though it already existed:"The Analytical Engine does not occupy common ground with mere 'calculating machines'. It holds a position wholly its own. A new, a vast, and a powerful language is developed in which to wield its truths so that these may become of more speedy and accurate practical application for the purposes of mankind than the means hitherto in our possession have rendered possible. Thus not only the mental and the material, but the theoretical and the practical in the mathematical world, are brought into more intimate and effective connexion with each other."

I think we have to differentiate between a purely sequential program and one that can make decisions. For example in Lovelace's time automatic looms that were programmed with reels of punched paper existed, but they could only produce a fixed pattern from start to finish.

Such looms, along with mechanical pianos, mechanical dolls and Hero's cart are not computers. They are fixed function and their programs cannot respond to inputs. Lovelace was the first computer programmer.

Lovelace's contribution lay in her translation and annotation of Menabrea's description of the Analytical Engine, for which she wrote a short program.
Like the Difference Engines, the Analytical Engine was not built during Babbage's (or Lovelace's) lifetime. Unlike the Difference Engine, the Analytical Engine has never been built; the "computer [...] not actually built until 2002" was the Difference Engine No.2, designed by Babbage in the late 1840s, which is a calculator and not a computer. The date of 2002 is also misleading, and refers to the completion of the printer for the DE No.2 (in 2000) that was built by Doron Swade's group at the Science Museum in London between 1989 and 1991.
Furthermore, her husband was not the "Count of Lovelace", but rather the 1st Earl of Lovelace (formerly Lord King, Baron of Ockham, and then Viscount Ockham). 'Count' is not a British title of peerage; her title of countess was therefore the result of her marriage to an earl.

For those able to get into the IEEE paywall [ieee.org], there is a great summary of Ada's work in the IEEE Annals of the History of Computing. See "Lovelace & Babbage and the Creation of the 1843 ‘Notes'" by John Fuegi and Jo Francisin the Annals journal of October–December 2003./. 'ers may also enjoy the hollywoodized film version of her life (+ a little sci-fi) in the film Conceiving Ada.

Whatever Ada Lovelace's contributions were, she was not a programmer: programmers write programs to run on actual machines. The actual implementation is the hard part where all the nitty-gritty gets done. At best, she was a systems analyst.

I learned to program in FORTRAN on a time-shared Multics system with punched-card input and a printer operator behind a wall of cubbyholes to return me the compiler's verdict and my output. At times of system backlog (CS grad students monopolizing the system), I sometimes had to wait for hours to see the results of my efforts.

HOWEVER... Ada Lovelace wrote her program in 1843 [wikipedia.org] and it was only actually compiled and tested in 2002 [wikipedia.org], which makes for a latency of 159 years, making my own waits pale in comparis