Posted
by
samzenpus
on Monday December 09, 2013 @01:43PM
from the original-programmer dept.

SternisheFan writes "Monday's Google Doodle honors computing genius Grace Hopper (remembered as a great pioneer in computing, as well as in women's achievements in science and engineering), on what would have been her 107th birthday, doodling her right where she spent much of her time – at the helm of one of the world's first computers."

Chips Ahoy: Do you think the current popularity of micros is just a fad?

Hopper: No, the big mainframes are going to disappear. In fact, I intend to scuttle them. They have to go. They’ll be too slow. We’ll build systems of computers. It will be a whole bunch of micros, and they’ll all call each other up and talk. If you use a big mainframe, first you have to do inventory and then you do payroll and so on. You might just as well have a micro doing each of those jobs all working in parallel. That’s the way you get the speed. The big pressure is going to be on faster answers. There never was a good reason for putting inventory and payroll on the same machine. The only reason you did it was because you could only afford to own one computer. That’s no longer true. The micros are as big [in terms of processing capacity] as mainframes were only 10 or 12 years ago. Back then a big mainframe had 64K. That’s smaller than today’s micros by a long shot.

Chips Ahoy: Is there a limit of what micros can do for us?

Hopper: They’ll only be limited if our imaginations are limited. It’s all up to us. Remember, there were people who said the airplane couldn’t fly.

Though she missed the issue of reliability. There's a reason certain mission critical setups don't run simple micros. And supercomputers that use Xeon and suchlike are more than just micros in how they link things together. But apart from those situations, yes, she was right: now micros dominate.

Uh, no it wasn't. Because of her rank there was actually problems promoting fleet operations people to Admiral positions. There's only so many "Admiral Chairs" and she occupied one and she wasn't in fleet operations.. This was one of the main reasons that she was retired by the Navy which she understood when she received the Rear Admiral rank in 1985 and she subsequently retired in 1986.

I think we can blame all the faults of COBOL on the fact that she wanted it to be human readable by business managers. What would your programming language look like if the Pointy-Haired Boss had to be able to understand it?

I knew somebody would bring that up. In defense of COBOL, 1. Look when it was invented. 2. Look how much staying power it has. 3. Look at the train wrecks caused by later efforts to make easier, more readable programming languages.

I think we can blame all the faults of COBOL on the fact that she wanted it to be human readable by business managers. What would your programming language look like if the Pointy-Haired Boss had to be able to understand it?

PL/I pioneered the free-form syntax used by C, C++, PHP, Java, C# and most other modern languages.What other sins has it committed?

PL/1 incorporated all manner of ugly ways of doing things, borrowing some I/O from COBOL or having something else hacked into it. Inexplicably I had to go back to using PL/1 on one system implementation because the I/O library could hack large I/O buffers, where most other compiler libraries were incapable and was reminded what a sloppy mess of a language it was. You could do just about anything, but it didn't do much of it elegantly. Unless you documented heavily it was difficult to come back to and fig

At the time you had... Fortran... and Assembler. COBOL was a godsend to the business community. Because of it companies actually invested in computer equipment to do things... that investment reduced the cost and increased its capabilities. Eventually allowing the creation of that smart phone in your pocket. If it wasn't for COBOL it is doubtful that companies would have made the investments.

Having programed in both COBOL and Fortran... I'll take COBOL for anything business related.

Yes, it's verbose. But, it was a product of it's time. And quite the amazing language if you know what you are doing with it.

At the time you had... Fortran... and Assembler. COBOL was a godsend to the business community. Because of it companies actually invested in computer equipment to do things... that investment reduced the cost and increased its capabilities. Eventually allowing the creation of that smart phone in your pocket. If it wasn't for COBOL it is doubtful that companies would have made the investments.

Having programed in both COBOL and Fortran... I'll take COBOL for anything business related.

Yes, it's verbose. But, it was a product of it's time. And quite the amazing language if you know what you are doing with it.

Anyone who has actually been suffered to write business applications in FORTRAN IV* would rather be disemboweled by a pack of rabid were-weasels than have to do that again and COBOL would appear to be a gift from Heaven.

I began my education with, what I considered being taught a load of dead or dying languages, while Object Oriented languages were just on the horizon and Pascal and c were gaining degrees of acceptance. c is still around, but I haven't heard from Pascal in ages - it was fiddly, like Modula2 and seemed to embrace the wordiness of COBOL over the conciseness of c. I've converted systems written in COBOL and at least they were readable - what the coder was doing. FORTRAN business apps are nearly unintelligible.

Anyone who has actually been suffered to write business applications in FORTRAN IV* would rather be disemboweled by a pack of rabid were-weasels than have to do that again and COBOL would appear to be a gift from Heaven.

Ah, but a Real FORTRAN programmer can write FORTRAN in any language. You ain't see nothing until you've seen a payroll system written in FORTRAN... using COBOL.

There is 'The Assembler', and 'Assembler'. The 'The' (definite article) in 'The Assembler' is the thing (program) that assembles Assembler (language) into object code. That is then merged with the linker to a run-time to become an executable. Modern Assembler languages, and by extension 'Their Assemblers', contain macro capabilities - very similar in nature to #include in C (and other such languages). But, back in the 50's, when COBOL was written,

Sigh yourself. The 'assembler' tool does not magically read your mind and spit out code. You must actually provide input to the assembler. And this input is in, wait for it, Assembler Language! Shocking, I know!

I think we can blame all the faults of COBOL on the fact that she wanted it to be human readable by business managers. What would your programming language look like if the Pointy-Haired Boss had to be able to understand it?

Thank you for that.

You see, Ms. Hopper, being ahead of her time in MANY respects, knew that programming should be easily done in a human readable fashion.

Programming computers should be easy. Having difficult to learn languages defeats the purpose of these machines. Being able to program these things should be easy to everyone and the fact that it STILL isn't shows the ineptitude of the computer science world - or arrogance (dude, computers SHOULD be hard to program because it's for smart people or some such nonsense).

Computers are a tool, The fact that computer languages haven't evolved much since the 1960s is pretty sad.

..

Please oh please post a flame that languages have evolved so that I can spank you hardily - 50 years and we're still typing esoteric computer code?! Seriously?

Agreed, I'm a non techie, and todays computers are less fun and useful than in the win95 era, when I could actually do things with them. Heck, the C64 held far more interest for the average person, you could at least easily learn how to program in basic.

Computers are hard to program because computers are stupid. That's why the most deadly words ever spoken in the industry are "All You Have To Do Is..." It's hard enough to get other humans to do things right when you tell them what to do, much less computers.

Some programming languages look more or less like English such as COBOL. Some look more or less like mathematical notation, such as FORTRAN or APL. Some are basically mathematical/symbolic notations on drugs. Each has its advantages, but none of them -

To a non-programmer, what exactly about that 'for' statement implies 'loop'? It could just as easily mean 'if i is between 1 and 100 do work here'. On the other hand, the COBOL example seems pretty unambiguous.

Nobody ever claimed COBOL was intended to make life easy for developers. COBOL was designed so the people who actually bear responsibility for the business (and who are certainly not the developers) can verify that their business processes are implemented to their liking. These people include not only bosses, but also financial people, lawyers, auditors, etc.

What would your programming language look like if the Pointy-Haired Boss had to be able to understand it?

Lots of comments, very little actual code.

When I was in school, we had to have over 50% comments or the TA wouldn't even try to grade your program. The habit was a good one, and although I don't always get to the 50% I still put a lot of comments in my code.

Come to think of it, making your code understandable by the PHB is not a bad goal. If the PHB can understand what you are doing, the next poor programmer (which might be you a few months after you have forgotten the project) will have an easier job fixing something.

I was commenting on MY comments and those of whom I work with. In general, our comments are related to the code they document, by design, by policy and by routinely checking them in code reviews. Do they always match 100%? No, but that is the exception and not the rule, at least where I work. I fully get that my current experience is *NOT* the norm. Out of the 9 places I've worked as a programmer, my current employer is certainly at the top of the list for producing quality code. Only a few have rivaled

"Having a conversation with the sketchbook" is a notion in visual design like architecture or construction. For little scripting tasks, I find talking to the comments an exercise in clarifying what I am trying to do and why. The intention, the way it fits the bigger picture. The code is the reality, the comments are the mental intention. Unless it is a very well understood area where to be a programmer you really have to know the domain and the problems very well, so the code is immediately obvious to the t

What would your programming language look like if the Pointy-Haired Boss had to be able to understand it?

Lots of comments, very little actual code.

When I was in school, we had to have over 50% comments or the TA wouldn't even try to grade your program. The habit was a good one, and although I don't always get to the 50% I still put a lot of comments in my code.

That's a very bad habit, and one that you should break. Comments are evil. They are occasionally -- very occasionally -- a necessary evil, but still evil. I'll explain below.

Come to think of it, making your code understandable by the PHB is not a bad goal. If the PHB can understand what you are doing, the next poor programmer (which might be you a few months after you have forgotten the project) will have an easier job fixing something.

Absolutely, you want your code to be extremely easy to understand. In fact, that's the #1 goal, even ahead of doing the correct thing, because bugs are more likely to get fixed than unreadability, and in the long run they cost less. Comments are one way of achieving readability, but they're a crutch. Worse, they're a crutch with a bui

I don't disagree but I'm reading someone's code at the moment, some routines in R, and the word "training" is in the name of a routine, but it doesn't explain in which sense of the word "training". So now I have to try to figure out the meaning of the result. I'm sure it was obvious to the author. Maybe fine grained comments are bad, but an overall story explaining in ordinary words the overall intent and picture would be nice. Anyway, IANAP.

Yes, an overarching narrative is very important. If you're using something like Javadoc or Doxygen, the documentation comments are a fair place for it, but what's often even better is to put it in a design doc, which should be linked from or stored with the code. Design documents also get outdated and become wrong if not maintained, but they should be at a sufficiently high level that this happens very slowly.

I think we can blame all the faults of COBOL on the fact that she wanted it to be human readable by business managers. What would your programming language look like if the Pointy-Haired Boss had to be able to understand it?

How many programmers of that era were expert in modern corporate accounting, law, banking, business practices and procedures, as they had evolved over the past three or four centuries --- and not merely knowledgeable, but credentialed, as a C.P.A., for example?

In turn, how many accountants could have read and validated FORTRAN code for accounts receivable?

What would your programming language look like if the Pointy-Haired Boss had to be able to understand it?

What would it look like? Each line must have a key in the first column:

Increases profits

Cuts costs

The competition is doing it

Helps us meet our ship date

Makes you look good to the VP

or it will fail to compile.

v.2 will have the compiler generate a histogram of keys for a given source file, and the make tool would actually generate graphs. I'm sure a real programming language designer can improve on the design.

Here I am pointing out that what many people consider to be a problem with COBOL, that it is too wordy, has in fact a functional purpose. Although, using the word PLUS for '+' did go a bit too far. I have to assume most business managers know what '+' means.

You may scoff at COBOL, but she pioneered the idea of using a more human-friendly notation instead of machine language and its cousin, assembler. Her experiments were the precursor to Algol, which shaped all the imperative block-oriented languages we use today, including C, Java, VB, Pascal, etc.

And it made software more vendor-independent as the languages were not tied to a specific machine architecture, unlike machine code and assembler.

Before that, many scoffed at the idea of "dumbing down" programming with English-like syntax, fearing it would waste resources and invite poorly educated riff-raff into the field. (Well, maybe it did:-)

Perhaps Grace didn't get it quite right on the first try, but she helped spark a computer language revolution that led to better tools down the road. She tested waters others feared.

I got to MEET her. I was a faculty brat at Syracuse where she was a graduation speaker, and through a lot of begging, my dad got me a seat at the speakers table, and she held forth, drinking straight scotch, smoking unfiltered Pall Malls and swearing for two hours. One of the best moments in my life. I'll never forget it, and she's been an inspiration through my career.

I have fond memories of her. On the one occasion I got to see her in person, I was a member of a student ACM chapter, and she was our guest speaker. I remember that she had very strong opinions, particularly about IBM.

At the time, the System 360 was all the rage, and had blue cabinets. She brought an 8080 to the presentation in a small, blue plastic case, commenting that she'd heard computers came in blue boxes. She also commented (again about the 360) that it couldn't be much of a machine, since it spent half of its time talking to itself, a reference to the operating system overhead.

I've often wondered what she'd think of computers and operating systems today, particularly Windows and Linux.

* credited with popularizing the term "debugging" for fixing computer glitches

You left out the story of why it's called debugging! From Wikipedia [wikipedia.org]:

While she was working on a Mark II Computer at Harvard University, her associates discovered a moth stuck in a relay and thereby impeding operation, whereupon she remarked that they were "debugging" the system.

The first version of this Doodle [google.com] got the algorithm to compute age wrong (!). The original version of the Doodle used the COBOL expression

SUBTRACT CurrentYear FROM BirthYear GIVING Age

which actually computes the negative of the age (for most people born after Christ, anyway).

I wondered whether this might be a nod to her pioneering work in software debugging, as also referenced in the flying moth at the end of the animation, but since Google has since corrected the bug [google.com], it seems even the mighty Google still sometimes commits the simplest of programming errors. (Right on their main page and logo, too. Oooops. I suppose there's also the view that the code was wrong because it was a woman doing the coding. You misogynist Google bastards.)

Nope, stopped reading reddit long ago after discovering the mods' penchant for silently censoring comments and entire story threads they didn't like.

That the original Doodle might have accurately depicted poor-but-industry-accepted COBOL coding practices (i.e., approving and committing code where the program logic is wrong but the result of the calculation may still appear correct if an invisible dependency on a separate section of the program happens to work out in the programmer's favor) is either deeply

Anybody on Slashdot who doesn't know who she is... get the fuck out, because you're on the wrong website.

No shame in being a newbie as long as one is *trying* to be a self teacher and tries to not be a newbie forever. In this case, the shame is on the one trying to run newbies off.. You are going to die a lonely death.

There's nothing wrong with not knowing something important; the sin is not lifting a finger to find the fact out -- e.g. people seemingly incapable of typing a name into wikipedia and reading the first paragraph (and then whining about it in the comments instead in hopes someone will spoon-feed it to them). Those are the people who need to get lost.

The realm of women is whatever they want it to be. There is substantial cultural inertia, especially in places like the Southeast US, that impedes young women from trying to "do computers and tech stuff", and so the lampshading of legitimate achievements made by folks like Hopper is no bad thing. Yes, were she male she wouldn't get quite as many accolades, but so? She was a pioneer, and there is no shame in pointing out to today's young women "want to become a computer scientist? You're in good company."

I have as much distaste for postmodern cultural wankery as you, but informing women that they are welcome in the scientific community ain't that.

I taught computational physics for a couple of years as a grad student. Of the students that I considered absolutely top-notch, about 60% were women (where the difference from 50% is statistical noise). As far as physics went, they were basically the same as the men.

There is a documentary about this, which I saw on Netflix (I don't remember if it was streaming or DVD) called "Top Secret Rosies." I knew about that history from my physics and math background, but my wife was amazed to hear it. Anyway, the film is worth watching.

I had the pleasure of speaking with one of the Rosies some years back. I was trying to line her up as a speaker at my workplace's "Women's Issues" month, but she lived too far away and the company wouldn't buy the plane ticket.

They have an association, naturally...they call their daughters Rosettes and their sons Rivets.

When I was at MIT in the 1970s there was not an official computer science major yet, even though were several prominant computer science labs that every student wanted to play in. Compuer science was a minor in EE, ME, and business. In 1980 MIT recognized a formal CS degree.
I wondered if the procrastination was due to the "taint" of programming being a trade-school craft and not a real scientific discipline. And that in turn due to its early female participation.

Not really. Both before and after WWII “computers” was a female dominated field, like nurses or teachers.

After WWII, well, I am not sure displaced is the right word. We are talking about a rapidly evolving field. Most of the jobs that men took in the computer field just did not exist at the start of the war. Virgin ground so to say. Not so much as displacing but rather being left behind.