Someday we'll look back at the rigid grid of orthogonal rows/columns of database tables with the same pity with which we look back the character grids in which we coded COBOL programs.

Practically all of COBOL was replaced by the printf() command. Which is still the ultimate target for most programs written today, even if the printf() is wrapped in some higher level output function. I'm looking forward to all of all database and relations someday residing in a single invocation with a comprehensive, yet simple interface. Probably a flowchart.

The university I went to stopped teaching it about 20 years ago, and most programmers of COBOL with much in the way of practical real world development experience retired long ago. In fact, a lot of them came out of retirement for a few months or a year prior to Y2K because the money offered was so good.

Today, there are still COBOL jobs advertised, and they largely go unfilled. It could have something to do with the fact that there are so few people remaining with the skills, and something to do with the fact that many of them are with banks who are notoriously cheap on IT salaries. The few remaining good COBOL people on the market go into contract positions that usually begin at about $70/hour. I kid you not.

It's a lot of typing, writing COBOL, and the code is at times boringly simple, but if someone is out of work and seriously looking for an IT position, learning it would not hurt. I predict there will still be some call for it 20 years from now.

Designing a language that is simple to use and results in easy to read and easy-to-understand code is the right idea. For a first attempt, COBOL wasn't bad. But from a modern perspective, it has lots of problems. Also, COBOL (like Ada) incorrectly assumes that writing more text makes code clearer; it does not.

The best designed language overall is probably still Smalltalk: it's easy to read, easy to learn, and was designed from the ground up with the idea of being used in an interactive programming environment. It also strikes a better balance between verbosity and expressiveness. Just about the only thing that Smalltalk got wrong was to use strict left-to-right evaluation for arithmetic expressions; a better compromise might have been simply to require arithmetic expressions to be fully parenthesized.

Be sure to pay homage to the inventors of the other two ur-languages; Alan Backus, and John McCarthy. Without them, we'd still be programming in assembler, and there probably would be only a world-wide market for 5 computers.

After working at a shop that wrote COBOL compilers for machine translation into C, I can tell you can it is interesting work, but by no means simple. What a lot of people misunderstand is that COBOL can react slightly differently under each IBM OS that was shipped. Writing a lexer/parser is easy, but the memory mapping and statement convolutions in COBOL were down-to-the-bit tricky.

COBOL was a huge exercise in data massaging, where hundreds of lines were used to map data into a structure which then fed a series of output channels, like a printer, screenmaps or files. Throw in a simple set of arithmetic, but apply it in hacker-esque ways to date bits, for example, and you're scratching your head a lot of the time.

I've read all the bashing here, but one must understand that COBOL's perspective of the world was far narrower than today's. Business data was a simple number-crunching exercise, not much further than the trajectory calculations of the earliest digital computers. I have some one of IBM's computer catalogs from 1971, a longwinded tome filled with secretary-models, low-level circuit specifications, and giant machines that would make a great B-movie these days.

You don't really understand how high level languages work (even when they're spellchecked). The COBOL forms still mirror the punchcards, even though there are no punchcards anymore.

I learned COBOL a quarter century ago, when there were still punchcards (mainly punched tape, but still plenty of cards). printf() mirrors the punchcards. And C++ and Perl, for example, are highlevel languages that still use the grid.

A truly highlevel language would present APIs independent of the underlying HW artifacts. Not just present a portable union of many common HW artifacts.

I've written highlevel (and lowlevel) languages. I've programmed assembly code, even in hex machine language (handcompiled on graph paper), starting in the 1970s. All the way up to 4GL IVR menuing. And plenty of - way too much - COBOL. COBOL looks archaic, though we don't notice its legacy in printf(). I'm looking forward to the same convenient nostalgia for databases down the road, because lots of DB programming and DBA reminds me of the slavery to the machine that COBOL required.

If you want to be stuck in the 1970s, you're welcome to it. Give my regards to the 8-track cassette of The Wiz.

Back in high school I attended a talk by Admiral Hopper where she passed around a wire about 30 centimeters long and explained to us "this is how far light, or any electromagnetic signal, can travel in one nanosecond." That illustration has always stayed with me, it helps to explain a lot of the limitations inherent in hardware now that CPU speeds have become so fast.

For example, for a 3GHz CPU (.33 nanoseconds per clock cycle), electricity can only travel 10cm in one clock cycle. It's amazing that CPUs can do complex arithmetic when electrical signals inside the chip can only travel 10cm in that amount of time. Wonder why the CPU stalls when there's an access to main memory? Just look at your motherboard and gauge how far your memory is from the CPU, distance alone explains 4-5 clock cycles of the total delay.

I think I am the only who ever publicly addressed Admiral Hopper as Admiral Grace. It was 1981, she was speaking at North Island Naval Air Station. She made a comment about how desktop computers would be more powerful than the current 1980's IBM 360 Mainframe computers, (this WAS the big dog in those days for mortals like myself). After the presentation, I had the chance to ask her why she thought mainframes would be on desktops, her reply was, "Because of the variety of computers being developed today are mainly microcomputers; And that the mini computers of today were the mainframes of yesterday." I was so nervous, I replied, "Thank you Admiral Grace", and left immediately. The implications of her words startled me, I understood her implications, I was changed, forever. 25+ years later, I look back and remember the nervousness, the times, the bosses who warned on switching to micros, and what the future would be. Her words, would guide my life for the next 25+ years.

When I was a sophomore in high school our school dedicated its computer lab to her. Her family had a summer place near where I went to school, and she came to the school for the dedication. As one of the geekiest computer people in the school I was chosen as the token pupil to be with her when pictures were taken, etc. I think she was 80 years old when that happened, and she was still sharp as a tack. Her official title at the time was Commodore, and I remember referring to her that way. I also recall her making some comments about programming, etc. that I think helped push me into a career of computer programming before I even realized it. I really wish I had known more about her at the time I met her since I probably would have paid a lot more attention...

COBOL had one language construct that I really liked, 'Move Corresponding.' Much of the work COBOL did centered around data record structures. The 'Move Corresponding' statement allowed you to move all the fields from one record to another whose names matched.

> Examine x replacing y with z - is there a current equiv without 2 or more steps?>> $x =~ s/y/z/g;

While I have to congratulate you for showing the old dog how it is done today, i feel a little nervous. In the olden days, folks at least talked to the machine in something resembling natural language. And now? We say "Dollar ex equal wiggle ess slash why slash zed slash gee semi-fucking-colon". Is that progress?

I used to live in the same apartment complex as her in Pentagon City. The owner built a small park in her honor, but the memorial plaque does not mention COBOL or bugs. I suppose out heroes cannot be perfect.

You were close to greatness living in Pentagon City near Grace Hopper. And shopping in the same underground center as she did.

I was lucky in school to meet Captain Hopper (Captain Cobol - she was promoted many times later.). In 1979 Captain Hopper visited my university, then ETSU and now TAMU Commerce, bringing her loop of microsecond wire. The computer club that night had a drink session for my one and only time as bartender and I served Ms Hopper a drink. Also there was Gary Shelley of "Structured Cobol" fame.

I am sure this story of Grace Hopper has been repeated across the USA in numerous colleges and universities in the years before, during, and after. Grace Hopper was a tireless and wonderful advocate for COBOL.

God bless her soul,Jim Burke

PS There were other Cobol promoters out there such as Gerald Weinberg. As far as I know Gerald Weinberg mostly visited corporations with the message.