October 14, 2011

Those tools were more than inventive bundles of computer code. The C language and Unix reflected a point of view, a different philosophy of computing than what had come before....

Minicomputers represented a step in the democratization of computing, and Unix and C were designed to open up computing to more people and collaborative working styles....

C was designed for systems programmers who wanted to get the fastest performance from operating systems, compilers and other programs. “C is not a big language — it’s clean, simple, elegant,” [said Brian Kernighan, a computer scientist at Princeton University who worked with Mr. Ritchie at Bell Labs.] “It lets you get close to the machine, without getting tied up in the machine.”

62 comments:

Our society has been built by countless little-known people who labored to make things better for all of us. Most of what we daily take for granted is the accretion of the work products of such people from previous generations.

OT: I'm putting the Blue Star image to rest this week. For the uninformed :) The blue star banner was started in the early 40's to signify that the household had a member serving in the armed forces. At one point we were a 2 star household. My wife retired this month, thus we are retiring our last star.

PS: The Gold Star, represents a household that has lost a family memeber in combat, hence, Gold Star Mother....

Actually, C derived languages comprise better than 90% of the commercial programming languages being used today. However, it was not just the language. Unix, the operating system, is based on many concepts that have since become foundational for modern computer operating systems. It is no stretch to say that without Unix, Windows would probably never have come about. Mac OS X actually is based on a variant of Unix, and Linux is a Unix clone. Further, Unix is the OS that powers the Internet.

It is difficult to overstate RItchie;s impact on programming and, to a somewhat lesser extent, OS evolution.

That we have "the internet" is due to Dennis Ritchie's work. Not only was the internet built (and still run today) around Unix servers, but the foundational communications protocol all internet data uses (TCP/IP) started off in life as the default communications stack of Unix servers.

All the other major vendors had their own network protocols (DEC had DECNET, IBM SNA, Novell IPX, etc), but if you wanted to talk to the backbone servers on the still developing internet for things like URL name resolution, you had to speak TCP/IP.

Now, thanks to the internet, almost everybody runs TCP/IP everywhere. Upon TCP/IP and its now universal standard, I'm now sending out this data on port 80 (http) for you guys to read.

We owe so much to the geniuses at Bell Labs: Ken Thompson and Dennis Ritchie for B, C and Unix in the 60s, and Bjarne Stroustrup for adding classes to C and creating the next programming language standard, C++.

Kernighan and Ritchie were the authors of the bible on the C language. I have at least a couple of copies of K&R. Ritchie defined the unbelievably influential and elegant C language. Ritchie also was a principal developer of the unbelievably influential and elegant unix operating system.

If you have an android based phone, the core of that code is from unix. Unix is everywhere still.

I did. I learned C and UNIX when I worked for AT&T in the 80's and we were selling computers. These were multi user systems, perfect for both C and UNIX. I use Apple computers today and they are very reminiscent of them.

@YoungHegelian: not really sure what you're getting at. unix predates the internet protocol suite by over a decade. the original unix development was in 1969; v4 of the ip suite wasn't officially active until 1983.

yeah, paper tape storage was awful. i got grant money to help operate oregon's pine mountain observatory in the early 80s. pmo had a 36" cassegrain reflecting telescope that had a computer controlled two-axis sidereal drive. quite accurate when it worked, but one fine day a visitor (that was the story i was told anyway) stepped on the paper tape and bent it enough that it tore. after a couple of days, we couldn't load the program anymore, and it took a very long time to get a replacement, which didn't happen until after i left. so for about 6 months, that telescope was essentially useless for research.

it made one hell of a personal viewing instrument, however. you had to wear sunglasses to look at the moon; saturn's moons and neptune's rings were amazing.

punch cards were also pretty evil. there were long cardboard boxes that would contain thousands of cards (i don't remember how many cards per box. 2500? 5000?). there were also carts that held a number of the boxes. really big programs would have their own cart.

one of the saddest things i've ever personally seen was an accident between two carts that overturned both at a corridor intersection in a computer center. intermingled cards, from multiple programs, were scattered everywhere.

YoungHegelian said...-----------TCP/IP was developed for ARPANET (by Vint Cerf and co, and he was the guy who came up with a 32 bit IP address) which became the Internet as we know it today and ARPANet was predominantly used by the UNIX communities across the universities, US and elsewhere who had access to ARPANet/NSFNet (precursors to Internet).

All I remember is that in Pascal and Cobol et. al., if you had a syntax error, you had to actually look through and understand all the code to figure out where the error was. C was a godsend. the syntax was simple and not in the least ambiguous. That is the reason everyone after has modelled their language syntax after C. Thanks dude. Vaya con dios.

IIRC, one reason that the computer science guys went to C and Unix is that a C compiler was always available for C for unix operating systems. You could write device drivers in C for unix, which made it easier to add and control new devices.

Otherwise you had to write the device driver in an assembly language, which varied from computer architecture to computer architecture.

IIRC, the source code for unix was available very early on. Unix could be studied and modified.

“It lets you get close to the machine, without getting tied up in the machine.”

This is the best short answer to the what the hell is C, and why is needed? question. Getting "tied up in the machine" means programming on the CPU level, often called Assembly Language or just Assembler. The most valuable courses I took as an undergrad involved programming in Assembler. Learn how to do that and all the mystery of the black box -- the Intel or AMD chip at the heart of that thing on your desk on your lap -- evaporates. Assembly language is a mighty hard way to get anything practical done with a computer because it is so alien to the way we normally thing about a task. People apply intelligence, insight and intuition to everything we do. Computers have none of those qualities, in spite of whatever illusions a clever programmer can cook up. Assembler also permits the programmer to force an issue with the system if there's a subtle flaw that evades his analysis. If your program always cashes and dumps at the same point because it must have F0F0 at relative such-and-such then you can fix it quick and dirty by writing a bit more code that puts that value there when its needed. This called hammer-lock programming, or putting a gun to the f*cking thing's head.

Before Ritchie most software was written is some flavor of BASIC, COBOL or FORTRAN, and most programmer never touched a project that required a lower level approach. They're all pretty similar with strengths in various fields, COBOL for banking, insurance, etc. FORTRAN for engineers and scientists, BASIC for everybody else, kinda like the three estates of feudal France. The advantage is that the code looks sorta like English and the logic is pretty clear for another programmer, or even a non-programmer to follow. The problem is that since loose and fluid human reasoning seldom coincides with the deterministic and thoughtless logic of the computer the human-like language of COBOL complies to a huge mess of machine code which is inherently inefficient and hard to debug. When COBOL was invented in the 1950s it when for years with subtle errors in the standard compiler, so a one could write a perfectly logical and evidently flawless program and have it fatally corrupted by the compiler. This is the origin of the apocryphal stories of people getting billion-dollar electric bills and such.

The huge size of even simple programs authored in COBOL wasn't an issue when every computer was a building-sized system build by IBM or Univac, they have enough resources (very expensive resources, mind you) to cope with the inefficiencies. But when minicomputers started appearing the limitations of the high-level languages became critical. If you wanted to something really useful with a PDP-series system you needed to program it on the CPU level. Then C appeared, and the mini became a contender. C was the trump of doom to the huge institutional machines like the 360s and the Burroughs 6000s. Suddenly a box the size of a refrigerator could do what a machine that need a building size of an average house had been needed to do.

Both C and Unix are elegant. Elegant implies as small and simple as possible, no clutter. Not only is this more beautiful but the smallness and simplicity of C and Unix make them simpler and cheaper to implement.

Maybe the key concept in making them elegant was defining a small, powerful set of core mechanisms and then reusing those mechanisms over and over and over.

Yeah C needed a better assignment operator than "=". Then again COBOL needed a better statement terminator than ".". The smallest possible bit of punctuation. Seriously. I can't remember the number of times I've pored over program listings looking for a misplaced "." or two. Ugh.

Then again my favorite bit of C silliness was, and this was a few years ago so my memory might not be as accurate as I would wish, obtaining a pointer to a function.

I had to help a co-worker debug a particularly ugly program where the guy decided that the best possible design involved an array of pointers to functions.

... Yeah don't ask.

Well the guy forgot that some library functions weren't actually functions but macros. And you cannot obtain a valid pointer to a macro. So occasionally his program would either start doing some crazy stuff or just completely crash.

"the guy decided that the best possible design involved an array of pointers to functions...... Yeah don't ask."

No, this is eminently askable: there is certainly nothing wrong with that sort of design in places where it's really appropriate. And also (regarding "And you cannot obtain a valid pointer to a macro"): shouldn't that simply fail to compile?