As a 45 year old with 30 years of experience in the computer industry, such titles interest me, immensely.

I grew up with Unix.

I remember the days when Unix was a seriously advanced topic, available only to the elite privileged (adults) who were granted access to it as a technological resource only through machinations that were out of my scope of influence, as a 13 year old learning computing in 1983. It was a vast mystery.

It was only with great effort on my part that access was eventually granted in the form of a login over a slow modem. That was enough for me to move to the next phase.

Then, it was a mystery in terms of taxology. I had a near-infinite, seemingly overwhelming number of things to learn - more than what was in my high school curriculum, more than my little home 8-bit computer could teach me, more .. it seemed at the time .. than all the books on my shelf.

Well, it was only a few years - actually probably less than 10 months - until I was able to grasp at least a little bit of the subject, and do something with it. That was enough to propel me into the professional sphere - where I was able to move from a clueless teenager into the clued-up programmer zone.

And then (late 80's) it became a game of "which Unix do you want to know?", and that game was as thorny as any other. I chose Risc/OS (from MIPS, pre-SGI) and it worked out: I gained my own machine (physical hardware) as a result of commercial delivery of products - a few files of .c/.h - which I'd managed to build as a result of eager learning applied to real application.

And then: Linux. This changed the world completely, at least for me at the time. When I first saw Linus' post on the minix-list, announcing his sharing of work I thought completely inapproachable and beyond the horizon, I immediately applied myself to catching up. And then, through the 90's, the commercial applications of these skills gave me an even deeper understanding of the subject - and Unix became less of a mystery and more of a resource for living. And so it was: for a decade I built Unix software, like no other, which resulted in many great things for many great people.

Then, we flipped the century bit, and Unix - in spite of all the predictions from industry experts - was still with us. It didn't just go away because of pop-culture desires and mores, it was seriously entrenched. That 80's decision seemed all the more wise in 2001, when it became clear to me that Unix was not going away.

So I kept at it. I coded all I could. I got a tiBook - astounded, but content nevertheless, that Apple (of all companies) was producing a portable Unix workstation - something that I'd dreamed of for over a decade. And there it was, the tiBook. So I've upgraded my way from there, to my current extremely satisfying device (MBP Retina) .. and beyond that, even beyond my wildest teenage dreams: here we are today - I carry around a Unix workstation in my pocket.

Its my primary means of communication, just like I always knew it would be. It does all the things that I used to do with the Risc/OS pizza box. It does it in a way that makes total sense - a simple interface, like the embodiment of the holy pipe, but at my fingertips.

It has been a challenge, a fantasy, and a real desire to see Unix become what it is today - but its also been a privilege. I think that all of my fellow hackers who professed faith in the technology, must feel the same - at least those who have watched it happen over decades. Truly, a unique human experience.

And therefore I'm really glad to see this article. It has been very enlightening - and as well, personally satisfying - to see that others recognize the power of this technology, which took decades to become the force of power that it is today.

I took a university course on Unix internals in 1988. The lecturer started by announcing "this is the last year that we will teach this course as Unix is now very out of date and is being left further behind every year". It turned out to be the best course I ever took.

I remember a course I took once with the venerable Peter Denning, the hardest professor I ever had and the proudest I've ever been from a grade I received...he was giving a lecture on processes in Operating Systems and one of the students raised his hand over some minor point:

"I don't think that's correct"

"what's not correct?"

"<whatever the minor point was> I've run ps on my linux laptop and it contradicts what you are saying"

"#1, I said no computers during the lectures for this course, #2 I am in fact correct because I invented it, now either close your laptop and pay attention or get out of my class."

Now, many years later, I'm constantly surprised at how pertinent and immediate what I learned in that class constantly is and how it keeps unfolding in my mind. The kind of simple, composable concepts he lectured us on have penetrated almost all of my work since then. It's amazing what robust ideas our forefathers invented from whole-cloth!

He sounds like an arrogant dick. The fact that he invented something doesn't mean it works the same way on every OS... I've worked with a few academics and unfortunately this attitude seems rather common.

This was many years ago, so it's a bit hazy, It was something to do with process ID assignment of some root process by the OS.

It's hard to explain, but he didn't come off as arrogant. He was one of those rare legitimate experts who had a hand in the formation of things we take for granted today, knew it, and didn't take kindly to young whipper-snapper know-it-alls with their Linux laptops open and not paying attention during class challenging him on his accomplishments.

He's one of the fathers of the modern notion of the field of "computer science", before that it was EE, or Math, or some other field. His PhD thesis introduced core multi-process OS concepts used in every Operating System since then and basically invented the modern theory behind Virtual Memory.

He helped design Multics which was worked on by Thompson and Ritchie which was the template for their sequel project Unix, he co-founded CSNET which was the academic Computer Science network that's one of those networks in the "inter" part of the word "Internet" -- it was also the network that hosted one of the first experiments in free software distribution (and is still around "netlib").

Challenging him on these kinds of issues is like challenging Neil Armstrong or Buzz Aldrin on actually landing on the moon.

He worked insane hours, put an incredible amount of time and effort into education (something more professors could learn from), had a very well structured course that had been refined down to near perfection and always went out of his way to make time for students. I remember many meetings with him where he was absolutely exhausted looking, but he never rushed us out of his office.

He was very insistent that you put as much effort into his class as he did, and anything less was disrespectful. And that's how he ran it.

He was easily the best professor I had in all of my education, full-stop.

thank you so much for sharing your experience. I am about 6 years younger than you but followed pretty much the same path. I also ran Linux when the kernel was pre .2 etc (Redhat 1.0 was the first commercially linux product I remember.) I think I remember getting Redhat Linux on 10 floppy disks.

I was lucky enough to live in a geographic area where there were a few universities, and my father had access that allowed me accounts on several Unix workstations.

I think I'm a little younger than you as well, but my first unix was Red Hat Linux 7. It came on a CD and was so buggy[0] I gave up on it after 6 months. I wouldn't touch another unix until my sophomore year at college, when a roommate convinced me to try Gentoo. That worked for two years before I got fed up with it (I wiped out my /etc one time too many). Then I bought a TiBook from same said roommate and haven't looked back since.

The most infamous comment in the history of source code from Unix V6. Now I can search an see at what stage this was committed.

2230 /*
2231 * If the new process paused because it was
2232 * swapped out, set the stack level to the last call
3333 * to savu(u_ssav). This means that the return
2235 * actually returns from the last routine which did
2236 * the savu.
2237 *
2238 * You are not expected to understand this.
2239 */
2240 if(rp->p_flag&SSWAP) {
2241 rp->p_flag =& ~SSWAP;
2242 aretu(u.u_ssav);
2243 }

From a brief glance at the surrounding code, this is part of the context switcher, which has to do stack switching. Also interesting to note that the whole function, swtch(), is only 71 lines. The entire kernel is less than 10kLOC.

I agree that understanding this type of code would be very difficult if you're used to the HLL notion of nesting function calls, because these don't work like regular functions; a call actually "returns" into a different one. You can't easily understand coroutines, threads, processes, and context switching without seeing that functions are really a structuring that HLLs impose, not the CPU itself.

After the speedy download from Github (which weighs in at more than 40 times the capacity of the hard drive which held my first *nix system), I was able to locate a patch I submitted to FreeBSD in less than 12 seconds from running gitk on the file in question, thanks to the wonderful world of Git.

I shudder at the thought of having to manage all of this using CVS, pulling changes using CVSup or whatnot.

The original context in which Dijkstra wrote that was very different; he was primarily referring to old versions of Fortran and BASIC, where gotos where more or less the only (or the predominant) form of branching. In other words, something like

and whatever was happening at 4840 would GOTO back to 700 (with predictable catastrophic results if anything before 700 changed) and so on.

Local use of goto in the context of e.g. bailing out of an ini function if something failed early in the initialization sequence is useful and not frowned upon by people whose daily work consists of something other than writing coding standards. There are a few (well-meant!) exceptions that have to do with e.g. static analysis, but those tend not be so important outside the field of high-reliability systems.

Well he is really close to IEEE[1], so I think if there was any legalize involved, he must covered it. I follow professor Spinellis since 2008 and he seems to be a really nice guy who pays exceptional attention to details.

UPDATE: Apparently DSpinellis is the new editor in chief of IEEE software magazine[1].

FreeBSD has supported a git mirror for years, although the primary VCS (the "source of truth") is still Subversion. A number of downstream consumers of FreeBSD use the git mirror though, and we have brought in patches from submitted pull requests.