Tag: history of computing

Not surprising that they originate from an RSA paper on public key cryptography. This site has way more than the early history, though, explaining how the set of notional actors has been expanded and adopted in other example narratives.

I am utterly fascinated by how far back many of the antecedents of modern software go. For those who read on the history of computing, Babbage’s failure to realize all his dreams with regards to his efforts with early computers mean this software is hardly surprising. Gives one pause when considering the push for software demos and other pressures that still lead developers to producing vaporware.

A listener sent me a link to this rant, The State of the Art is Terrible, by Zack Morris. If you can wade through the technical humbuggery, I think there is a useful point. Several decades after the advent of high level programming languages and well into the age of ubiquitous computing, it genuinely is time for computing technology to be more focused on outcomes.

I’m on the threshold now of rejecting this false idol, but for at least a little longer I have to cling to it to carry me through. I have a dream of starting some kind of open source movement for evolvable hardware and languages. The core philosophy would be that if your grandparents can’t use it out of the box to do something real (like do their taxes or call 911 when they fall down) then it fails. You should literally be able to tell it what you want it to do and it would do its darnedest to do a good job for you. Computers today are the opposite of that. They own you and bend you to their will. And I don’t think people fully realize how trapped we are within this aging infrastructure.

The post is rife with examples of how the status quo is an abysmal failure to all but those of a very hackish bent. Morris touches on why this is so, the industrialization of software and the subsequent urge to profit. If you can wade through the very down tone, I think there is a kernel of optimism–a call for a sea change in how computers work and work for us.

Morris isn’t alone in this view, keeping company with the likes of Jaron Lanier. This is not likely to be the last rant in this vein. I think he is a bit more pragmatic, though, highlighting PHP as an example of a step in the right direction. His point isn’t that PHP has a natural language based syntax or that it has syntax or semantics that mirror concepts and idioms with which non-programmers are familiar. Rather he suggests it for its more productive failure modes, that it makes a best effort on the easy stuff and doesn’t obscure breakages requiring more investigation.

Whether you agree with PHP specifically or not, it is worth considering it as an example of the model he is proposing–languages and tools that are more focused on outcomes than abstract design principles or idealized syntax. The emphasis of getting out of the way of doing useful things ultimately sets apart this rant from a crowd of voices raising many of the same critiques of the state of the art.

I’ve been meaning to remark on the passing of Dennis Ritchie but have been incredibly busy at work. The irony is unlike Steve Jobs who touched on my career and my life only glancingly, Ritchie’s contributions in the former of the C programming language and co-inventing Unix are pretty critical pre-conditions for most of what I’ve been doing professionally and out of enthusiasm for well over a decade.

When I was in college, access to and usage of the limited number of Unix workstations on campus were of mythic proportions. All of my friends and my co-workers within the school’s Technology Services enjoyed noodling around with PCs of different strips. Gaining entrance to the access-limited Unix labs and setting at the quietly humming machines with their remarkably large and high resolution displays for that time was something else altogether.

Those machines in no way felt like toys. To a one they were all networked together and connected via fast links to the Internet. What you had to bash and cobble together on your own PC to get barely functioning was a given in terms of horse power and network connectivity with these machines.

That sense of awe, the invitation to explore that is woven into my earliest experiences of Unix deeply informs my relationship with Linux, its spiritual descendant. I still experience a subtle frisson of delight when exercising root privileges on any of my Linux boxen for the way it takes me back to those almost furtive trips into the Unix labs at school.

The C programming language holds a similar place in my personal pantheon. Almost every programming language with which I have more than a passing fluency can be described as C-like. I have only worked directly with C for limited stints over the years, experiences too few and far between to transform the experience from mysterious into the quotidian. I realize that rationally it is a bit silly but just the age and application of C seem to beg a certain veneration that few if any subsequent languages have yet to achieve.

The contrast between the coverage of Jobs’ passing and Ritchie’s is pretty extreme. The temptation to read much into the difference is great but I think easily explained. By all accounts Ritchie was a very quiet and private person. Unlike Jobs, you don’t have to have a sense of Ritchie’s personality to appreciate his contribution to modern computing. The technical merits of C, Unix and his collaboration with Kernighan in the form of The C Programming Language, or simply K&R, speak for themselves.

Slashdot links to an intriguing Technologizer piece about the days when dominance of the graphical desktop on personal computers was far from a foregone conclusion.

The company had announced [Windows] in November of 1983, before most PC users had ever seen a graphical user interface or touched the input device known as a mouse. But by the time Windows finally shipped two years later, after a series of embarrassing delays, it had seemingly blown whatever first-mover advantage it might have had. At least four other major DOS add-ons that let users run multiple programs in “windows” had already arrived.

Just like Microsoft’s late arrival on the Internet and more specifically the web, the article outlines one of the earliest object lessons from dealing with the Redmond giant. Their engineering strength, including the ability to deliver in a timely fashion, has far less to do with their business practices. In this case, the premature announcement seriously affected the behavior of competitors already shipping their offerings, in ways that ultimately played to Microsoft’s advantage.

The rest of the article is a nifty nostalgia trip. Mac and OS/2 are not in a surprisingly full list because they didn’t run on DOS, like the rest of these competitors did. Microsoft’s marketing shenanigans were also the sole sole contribution to these failures. Some of the patent and trademark activity is very reminiscent of the sort of litigation that still goes on today over hotly contended software markets, like mobile computing and smart phones.

First glimmerings of holographic video displays
John Timmer at Ars Technica discusses some pretty impressive research considering how little holography has advanced for anything other than trivial applications. The system these researchers are building may seem crude but most of the equipment being used, including the network connection, are pretty close to consumer grade. The potential is enormous though I have to imagine free standing holography is a further horizon beyond these re-writing but otherwise fairly constrained displays.

History of computing and elections from 1952
Wired has re-printed an article from around the time of the last US elections by Randy Alfred. In it, he explains how Univac, one of the earliest computers, was tasked with predicting the presidential election in 1952. The forecast put together by the machines and its operators was remarkably accurate but the TV folks they initially approached were too skeptical to air it at the time, only admitting to discounting the computer’s results well after they were obviously correct.

Patent database is up and running
Rogue archivist, Carl Malamud, has the good news at O’Reilly Radar. The joint effort between the USPTO, the White House and Jon Orwant at Google has resulted in a new, open database that supplants feeds that formerly required substantial subscription feeds. As Carl explains, this was no easy chore given vested interests in the revenue streams from the old, closed system. A huge win for restoring a critical piece of our informational commons here in the US.

Categories

Endorsements

"[T]houghtful, informative, and deep, a real plunge into the geeky end of the news-pool. There's great analysis and rumination, as well as detailed explanations of important security issues with common OSes and so on." -- Cory Doctorow