Posted
by
timothy
on Sunday January 27, 2013 @05:40PM
from the cultural-literacy dept.

theodp writes "That his 28-year-old whip-smart, well-educated CS grad friend could be unaware of MacWrite and MacPaint took Dave Winer by surprise. 'They don't, for some reason,' notes Winer, 'study these [types of seminal] products in computer science. They fall between the cracks of "serious" study of algorithms and data structures, and user interface and user experience (which still is not much-studied, but at least is starting). This is more the history of software. Much like the history of film, or the history of rock and roll.' So, Dave asks, what early software was influential and worthy of a Software Hall of Fame?"

Which, it should be emphasized, we do study. While I'm a major advocate for the study of computer history, CS is not about software development, it is a branch of mathematics. The author of the article would be better off pestering computer engineers.

"CS is not about software development, it is a branch of mathematics."

That depends entirely on what college or university you are attending. The definition is still pretty much dependent on the school. Although it has been getting somewhat more consistent.

However: at least in the U.S., computer engineering is definitely NOT a software discipline. It is engineering of the computers themselves, that is to say, hardware (though firmware is involved, naturally).

"CS is not about software development, it is a branch of mathematics."

That depends entirely on what college or university you are attending.

Computer science has a meaning for more than just students, and that meaning lies primarily within the domain of mathematics. What gets taught in the name of computer science depends on the institution doing the teaching.

Indeed. If there's a piece of software that launched the personal computing revolution, it was VisiCalc - the first software business actually _needed_. I'd also throw in:
* WordStar - which was the PC world's answer to emacs. If you did text processing on DOS systems, it was done with WordStar or another program which emulated it.
* WordPerfect - the word processor, I imagine that without the Windows Hegemony, Microsoft would -never- have been able to kill wordperfect
* Bank Street Writer - the first -educational- word processor, I imagine X'ers like myself lived off of this in school

Once in an interview, Dan Bricklin (IIRC) said that in the early days they personally demonstrated VisiCalc at trade show booths. Sometimes accountants would actually cry, as they realized how many hours they'd spent adding up rows and columns of numbers, and how quickly they'd be able to do it with this new piece of software.

You know you've got a killer app when a demo causes members of your target market to realize how much your software is going to change their lives, and they burst into tears.

Most valuable program(s) ever. From day one, and still today. Hands down. Best positioned language in terms of "to-the-metal", changes from tool to uber-tool in the hands of anyone who masters assembler and arrives at learning C with that under their belt, can create extremely fast executables if the CPU is really taken into account, or can be extremely simple to implement if a CPU is treated simplistically -- yet your code will still work fine, if a bit more slowly. Made portability something achievable instead of just desired. C is so well positioned that implementing the language's constructs on top of [some random] CPU is a relatively simple exercise, and then you have immediate access to oodles of goodness.

Also the source of a lot of whining and bad programming from poor programmers. But hey, a fine carpentry set doesn't make you a great carpenter, either.

Also a nod out to standard libraries -- also a boon to portability and more.

Before Wordstar, there was Electric Pencil. I also compared Apple Writer II vs. Wordstar for a technical presentation in some college class in 1982; I declared at the time that Apple Writer was far and away the most advanced and user-friendly WP on the market.

I find it amusing that some 30 years later, some of the old Wordstar keyboard shortcuts are still used in some programs today -- notably alt-X, ctrl-Y, and F1 still do essentially the same things they did in Wordstar.

I think someone else mentioned Colossal Cave, and yes indeedy -- CC begat Zork which begat the rest of Infocom's amazing library, which I still play from time to time today. My 20-something daughter just the other day complained about the difficulty of getting the babel fish in your ear! Tell me, Microsoft, what games of YOURS are still being played 20 to 30 years later?>

I was visiting a computer store owned by a friend. A man walked in who looked homeless. He wore clothes that everyone else I knew would have thrown away. This was in California before Reagan, before there were a lot of homeless people.

I quietly asked my friend if he would ask the homeless person to leave; maybe there would be a concern about theft. My friend laughed, "That's Michael Shrayer [wikipedia.org], he wrote Electric Pencil, he's a multi-millionaire".

Tell me, Microsoft, what games of YOURS are still being played 20 to 30 years later?

Well, Microsoft Flight Simulator was launched in 1982, that is almost 35 years ago; Solitaire came with Windows 3.0, in 1990 (and believe me, there are many more people still playing Solitaire than ever played Colossal Cave or Zork). Minesweeper was originally part of the MS Entertainment Pack (also 1990) but was bundled with Windows I believe starting with Windows for Workgroups. Freecell came a bit later, can't remember exactly when, but was there before Win95, which makes it at least 18 years old, I'm sure there are more.

VisiCalc was actually credited by a few business journalists in the 80s for starting the whole corporate raider business. They were now able to plug in all those numbers from SEC filings and other sources into the spreadsheet, run simulations of financing and figure a way to take the company over and make their billions.

They also used it to find out if the pension fund was over funded. See, back in the old days, companies would invest the pension in very low risk things like government bonds - at like 3%. The raiders said, "Hey wait a minute! If we put the money in the stock market, it could make 10% a year - because that's what it averaged for decades! They don't need all that cash in their and we can use it to finance the deal and pay our "consulting fees"!"

Flash forward to the '00s, and pensioners are getting their benefits cut left and right or they are just gone.

KKR, Icahn, T Boone, and Bain Capital (of Mitt Romney fame) were and are some of the players.

Now, many of those folks don't have the money that they counted on - their deferred compensation. Another way of putting it is those folks weren't fully paid for their work.

Now, many of those folks don't have the money that they counted on - their deferred compensation. Another way of putting it is those folks weren't fully paid for their work.

I find it amazing how little attention is paid to that. Some like to blame pensions for bankrupting the auto industry, but the fact is, until shenanigans like that, they had the pension funds in reserve like they were supposed to. If they don't have them now, it's only because of greed at the top, not something the union did.

1. Visicalc, of course. It is what changed the Apple ][ from a toy to a valuable business asset.

2. Lotus 1.2.3, the better VisiCalc, and now for DOS machines!

3. The first flight simulator for the Apple ][.

4. WordStar on CP/M (later on DOS), proving that effective word processing could be done without a dedicated word processing network.
5. Perl--- the first truly useful, easy to learn (hard to master) programming language supporting regular expressions. (Well, awk preceded it, but awk was impossible to work with.)

There were also several raster and vector graphics apps from the 1980s that demonstrated the breadth of possibilities.

I have avoided the software that was originally created on mini frame and main frame computers, then duplicated on the microcomputers. These were great, but they did not have the "Oh wow, nobody saw that coming" impact of Visicalc, WordStar, or Perl.

Yes, any decent Computer Science program should definitely have some required courses in how and why these apps changed the world.

Xerox Bravo [wikipedia.org] (1974), Xerox Gypsy [wikipedia.org] (1975), and Xerox Markup [toastytech.com] (not sure of exact year, in the vicinity). As a general rule, whatever you can think of, PARC had it ten years earlier. By the late eighties they were working on a PDA/tablet/smart surface, touch-driven ecosystem.

Point being—people disproportionate weight on programs that they experienced. It's the same story whenever an amateur writes a computer history article; a few pages of nostalgic bullshit without any real research. Yes, it's significant that the Mac programs (which, oh by the way, already existed on the Lisa, too!) were popular, but severely erroneous to give them all the scrutiny. As historians we should endeavour to look past our own biases and provide an accurate image of history, not play favourites with specific products.

Autocad & PowerDraw (now PowerCADD) 2D CAD followed a decade later by SolidWorks 3D for turning concepts into executable designs that were within the realm of price and usability for individual designers.

Autocad & PowerDraw (now PowerCADD) 2D CAD followed a decade later by SolidWorks 3D for turning concepts into executable designs that were within the realm of price and usability for individual designers.

I'm not sure Turbo Pascal's legacy is as influential as it should have been. Sure, plenty of modern IDEs owe a nod to TP, but what about the compiler? The thing was shockingly fast. I wish TP had been more influential in that regard.

Some interesting info about how Turbo Pascal's speed was achieved here [hubpages.com].

Here are a few that were great in the beginning but have become bloated and kind of overbearing since:

Word 4.0 for Mac (fast, stable, good UI, nearly perfect)Photoshop 1.0 and then 3.0 (when they added layers)Early versions of Excel (for Mac, then later Win95)FreeHand (when it was Aldus)PageMaker (when it was Aldus...see a pattern here?)Aldus Persuasion (notice I didn't say PowerPoint?)iMovie (compare to any version of movie editing software bundled with Windows ever...no contest)Honorable Mention: Garage Band (too niche to be mainstream)

He mentions Susan Kare but I'd like to give another shout out to her work [plos.org]. We are still using derivatives of her designs, and the brief simplicity of them really led the way for a lot of the icons we use now.

OK, maybe that's a little harsh. But it's not completely apparent what value such a detailed review of early software programs would add to a computer science curriculum. It's probably sufficient to note the emergence of the GUI as the major defining element here, and let our poor undergrads get back to studying their bi-directional linked lists.

My opinion: it's not an accident that computer science is a more forward-looking than backward-looking discipline. Students will get more mileage out of downloading the latest version of OpenCV or playing with math in Python than sitting through a boring lecture about primitive computer software apps.

The Mac OS's successful commercialization of the GUI was a huge advance, and students really need to compare it to CP/M and the like to understand its importance. You don't need a detailed comparison, just test runs of the two side by side to show the difference in user experience.
Late in 1983, I walked into a computer store fully intending to buy a CP/M machine, fiddled with the interface for about a half hour, and walked out without buying one. It simply was not worth it, even as a technology writer. I'm a fast typist, the three-finger command interface was too clumsy, and nobody wanted -- or even knew how to handle -- electronic submissions.
The late Cary Lu introduced me to the Mac, in 1984, but what sold me was watching my six-year-old daughter play with one in the Boston Computer Museum. She picked up the interface in minutes for MacPaint. MacPaint and file management were similarly intuitive. I wanted a tool for writing, not to be a computer operator. I bought a Mac and got it up and running right out of the box.

Microsoft BASIC and later Visual Basic: Unjustly despised, but introduced many to programming (and the very first ones were marvels of micro-programming too). Also interestingly portable at a time where portability was on nobody's radar.

Spectre GCR, a Mac emulator on Atari ST. A precursor of virtualization in my opinion, and a very smartly done one at that.

VMware for making virtualization available to the masses and enabling the cloud.

AmigaDOS for being the first OS with built-in hardware-accelerated graphics and sound.

The RPL system in the HP28 and HP48 series of calculator. Reverse Polish Lisp and symbolic processing on a 4-bit calculator with 4K of RAM? Seriously?

The Minitel system in France, including nationwide phone directory and dubious innovations such as Minitel Rose (porn in text mode at 1200bps, basically).

Postscript and the whole desktop publishing revolution.

NeXTStep (or whatever the CorRect CapItalizATION is), so far ahead of its time that it took years for it to reach its full potential in the form of iOS.

GeOS (already mentioned by someone else)

Mathematica. Just wow. But also forgotten precursors such as TK! Solver.

Lisp 1.5 [softwarepreservation.org] was the first widely distributed Lisp sytem (and it includied an interpreter AND a compiler). Many people have completely forgotten about it, but among its contributions were to pioneer dynamic programming languages (as are ruby, python, etc, etc) AND garbage collecting. And many other things. It was staggeringly innovative.

Learn C to learn how things really work for the last few decades in the kernel and library spaces, learn the original specs of HTML to understand what Hypertext was really for, and learn C-Kermit to learn what configuraiton and control over a limited interface really means.

Not so much software as software tool, but if you're looking for the most influential and important thing in software, the clipboard probably wins hands down. Without it, most of the web would not exist, for one thing.It also has the distinction of being invisible - out doesn't even feed back. Nothing comes close to it for ubiquitous power and influence.

TUTOR [wikipedia.org] (also known as PLATO Author Language) is a programming language developed for use on the PLATO system at the University of Illinois at Urbana-Champaign around 1965. TUTOR was initially designed by Paul Tenczar for use in computer assisted instruction (CAI) and computer managed instruction (CMI) (in computer programs called "lessons") and has many features for that purpose. For example, TUTOR has powerful answer-parsing and answer-judging commands, graphics, and features to simplify handling student records and statistics by instructors. TUTOR's flexibility, in combination with PLATO's computational power (running on what was considered a supercomputer in 1972), also made it suitable for the creation of many non-educational lessons - that is, games - including flight simulators, war games, dungeon style multiplayer role-playing games, card games, word games, and Medical lesson games such as Bugs and Drugs (BND).

1994 Message from CS Prof Daniel Sleator to Tim Berners-Lee [archive.org]: It would be possible for one person to write a new game (such as double bughouse chess) without having to write a half dozen graphics interfaces. Many really cool things change from being impossible to being quite feasible. (The PLATO system developed in the 70s at the University of Illinois had some of these properties: simple graphics available to all users, fast interaction among a large pool of users. The result was the development of a number of very popular and engrossing interactive games.)

Jumpman: set the standard for 'playability' & 'fun'. I remember making fun of it when I saw the underwhelming graphics, but it had me hooked the first time I played it. Truly, one of the best games ever. Decades later, it's STILL playable

Archon: what can I say? It started where chess left off, hit the ground running, and just *oozed* "epic win" for concept & gameplay.

RUNOFF on CTSS (1964) turned the computer into a document preparation tool. From there we got Multics runoff. The UNIX developers justified their early efforts by promising to bring runoff to AT&T without the expense of Multics. And now RUNOFF has many descendents, both in the form of markup languages and document processing applications. These are arguably a more widespread and important use of computers than actual computation.

Before the internet, computers were a tool and not just a screen to get you to what someone else already had made. You got a computer because you wanted to make things. It could be a document, an image, a song, software that could be used to make more and other things. Computers were mainly purchased by those who wanted to use them as a tool for creative and practical purposes. All you could consume on computers in the pre-internet age were games, and consoles were usually cheaper and better for that, or the few expensive and slow online services that you could reach over dialup.

So this made a huge difference for early software. The windowed GUI interface that is everywhere today was designed for desktop publishing, by Xerox, a company whose business is making documents. The phone and tablet interfaces that are growing now and the first centered around consumption of data instead of creation of data. This is a huge switch which makes it even more important to remember software history.

Paradox for DOS was a breakthrough program for its time, permitting fairly serious multi user networked business applications to be built in DOS with a relational database. The PAL (Paradox Application Language) was very powerful. I built a rock solid and fast multiuser system for a mental health clinic with it. And Commodore 64's Logo was actually HP's graphics language in disguise, a great program for what it was and for its time.

Napster - this is the software that kicked off the idea of music file sharing. Okay, the record companies hated this program but this is the first program that I can think of that really CONNECTED people as a group on the internet for exchanging data.

MS GW Basic - this was the basic that shipped with the IBM PC and was pretty much what much of its early software was written in because it was so simple to use and yet could be used to do quite a bit.

Windows 3.0 - This was the first version of Windows that people really used and really brought the GUI desktop with the mouse into the mainstream. Okay, the first Macintosh from Apple did that too and came before Windows 3.0 by a ways but it was not nearly as widely used, especially in the workplace.

I started computing with a VIC-20, and grew up with a C-64. I never really used the 'must have' apps that made businesses want computer in the first place, though. I knew about them, and knew my uncle spent a fortune on an Apple II to run them for his store, but knew little about them.

So recently I picked up a Commodore 128D and got some CP/M software: WordStar, dBASE II, and VisiCalc. After some configuration brouhaha (this wasn't easy, without the manuals!) I gave them a go.

What most surprised me was how usable they all are, still. Oh, the interfaces require actual studying, but WordStar's is sensible, and dBASE's total lack of anything resembling user friendliness at least exposes its raw flexibility.

Of course, then my 30-year old Commodore monitor let the blue smoke out of the capacitors, so it's out of commission till I get them replaced.

I think having current compsci people take at least a brief course using these old, old programs might help them understand not all that much has really changed - and maybe inspire them to change things.

The only word-processor I ever really liked. And the reason why I switched to Windows from DOS and my own customized Turbo Pascal editor.

I immediately felt at ease with Ami Pro. Everything felt intuitive for someone who had started using computers mainly to get rid of typewriters. Other word processors at the time seemed like just different typewriters. But Ami Pro almost forced you to use styles instead of manual formatting. And it made the use of styles very obvious and easy, mapping them to the function keys. At last, something smarter and more useful than a typewriter.

I'm using LibreOffice now, but I'm unhappy and still long for the elegant simplicity of Ami Pro.

This reverse Polish language was not a "mainstream" language, but for astronomers, it was perfect for telescope automation. FORTH was also used in other robotic things. I was really surprised that FORTH wasn't included on anyone's list. In fact, how many of you have ever heard of FORTH, let alone did any programming in it?

This was many old-school programmers' first exposure to computers as entertainment. For example, both my wife and I recall playing it on TI SilentWriters (paper output plus an acoustic modem) when we were kids. Even more than Space Wars, which was written at least a year later and only ran on much less common hardware, this was the start of computer gaming.

Byte is kind of the journal of note of the microcomputer era from 1975 to the early '90s (when it became just a bunch of boring reviews). I'm sure anyone who wanted a list of influential software from the past could spend a couple of weeks digging through them. You can find most of the early years as scanned.PDF files if you know where to look.

And don't forget to cover some of the important failures too, like The One[tm], Visi-On, and Lotus Jazz. And the important semi-failures like Smalltalk and OS/2.

It's a technique for simulating variations in product assemblies. Usually mechanical, but could be of other natures, as well. You model the assembly and it's manufacturing variations, and then "build" some quantity of parts. One can determine how many assemblies will likely meet specifications, the major contributors to out-of-spec assemblies, etc. etc.

The technique was developed during WWII at Willow Run Labs, where it was implemented by the classic "banks of women operating calculators", and is one of the reasons we were able to crank-out all those airplanes that actually worked.

By the 70's it was implemented in an academic setting on mainframes.

A company I worked for obtained rights to VSAS and we ported it to the IBM PC. I did the initial port to Watcom Fortran (there's another one for you!), and then designed a domain-specific language (VSL) and implemented a compiler in C and interpreter in Fortran, so that mechanical engineers didn't have to write their models in Fortran any more. The Fortran models were bulky - with line after line of function calls with zillions of parameters, passing separate X,Y,Z values in the calls. I'd imagine the engineers wore-out the parenthesis keys on their keyboard pretty fast. VSL, on the other hand, had data types for points, lines, vectors, planes, etc. Using an interpreter didn't slow things down, because most of the time was spent in geometric library routines, which were in carefully-optimized Fortran.

I insisted on their hiring a mathematician, and between the two of us, we tweaked it to run faster on the PC than it did on the mainframe. (Engineering professors don't write code that is either fast or mathematically-correct, it turned out...)

And that's when it's use took off. The company founder started as a manufacturer's rep for some Finite Element Modelliing software, so had lots of contacts in the auto industry. (And the company was located near Detroit.) They both sold the software and did also did in-house projects for the auto companies until they ramped-up their own engineers. This allowed the auto makers, for example, to start treating windshields as structural elements (because the hole for the windshield could be manufacturered to precise tolerances), and allowed them to eliminate costly alignment operations, such as when fitting hoods.

It's used by every auto and aircraft manufacturer, every hard disk manufacturer, etc. etc. etc. Basically just about any complex mechanical product you touch was touched by VSAS during design.

I'd imagine you couldn't build an iPhone at an affordable cost or with the quality level of an iPhone without VSAS (or it's equivalent). You wouldn't be able to buy a terabyte hard drive for less than $100.

Maybe not quite what this post was looking for, which I think was more consumer PC software. But it runs on a PC and has from the beginning of PCs, and has had a large but mostly-invisible influence on just about every tech product we use every day.

While the first 4K Microsoft BASIC was significant in many ways [wikipedia.org], the ROM-based Microsoft BASIC included with literally tens of millions of computers shaped the industry in ways no other application ever did.

It's impact was in being the first tool used by an entire generation of programmers, it shaped their thinking in ways that frustrated some.

Lisp was such a good idea that people are still reimplementing it [wikipedia.org] 55 years later.

FORTAN was such a piece of crap that... almost everyone started using it, it became for most people the only possible way to learn to program, it persisted for decades after alternatives were designed, it was sufficiently flexible to evolve into a very nice and usable modern version, it's still often more efficient than C, and it basically defined the whole procedural style of programming.

Because once we forget how this software worked, someone else comes along and does a research project, thinks that they have invented something new, patents it and/or names it after themselves. Then they'll start sending lawyers after other people. I've seen this happening with something as simple as 3x3 convolution matrices and widget libraries. What was common knowledge in personal computer magazines back in the 1980's now seems to be stuff that leadsto patent battles now.

Because once we forget how this software worked, someone else comes along and does a research project, thinks that they have invented something new, patents it and/or names it after themselves.

Historically, that also happened in mathematics. Oh, wait, software IS mathematics. And mathematics just doesn't get obsolete. Just sometimes, notation changes (== programs get reimplemented), but the core is still the same.

More to the point, the mathematical systems based on roman numerals are obsolete, replaced by a system that uses zero as a placeholder and as the common origin of all the number lines. You would have a hard time describing how to do simple things like reconciling your checkbook using only the concepts behind roman numerals.

For the same reason we have a Baseball Hall of Fame, a Football hall of fame, or even simpler, for the same reason we study world history. Know thy history, learn from your mistakes, understand what the best things were made off.

We study influential software for the same reason we study the past in any domain: to learn of the forces that shape what is, the human stories that lead to these artifacts, the design decisions and the lessons learned therein. What you see on your desktop today is the current end of a long chain of "obsolete software" that includes MacPaint, and Whirlwind, and any number of earlier systems that bring us to current dominant designs. Economically significant and useful software intensive systems all have such a legacy, and your hubris in so quickly dismissing the value of understanding anything older than your professional lifetime is staggeringly depressing to me. May you never be on any development team that has to grapple with the refactoring of legacy code.

Understanding what made such software good back then might help you produce better software now. Who knows, maybe studying various ancient, obscure GUIs could have averted disasters like Windows 8, Gnome 3, and Unity.

You have not gotten the straight answer yet, but the real world economic answer is nothing changes very much, so a well educated individual knows how the newest PR news release about a "new" idea will turn out, given how the exact same idea turned out three times in 1970, five times in the 80s, and twice in the 90s. Even if the outcome is different for tech or non-tech reasons, the challenges, successes, roadblocks, etc, will be the same this time around as the last ten times.

Ah so you're saying that this new language will be a silver bullet which will eliminate programming as a profession because business people will write their own programs, you say? Hmm I wonder if thats ever been claimed before. Naah. If it were you'd have language names like "Business Oriented Language" and stuff.

I've got a totally new idea! We can project manage programming by programmer-hour because the product of programmer times hour is always a constant a given problem. You'd think someone in 1960's mainframe development would have had the same idea, but people back then were pretty stupid so I'm sure my new idea is... new.

Hey guys, I got a new one. We could assign a noob to work with an old timer and see if the noob learns anything by osmosis. This has never been tried in all of human history so I'm gonna patent it and trademark it and I'm gonna be rich and buy a private island.

To be honest its not as technical as you'd like to think... its kinda like studying ancient fashion to predict what future fashion will look like, seeing as womens fashion is kinda cyclical. So, you're saying after skirts go down, they tend to go up, and vice versa? Holy cow batman! Especially when dealing with trendy style high fashion like UI design or PR.

I'll beg to disagree with the idea that history is irrelevant to CS. Protocols, and practices, did not eveolve in a vacuum. Knowledge of how early principles were derived, and why we've migrated to newer approaches, is critical to understanding ongoing changes in a field. Moore's law, for example, led us from extremely limited command line interfaces to today's sophisticated GUI's. But understanding the original command line interfaces is vital to seeing _why_ modern tools aren't all in XML with back end databases.

In physics, we learn about older theories and the reasoning that lead to them, then the new experiment that disproved them and what came next. Physics isn't just a body of current theories, it's a process with a history. Understanding the process is probably more important than understanding the current theories.

CS could stand a bit of that.

I would love to make "the history of software" a mandatory course for anyone who will ever be involved in software patents. While we're at it, a history of science and e

MacPaint and MacWrite were, I believe, the two programs that enabled the original Macintosh 128K to be accepted. It was easy to use and, while not as powerful as some DOS apps, we're fully interoperable wirh one another. I am among the first to receive the Macs at Drexel in 1984. Our curriculum was based around these two apps. And, you know what? It worked. Eventually we received other - the departments developed their own software as well.

Lightwave and the rest of the Video Toaster studio software was influential in that for the first time, you could have a quality video studio stuffed in a single computer. A lot of UHF and independent stations used 'em.