Well, happy 30th anniversary to them! PARC has provided us with far more than just the GUI, though that is what it is most notable for. PARC has churned out a lot of innovations and I hope it continues as long as Xerox is willing to fund it (which is in their best interest, IMO, a lot of IP comes out of it).

Hum... I thought that it was the PERQ the first machine to meet this challenge. I think the 3M challenge was put forward by CMU, and Three Rivers (the group that produced the PERQ) was made mostly of people from CMU. The 3M challenge was supposed to portray what workstations would be like by the mid 80s, I think the Alto was the main inspiration for the PERQ though....

The 3M challenge asked for a network of distributed workstations, each of which should be able to process 1MIPs, hold 1MB of RAM, and displa

Have you ever used Windows 1.0? I managed to get it running in Virtual PC one day; it was nothing more than a glorified DOS shell with a calculator and word processing app. The Lisa on the other hand, actually did some useful things, and had a somewhat graceful GUI; nicely shaded grays are much nicer than that 4 color CGA monstrosity that was Windows 1.0.

Actually I remember using Geos on my c64 around 85/86, and unlike Windows 1.0, there were a few decent productivity apps for it. M$ isn't the only company guilty of stealing ideas, it's just they're the only ones to consistently make bad implementations of what they stole ..

Steve Wozniac wrote: Steve Jobs made the case to Xerox PARC execs directly that they had great technology but that Apple knew how to make it affordable enough to change the world. This was very open. In the end, Xerox got a large block of Apple stock for sharing the technology. That's not stealing outright.

Indeed. The common misconception is that Apple "stole" the concept from the PARC, when it was more along the lines of a random friend of a PARC researcher (who happened to be a Mac team member) was invited, checked it out, told Jobs, and then Jobs gave Xerox millions in stock for a technology that Xerox thought was useless at best and a piece of shit at worst.

It's like if you had a shitty G.I. Joe missing an arm, a guy buys it from you for $10,000 and then fixes it up to near-mint and eBays it for $100,0

"The Jobs visit had been infuriating enough, he says. He'd been out of town at the time, which was regrettable, "because if I'd been in town, I would have told him [Jobs] to get out. And if he hadn't, I would have beat the shit out of him. I had no respect for him. Then they [Xerox] would have fired me - and it would have been good for me and for them.""

From an interview with Bob Taylor (who used to be director of the Computer Science Lab at PARC) in W. Mitchell Waldrop's _The Dream Machine_. The exchange happened when Jobs "allowed" (the book makes it sound like this was a privilige) Xerox to buy $1.05 million of Apple stock in a private (Apple hand't gone public yet) stock sale, for which he would get unlimited access to research at PARC.

Actually, Apple had been planning the Lisa over a year before Job's visit, bit-mapped screen and mouse included. The Apple people mainly wanted to look at Smalltalk (too bad Jobs didn't "steal" that). They weren't particularly impressed with the laser printer or ethernet (Jobs was supposed to have hated networks with a passion).

The quote above was probably largely motivated by the (realized) fear that the microcomputer manufacturers would bastardize the idea of personal computing (the general view seems to have been that they were bright ignoramuses who completely ignored what the rest of the computer community was doing).

No offense, but Bob Taylor is not the most disinterested source to quote. If you read Dealers in Lightning, you'll get a better view of what was going on at PARC at the time. I've met a bunch of people at PARC at various points, and most understand that the biggest flaw was the disconnect between PARC's goals and Xerox' goals. PARC was very long-term, and focused on innovation, where-as Xerox was very focused on what would help them next quarter.

Windows 1.0 in CGA mode was 600x200 black and white, if you had colors at all it was running in 16-color EGA mode. It also came with Paint, and a very early version of Win 3.1's File Manager, which was the main way to launch apps. And let's not forget Reversi:).

The Lisa was black and white, not grayscale. And yes, The Lisa 7/7 OS had a brilliant UI, and was a much more robust OS than MacOS would be for years to come. The UNIX variant it ran was Xenix (not sure if Microsof

There were a few valid productivity apps for Windows 1.0. Micrografx In*A*Vision was a pretty nice vector-based drawing program. It evolved into Designer, the techie's preferred alternative to the more flouncy CorelDraw.

Back in that day, Windows 1.0 pretty much had to be given away. Early Windows apps came bundled with a 'runtime' version of Windows that would be installed as part of the process of installing the App. This in effect made the Windows/App bundle into a temporary run-time Windows environm

I think it was. I know I wanted one very very badly at that time. Even today those screen shots look very usable. By usable I mean it had a gui that is simular to what we all use today, it had the all purpose application of the day AppleWorks amoung others. It also had the imagewriter (maybe even the v2 model). It also had a lot of games available which isnt all that bad either.

I think it was. I know I wanted one very very badly at that time. Even today those screen shots look very usable. By usable I mean it had a gui that is simular to what we all use today, it had the all purpose application of the day AppleWorks amoung others.

I'm guessing that you're referring to the Lisa...if you were, AppleWorks didn't run on the Lisa. I don't doubt that there was a similar productivity suite for the Lisa, but AppleWorks was an Apple II package. It was derived from an earlier Apple III

I thought vi was long unsupported and has since been replaced by alternatives such as Vim [sourceforge.net] (my personal favorite), Nvi [bostic.com], Elvis [the-little...d-girl.org], and Vile [thehutt.org].

JOY: Bruce suggested that. At one point there was an acknowledgment section in the documentation for the editor that mentioned all the people who had helped - I don't know if it's still there in Volume 2.

A lot of the ideas for the screen editing mode were stolen from a Bravo manual I surreptitiously looked at and copied. Do

It has improved greatly. I only use:q!:enew:dd:w so my hands never leave the keyboard and use the menu's and icons for everything else. I am by no means a cryptic command jockey. I find it alot easier to use then emacs as well.

I prefer to focus on the core vi functionality, and avoid any new non-standard bells and whistles. I have too many boxes here at home whose only connection to the outside is the ethernet cable. The BSD os'es all include vi built in, and emacs only as a package. And at a job not long ago even the OS/2 boxes, which all had telnet server daemons running on them, had a vi installed.

It's just nuts to use anything else. Bring up many editors in a remote shell and you just go to a blankscreen (the editor used

Read Dealers of Lightening for a very good look at what happened at Xerox Parc. It does a good blend of the managment misfires, the politics, as well as providing a solid appreciation for what these guys did.

The section I found most interesting was the political battles over purchasing a research computer. After selecting a computer that was best suited for the job, they didn't get to buy it, and ended up building their own. A great story about how the pure research and deep thinkers mixed both worked together and battled against the engineers and the suits.

I once plugged in a 1 MHz crystal oscillator in an AST 286 machine. Which made the machine into a 512 KHz 286.

I didn't have the patience to let it boot up all the way, though. I waited and waited and waited. Then I heard the floppy drive start going *bip* *bip* *bip* as it started the POST sequence of s-l-o-w-l-y stepping the head up to the top track and back to home. I said 'forget this' and put the original (12 MHz!) crystal back in.

heh...doesn't it make you feel old when you recall the days of single and double diget Mhz computers:-)

Nah- you don't have to be that old to remember those. Especially double-digit MHzes- plenty of 16 year olders grew up with original Pentium machines.

Now, you gotta feel old if you recall the days where people actually measured clock speed in KHz, not MHz. I mean, I'm only 22 and used to use 1 MHz machines all the time when I was a kid.:P...and slower!

If they (and the followon effects, such as apples machines, and windows etc) hadn't created the GUI as we now have it - which in many ways is unchanged, ie overlapping windows, mouse, etc... what kind of interface would we have?

I'm willing to accept it was a pretty good jump of thought to create the gui on a bitmapped display after so much text-only based human-computer-interaction, but are there other ways of interfacing? perhaps other GUI ideas that we don't see just because they weren't first, and hence now the most developed?

Well, any user interface starts out as some kind of metaphor. The dominant file system organization, for example, borrows the ideas of files and folders from simple paper filing systems. By the same token, the overlapping windows GUI is just a metaphor for a desk with a lot of papers on it. So your question really devolves into this one: what other good GUI metaphors are there? I can't think of any, but then I'm pretty bad at thinking visually.

Not quite offtopic: back in the late 70s, some workstation designers decided they could do an intuitive user interface without waiting for bitmap displays to become affordable. The result was the form-based user interface of the CTOS operating system [byte.com], which ran on special proprietary hardware [cs.uu.nl]. Of course, like most proprietary systems, it was driven from the marketplace by IBM compatibles. Too bad, really.

A "book" metaphor, where you move between tabbed "chapters" that represent either various tasks or various stages of work

A "deep box" metaphor, where you have various objects in a 2D+1 space, with the closer objects getting higher priority.

The interesting part is, modern GUIs integrate both the "book" and "channel" metaphors alongside the "papers on a desk" metaphor. I certainly know that I don't use overlapping windows for anything but file-sorting; every program I run (exempting IM and Winamp) is maximized, and I switch between the tasks with the fundamental windows keyboard command, Alt+tab.)

Personally, I'm eagerly awaiting a better file system metaphor. Toss the "files and folders" lie, skip the "everything is a file" concept, and hop right into "Hard Drive is a database."

The BeOS file system has a lot of interesting features (I especially like the built-in file typing), but it's hardly a database metaphor. Perhaps you're thinking of the fact that the file system is journalling, a concept you usually associate with databases. But any serious file system has journalling these days.

Never heard of XP2FS. There's the open-source flight simulator, but I don't suppose that's what you meant.

Paging system? Whatever. Let's not trot out the tired old "it works best for me, so anyb

There are a lot of inexperienced users who run every program maximized, at least such is my experience doing support with a few different groups. Sometimes these people get freaked out when you un-maximize something and proceed to drag data from one window to another- bound to freak 'em out every time.:)

Good examples. They're not deeply integrated into popular GUIs, but you're right, people do use them.

Also, "channel metaphor" would seem to describe the virtual consoles on most Linux and a lot of Unix systems. I know text-mode diehards who insist that virtual consoles are more practical than any GUI.

One of the big design mistakes in early Windows was not making the book metaphor (I prefer to think of it as tabs that access specific windows in an app) a basic feature of the GUI. Instead, it provides tha

yay...CTOS....when I worked for the State Of Michigan 2 years ago as network support, they were JUST phasing those systems out for storing data for Child protective services.....man that interface sucked.

Sadly, the prophets at PARC were without honor in their own company, so much so that it became a standard joke to describe PARC as a place that specialized in developing brilliant ideas for everyone else.

There was a neat little dos program that once came with a Logitech mouse called "popdos". It looked very similar to Windows 1.0. The interesting part is that popdos originated from the same place as OpenOffice.

And, let's not forget a TRUE genius and pioneer, Doug Englebart [ibiblio.org]. He predated the Alto. This guy is what engineering and technology is all about. Not the bunch of clueless kids (and women!) that are sucked into the indoctrination of universities these days....

Ah, my kingdom for a time machine to travel back to the 1960s. Men were men, electrical engineers actually liked electronics way before they went to school and there was no fooling around!

I wrote an <a href="http://www.macedition.com/soup/hotsoup_20020 711.php">article </a> on this very topic last summer. In addition to the GUI, the Alto is also largely responsible for the concept of a technical workstation... Sun and SGI both were born on the campus of Stanford University, one of the places where there were plenty of Altos for students to play with.

Those of us who run UNIX on machines like my Toshiba 486 laptop sorta resent you putting down FVWM. It works really well. It's disappointing that big fsking aircraft carrier bloatware desktops seem to be the defacto standard now.

Only collectable system that fetches those types of prices are Apple I's, as far as I know. Though it wouldn't suprise me to see it get $2000-$3000, I guess, lord knows any of the old IMSAI stuff can get that on ebay.

The last Alto that a friend sold went for $5K about three years ago. Even though the economy has tanked since then, I seriously doubt that an Alto would sell for any less than that now. Although there were more Altos made than Apple Is, there may be fewer Altos left in existence. It was easier for someone with an Apple I to store it in their garage or basement. Also, most Altos were not in private hands, so when they were no longer needed they got scrapped.

A Star would suit me just fine, I think. Thanks for the extra keyword (daybreak), incorporating that into my Ebay search as we speak.

As for Alto's being rarer than an Apple I, that would mean fewer than 150. I can't think of any computer system that would be rarer, off the top of my head. Shame how corporate disposal policies are killing all sorts of historical computers.:(

Software is always the bitch though, isn't it? An acquaintance of mine has a Cray supercomputer, I think it's going on 2 years now...

As for Alto's being rarer than an Apple I, that would mean fewer than 150.

I don't mean rarer in the sense that fewer were made. Several thousand Altos were made, vs. perhaps 200 Apple Is.
But I think there are fewer surviving Altos than surviving Apple Is. There are believed to be under 20 Apple Is left.

Outside shed, climate controlled with a nice big concrete slab floor. I have a few 220v receptacles and such, I store most of my mini-computers out there.

Yes, I do want one. I wouldn't butcher it and see if I could put a Athlon motherboard in it, or any of the other bullshit you see people doing with treasures like this. I'm willing to do the research to restore it, if it's not in working condition. And if it can be networked to a modern computer, I will do so... maybe even letting some respectul individu

The concepts prototyped in the Xerox Alto contributed to the development of the Xerox Star, the Apple Lisa, the Apple Macintosh and Microsoft Windows 1.0."

I believe the pedigree should read: "the Xerox Alto and Star pioneered the GUI and mouse navigation in 1980 and 1981. these elements of the operating system while brought to the business mainstream by the Apple Lisa in 1982 (one year behind schedule), were brought to the common PC user in 1984 with the Macintosh."

Including Windows 1.0 in this company is a joke as Windows 1.0 was nothing more than a shell and not a true OS. In fact, it could be argued that Windows was a shell with DOS being the real OS up until Windows 98.

Even Win98 still had the DOS backbone on it - I'd say WinXP was the first "home use" Windows OS that was the first non-DOS-shell OS. Although I know a lot of people not into computers at all that use Win2k, so I guess a line can be drawn somewhere in the NT line.

I wouldn't say that MacOS was really an 'os' anymore than the Windows 1.0 shell running on top of MS-DOS.

Oh? And why not? I would be interested in what your definition of an OS is. It is true that the Classic MacOS (MacOS through System 9.2.2) had some serious problems in terms of its architecture compared with other operating systems (UNIX based), but it most certainly WAS an operating system inclusive of its GUI which was not simply a shell running on top of the OS.

I was responding to your comment: I wouldn't say that MacOS was really an 'os' anymore than the Windows 1.0 shell running on top of MS-DOS. My question still stands, and my point was that the MacOS was very different from Windows 1.0 running on DOS.

What difference does it make if it's a shell running on top of an OS, or an OS that has the shell embedded in it. Either one is an OS, and MacOS (before they gave up and just bought NextStep, the same way Microsoft bought the first version of MS-DOS, from an

It could be argued that having a GUI interface running on top of the operating system is much less efficient than having the GUI as a fundamental part of the OS.

Or the reverse could be argued. Lots of people here who are bigtime Linux/Unix advocates have made the case that one of the big problems with Windows NT is that the GUI is built in, whereas with Linux/Unix the GUI is seperate and not even necessary to the functionality of the whole. When Microsoft went from NT 3.51 to NT 4.0 one of the bad thing

have made the case that one of the big problems with Windows NT is that the GUI is built in, whereas with Linux/Unix the GUI is seperate and not even necessary to the functionality of the whole. When Microsoft went from NT 3.51 to NT 4.0 one of the bad things they did was integrate the Graphics into the NT kernel, which reduced reliability considerably, and sabatogued the microkernel design.

Including Windows 1.0 in this company is a joke as Windows 1.0 was nothing more than a shell and not a true OS. In fact, it could be argued that Windows was a shell with DOS being the real OS up until Windows 98.

?

Don't you mean XP/NT (depending on when you move "the OS" away from 9x.) or Win95?

All that Win98 did over 95 was IE integration and some small tweaks. DOS was still there, still built-in--and still in a vital part of the OS through ME.

When I worked at Xerox (not PARC) in the 80s, we had an Alto lab with a dozen or so Altos. They were so cool. Besides all the visible features, what really made them kick was that they had programmable microcode. So you could code up a new high-level instruction set and build your own language. This was how the Smalltalk-72 VM was implemented. They also had removable hard disk platters. Something the size of a pizza that held about 2.5MB. And besides the 3-button mouse, they had a 5-key chord keyboard - right hand mousing, left hand chording, it was a surprisingly fast way to edit.

The other totally fun thing about the Altos was they supported network games. My favorite was Mazewars. This was almost certainly the first multiplayer FPS game in the world. Everyone played an identical looking eyeball. You zipped around a maze and shot each other (with withering glares, I guess). But you really needed to be good on the chord keyset to win.

The thing I love about Xerox is that it reminds us all that Windows didn't rip off Apple, they ripped off Apple who was ripping off Xerox. It's interesting to think about what it would have been like if Xerox would have been in control of the computer market, since they had everything that we use today, and gave it away when they could have sold it.

you are so dumb...Jobs PAID Xerox for a tour of their research labs. that gave them all he needed to get the idea for the lisa.....of cource JObs screwed up and allowed Bill Gates in to see the Lisa before it was released becasue he wanted MS to develop some software for the system and Gates said he needed to see what he would be making the software for. next thing you know, MS has decided to not take Apple up on its offer and MS went to their mother (IBM) with the great Idea for a new GUI based OS later to

Actually, he was offered the tour, and allowed the rights to use the idea of GUI. He never paid rights to use the idea. I'm not sure about paying for the tour.

IBM requested a GUI OS and then allowed MS to use the concepts behind it, the same as they had allowed them to market MS-DOS, as compared to IBM's PC-DOS. And Windows 2.0 was the first to properly implement the GUI idea conceived for OS/2.

Excel was made to compete with Lotus 1-2-3, and could be used on a lot of different platforms, I believe, but was the primary focus of the Mac. It was before the OS/2 fiasco, but it was the main justification behind computers entering into the business world for day-to-day use.

Try googling Xerox consent decree and you will discover that Xerox neither mistakenly gave away nor generosly gave up their technology --they were forced by the government. That's government as in by the people of the people for the people. Too bad we gave up on that form of government in the US.
The public domain has to be taken by force, it always has been and always will be. There is no room for charity in monoploly plans.

In the early 1980's, I worked for a software spin-off of an engineering company that was going down the tubes rapidly. One Friday I went to work to find:1) A very polite policeman at the door.2) No electricity.3) No management people.4) Confused employees.5) An envelope at my desk with a check for 1/2 of my pay.6) On the memo line, it read: "WYSIWYG"7...8) no profit.

Last year I did some work on
Altogether [brouhaha.com],
a microcode-level Alto simulator. It does not
yet include simulation of the disk or 3 Mbps Ethernet hardware, which will be necessary
in order to boot useful Alto software.

Because almost all of the interesting Alto software used the writable control store, it is important to simulate the Alto at the microcode level. The Alto used horizontal microcode, so several operations are done in each clock cycle, which IIRC was 170 ns. On an Athlon XP 1900+, the CPU simulation runs at about 1/4 real time. In order to obtain better performance, it will be necessary to do quite a bit of optimization, possibly including binary translation of the microcode into native host code.

There's no packaged release of the Altogether code, but it can be checked out from CVS.

Bob Taylor headed the labs at PARC in those days. They say that at its height he had 76 of the top 100 computers people in the country working for him. His management technique was simple: Just bring a lot of brilliant people together and give them enough money and time to carry out whatever research they wanted. and they came up with the mouse, bitmapped screens and the ethernet cable. Douglas Englebart worked there and was(is) one of the great unsung heros of the multimedia revolution.

Steve Jobs has said that, at the time he visited PARC, they demoed three technologies for him: OO-programming, graphical user interfaces, and LANs.

He said that he was so blown away by just one of the techs (the GUI, of course), that the potential of the other two were completely lost on him.

It boggles my mind how far ahead of the curve the PARC guys were. Imagine going to a demo session and having the demonstrators show you a working quantum computing laptop running from a fuel-cell with a virtual holographic 360-degree 3-D display. It must have been something like that... where each advancement is so groundbreaking that you can only absorb one of them in a sitting.

While many Xerox engineers and even more techies outside of the company were sad to see Xerox discontinue GUI efforts beyond the Alto and Star, this was the full intention of the company's executives. At the time, Xerox was a copy machine company, the powers that be had no interest in making any sort of computer. In return for information, cooperation, and to somewhat return the favor, Apple gave Xerox a large amount of Apple stock. Apple didn't "buy" the GUI from Xerox, neither did they "steal" the GUI. About the only thing they "stole" were some engineers that moved to Apple to continue GUI work (Apple's former chief scientist, Larry Tessler, for example).

The early Lisa and Macintosh machines were less powerful than the last generation Xerox machines, but had better software support. The Xerox had several impressive demos, but most were incomplete. By 1985, the Macintosh had Mac Write, Mac Paint, Mac Draw, Hypercard, several Postscript-based illustration and DTP applications, and the very first GUI versions of MS Word and Excel.

Search the web for Apple/Xerox myths, you'll find the real story from several credible sources, including Steve Wozniak (Apple co-founder) who was still with the company at the time. www.woz.org may be a good start.

If it makes you feel any better, you may want to think of Apple as getting a taste of their own medicine with the Newton project. Like Xerox that pioneered a new area of computing, but allowed other companies to mass market smaller/cheaper models, Apple left the PDA market just as it began to take off. The Newtons were impressive technology demos, but were large and expensive and still had some quirks. Two years after Apple discontinued the Newton, everyone had a Palm.

We must always remember this story is written by John Markov, whose career is based in part on a set of half truths about Kevin Mitnick (who is by no means a saint) and other spin-based technology reporting. Some of the dotcom frenzy could have been moderated if he'd reported truth instead of illusion from his bully pulpit.

Given the previous mis-reporting (and I was around in the early 70s) I take issue with any one person or organization getting 'credit' for personal computing. It was time, in the industry, to do this. Already in tbe back of Scienctific American were half a dozen companies advertisting mini-computers that were targeted to a single researcher. I was on PDP 8s and soon thereafter PDP 11s which were mostly being used to support single people.

Allen Kay shold get credit for bringing to prominance the windowing environments that most of us now use.

Just published: "Open Innovation" by Hank Chesbrough, $24.50 on Amazon at http://www.amazon.com/exec/obidos/ASIN/1578518377/ . It describes what PARC was looking for in its research, the many spin-offs that we've heard of, and proposes a post-PARC theory for tech R&D funding / thinking with research from Intel, IBM, Lucent and others. I've posted a full review at http://www.mironov.com/pb/mar03.html. Strongly recommended!

While this is a little off topic, I'd just like to point out that Steve Jobs saw all that stuff at PARC because his people took him there to see it. They'd already seen it all, in fact some of them came to Apple from PARC. The reason for the effort? Because in order to get Steve Jobs to go along with a good idea it is necessary to make him think he came up with it himself.

The AC has a point. I worked with a guy who used to be a Xerox Star salesman. He said that they used to arrange that the machines be booted ahead of time. If they crashed for any reason, you'd just walk out, because once the customer saw how long they took to boot you'd never sell one.

The Alto and Star had a number of dubious design decisions that led to the incredibly large boot time and low reliability. One was that the filesystem implemented disk allocation via a simple linked list; no file allocation t

When I was in kindergarten, my father bought a Lisa for his office. I'm now married with kids, so that gives you some idea of how long ago this was. I remember playing with it when I was a kid.

To this day, my father claims the Lisa was the best machine he has ever used. All the applications were completely integrated in a way that DOS and even Windows apps weren't for many years. You could draw up a diagram in the paint program and paste it into the word processing program easily. It was so solid that, AFA

people forget all too quickly that amachine that takes 10 minute to boot is a worthless piece of crap.

Well, that depends on how often you boot it, doesn't it? At the time, Lisp machines took a long time to boot, but they stayed up for months at a time. Altos in use as file servers had similar uptimes. You must have had to boot your Lisa a lot if their time to boot was a big concern.

One reason Altos and their kin took a long time to boot was the multiple layers in the OS - boot loaders that load micro