Posted
by
timothy
on Sunday April 03, 2011 @03:07PM
from the how-far-we've-come-or-not dept.

autospa points out a post (with video) showing off the multi-tasking abilities of the Blit terminal, developed in 1982 by Rob Pike and Bart Locanthi. Before Windows, before X, and before the Mac (but somewhat later than the Xerox Alto), the Blit terminal provided a multitasking, mouse-driven graphical interface; it took a Unix server on the other side to do the heavy lifting, though.

It wasn't useless and did multitask. True it was via special applications referred to as 'accessories'. However, if you used a wedge you could stick any application in as an accessory and as long as it didn't need to write to the screen to keep running while back grounded, it worked rather well..

[The Atari ST's TOS/GEM] wasn't useless and did multitask. True it was via special applications referred to as 'accessories'. However, if you used a wedge you could stick any application in as an accessory and as long as it didn't need to write to the screen to keep running while back grounded, it worked rather well..

Let's put this in context. That somewhat stretched, certainly limited and somewhat kludgey version of "multitasking" might sound passable compared to MS-DOS-based PCs of the same era. Not that big a feat given that mid-80s PCs were running MS-DOS, an early-1980s ripoff, er.... *port* of the 1970s 8-bit-microcomputer-era OS CP/M.

However, the ST's main rival, the Commodore Amiga (which hit the streets at almost exactly the same time as the ST- mid-1985, and not 1984 as you state) featured full pre-emptive multitasking as a standard part of the operating system. No silly restrictions or workarounds for what was basically a single-tasking OS required, because multitasking was an integral part of the OS. You simply ran two or more programs at once and they worked- period.

And this was "proper" pre-emptive multitasking, not the more primitive co-operative multitasking (which relied on well-written programs yielding control themselves) that even Windows 3.x was still using in the early 1990s.

Thing is that although the Amiga was generally a more advanced computer than the ST, it had the same basic CPU- the 68000- running at similar (actually, slightly slower) speed- and to the best of my knowledge its multitasking (and other aspects of the OS) weren't reliant on the Amiga's custom hardware. So I'm pretty sure the 68000-based STs *could* have run a more advanced multitasking OS in theory, even a port of the one that the Amiga had(?!)

But the fact was that they didn't, at least not back then, and the "multitasking" you describe was at best a restricted hack that clearly *wasn't* the best that could be done at the time.

Well, I never claimed that the Amiga was perfect by modern standards- we're talking 25 years ago after all!- only that it was miles ahead of the ST and DOS.

If only it were 25 years ago. One could understand why the Amiga never got VM in its original 68000 form but when the 68030 & 68040 models turned up with HDD and high end spec / price they damned well should have. The AmigaOS was a very advanced OS when it first appeared but it squandered its advantage. By the end of its life the lack of VM and the graphics were major impediments to the platform.

Operating systems like Windows & Mac OS were still stuck in preemptive multitasking and both had limitations (e.g. all the GDI segment shit in Windows & dirty addresses in Mac OS), but they still managed to get VM in MMU enabled processors while the Amiga was still extant. It makes one wonder what could have been if Commodore had been a bit more competent in selling their machine and pushing features that kept it ahead of its rivals.

The ST did eventually get multitasking but way too late to make a difference. MiNT [wikipedia.org] was an alternative kernel which allowed multitasking and AES was a multitasking version of GEM. The Atari Falcon offered them up as MultiTOS. By then it really didn't matter much though. Anyone interested in multitasking probably owned an Amiga anyway, and both computer families were ultimately doomed by the incompetence of their respective parent companies.

Multi-tasking certainly existed on the server, but you had a hard time seeing multiple things on your terminal screen. The BLIT allowed you to have multiple active windows open that and see stuff going on in all of them. It was such a nice interface that many of us wondered why people got even a little excited about Windows on a PC.

It wasn't that the developers didn't know how to make a multi-tasking environment, it was getting the horse power to do it. At the time to get a dumb (non-multi-tasking) terminal b&w cost over a thousand dollars in 1980 money. The 1983 Lisa cost almost 10k and apple engineers were scramming to get all the parts to fit and be as affordable as possible. Getting Windows on a PC in the late 80's was a big thing, because for a $2k you can get a full computer. That could use the advanced features too.

For a muti-tasking I have seen 3 approaches.1. Full Screen and switch to active session (like hitting alt-ctrl F1, F2... in Linux, or multi-tasking in iOS)2. Frames where the applications are split across multiple frames (Desqview, Plan 9)3. The movable window. (Windows, Mac OS X, XWindows)

Computer data has always been represented in terms of rectangular shapes. So the rectangle subset of UI data makes for the most Useful representation of data, and the Movable Window can be setup to emulate Frames or Full

Much of what makes modern computers more similar to the Mac than to the Alto are mostly two things: the menu bar and the fact that you can do everything with the left mouse button, and other systems were adopting those features at the same time. Apple was a little quicker to market, but they weren't really ahead of their time.

nope, the original ones had motorola 68000, so forget *BSD or Linux. The WE-32000 used in some of the later models had demand paging and were basis for some Unix systems. But then even later models of the Blit terminal went back to the 68000.

It certainly could, if you could get a hold of the protocol details and wrote a Linux application that used it.

It was a terminal, not a full user oriented operating system or workstation. Very similar to X Windows but predating it. It was quite usable over a normal serial link or even a modem, since it used a lot less bandwidth than X. Your applications ran on the server, as this was the before the days when everyone got their own Unix machine on their desk.

It ran a protocol called Layers. About 10 years, ago, I came across a later version of the BLIT, an AT&T 610, in a back corner of a testing lab in the office I was working at the time. Being curious, I did some searching and found C source for a user-space Layers driver. Basically, it worked like the screen utility works, except that the "driver" simply multiplexed the normal tty IO over a serial link, which could be a com port, TCP or other, to the terminal, which then de-multiplexed the streams to separate windows on its display. It also had some small capability to draw shapes from commands sent to it. I never got that feature working, just the equivalent of multiple xterm windows.

While I suppose a simple protocol like that could be useful for people who use remote shell access, I think it's easier to just run SSH in a bunch of xterm windows, leaving the multiplexing to TCP.

Back when a VT terminal was my only machine I could have four sessions at a time. I think it was a feature of the terminal server, backed up with keystroke support on the terminal for changing sessions. I suppose the VT terminal could have supported the sessions as overlapping windows, and sent the change session command when focus changed. But that would have undermined the sales of VAXStations, etc.

I don't care so much if it were the first or not. It was still cool for it's time. In my mind, this one being among the first was still quite an achievement.. because if you think about it, not much has changed since then. It really hasn't. Sure the boxes are faster today, and the applications more sophisticated... but the basics of multitasking are more or less the same today.... we stand on the shoulders of giants.

On another note, I like the look of the portrait oriented monitor. It looks to be so much better suited to documents, and probably coding, than the mostly landscape orientations that came later.

I have a large 4:3 that can pivot 90 degrees; great for actual productive work but directX doesn't support it and the driver is so crap that it crashes the OS about 1 in every 4 rotation switches.These days monitors are just wide, which is useless for anything besides movies and games.I would love a square monitor as a "best of both worlds". Sadly these don't exist.

Question:Why not just buy two cheap 16:9 widescreens and use a stand to mount them on top of each other? Even cheap graphics cards nowadays support multimonitor, and the price of 2 bog standard 19 inch wides would be cheaper than finding a decent 4:3 monitor now anyways. That wouldn't give you perfectly square but would certainly give you more height.

That said you CAN still get square monitors you know, they just don't come any bigger than 19 inch, at least I couldn't find any. So why not keep the wide a

Question:Why not just buy two cheap 16:9 widescreens and use a stand to mount them on top of each other?

Because I don't want a big seam in the middle. If I were using the screens to display independant documents, then perhaps that would be a solution, but when working on single documents (i.e. photo's), the seam is a big no-no.

Then what about my other solution, which is to use a 19 inch 4x3 for your document and picture work, and then use the widescreen for entertainment? Again this would be MUCH cheaper than trying to find a big ass CRT that still works good and has decent resolution, and every GPU except Intel has supported multimonitor for at least a couple of years now.

The drivers are solid, Windows supports dual screens quite nicely (especially Windows 7) and you don't have to deal with rotation bugs, which frankly so few p

Yeah, but it sucks that most TN screens look horrible when tilted. Color reproduction seems to be even worse (so I cannot see the highlighted text any more, unless I use a hard pink color or such). And 1080x1920 is awkward as well, that's just a bit too much vertical space. I've now bought 2 LP2065 screens for my coding needs at home.

On another note, I like the look of the portrait oriented monitor. It looks to be so much better suited to documents, and probably coding, than the mostly landscape orientations that came later.

I suspect you can blame the early cinema pioneers for that... they decided on a "landscape" format for movies which then became the standard for Television sets. In the 80s, most home computers (Sinclair Spectrum, Commodore 64, Amstrad 64 and even the Atari ST & Amiga) used the TV as a monitor so a generation of kids grew up assuming monitors must be in portrait layout.

I've used one of these, and it was kind of nice. The drawback compared to X Windows was that it was not very standardized or common. You could use X on many different Unix machines, whereas the Blit only ran stuff programmed for AT&T servers.

There has been a fundamental change since those days though. Back then it was considered somewhat silly to have a full workstation on every desk. Those workstations that did exist were very expensive and usually intended for specific engineering tasks (such as C

"Now consider the 'cloud' push, and concepts like Google's ChromeOS. The web browser is becoming the modern day equivalent of a X terminal in a sense."

The HTTP and HTML page request/response mechanism has been compared, with a fair degree of accuracy, to the IBM 3270 terminal system. The web server is the mainframe, the big unseen untouchable system where all the data lives. The terminal/browser stores no data.

"There is nothing new under the sun." (from Ecclesiastes 1:9-14, reportedly written about 2250 years ago)

It makes you wonder, how the software industry would look right now if that project would have been competition or replacement for windows. Just asking, exactly how much did we lose because of the MS monopoly?

Bill Gates wanted a computer on every desk, like any populist endeavour it's going to piss off those who think they are superior to the common masses.

The market was penny-wise and pound-foolish; the idiotic software and window system architecture of Windows and Macintosh meant that companies ended up spending enormous amounts of money on rewriting their software again and again.

...And the A500 would do it while rendering a 3D scene of an ST being shat upon and playing a topical tune.The ST, on the other hand, would have returned the favour if it weren't busy "multitasking" something else.

I loved my STs, but let's be realistic here. TOS was a singletasking operating system. The first real multitasking OS on the ST was probably MiNT, which was for a long time really an "experts only" option. Multitasking on the ST line that was usable by the masses didn't really exist until MultiTOS, which was, what, 1992?

I was definitely an ST fanboy back in the day, but you've got to admit, the Amiga was simply a better system.

OS-9 provided true multitasking for microcomputers in 1979. It was a standard option for the Tandy Color Computer starting in '80 or '81.These Radio Shack computers were available and affordable for "Joe Sixpack".. though most instances of Joe didn't seem very interested at the time.Amiga's ability to provide multitasking 5 years later may have more to do with marketing and the public's receptiveness to computers in general thanany technical feat.

True multitasking, yes - on an eight bit computer, which is outstanding. But multitasking is not very useful on a personal computer without a multitasking GUI, and those required a lot more resources than were available on affordable computers in the early 80s.

The Amiga 1000 shipped with 256K of ram, but it was more or less a toy without 512K. A 16/32 bit MC68000 processor, multitasking kernel and GUI, multichannel digital sampled sound, and scads of custom hardware support made it a much more attractive

A couple points.. first, OS9 did have a GUI on the Tandy Color Computer, not sure if there was a GUI on the other 6809/OS9 platforms. It used the joystick to move the cursor, and I think they even sold a mouse at one point. However, this GUI was fairly crap, wasted precious RAM (many OS9 systems had 64k or even less) and few people wanted much to do with it. More often, people used the "windowing" system which let you define several areas on the text screen as independent I/O devices, or have multiple v

Of course innovations do not come sequentially, but in cluster as technology matures. In this case we have an evolution and application of the WIMP interface as hardware get cheaper and the software techniques develop. So in 1982 this terminal application is developed. In 1983 Apple introduces the Lisa, a personal computer with multitasking, protected memeory and GUI interface. In 1984 came the Mac, which simplified the OS, just like MS Dos, to the needs of the emerging PC user. Then Amiga came in 1985

The Amiga OS had a preemptive multitasking with a "proper" task scheduler from the very beginning. It was designed for it. Perhaps you are thinking of the Mac, to which cooperative multitasking was added using a clever hack about three years after release.

If you bought OS9 shortly after it was released for the CoCo, you'd have been multitasking for five years before the Amiga came out.Could have been six or seven if you'd had a GIMIX or another of the first systems that ran OS9.

I sold both Amiga and Atari ST computers while working through college in the '80s (I still have the first model of the ST, but sadly not the Amiga - It was more popular). The original ST only did "cooperative" multitasking, while the Amiga used preemptive multitasking. However, Amiga had no memory protection so it was prone to a lot of "Guru Meditations" (especially with the early versions of Exec used with the 1000).

The first preemptive multitasking in the "home" was provided by OS-9 that ran on the Colo

The original ST didn't multitask at all. Neither did the original Mac - unless you count desk accessories. Cooperative multitasking wasn't standard on the Mac until System 7 in 1991, although it was available as an option after 1987. Pre-emptive multitasking didn't come to the Mac until 1999.

The Atari ST was nearly obsolete by the time Atari started supported multitasking - in 1993. The Amiga had a "real" pre-emptive multitasking operating system on rele

Come to think of it you may be right, it's been 26 years so cut me some slack. The more I think about it, I believe only the foreground application actually ran while the background applications just waited. I don't remember a 4 window limit, that seems low.

I'll have to dig the machine out of the garage and boot it up. I doubt the boot disk is still good, but I'm sure there is an image available somewhere.

No problem. I did some limited application programming on the ST and I am pretty sure TOS didn't do multitasking at all (preemptive, cooperative, or otherwise) until 1993 or so.

The ST had decent hardware though, much nicer than most of the Macs at the time, and in some respects better than the Amiga. Better (if smaller) monitors, better (non-interlaced) hi-res monochrome than the Amiga had for several years, more memory standard, and of course built in MIDI ports. Hard drives were more common on the Atar

A blit sounds nearly as capable as a BBN Bitgraph, which in 1982 had a 68000, a bunch of RAM, a mouse, a portrait display (I don't remember the resolution), bitmapped graphics and a windowing system. Nostalgia runs so deep for the BitGraph that it's still supported by gnuplot, dvi drivers, ghostview...

Not that many people ever got a chance to use a blit, but bitgraphs were workhorses of their day. It was hard to get some people to trade them in for Sun 3's.

Well, there's no way to sugar-coat it. For a long time, Rob Pike didn't understand that someday non-geeky people outside of Bell Labs might be interested in using UNIX and things like the BLIT, and the result has been that he designed systems that are, for lack of better words, idiosyncratic and quirky. He takes interesting ideas and inadvertently wraps them in unmarketable UIs. He is intolerant of imperfection, including imperfect users, which includes anyone who uses Windows, X11, the mac, or still us

I had one of these on my desk while it was being tested in the Labs before the commercial product was released. When you got up after an afternoon staring at all that monochrome green, the rest of the world looked slightly pink:^) Test users had to provide regular feedback. One consequence was that every few weeks they came around and replaced the keyboard with an improved version. The last one was easily the best programmer's keyboard I ever used: all keys in the right place, wonderful touch.

... were just the next logical step beyond simple ASCII terminals,and before a decent graphics protocol (X, etc.).For some background/data on how those things fit together, see here:http://www.feyrer.de/NetBSD/ttys.html [feyrer.de]

OS-9 is a family of real-time, process-based, multitasking, multi-user, Unix-like operating systems, developed in the 1980s, originally by Microware Systems Corporation for the Motorola 6809 microprocessor. It is currently owned by RadiSys Corporation.

OS9 was a wickedly cool operating system, which could multitask surprisingly well on the Motorola 6809. While never quite what you might call "mainstream", it was popular with some hobbyists. A friend of mine showed off his TRS-80 running OS9 once, and I was suitably impressed, even though I had an Amiga and an 80386 PC running OS/2, both of which were obviously more advanced. It was a very powerful, sleek system that probably should have caught on more than it did.

Of course, there was also GEOS, the Amiga OS, the Atari ST, and OS/2, but those came a bit later than OS9 (which dates back to 1979!). I still have fond memories of my Amiga, the massive flamewars of Amiga vs Atari, and the poor Apple fanboys with their black and white OS that barely even multitasked.

"poor Apple fanboys with their black and white OS that barely even multitasked..."

Tsk tsk, now don't hurt their poor little feelings. Remember Apple eventually did acquire some BSD unix [wikipedia.org] code from NeXT [wikipedia.org] at a going-out-of-business sale and eventually produced "their own" (ahem) preemptive multitasking OSX [wikipedia.org] in 2001 which was only 16 years after Commodore's Amiga [wikipedia.org], 22 years after OS9 on the Radio Shack computer. Heck, even Microsoft produced a preemptive multitasking OS [wikipedia.org] only 10 years after Amiga.

The Blit was commercialized into the DMD5620 using a 32 bit WE32000 processor. Sadly, this made it prohibitively expensive. To make it more price competitive, it was redone with a 68000 processor as the 630MTG. This was ironic, as the original Blit was also based on the 68000. Later a network interface, supporting both OSI and TCP/IP, was added along with additional memory and a faster CPU as the 730MTG. The 730MTG could also run X-Windows.

They were remarkably productive over a serial line.

I miss Gebaca!

Another fork off the Blit design was the Not. It was based on a 68020 processor, and was the original graphics workstation for Plan 9. The origin of the name is amusing, it looked like a 630MTG with a DMD5620 keyboard and mouse. When asked if it was a 630, the answer was that is was Not. A name was born.

DOesn't anyone here remember Doug Engelbart & The Mother of all Demos [wikipedia.org]? The year was 1968, it was a technical marvel and was very carefully arranged, but REAL - it was a spare no expense demo of what was possible with the current technology:

I used a Mentor Graphics Apollo workstation from 1982 for PCB design, software development, word processing, etc. It multi-tasked... compiling, reading errors (I mean *features*) and correcting the code in another window was awesome coming from a DEC environment. Windowing, mouse, etc. I never understood what all the hubbub was about... switched directly to a Mac after I left that company.

Yes it show how Apple and MS went down to cheap junk hardware and milked generations to get us back to Unix like protections.
20 years of gui and networking bug hunts and we have Unix back and think we are in the future.