Dave Morse, Amiga Computer Co-Founder, Passed Away

Dave Morse (1943-2007), Amiga Computer co-founder, died on Saturday, November 2nd. In 1982, he left Tonka Toys (where he was Vice-President of Marketing) and became the Chief Executive Officer at Hi Toro, Inc., which he co-founded and which morphed later that year into Amiga, Inc. and led through the development of the Lorraine Project (a codename inspired by Dave’s wife Lorraine) – ultimately, the Amiga 1000 Computer. Our take: We want to wish all the strength in the world to his family and friends, and I personally would like to thank him for creating a truly visionary computing platform. Forget Apple, forget Microsoft, forget Linux: the Amiga was the real revolutionary device.

About The Author

65 Comments

You saw the future, and fed it till it blossomed. The world was bright, as were your eyes. You shall be missed, your contribution to the world as a whole more impressive than many of the giants we hold up today. Take care and fast journey to the west. May Ra’s boat lift you up, and see you safely in your journey.

Dave created a computer/platform that showed true innovation. He demonstrated that a super computer was not needed to create a system that was graphically powerful and functionally advanced way before OSX, Windows XP/Vista and KDE/GNOME/Linux. You will be missed but more importantly you will also be remembered.

I recently left the IT industry after many years. From 9, I was programming. I found it beautiful, when I had to get many different INTERESTING systems to interact with each other. Suddenly in 2007, I felt the industry had let me down. Why? People like Dave were pushed to the side. True innovation was a thing of the past.

“We want to wish all the strength in the world to his family and friends, and I personally would like to thank him for creating a truly visionary computing platform. Forget Apple, forget Microsoft, forget Linux: the Amiga was the real revolutionary device.”

I can’t add much to that…as a proud former A1000 owner, thank you, Dave, for allowing me to participate in your dream.

One of my prized icons is an original Amiga makes it possible button from the original A1000 launch in Toronto. I’m glad to be amoung a group who understands when I say the suits have taken over IT and it is only mediocre…

I was at work chatting to a mate about the state of IT; the good old days, when there was vicious competition, who could make the best device with the best graphics with limited hardware. Who could squeeze every last bit of performance (and more!) out of a very limited device.

We need to go back to the days of Dave Morse, where innovation meant revolution, change, disruption, companies doing what they said rather than today where companies promise the world and fail to deliver. Games that are innovative rather than ‘better graphics’ and instead of higher quality AI based engines, relying on the person to be connected to the internet for competition – anyone remember the pride of being to ‘clock a game’? ah, the good old days.

*sigh* some of us had to do real work back in those good old days. It is soooo easy to do the armchair thing and reminisce with pink coloured glasses.

Let me tell you, having to deal with the limitations of HW and SW of yore wasn’t fun. Some of you seem to be under the impression that a lot of developments came to be just because, and it is the lack of context that can be fairly dangerous.

They were good because of the wild west nature of it all. We know it sucked, but that is my the time is so venerated. It was hard because of the limitations imposed by the hardware and software of the time, and all the niceties that we take for granted now were created to solve some limitation. There were no instant computer gurus back then. The ones that were gurus were gurus for a good reason, their kung fu was the best.

It was also a more innocent time for our industry. There were more possibilities, and people were able to take chances with some crazy ideas. It seemed like it was more about creating Sci-Fi dreams rather then pushing product. No one speaks of redefining the desktop by coding the ultimate operating system to topple Apple or MS. They will just build it off of the Linux kernel using Gnome or KDE. It’s all plug and play now. There is nothing wrong with that, but it’s not quite as fun as seeing the unique solutions that would get created otherwise.

I wouldn’t give up my modern hardware for anything, but I’m not looking forward to x86 and and unix everywhere homogeneity.

They were the good old days because of this; people designed software for the hardware of that day. iIf the common place was a 7Mhz 68K Motorola processor then you wrote your applications to target that speed and what most people had installed on their computer.

Then comes along Microsoft – bugger making your applications run on today’s computers; create a big bloated application then wait for the hardware to catch up. Who cares about trying to make you code efficient and elegant, who cares about tweaking the software to extract the most speed – let the hardware company sort it out.

Like the old adage goes, give someone and inch, and they’ll take a mile. In the case of modern software, sure, it gave them a degree of leeway when it came to development; the problem is that these days this ‘wait for the hardware to catch up’ has simply gone overboard. They’ve not just gone “you need to have the high end of today” its now, “the hardware hasn’t even come to market yet!” – what about all of us who want to use the application but don’t want to be punished because we’re not on the constant bleeding edge.

There was *real* skill involved. Yes, I did programming back then, AMOS/AmigaBASIC, dabble in REXX and assembly in some places. Before that I used a BBC Micro with 32k memory, using BASIC. Yes, it required you to think about things before you wrote the code in, but it forced you to think about things; instead of just firing code at a file, you actually had to logically think out the whole programme. That is part of the problem I see today; there is no discipline by programmers, very few stick to the fundamentals taught in System Analysis and Design, and very few actually logically lay out their application in terms of non-code representation of how things interact with each other (and thus show parts which are redundant and could possibly be replaced with a single piece of code which could do both).

Dave, you and the other Amiga craftsmen were responsible for many, many a sleepless night, as I stayed up to all hours both playing, programming and marveling at all of the amazing things I could do with your wondrous machine. You’ll be missed.

The Amiga 1000 and the Amiga 500. They are two names, along with the Commodore 64, that really bring back some memories from childhood.

“Dave was one of those guys who would sit through a board meeting, say almost nothing, then at the end of the meeting would say just one sentence or two, and they would be a perfectly formed gem of thought and plan of action — the optimal direction for the company, whatever company it was.”

When Steve Wozniak dies, are we going to write “Forget Amiga, forget Microsoft, forget Linux: the Apple ][ was the real revolutionary device.”?

We are all standing on the shoulders of greatness, the Amiga was revolutionary in many ways, but many revolutions had come before it.

Sorry if this sounds disrespectful, I don’t mean to be, I loved my Amiga and thought the world of them when they came out. They were many years ahead of their time in some areas and brought massive power to the masses at a very cheap price.

I guess what I am trying to say is remember Dave and his contribution in the correct context, I’m sure that’s what he would have wanted too…

When Steve Wozniak dies, are we going to write “Forget Amiga, forget Microsoft, forget Linux: the Apple ][ was the real revolutionary device.”?

I don’t know. All I know is that for me, personally, the Amiga showed us, in 1985, what computing would be like 20 years later. The Apple II was right for its time (and impressive because of it) – the Amiga was way ahead of it.

There has never been something quite like the Amiga after the Amiga itself. Apple, Microsoft, the Linux world; they are just ripping each other off, no one binging anything really revolutionary to the table. They each contributed significantly to the advancement of computing, but at the same time, they hold it in a stranglehold, blocking any really new ideas/interfaces/devices from reaching the market – because they define the market.

The Amiga was the last of its kind – a system designed from the ground up by visionaries. Some might say that my beloved BeOS and its BeBox was the last attempt – but not to me. BeOS wasn’t even nearly as ahead of its time as the Amiga was in 1985.

I do agree,the Amiga “seemed” so far ahead because of the custom sound and graphics it had. I loved programming the blitter (well, the small amount I did).

I think it’s perception and nostalgia that help filter how good something really was. For me, the break through was the Apple ][. In 1980 we had one at our small school for a couple of weeks, so it was my first time seeing a computer of any sort. I learned programming, saw graphics and text on a screen, sound (well, OK, that’s pushing it),

When the Mac came out in 84 I remember studying screen shots (on the cover of magazines at the time) for hours, just totally blown away by this machine. It gave us GUI, a mouse (yes, I know these were pioneered/created by Douglas Engelbart and the ARC team back in the 60’s), 3.5″ floppies, pure graphics (if only b&w), Word and Excel (yes, on the Mac first), Paint, proportional fonts, WYSIWYG and so on…

Not a lot of people know this, but the Lisa had multitasking, an object orientated desktop, pull down menus and scroll bars (a first) and so on… If not for the price, Lisa would have been a revolution.

I think the Mac personally showed me more what computing would be like in 20 years time, and other than colour (which was left out due to price), it pretty much did. I’m sure other readers here would suggest other machines or platforms with equal merit.

I do remember in 85 when the Amiga came out and being blown away by that. The graphics were sooooo far ahead of the Mac and everything else at the home level! I remember going to a trade show and seeing the HAM mode for the first time and the painting application for it (remember the painting of the pheros)?

I think we are all stealing from the thieves (as Bono puts it), some more than others 😉

Personally, the biggest thing for me that Amiga gave us was so much power at such a small price. That was something. It also showed part of the industry how to build graphics.

I wasn’t saying Apple invented the GUI, I actually gave that credit to someone even before Xerox. Xerox actually stood on Douglas Engelbart and his teams shoulders but moved the GUI to a point that is recognizable today. And I’m sure Doug got his ideas from someone else, but maybe not, not all of them anyway. There is always some innovation in amongst the process of development.

My point was that as great as the Amiga was (and it was a great machine), it wasn’t as revolutionary as we may wish it to be, and certainly not worth the “Forget Apple, forget MS, forget Linux…” quote given by OS News, does that help?

As for the scrollbar, you might be right on that, I’d heard Apple invented them, but don’t mind being wrong… I think I remember (and I am most likely wrong on this too) the original Altos had big enough screens that they didn’t need them (they could fit a page of a document on the screen), but the prototypes of the Lisa needed them as their screen resolution was much smaller. I got that from a video of the Lisa and Xerox guys talking at some anniversary a few years back… I should find those videos again. That was a very minor point and I should have left that out, sorry.

Anyway, I wasn’t trying to push Apple as the only guys that invent anything, far from it, they stood on the shoulders of many many people. I didn’t mention the Alto as the GUI preceded it.

Apple did have the GUI novelty, but it was well in to the 1990s until the Mac surpassed the Amiga in either hardware or software capabilities. The Mac didn’t get pre-emptive multitasking until Mac OS X. In fact, the Amiga came so shortly on the market after the Macintosh, that claims that Amiga took the GUI from the Mac are doubtfull.

Therefore, the Amiga was way ahead of its time. While the Mac was a remarkable machine at its time, GUIs were soon commonplace and therefore one cannot say it was ahead of its time.

All those features you mentioned about the Amiga had been invented elsewhere.

In the realm of technology, in the big scheme of things, the Amiga was not revolutionary. What was revolutionary about the machine was its pricing. Alas, commodore did not really know what to do with the machine…

There is a biiiig difference between clever systems/solutions, which the Amiga was one of them, and truly revolutionary systems.

You surely can name truly revolutionary systems, right? Gee, if it was not Amiga and impact it had on personal computing, then what was it?

And of course you simply lie. Well, someone could pioneer something, but Carl Sassenrath brought us multitasking operating system, usable on 7MHz machine. Dynamic libraries, dynamic devices, later on system wide scripting and system wide localisation.

You also show total disrespect to what the machine provided HW wise at that time, in comparison to anything available on the market back at that time.

Kudoz to Jay Miner, Dave Morse, Carl Sassenrath and other ppl, who followed their visions and bring this world real innovation.

Oh, yes… let us remember the Amiga’s GUI, that was more primitive and ugly than the Atari ST’s GEM (DRI). The Mac 128 and the Atari ST looked a lot better. It wasn’t until 2.5 (3.0?), I believe, before the Amiga’s GUI started looking decent. And let us not forget that the Amiga, TO THIS DAY (AmigaOS 4), still does NOT have protected memory… or something like that.

No doubt the Amiga was better at what it did than the Atari ST, but it looked uglier while doing it.

Whatever you say, is just plain words. There is no comparison between what Amiga provided SW wise, and HW wise. Why do you capitalise on “to this day”? Where is your Atari then? 🙂 What does “to this day” has in common with Amiga inovation or not in 80ies?

Well, typical Atarian reaction 🙂 You guys will probably never get over it 🙂

The ugly version was 1.x. 2.x already looked pretty decent and supported graphic cards. Homeversions of Windows had memory protection in 2000(!), years after Amiga even ceased to exist and that was also the year Windows got preemptive multitasking, 15 years after Amiga already had it, and both is true for Mac OS as well, since it neither was truly multitasking nor had virtual memory before OS X.

Mac OS was prettier at that time, but it ran on *much* more expensive hardware, so that isn’t comparable at all.

There is ALWAYS someone else. If it wasn’t Woz or Steve Jobs or Bill Gates (the three biggest “personalities” I can think of at the moment)… Someone else would have been there, sooner or later, to do what they did. Innovation and inspiration are always waiting to be exploited. It’s just a matter of “the right person at the right time”.

Steve Wozniak wasn’t the ONLY one who could have created the Apple I. He was simply the RIGHT person at the RIGHT time, who KNEW the RIGHT person and was willing to work with (or be goaded on by) him… which started Apple.

And any wonder, Steve Jobs, single-handedly SAVED Apple from ruin. I still think he sold his soul to the devil, to put Apple where it is and make the revolutionary products they make, that EVERYONE loves!

Like it or not, Steve Jobs *IS* Apple. The day he dies (or is kicked out; you know he’ll never leave on his own, except for (maybe) extreme loss of health), is the day Apple will start into a tailspin once again.

I second that. Is there a need to disrespect other efforts and initiatives, just to make the point of how innovative the Amiga was in its time? The GNU project may not have been very innovative in a technical sense, but it was innovative in many other ways (social, ethical, political) and I think the “forget Linux” part is uncalled for, especially in an obituary, which is not the right place to debate such matters.

Amiga was revolutionary no doubt. Loved my A500. Also for me the Atari 8bit was the most revolutionary hardware for its day – Jay Miner who was behind the Amiga was involved with it. 3DFX also started a revolution. So did Torvalds and so did MS. Apple 8bits, too. Also the C64 was a revolution in pure unit numbers, but not in hardware as the Amiga was. Win95 was a bit of a revolution. Doom (id) was too. Each had its impact.

“Some might say that my beloved BeOS and its BeBox was the last attempt – but not to me. BeOS wasn’t even nearly as ahead of its time as the Amiga was in 1985.”

I believe for every computing decade there is a “Paradigm Shift” to the way people believe things can be done. When the “envelope” is pushed a little further, or a new way of thinking is presented.

BeOS had plenty of “Wow!”, but there wasn’t enough substance after that, to keep people. I believe Be, Inc. was a little out of sync with their offering. They presented a product that excited people beyond the OS’s capacity to deliver on expectations, at the time. I think they should have really matured the OS and established it’s software/driver foundations before presenting it to the public. So, when the “Wow!” wore off, the user could get down to actual productive work!

But that never happened. People saw the gaping chasms of lack (drivers/software, etc.) and eventually chose to go back to the “security” of their main OS. Once the decline started, it was impossible to recover from the tailspin…

Be, Inc. made a GREAT first impression, but it was found to be rather superficial in the end.

Haiku is going to be what BeOS could have been. It won’t be a paradigm shift, but Haiku will enjoy it’s own “shining moment”, to be sure.

Sad to hear of Dave’s passing. Both Dave and Jay Minor did so much to make the Amiga what it became. It was SO AHEAD of it’s time it was not funny. Thanks to everyone who made computing what it is today. Doug Englebart, the people at Xerox Parc, Jobs and Woz, and many others. Dave and Jay were so ahead of their time like Xerox and others were in their time, but like has been said already, the Amiga was so advanced when it came out. Years ahead of anyone else in certain areas.

I still have my 500 and 3000 and will always remember the Amiga. It stood out for years above the rest of the computing world.

Maybe he submitted it first or his article link was better or title was more concise. I wouldn’t take offense, its just the luck of the draw. I dare not count the amount I have just missed out on or simply been ignored on!! :-p

I submitted the same title, with the info directly from the REBOL Tech. CEO, as well as the link to it. I think I know the reason why anything I submit is getting attributed to others…is because I called Eugeina a Forum Nazi a few years ago; and I am still sticking to that.

You guys are obsessing on preemption. LoseThos has a feature where you can turn it off and on. I’ll assume your too stupid to understand why this is cool… think about it for a while. LoseThos empowers programmers — that’s the whole point — fun with programming… all instructions available. To put this in terms you can understand, since, y’all seem to be administrators, most operating systems are like being a user without root access. Kernel mode is off-limits. In LoseThos, you have kernel mode in all programs you write! There nothing less than kernel mode. You pussies are already complaining, right? Oh, no, it might crash. BFD, boots in 2 seconds. It has no network. There is absolutely no risk of malware. You’re suppose to use it to WRITE program, not run them… if you write the program and there’s no network, how the hell are you going to have security problems? Screw security! Have a blast programming.

You guys are all brainwashed by 1970’s mainframe opeerating system ideas… the kind they teach in school. Let me tell you something–a home computer is not a mainframe. Huge difference, that nobody seems to grasp–on a mainframe a crash inconviences hundreds of people. Get out of the 70’s and don’t be so brainwashed.

LoseThos doesn’t use virtual memory. I have 2 Gig RAM. Virtual memory was necessary when you had a hundred users on a mainframe. Mostly, it’s not needed on a home computer unless you’re doing science stuff maybe.

LoseThos is simple, on purpose. Linux might as well not be open source. I laugh at all these open source advocates who don’t understand the code.

Linux is a false promise–people think they can understand the code and tinker. It’s got so many #ifdefs and stuff–pile of shit. Have you ever tried to trace operations to the actual hardware IN or OUT instructions? Nightmare. I seriously thought the goverment told them to obscure it for homeland security or something.

Dave, yours was the computer I couldn’t believe the existence of back in my childhood, during the days of the Spectrums and Commodore 64s, the Apples and the Ataris and the Nintendos. It was legend. A sort of mythical holy-grail by the fact that I was never able to get my parents to buy me one.

Now that there’s all these emulators (thanks to the UAE project) I still intend to sit down at a real machine someday for the first time, and when there’s a significant breakthrough in computing paradigms (quantum, photonic,) I hope they call it an Amiga.