May 22, 1990 saw the release of Microsoft Windows 3.0. In many ways it was the turning point for Windows, the version at which you could see Windows being generally useful. With Windows 3.0 the Windows “ecosystem,” as Bill Gates termed it, was born, an ecosystem which caused the extinction of many other operating systems.

Not that Windows was really an operating system at this point. It was a graphical environment with many application services, but users started it by running the ‘win’ command from a DOS prompt. It relied on DOS not only for booting the computer, but for many basic services like file I/O.

And while it could run multiple Windows programs simultaneously, this was a cooperative or “non-preemptive” multitasking, in which programs needed to call into the operating system in order for another program to get time to execute. If a Windows program didn’t do this, it would get all Windows CPU time.

But Windows 3.0 could run multiple DOS sessions preemptively, an important feature at the time. The application market muscle at the time was held by DOS applications sold by vendors other than Microsoft: Word Perfect, Lotus, Borland, Ashton-Tate, and a host of smaller companies. Microsoft’s application market share was not nothing, but it wasn’t a leader in any core business applications. Windows 3.0 changed all that.

This was an era in which the basic architecture of the PC was still evolving and trends were highly controversial. Competitors were beginning to make x86-compatible processors; RISC processors were showing impressive performance potential; IBM was still pushing their MicroChannel Bus architecture to compete with the old AT or ISA bus, and newer EISA and VESA extensions were popular. The x86 architecture itself was only beginning to reach maturity as 386-based systems became popular. That 386 architecture was necessary for the pre-emptive DOS multitasking in Windows 3.0.

Even before the release the buzz was significant in and out of Microsoft. At the time I was doing contract testing of software for the industry and it was easy to see the excitement among many companies about Windows. Clearly Microsoft was moving all their applications to it; Excel began as a Windows app, replacing Microsoft’s atrocious DOS-based Multiplan.

It was also obvious that the main application companies, named just above, weren’t so anxious to play Microsoft’s game plan. Just as the PC hardware architecture was in flux at this time, so was the software architecture. Microsoft was still working with IBM on OS/2, at this point still in its limited 1.x versions. In broad architectural terms, programming for Windows and for OS/2 had a lot in common, but the two differed in all the details.

Many users and developers weren’t convinced that DOS was obsolete; perhaps DOS-extended programs could do anything users needed, and all these GUIs did was to slow the system down. Microsoft liked to claim in this period that Windows programming was somehow a step for application programmers on the road to OS/2 programs, but it was an empty claim. Developers basically had to choose one or the other.

Another reason for skepticism was that little in Windows 3.0 was, from a Windows programming perspective, new. Charles Petzold, author of the definitive Windows programming book and a PC Magazine Contributing Editor at the time, notes that the only big additions to the 3.0 edition of his popular Programming Windows were the chapters on DDE and MDI, relatively minor features. The newer memory management features of Windows were older than version 3: “Tucked between the releases Windows 2 and Windows 3 were Windows/286 and Windows/386, which did some of the same stuff.” I had forgotten about these versions, even though I worked with them at the time. They didn’t make much of a splash in the market.

The internals in Windows 3.0 weren’t the big deal. A Microsoft developer I know who was there at the time says “Windows was a punch line before version 3. It [3.0] was a huge step forward in terms of aesthetics in computing – I remember being impressed by the color, fonts, icons, etc. The big irony of course is that it was the backup strategy to OS/2 and it isn’t clear upper management knew the full extent of what was being developed (and if they had, whether they would have approved it).”

But once the Windows 3.0 juggernaut got moving it wouldn’t stop. Not only were new application opportunities created, but the markets for certain hardware devices exploded: Suddenly the graphics card you ran was much more important. Suddenly everyone needed a mouse, not to mention a color monitor. And suddenly the OEM market became much more complex: it launched the era in which users bought a system preloaded with a lot of stuff, including Windows, from an intermediary like Dell.

With Windows going gangbusters the IBM relationship quickly got tired and uncomfortable. IBM was still a major PC company and saw OS/2 as a way to differentiate their hardware and, by the way, it was a much better product than Windows. They had a point.

Windows 3.0 may have been most users’ introduction to graphical environments, but it was also their introduction to the Blue Screen of Death. Windows systems in those days, and pretty much all days until the NT kernel was mature, were commonly unstable. OS/2 at this point had a lot of problems, but stability wasn’t one of them.

It was also easy to see future growth for the OS/2 kernel and services, but the Windows architecture, as it existed in the 3.0 era, was a major kludge. It wasn’t scalable for very large applications, very large sets of applications or multiple processors. Only a fool would run a server on it.

Marginal, but noticeable improvements were made in Windows 3.1, which was released about the same time as OS/2 2.0. The Windows juggernaut didn’t stop for a moment.

Around this point Microsoft announced their Windows NT architecture, which would, around 2002, finally supplant the hack that was the old Windows kernel. It may be an ugly sight by modern standards, but Windows 3.0 was really something in its day, and it shaped the PC market we still live in.