From Win32 to Cocoa: a Windows user’s conversion to Mac OS X?Part II

In part two of our special three-part feature, Peter Bright digs deep into …

The coming of .NET

This is the second part of a three-part series describing how one developer became disillusioned with the Windows platform and was reinvigorated by the bright lights of Mac OS X.

In part one, I described how Apple turned its failure to develop a modern OS into a great success. The purchase of NeXT gave Apple a buzzword-compliant OS with a healthy ecosystem of high-quality third-party applications. Meanwhile, Microsoft was lumbering along with Windows XP. Although technically sound, it was shot through with the decisions made more than a decade earlier for 16-bit Windows.

In 2001, when XP was released, this was not such a big deal. The first two or three versions of Mac OS X were troublesome, to say the least. Performance was weak, there were stability issues, and version 10.0 arguably wasn't even feature complete. It wasn't until early 2002 that Apple even made Mac OS X the default OS on new Macs; for the first few months of its life, XP was up against "Classic" Mac OS 9.

But OS X didn't stand still. Apple released a series of updates in quick succession, strengthening the platform with new features like Core Audio, Core Image, Core Data, and Quartz Extreme, and providing high-quality applications that exploited these abilities. All this time, XP itself stood still. The core Windows platform didn't change between 2001 and late 2006.

Although XP itself was essentially unchanged, Microsoft did try to produce a modern, appealing platform for future development. That platform was, of course, .NET, and observant readers will have noticed that I didn't mention it in part one. This was no accident, as the whole .NET story deserved a more thorough examination.

Microsoft attempts modernity

In 2002, Microsoft released the .NET Framework. The .NET Framework was brand spanking new. It was designed and implemented from the ground up. It could have been clean and consistent and orthogonal and with a clear design and powerful concepts. It could have been a way out of the quagmire that is Win32. It could have provided salvation—an environment free of 16-bit legacy decisions, with powerful APIs on a par with what Apple had developed.

It was certainly promoted as such. .NET was pushed as the future, the way all Windows development would occur in the future. The plans became quite aggressive; in the OS that was to succeed Windows XP, new functionality would be accessed not through Win32 but through .NET, meaning that any developer wanting to exploit the latest and greatest OS features would have to venture into this brave new world.

So .NET could have been a step into the 21st century. It could have been, but it wasn't. Technically, .NET was fine. The virtual machine infrastructure was pretty sound, the performance was reasonable, and C# was an adequate (if not exactly ground-breaking) language. But the library—the .NET "API" used for such diverse tasks as writing files, reading data from databases, sending information over a network, parsing XML, or creating a GUI—the library is another story altogether.

The library is extremely bad. It is simplistic and inflexible and in many ways quite limited. See, .NET has a big problem: its target audience. .NET was meant to be a unified platform that all developers would use—after all, if new OS features required .NET, a broad cross-section of developers would use it. The problem is that not all developers are created equal. By looking at the different kinds of developers out there, we can understand why .NET is the way it is. What follows is not an exhaustive taxonomy of all the weird and wonderful breeds of programmer, but rather a rough taxonomy of some of the key species.