This comparison of the Unix and Microsoft NT/Win2K/XP operating systems is
written from the point of view of one technical user - someone who spends some
time with system administration, some with scientific and database application
programming, and the remainder doing documentation, e-mail, web browsing,
and following a small number of news groups.

Like many over the last few years, I've watched Microsoft's push from the
desktop into the server market in particular with some misgivings. Probably
the greatest fear of all has been the implicit threat to open standards.
The networking aspects of commercial unix (SunOS, Solaris, HP-UX, AIX, Linux
etc) have always rested on a solid foundation of the freely available RFCs
(Requests For Comment). And because the free 'nixes such as Linux (GNU),
FreeBSD, and so on, are available in source form, they're completely open
by definition.

The effect of closed, proprietary standards such as those employed
by Microsoft and other vendors is to automatically close off software
development in the effected areas to anyone except the companies concerned.
And where such companies can manage to gain a monopoly market share, the
incentive to improve a product in real terms all but disappears.

A more serious problem, as we're now seeing, is that Microsoft have
established such a fierce hold on the PC Operating Systems market (with
Windows 98, XP, 2000, and so on) that they're now in a
postition to kill off any MS Windows applications which are
built on open standards. These include network products such as
Netscape, and the various E-Mailers, which Microsoft have all but destroyed
by simply shipping their own, "free" versions
(Internet Explorer and Outlook)
as part of their OS distributions. Attempts through the U.S. court system
to break up this patently obvious Microsoft monopoly have, unfortunately,
failed so far.

One bright light has continued to shine. Microsoft's original attempt
to destroy the open-standards based Internet in the early 1990s by offering
their users a proprietary replacement version called MSN (Microsoft Network)
was greeted, if not with laughter, then at least with a very loud yawn.
This forced a hasty and very costly back-track, finally resulting in the
inclusion of the standards-based (and U.S. DoD developed) TCP/IP Internet
protocol into Windows 95 and NT 4. So for once - some sanity
from the management team at Microsoft ... the use of a well established,
robust open standard.

Their attempt at a proprietary E-Mail system (Microsoft Mail) in the early
1990s was similarly unsuccessful, although not before many large corporations
(such as Telstra in Australia) invested hundreds of millions of taxpayer's
dollars implementing it across the country. The product proved to be a
complete lemon, and probably put Australia's communications networks back by
at least 5 years in the early 1990s.

Leaving such historical issues aside, the comparison which follows is generally
restricted to those aspects which are currently the most obvious to me in my
own day-to-day work. And I won't be delving particularly into the internals
of either MS Windows or the 'nixes unless it's absolutely essential.
Bearing such caveats in mind, let's now wade in.

In fact, from mid 2003, public awareness of various fundamental deficiencies
in Microsoft operating systems and other products (Outlook, Office, etc)
rose sharply and has continued to rise. This has been brought about
as a result of serious and ongoing virus and spam attacks - see
below.
And Linux user-friendliness and versatility continues to improve rapidly
at the same time. Even Redhat 9, for example, although designed more for
rugged server deployment than desktop use, is very good. I'm using it
for both at the moment - one as a server, and several others as destops.
Installation and updating is now very slick, and with KDE, you
get a selection of GUI interfaces ... including one called "Redmond" :-)

Please don't construe this as a recommendation for Redhat for desktop
use. RH is certainly very good for server use (as is FreeBSD), but there
are probably better versions of Linux for desktop deployment. You should
look for reviews on the web and/or discuss it with people who are already
using Linux.

For any server, assuming of course that one has
a choice, even though I dislike Unix/Linux for its terse and often complex
aloofness and I readily admit that XP/W2K servers are great to administer
while they're working - when the chips are down and I've got a major problem
on my hands, give me Unix (Solaris, Linux, HP-UX, whatever) any day.

As just discussed, having all configuration information for each application
readily accessible in the form of a text file is great for disaster recovery.
But such an arrangement is also great for running the occasional "what if"
scenario with the configuration and tuning of an application. In my case,
I begin by editing the config of interest using my
vib script
(to ensure that I have a snapshot of the current working setup before I start).
Then I can play around as much as I like, because when something breaks (as
it sometimes does when one doesn't fully understand the entire package),
I can always restore the original setup in a matter of seconds.
And that is critical to me when that 'phone starts ringing!

In trying similar stunts on Windows servers for which I'm also responsible,
I have occasionally come unstuck badly. You cannot back up a "configuration"
as such, and on more than one occasion, when carefully altering parameters
for our Project and Finance control database system (MS SQL), I've discovered
to my horror that it's gone completely troppo. One such experiment broke
things so badly that I ended up having to reinstall the SQL server
and the Windows OS ... around 5 days work by the time I finally
got everything reconfigured even half reasonably.

I have plenty of other reasons for preferring unix systems for "mission
critical" applications. Unix systems are reliable - they hardly ever
need rebooting, whereas the Windows systems (typically running just the one
miserable application) need to be rebooted every month or two. Then there's
the ease of secure remote administration, consistent user profiles, and so
on.

A slight digression: Re learning unix, in the R&D lab where we set up our first
pair of Unix boxes to get some initial exposure, we just played around a bit
as we found time. We read the hardcopy manuals and man entries 
at length, wrote lots of shell scripts (good fun), and (being all superusers,
of course!) managed to thoroughly stuff it once or twice and even had to
reload the OS on one occasion.

We also followed various comp.unix newsgroups for a year or two, and tried our
hand at some elementary C programming out of K&R's classic C text. All in all,
just wetting our feet was really good fun! Be warned that Unix isn't something
you can "learn" in a week or even a month - it's just too broad and too deep.
Just play around with it a bit as you find time and do some
reading and you'll get there in a year or so without going to
any courses. All you need is interest, enthusiasm, and 3 or 4 good
reference texts on your shelf such as a good shell book, a vi book, and a
suitable unix admin book - see
the O'Reilly site
for some good titles.

As a last resort (provided you can get your employer to pay for it), if you
find that you just can't make enough time during working hours, find a good
unix/linux introductory course and go along to that.

 Unix man entries can be infuriating for the uninitiated - they
often seem to be written on the assumption that you're already a unix "wizard"
or that you can read the author's mind. But some are quite good - in fact,
some are even downright funny (esp the bug descriptions). Unfortunately,
with GNU/Linux, many man entries point you to some pathetic doco system called
"info", which I personally detest (but then, it's free, so one can't really
complain).
(Hint for reading man entries: set "PAGER=less" in your profile - the default
pager for man is usually "more", but this often makes scrolling backwards
difficult)

Anyway, that's quite enough digression. Moving on to the user's desktop
machines ... well, that's a harder one. As mentioned earlier, it's often
still difficult to avoid Microsoft. One still needs a degree of political
bravery to move a company from Microsoft PC to Unix or Linux or Apple (OS X).
But it can be done - and many smarter organisations are doing it following
Microsoft's arrogant decision to double their prices as of Aug 1, 2002.

The second thing I did was to install PERL in the form of
ActivePerl.
Perl is an acronym for Practical Extraction and Report Language,
and also Pathologically Eclectic Rubbish Lister but in reality it's
a very powerful scripting and control language (albeit with a somewhat
cryptic syntax).

Some of my PERL scripts are called from within MKS shell scripts, some
run alone, and some even run via the web (using Microsoft's IIS!). Altogether,
this whole system has given me back control of the various NT and Win 98
boxes that I need to look after.

PERL in particular is virtually unlimited in what it can do. I have PERL
scripts running on Win 98 and NT boxes that collect data from Sun Unix systems
via network sockets, reformat it suitably, and then pipe it out again
through the PCs serial port to external scientific equipment, cameras or
video switching boxes, and so on.

Other simpler scripts merely keep an eye on disc usage and email me weekly
summaries, or synchronise a PCs time and date to a reference Unix system -
whatever I need.

PERL is a bit like Linux and GNU in its background history. Originally
written by Larry Wall, it's subsequently been expanded and further developed
by a cast of thousands. And all sorts of powerful add-on modules to do
just about anything you can imagine are free for the asking from the PERL
CPAN (Comprehensive Perl Archive Network)
site. (The Win32 (ActivePerl) package gives you access to
this archive via an automated CLI interface called PPM.)

So if you are forced to use Microsoft Windows (usually because of
the availability of a particular application that your organisation has
decreed "it must have", or even just because your organisation has Microsoft
tunnel-vision), all is still not entirely lost. You can
still get the job done with a professional, robust result - in spite of
Windows.

Microsoft too are always adding proprietary extensions and
MS-other-package-hooks
to
Outlook,
specifically to cause maximum havoc with non-Outlook users
in order to further maximise their market-place advantage. Their assumption
here (perfectly correct) is that most organisations will then opt for the
easy fix and force their users to move over to Outlook for "compatibility".

Such organisations are, however, quietly ignoring the fact that they've
just multiplied their virus and security exposure by an order of magnitude.

Microsoft Exchange

Similar comments apply to the use of
Microsoft Exchange
as a main, central
mail-exchange server. In contrast with the more established, open-source
"mission critical" mail exchange packages such as
Sendmail
(usually on Unix), MS Exchange is again promoted as the "easy, safe option"
- especially for organisations with no in-house unix or programming expertise.

The pay-back with Exchange occurs when (eg) the company finds that it needs
to make large organisational changes but needs the flexibility to allow
the old email addressing and forwarding structure to "run in parallel"
for a while. Or when users request rather complicated intersecting and
nested mail-group definitions with very specific handling requirements.

Even just trying to use an E-Mail client other than Microsoft Outlook with
Exchange can cause problems. Although Exchange theoretically supports SMTP
(the open standards protocol as used by other E-Mail clients), it does so in
a sufficiently complicated and convoluted way that many Systems Administrators
just take the easy road and stick with Microsoft's proprietary RPC protocol.
Fine for running Outlook - but nothing else.

One also needs to be aware that Exchange doesn't maintain user mailboxes
and folders in the traditional flat ASCII file form, but keeps them instead
in a large, proprietary database. This can cause further havoc if and when
a decision is made to move to another E-Mail system. It also complicates
backups unless the backup system is "MS Exchange aware". (A case of the
age-old "proprietary database" trick so favoured by the old mainframe
application manufacturers.)

MS Exchange falls flat on its face in many such scenarios, and your local
Microsoft Certified IT professional is often forced to present management
with sad news such as
"Sorry, guys - can't be done".

Sendmail,
on the other hand, handles most such requirements with ease -
provided of course that a competent unix administrator and/or programmer
is employed to do the requisite configuring. Even in those rare instances
where Sendmail alone is insufficient, there are 'hooks' there to allow any
desired extra functionality to be added from outside.

Mind you, Sendmail admittedly has one of the most complex configuration
interfaces known to the free world. Sendmail is immensely powerful,
flexible, reliable - and a snap to back up. And the use of open standards
such as SMTP, POP and IMAP allows the use of any convenient E-Mail client.

But simple to configure? Nup - no way!

In reality, this is actually not a major problem provided that you just want
to run it with its "out of the box" defaults, but for most organisations
the need will eventually arise to block this domain or prevent
relaying for that domain etc.

In such cases, the only option is to print out Sendmail's (quite
comprehensive) documentation and actually sit down and spend a day reading
it (known in the industry as
RTFM
- as in ...
Read The F______ Manual :-).

It's also not a bad idea to buy a copy of O'Reilly's
Sendmail book,
although I must admit that I've found the doco files which come
with the Sendmail source distribution and the
FAQs at
the Sendmail web site
quite adaquate for my needs so far. (But then again, I also do a bit of
programming from time to time - I'm not just a Systems Administrator.)

In particular, the
Sendmail Installation and Operation Guide (PDF format)
which is supplied with the source distribution is well worth reading (and
printing out). For some reason known only to Eric Allman (the original author
of sendmail), this is only distributed in M4 macro format or as Postscript.
So this PDF version is one I've just converted using ps2pdf. (The rest of
the doco supplied with the source is in the form of plain text or as unix
man entries, both of which are fine by me!)

Of course, there are other MTAs (Mail Transport Agents) one can use
apart from Sendmail. It may have been the first kid on the block, but as
Stefano Rivera put it to me after reading this page: " Sendmail isn't
the only mailer, qmail (for example) is a doddle to setup and maintain,
almost infinitly secure, and lightning fast. The same applies to all
D.J. Bernstein's other programmes (djbdns, daemontools, etc)."

I have to admit that I haven't tried anything other than Sendmail to date.
If I had, maybe I'd be mentioning one of those instead. I'm not really too
fussed either way. As long as it's configurable, secure and open-source,
I'm always happy to try anything. Sendmail just happens to be bundled by
default with so many Unix-type systems, and when you boot up for the first
time, it's there and it's running. So it almost seems a bit of a pity not to
use it!

In light of the above, it's probably now apparent why Sun Microsystems were
so furious when Microsoft tried to hijack
Java
in the 1990's.
Java as a 'browser' and 'platform-independent' computer
language has always had security and platform independence as its
prime focus. Security and platform independence first, functionality
and speed second. To this end, all Java implementations are therefore
required to pass Sun's Java Compatibility Tests before a company is legally
entitled to use Sun's JAVA COMPATIBLE trademark.

Microsoft asserted that they did not accept this compatibility requirement as
binding on them, and eventually left Sun with no choice except litigation
for an injunction against Microsoft if future Java language portability was to
be maintained. And so began the famous court case.

Anyway, I am digressing in the extreme now (you can go and trawl though
Sun Microsystems
Sun v Microsoft Case History
if you want more information on that one).

When designing their new server-strength protected-kernel OS, Microsoft Windows
NT, in the early 1990's, Microsoft decided they could do it better than Unix
by employing some guys from DEC and getting them to create something called
NTFS. Nothing so simple as the established Unix file system was considered -
Microsoft were going to do it much better.

Yeh?

Well, no ... (duh)

In fact, standing right back from all this ... I've never really been able
to quite figure out why Microsoft didn't just do the sensible thing and
clone Unix in its entirety as the backbone of NT, 2000 and XP. Plenty of
others have. After all, operating systems are unbelievably complex, and
no-one ever manages to release one which is totally bullet-proof and bug-free.
The various Unix implementations (and Linux) generally come very close though,
so why not take advantage of the world of talented open source developers when
it's all out there? It simply isn't worth "doing your own thing", as Apple
have discovered in recent times. (Much better to have "yet another port of
Unix" than these dreadful Microsoft systems with their woeful toolsets.)

Anyway, this is water under the bridge. Microsoft have gone their own
way - and it's clearly not a good way for users. So it's better and
ultimately much safer, simpler and less expensive to step off the MS
juggernaught entirely if you can manage it.

Unix and Linux have always been designed for maximum usability, reliabilty, and
security. And because so much of it has been designed by academics and
so-called computer science "boffins", you actually feel much safer.
And why? Because in that world, professional reputation is everything.
It is that simple.

Scott's article
on the relative virus susceptibility of the two competing operating systems
again raises an issue which has really gone "critical" since September 2003.
Any business which is in any way connected to the Internet and continues
to use Microsoft Windows as their platform of choice would need to have
irrefutable logic for doing so, considering the security implications
of the most recent virus attacks.

Making the change to Linux or Unix may be costly in the first 12 months, but
it nevertheless makes excellent business sense for any company that values
its data and intellectual property. (In any case, "total cost of
ownership" will still end up being lower in the medium to long term.)

The bottom line is that with all sorts of viruses continuing to attack
virus-prone Microsoft operating systems and applications, you generally
won't even know when your data has been stolen.

For a Windows system administrator, a course in Linux and/or Unix may thus
be prudent. A management which discovers the sanguine realities of
"security" under Microsoft before the organisation's IT staff have bothered
taking appropriate action is unlikely to be pleased.