In 1997 Linus moved from Finland to a California startup
Transmeta in order to experience the "extreme other side" of computer
programming -- that of a high-tech start-up and as a side effect to become
rich (his explicitly stated goal, and he definitely managed to achieve
it in less than three years). Later in his September
28, 2003
interview to NYT he tried to avoid answering the question how "make
money fast" craze affected him in Silicon Valley:

Since you moved to Silicon Valley
from Finland in 1997, how has the region's aggressive approach to money-making
affected you?

Oh, how I hate that question. I've actually
found the image of Silicon Valley as a hotbed of money-grubbing tech
people to be pretty false, but maybe that's because the people I hang
out with are all really engineers. They came here because this is where
the action is. You go out for dinner, and all the tables are filled
with engineers talking about things that won't be available to ''normal
people'' for a few years. If ever.

In reality Linus decided to struck while the iron was hot, getting
into the US "Internet IPO" boom before the bubble burst. Aided by mass "Internet
boom" hysteria, Linus and many others – from books authors to
snail oil salesmen – were happy to became rich from the greatest XX century's
stock bubble. Bob Metcalf was the first who understood this transformation
of the former revolutionary into a "make-money-fast", personal
enrichment comes first type (pigs in Orwell terminology ;-). In his Feb
2000 column
If open-source software is so much cooler, why isn't Transmeta getting it:

...Am I the only one to see that Torvalds
and other open-source software revolutionaries are acting out the finale
of George Orwell's Animal Farm?

Orwell's farmhouse is full of open-source pigs, which are now almost
indistinguishable from the proprietary humans they recently overthrew.

It's true that I have been unkind to the "open sores" movement. But
to be clear, anyone is welcome to beat Microsoft with better software,
even a utopian community of volunteer programmers.

May the best software win.

And don't get me wrong, even if he disappoints Richard Stallman by
not always referring to GNU/Linux, Torvalds is a genuine hero of the
open-source revolution.

But with Torvalds saying some animals
are more equal than others, why is the sanctimonious open-source press
still cheering him on? Are the likes of Slashdot.org, just gobbled by
VA Linux, also porking out in Orwell's farmhouse?

Torvalds wrote and now controls Linux, the open-source operating
system, due this summer in Version 2.4. By day, he is a programmer at
Transmeta. Transmeta just announced Crusoe, its low-power microprocessors
for mobile computers.

The architecture of Crusoe chips is based on VLIW (very long instruction
words). It has "code morphing" to convert and cache software in speedy
VLIW codes. And it comes with Mobile Linux, with Linux extensions for
power management. According to Transmeta, Crusoe is two-thirds software
and one-third hardware.

So what I want to know is, if open-source software is so cool, and
if Torvalds "gets it," why isn't Crusoe open source? For a start, why
aren't the Crusoe chip's mask sources published for modification and
manufacture by anyone?

And yes, Mobile Linux is open source, but not the "code morphing"
software Torvalds helped write. Transmeta has taken the phrase Code
Morphing as its proprietary trademark. And
what the code does, according to Transmeta, has been ... patented.

Worse, Crusoe is touted for running Intel X86 software, and in particular,
Microsoft Windows. Doesn't the open-source community say Windows is
beneath contempt?

Torvalds showed up at LinuxWorld Expo touting open source, of course,
but then went on to revise two of its bedrock principles.

Torvalds talked at LinuxWorld about fragmentation -- the emergence
of too many Linux versions. Being old enough to have watched Unix fragment
during the 1980s, I worry. But instead of holding to the party line
that Linux will not fragment, Torvalds now says there is bad fragmentation
and good. One can assume, because he's in charge of both, Transmeta's
Mobile Linux will fragment Linux 2.4, but in a good way.

Then Torvalds talked about commercial companies, which aren't so
bad after all: Take for example Transmeta. His audience, packed with
employees, friends, and family of newly public Linux companies, did
not boo him back out into the barnyard.

Where is the outrage?

So just to keep Torvalds honest, I'm thinking that Crusoe chips,
which are mostly software, should be open source and basically free.
Chips have to be manufactured -- with white coats, ovens, and stuff
-- so maybe it should be OK to sell open-source Crusoe for the cost
of its silicon, trace metals, media, and manuals...

Since early 1996, shameless opportunists have sprung up across the country
and across the Internet, ready to take advantage of the America newfound
spirit of "webalization". While making money by fooling others is
reprehensible, we can assume that many of these snake-oil salesmen like
Bob Young and Larry Augustin were just gifted Ponzi scheme manipulators
that just waited for the opportunity to strike gold from the fools and
for whom this Internet boom was the last and only opportunity to become
rich. They were not real sharks. The real sharks here were the investment
banks and venture funds. And this period was the first when regulations
adopted during Great Deal were substantially weaken and agencies that were
formally responsible for control on financial oligarchy behavior were weakened
and emasculated. Fed was in the hands of shameless opportunist Greenspan
who under the pretext of free market and deregulation was selling the country
to financial oligarchy. For the latter dot-com boom was a perfect opportunity
to redistribute country wealth in their own favor. Greed is good was
the slogan of the day.

This wealth redistribution mechanism worked via numerous venture and
hedge funds that were created to attract money and extract rent for the
casino owners which were investment banks. With skillful propaganda, money
were just flowing like water into all sorts of "Internet funds" as well
as the small and mostly unprofitable Internet start-ups. In 2011 in her
article in NYT Evelyn Rusli and Verne Kopytoff gave the following assessment
of this giant Ponzi scheme (Is
It a New Tech Bubble Let's See if It Pops). Note the Goldman Sacks,
JPMorgan Chase and Morgan Stanley were active players in creating and milking
this Ponzi:

In 1998, Goldman Sachs Capital Partners, the bank's private equity
arm, began a new, $2.8 billion fund largely geared toward Internet stocks.
Before that fund, the group had made fewer than three dozen investments
in the technology and communications sectors from 1992 to mid-1998,
according to Goldman Sachs documents about the fund.

But between 1999 and 2000, the new fund made 56 technology-related
investments, of about $27 million on average. In aggregate, the fund
made $1.7 billion in technology investments -- and lost about 40 percent
of that after the bubble burst. (The group, which manages the money
of pensions, sovereign wealth funds and other prominent clients, declined
the opportunity to invest in Facebook early this year.)

Philip A. Cooper, who in 1999 was head of a separate Goldman Sachs
group that managed fund of funds and other investments, recalled that
investors were clamoring, "We want more tech, we want more." Bowing
to pressure, he created a $900 million technology-centric fund in 1999,
and within eight weeks he had nearly $2 billion in orders. Despite the
frenzy, he kept the cap at $900 million.

"There was a lot of demand, but we couldn't see any way we could
prudently put that much capital to work," said Mr. Cooper, who has since
left Goldman.

Other Wall Street firms, including JPMorgan Chase and Morgan Stanley,
also made a number of small to midsize investments during the period.
In 1999, for instance, Morgan Stanley joined Goldman Sachs and others
in a $280 million investment in CarsDirect.com, which scrapped its initial
plans to go public when the market deteriorated.

"We thought we were going to double our money in just a couple of
weeks," said Howard Lindzon, a hedge fund manager of Lindzon Capital
Partners and former CarsDirect.com investor. "No one did any due diligence."
Mr. Lindzon lost more than $200,000 on his investment.

Also in 1999, Chase Capital Partners (which would later become part
of JPMorgan Chase) invested in Kozmo.com -- an online delivery service
that raised hundreds of millions in venture funding. JPMorgan Chase,
which just recently raised $1.2 billion for a new technology fund, at
the time called Kozmo.com "an essential resource to consumers." At its
height, the company's sprawling network of orange bike messengers employed
more than a thousand people. Less than two years later, it ceased operations.

An online grocer, Webvan, was one of the most highly anticipated
I.P.O.'s of the dot-com era. The business had raised nearly $1 billion
in start-up capital from institutions like Softbank of Japan, Sequoia
Capital and Goldman Sachs. Goldman, its lead underwriter, invested about
$100 million.

On its first day, investors cheered as Webvan's market value soared,
rising 65 percent to about $8 billion at the close. Less than two years
later, Webvan was bankrupt.

About the same time, Internet-centric mutual funds burst onto the
scene. From just a handful in early 1999, there were more than 40 by
the following year. One fund, the Merrill Lynch Internet Strategies
fund, made its debut in late March 2000 -- near the market's peak --
with $1.1 billion in assets. About one year
later, the fund, with returns down about 70 percent, was closed and
folded into another fund.

"We all piled into things that were considered hot and sexy," said
Paul Meeks, who was the fund's portfolio manager. Mr. Meeks started
six tech funds for Merrill Lynch from 1998 to 2000.

The period of explosive commercialization of Linux startups and Torvalds
can probably be considered as the towing figure, the banner of all Linux
"get money fast" IPO schemes. As we mentioned before Torvalds never
passed his rights to FSF: the practice that Stallman usually rigorously
enforced. Therefore in some sense he from the beginning was a dissident
in the camp of fanatic pro-GPL zealots, the leader of his own cult.
He definitely does not like to be a follower of Stallman or anybody else
and wanted to play his own game in order to became rich and famous.

And he probably was right about the "extreme other side" that he wanted
to experience, but in a quite different meaning of this word. One thing
that he experienced very soon is that he is no longer a free software developer.
Actually he became more of a pawn in the hands of executives of major investment
banks and venture firms as well as suits from Intel, IBM, and Oracle.
And those people were able to explain to Linus what is good and what is
bad for his beloved OS with a pretty convincing arguments like stock options
(BTW Intel & IBM were the major and early investors in Red Hat and
VA Linux), speech arrangements and other valuable perks.

And there were, of course, pretty funny things that waited for him when
Linux became kind of "theater of absurd". For example, when Oracles' Ellison
spent an hour in front of 1,000 programmers in San Francisco talking about
the merits of free software. That's right. A man who became as rich as Croesus
by selling his database software for thousands of dollars per CPU told this
standing-room-only audience that their practice of sharing code for Linux
(and charging nothing for it) was a good thing.

Moreover, Linus Torvalds' beloved Linux (and he himself) soon will
be associated with the worst charlatans of the Internet bubble. The sad
fact is that Linus did a lot of PR work to help to ensure that the record
for an IPO's opening-day gain belongs to VA Linux. The latter shot up 700%,
to $239.25 a share , when it hit the market on Dec. 9, 1999. Three year
later, renamed VA Software, traded for about $1. Eliot Spitzer, the New
York attorney general, later has exposed the way Wall Street analysts promoted
lousy "Linux" companies to win additional investment banking business. And
Linus Torvalds participation in this game of deception is undeniable fact:
he willingly adopting the role of cheerleader.

First of all "corporate sponsors" made it clear that they need Linux
on servers (IBM wanted to fight Sun and Microsoft, but probably more Sun
than Microsoft; the same was true for Intel). Therefore, despite Linus previous
convictions, Linux on desktop became No.2 task and from now on Linux will
be developed mainly as a server OS. Talks about Linux on desktop will continue
for next several years and interesting applications for Linux will emerge,
but there is not much money on the desktop and it's money that (in a subtle
way via Linux startups that are financed by Intel, IBM and other large companies)
determine the actual direction of Linux development. To reuse old cliché
of the Clinton campaign "it's money, stupid". Volunteers are welcome,
but they does not matter.

Second, they made him a poster boy for both Linux commercial promotion
and for anti-Microsoft campaign. Much later in his 1992
interview to BBC World Linus said that he is pretty happy to play this
role:

"In a way it is fun. I'm pleased to be
a poster boy. It gives me some self-importance," he said.

He would not, however, want to become personally involved in the
dispute with Microsoft.

"I've tried to stay out of the Microsoft debate. If you start doing
things because you hate others and want to screw them over the end result
is bad," he said.

If the period when Linus controlled the direction of Linux development
ever existed, it definitely come to end. Not only he became "a developer
for hire" that barely understand the code of some subsystems that he incorporates
into the kernel (see more about it in discussion of v.2.4 problems), he
become "a hired gun" in an ultra-secret and ultra-closed organization. Yes,
he still can make an important decisions and yes he still can optimize kernel
the way he liked (and sometimes cutting chunks of necessary code that he
considered "raw", but that was necessary for the future development of a
particular subsystem). But what and when should included in the kernel
is now a different story. In a sense he became a prisoner of his own
project.

The selection of features from which he can chose is no longer controlled
by him. And the first such feature that he previously objected against that
that now became almost No. 1 priority was SMP. Another thing that he probably
cannot predict at this time was that he and his beloved OS will become the
core of several crazy Linux IPO that enriched few members of Linux community
(including Linus Torvalds himself), but was a huge rip-off for many naive
open source supporters. As we will see in 2001 he more and more looks like
a guy on the former girlfriend wedding party.

As for Transmeta experience he probably can attest that the company was
able to transform a former "free developer" into some mixture of a developer,
a marketer and a ceremonial figure suitable for keynote addresses and product
launches. And Transmeta was interested in Linus as a PR figure probably
much more than in his technical abilities as a kernel developer.

Formally Linus became just a staff engineer in Transmeta to lead the
development of Linux for a supersecret (at this time) Transmeta's Intel-compatible
chip. But a far more important role that he was assigned was the role to
chief PR person for Transmeta and here his journalistic upbringing was a
huge asset for this startup that has nothing to do with Linux per se and
from technical standpoint was more in the camp of VM developers than monolithic
kernel developers. Transmeta probably gave him an offer that he cannot refuse
as at this time he seems rejected several other excellent offers including
one from Steve Jobs:

A lot of folks in Silicon Valley are
so drunk on their own bath water that they simply don't get Linus. Take
Steve Jobs. After Linus moved to the States in 1997, the acting Apple
Computer CEO got in touch with him. Jobs wanted to persuade Linus
to get involved in making the MacOS an open source code project. "He tried to get the Linux movement going more into the Apple area.
I think he was surprised that his arguments, which were the Apple market
share arguments--which would have made an impression on people who did
this for commercial reasons--had absolutely no impact on me,'' Linus
says.

BTW in his autobiography he mentions this episode that cast some light
of his super sized ego struggle -- he really can do whatever it takes to
avoid being in the shadow of Steve Jobs or other prominent developer:

According to Torvalds, Jobs assumed that
he would be interested in joining Apple's mission to capture more of
the personal computer market from Microsoft, rather than continue concentrating
on Linux. "I don't think Jobs realized that Linux would potentially
have more users than Apple, although it's a very different user base."

In Transmeta he was the one and only Linus Torvalds. The start-up was
headed by Dave Ditzel, former chief scientist of the chip development project
at Sun that produced the SPARC processors, one of the most successful RISC
CPUs. But Ditzel was known only in very narrow circles of CPU developers.
Microsoft cofounder Paul Allen was the major investors in Transmeta -- so
here Linux symbolically again returned to his MS DOS roots and made an important
step of becoming Microsoft of Unix :-)

Starting salary of Linus in Transmeta as well as the number of share
he would get in the eventual IPO were closely guarded secrets. Linus
claims that he negotiated that he will have enough time to continue to supervise
the development of Linux kernel, but again reality was that he was really
working like Transmeta PR person driving attention to the company.
Just think about the number of interview and speeches he gave during the
1998 and 1999 before the launch of the first Transmeta chips. PR work requires
a lot of time. Moreover in any case the size of the project already has
outgrown his capacity to manage it.

In 1998 when I wrote the first draft of this chapter the Transmeta
move to hire Torvalds looked like a very clever (I would like to call it
innovative) marketing plot, that can be called "cloak-and-dagger marketing"
-- officially Torvalds work on a some "top secret" project there. Transmeta's
windows were blacked out, its
Web site contained
one sentence ("This Web page is not here yet"). All they need was one celebrity
to provoke heated speculations about just what Transmeta were doing and
this particular celebrity was doing there. In 2003 as SCO lawsuit heated
and started threatening Transmeta, he was promptly shipped to the OSDL (As
Shakespeare aptly formulated this idea in Otello "Moor did his job,
Moor can leave.") :

The non-profit Open Source Development
Lab (OSDL) and chip-maker Transmeta Corp. jointly announced today that
Linus Torvalds, the creator of Linux, will join OSDL as the first OSDL
Fellow. Torvalds will join OSDL on leave from Transmeta Corporation,
where he is currently a Transmeta Fellow.

Most of the talk before Transmeta IPO was focused on the next-generation
secret chip that Transmeta is developing. The main ingredient of this hugely
successful PR trick was of course Torvalds, who play the role of celebrity
who fueled huge interest in Transmeta interest in "Linux media" and even
managed to get substantial attention from mainstream media. This PR trick
would probably would not work for Transmeta without him, that's why in 1998
I, like many other observers, (incorrectly) assumed that there were some
gentleman agreement, that Torvalds will work just as PR person for company
by promoting Linux and other than that he is free to do whatever he wish.
This proved to be partially wrong -- actually along with PR duties Linus
did Linux port to the Transmeta chip.

In any case, for Linus this was not only the official end of volunteer
period but also definitely "end of fun". Welcome to the hotel California.
Now it's all about money.

While he got an unusually high salary in Transmeta, strategically this
move was a very risky step from several points of view. First of all, leaving
university grounds cuts implicit, but extremely important support from friends
and academic stuff that he enjoyed at the University of Helsinki, making
his job of the "Chief configuration management specialist" of the kernel
code much more difficult and forcing him to make more bad, "intuitive" decisions
that badly affected Linux development. Also that made him more conservative:
without a university he was essentially on his own and please remember that
from the point of view of academic training he never managed to accomplished
much. Just a MS degree. No other research projects, no articles, no participation
in other important project like Multics was for Unix creators, nothing that
might help him in the future to navigate the development of the kernel.
Note that Ken Thompson and other core members of the Unix team has had an
excellent school in Multics project before launching Unix. So deciding what
technical path to chose became more difficult and that dramatically increased
the conservatism, the concentration on the polishing of existing codebase
and including only features that he cannot reject. That also made him very
vulnerable to the corporate pressure.

Generally speaking, leaving university grounds is a very dangerous step
for any freeware/open source project leader and I attribute much of
Stallman's success to the fact that he managed to stay in MIT, even when
MIT got tied of him and wanted him out ;-).

Gradually the question of the level of "Linux_ kernel_development/Linus_leadership"
synergy became open. In this sense SCO lawsuit really helped Linux as it
was a good time for Linus to divorce Transmeta and to join an independent
body for the development of Linux kernel, which can survive even if Linus
himself was overrun by a track. Probably the fact the Linus was quickly
shipped to Open Source Labs to protect Transmeta from excessive liability
was the most positive development that came out of this lawsuit.

I could only say that I have
the benefit of having been exposed to the Raymondism and other
silly ideas ten years ago, and having the benefit of thinking
them through before thinking got clouded by pundits and free
IPO money. For most people who have discarded orthodox Raymondism,
the idea seems so childish and shallow that there isn't much
perceived value in convincing the people who are sold on it.

In late 1997 Linux gold rush started and in early 1998 Linus found himself
at the front-pages of such magazines as
Forbes. In one year he probably gave more interviews and speeches that
in all his previous life. It's difficult to say how he reacted to all this
stream of media hype and exaggerations, but it definitely increased the
troubling "cult of personality" problems in the movement that I mentioned
before. It's not an easy job to be a media darling. Despite his usual
"political correctness" the burden of fame was probably too much for him
and Linus managed to make some revealing his personality statements (see
Linus 1998 for more complete context
of the statements; full interviews are also available online).

One of the main negative factors that usually cautious and realistic
he gradually was also affected by "Linux uber alles" movement headed
by Eric Raymond. I discussed this movement (Raymondism) in more details
in my Bad Linux advocacy
FAQ):

...first he proclaimed that he has no competitors in the
Unix market anymore (assuming that Linux works better that Solaris
7 on 64 CPU systems -- BTW in late 1998 Linus abruptly changed
his technical priorities and despite proclaimed desktop orientation
started working hard on adding SMP support to the Linux kernel.
In other words Linus adopted "One Microsoft way" technical policy --
much like Microsoft with NT, he is trying to make Linux a universal
OS that dominates both desktop and server. It's really funny that despite
the fact that both Solaris and Free/OpenBSD were still superior in certain
aspects of kernel design he claimed just opposite:

"I'm no longer looking at the
Unix market when I'm looking for competitors," he said,
adding that when looking for new features to add to the free OS,
he is looking more to Microsoft than to high-end Unixes like Solaris
or Digital Unix. "I've been much more focused on the Windows NT
and 98 target group as a market"

...in another interview he predicted disappearance
of Apple from the face of Earth in a couple of years (and also misunderstanding
the power of Microsoft legal machine -- the danger of infringement of
copyright lawsuits -- whatever you think about it, realistically this
is a very powerful threat to free software movement, as SCO lawsuit
later have shown). In fact, Apple essentially killed Linux on desktop
with the advent of OS X (see below):

I actually think that within a few
years, Apple will cease to exist simply because, my personal opinion
is, it's too hard to compete against Microsoft in the commercial
marketplace. And yet I feel that Linux can actually succeed
because we aren't really competing against Microsoft. Linux doesn't
have the same commercial pressures that Apple does have.

...in yet another interview he gives his final and categorical
judgment about Java ("I think everybody hates Java as a desktop
thing") forgetting the fact that he never programmed in the language
and after programming kernel in C it's very difficult to switch to something
else, anyway ;-) :

SW: What are your thoughts
on Java?

LT: I think everybody
hates Java as a desktop thing. I see Java mentioned a lot lately,
but all of the mentions within the last year have been of Java as
a server language, not as a desktop language. If you go back a year
and a half, everybody was talking about Java on the desktop. They
aren't anymore. It's dead. And once you're dead on the desktop,
my personal opinion is you're dead. If servers are everything
you have, just forget it. Why do you think Sun, HP -- everybody
-- is nervous about Microsoft? It's not because they make great
servers. It's because they control the desktop. Once you control
the desktop, you control the servers.

It's no longer something that will
revolutionize the industry. It could have revolutionized the industry
if it was on the desktop, but I don't see that happening anymore.
I hope I'm wrong. Really. I just don't think I am.

...and in yet another he admits that "he really
like the Unix as a philosophy" (not the fact that he just implemented
a clone of the Unix kernel that already has standards, books and
even complete source code available) and ironically (referring to Windows)
gave a very good characterization of Linux development "nobody tried
to design windows -- it just grew in random directions without any kind
of thought behind it."

boot: Linux is based on UNIX,
right?

Torvalds:Well it's based on UNIX in the sense that I was used
to UNIX and I really liked it. UNIX has a philosophy,
it has 25 years of history behind it, and most importantly, it has
a clean core. It strives for something—some kind of beauty. And
that's really what struck me as a programmer. Operating systems
that normal home users are used to, such as DOS and Windows, didn't
have any way of life. Nobody tried to design Windows—it just grew
in random directions without any kind of thought behind it.

...in early 1999 interview
Linus first time used the word "invented" to Linux -- the word completely
inapplicable to the clone OS and very close to the Microsoft marketing
style (as everybody knows Microsoft "invented Windows" by a blatant
rip-off but then intelligent enhancement and independent development
of the Mac GUI ;-). As always he failed to mention GNU project (moreover,
this time he even forgot to mention that the adoption of GPL license
was the smartest thing he ever did and probably at this time he really
has some reservations -- see KDE jihad
):

PC Week: Give us the short history
of Linux's development.

Torvalds: Basically, I invented it eight years ago, almost exactly eight years
ago. It started small, not even an operating system. It was just
a personal project. I just was doing something fun with my new machine.
It kind of evolved through luck and happenstance into an OS, simply
because there was very much a void where there wasn't much choice
for someone like me. I couldn't afford some of the commercial OSes
and I didn't want to run DOS or Windows -- I don't even know, did
Windows really exist then?

PC Week: You could have copyrighted
Linux and made a fortune. Why did you make it an open source code
operating system, and will that model work in the future as Linux
acceptance grows?

Torvalds: It started out as
a personal belief that, yes, open source was
needed. Then, when it got large enough, I encouraged people to license
their own development, their own parts. Now there are multiple owners
sharing all these licenses...

"Frankly, I think it's a piece
of crap," Torvalds says of Mach, the microkernel on which Apple's
new operating system is based. "It contains all the design mistakes
you can make, and manages to even make up a few of its own."

Torvald's comments promise to upset not
just Apple fanatics, but also some quarters of the free software movement.
The Mach microkernel is also being used as the core of Hurd, a kernel
project from the Free Software Foundation that will be an alternative
to Linux as the heart of the GNU (Gnu's Not Unix) operating system,
originally devised by free software advocate Richard Stallman.

The criticism comes in a chapter where
Torvalds tells that, on arrival in Silicon Valley in early 1997, Apple's
charismatic chief executive Steve Jobs invited him to join Apple and
help develop OS X. He says that Jobs was also keen for him to help attract
open source developers to the project.

The remarks will particularly sting Apple,
because the company has made great play of the fact that the core of
its new operating system is, like Linux, based on the Unix operating
system and was developed on open source software.

Qualified criticism is a gift. If Linus says that a particular approach
has its problems, great. Mach 3 Microkernel has a lot of problems, but there
is no perfect approach to the kernel design. For each and every Mach 3 problem
one can find one in monolithic kernel design advocated by Linus. And if
you are a specialist that defends monolithic kernel as a better alternative,
you better provide solid arguments, not the all encompassing word "crap"
in the best "Father Linus, the greatest kernel designer of all times and
nations" style. Actually even among monolithic kernels Linux kernel is not
the best in many respects. BSD was and still is a better system in the networking
department. If you need a firewall, router, maybe even a web server it will
probably run better on BSD. That makes me wonder at his motivations and
it lowers my respect for Linus. And definitely reminds me of
the old Tanenbaum vs. Torvalds debate right back in the beginning of
'92:

While I could go into a long story here about the relative merits
of the two designs, suffice it to say that among the people who actually
design operating systems, the debate is essentially over. Microkernels
have won. The only real argument for monolithic systems was performance,
and there is now enough evidence showing that microkernel systems can
be just as fast as monolithic systems ... that it is now all over but
the shoutin`.

Although microkernels did not won, they did not disappeared either and
the design has its merits and to a certain extent found its way into best
commercial Unixes like AIX and embedded kernels like QNX. May be some
sort of compromise is an optimal solution. The problem here is that he extended
his (primitive) knowledge of the field circa 1992 to the situation in 2001.
Linus has made his opinion of microkernels clearly in the past, when he
was just a beginner kernel developer, a guy who can barely can program in
C.

But both monolithic kernels of early XXI century and Mach kernels
are quite different from what was the state of the art of 1992. And he should
know that in the current Linux kernel "Bad driver equals kernel panic",
not to mention the hassle for driver developers, who are forced to port
their code against all kernel (endless) changes. My (limited) understanding
is that QNX has been a quite successful microkernel architecture for years.
Considering the critical real-time applications that QNX has to run, I am
glad they got their micro-kernel right. Lets see... Nuclear reactors, nuclear
subs, NASA uses it for some things, traffic control systems, etc. Very impressive
list for such a crappy design.

And as the last peace of my collection I will reproduce his famous letter
"Because I'm a bastard, and
proud of it! " from
Kernel Traffic #87. were he stated his position about kernel debuggers
in the following way:

I don't like debuggers.
Never have, probably never will. I use gdb all the time, but I tend
to use it not as a debugger, but as a disassembler on steroids that
you can program.

None of the arguments
for a kernel debugger has touched me in the least. And trust me, over
the years I've heard quite a lot of them. In the end, they tend to boil
down to basically:

it would be so much
easier to do development, and we'd be able to add new things faster.

And quite frankly, I
don't care. I don't think kernel development should be "easy". I do
not condone single-stepping through code to find the bug. I do not think
that extra visibility into the system is necessarily a good thing.

Apparently, if you follow
the arguments, not having a kernel debugger leads to various maladies:

you crash when something
goes wrong, and you fsck and it takes forever and you get frustrated.

people have given
up on Linux kernel programming because it's too hard and too time-consuming

it takes longer
to create new features.

And nobody has
explained to me why these are _bad_ things.

To me, it's not a bug,
it's a feature. Not only is it documented, but it's _good_, so it obviously
cannot be a bug.

"Takes longer to create
new features" - this one in particular is not a very strong argument
for having a debugger. It's not as if lack of features or new code would
be a problem for Linux, or, in fact, for the software industry as a
whole. Quite the reverse. My biggest job is to say "no" to new features,
not trying to find them.

Oh. And sure, when things
crash and you fsck and you didn't even get a clue about what went wrong,
you get frustrated. Tough. There are two kinds of reactions to that:
you start being careful, or you start whining about a kernel debugger.

Quite frankly, I'd rather
weed out the people who don't start being careful early rather than
late. That sounds callous, and by God, it _is_ callous. But it's not
the kind of "if you can't stand the heat, get out the the kitchen" kind
of remark that some people take it for. No, it's something much more
deeper: I'd rather not work with people who aren't careful. It's
darwinism in software development.

It's a cold, callous
argument that says that there are two kinds of people, and I'd rather
not work with the second kind. Live with it.

I'm a bastard.
I have absolutely no clue why people can ever think otherwise. Yet they
do. People think I'm a nice guy, and the fact is that I'm a scheming,
conniving bastard who doesn't care for any hurt feelings or lost hours
of work if it just results in what I consider to be a better system.

And I'm not just saying
that. I'm really not a very nice person. I can say "I don't care" with
a straight face, and really mean it.

I happen to believe that
not having a kernel debugger forces people to think about their problem
on a different level than with a debugger. I think that without a debugger,
you don't get into that mindset where you know how it behaves, and then
you fix it from there. Without a debugger, you tend to think about problems
another way. You want to understand things on a different _level_.

It's partly "source vs
binary", but it's more than that. It's not that you have to look at
the sources (of course you have to - and any good debugger will make
that _easy_). It's that you have to look at the level _above_ sources.
At the meaning of things. Without a debugger, you basically have to
go the next step: understand what the program does. Not just that particular
line.

And quite frankly,
for most of the real problems (as opposed to the stupid bugs - of which
there are many, as the latest crap with "truncate()" has shown us) a
debugger doesn't much help. And the real problems are what I worry about.
The rest is just details. It will get fixed eventually.

I do realize that others
disagree. And I'm not your Mom. You can use a kernel debugger if you
want to, and I won't give you the cold shoulder because you have "sullied"
yourself. But I'm not going to help you use one, and I would frankly
prefer people not to use kernel debuggers that much. So I don't make
it part of the standard distribution, and if the existing debuggers
aren't very well known I won't shed a tear over it.

See, you not only have to be a good coder
to create a system like Linux,
you have to be a sneaky bastard too ;-)
[Linus Torvalds in <4rikft$7g5@linux.cs.Helsinki.FI>]

The other interesting thing is that Raymondism equals Linux and open
source with innovation and Linus often tries to play this card too. In fact
Linux can be considered as a neo-conservative revolution (counterrevolution)
against Microsoft. There is nothing very special about Linux kernel. It
is absolutely, 100% true that Linus has complete control over the present
development. He has in the past and will in the future nix excellent ideas
that do not conform to his rather limited views. It has taken 10 years for
Linux to get a semi decent VM which is still weak compared to the FreeBSD's
VM. Linux kernel version 2.2 was worse than 2.0 at handling high demand.
Until 2.2.18 or so, if a process used all the virtual memory, the kernel
would randomly pick a process and kill it to free some memory. I think it
checked the PID!=1, thank goodness. But this still means that Netscape or
Midnight commander going nuts and eating all the memory could leave your
box in a "fuzzy" state when the number and names of surviving applications
and daemons cannot be predicted with any certainty.

In no way Linux kernel can be considered as an advancement of the state
of the art in operating systems design, it's a reimplementation of preexisting
(and really innovative) ideas of Unix. Yes good reimplementation has
its own value and nobody can deny that Linux was an important part of Unix
Renaissance. But is it an innovative OS? Or in a more narrow
sense is it an innovative kernel? That's a joke. Unix introduced at
least seven major innovations in OS design: C language as system programming
language, hierarchical filesystem, pipes and a set of pipes friendly utilities/filters,
regular expressions, devices as files, shell as the mother of all modern
scripting languages, first built-in TCP/IP stack. Linux introduced none,
zero to be exact (if we do not count the method of development praised so
highly in CatB ;-). The most important part of open source development:
the development of scripting languages such as Perl. PHP Python and Ruby
as well as LAMP stack, two things that were really innovative about
open source just happened to be in Linux (and not only in Linux; for example
Perl predates Linux as the initial version was developed by Larry Wall in
1987 and Perl 3 was released in 1989 and Perl 4 in 1991)

If one compares Linux with BE OS, Inferno or even with OS/2 and Amiga
one can see that in major design decisions Linux is a very conservative,
backward OS. As Rob Pike noted in his "Systems Software Research is Irrelevant"
(http://plan9.bell-labs.com/cm/cs/who/rob/utah2000.pdf) Linux can be
considered as a sign that computer science research became irrelevant and
he claimed that the whole situation in OS design is generally bad and requires
action.

In a sense it's more like a powerful social movement with political overtones,
very similar to Newt Gingrich "Contract with America" thing (fight corruption,
waste in government spending, tax reform and a balanced budget, etc.) than
a technological advance. There are other things that make analogy between
Newt Gingrich and Linus Torvalds much less superficial than it looks from
the first sight.

For example, both understand that the principal task of the leader is
to define a positive "vision" around which followers can cohere, define
strategies to achieve this vision, then delegate authority to others to
carry out operations and tactics preserving key functions as the center
of communication and final decision maker. That's why Torvalds so easily
delegates programming of various parts of the kernel to other people.
As long as he is in a controlling position of leader and configuration manager
of the whole kernel who has the final word on what will be included in a
particular version, it does not matter much who will produce a particular
module or driver. Enjoy your ride as long as you agree that I am the driver.

Prior to becoming the House speaker, Newt Gingrich had spent over a decade
writing and speaking about, and organizing around, an eclectic neo-conservative
model, which he argued should take the place of Great Society liberalism.
In Gingrich's view, his conservative predecessor, Ronald Reagan, had ultimately
come up short as a transformative leader because his horizon as a leader
had been defined by opposition to liberalism rather than a positive vision
of a new order in American politics. "The great failure of the modern Republican
party, including much of the Reagan administration," Gingrich wrote in 1984,
"has been in its effort to solve problems operationally or tactically within
the framework of the welfare state . . . . Real change occurs at the levels
of vision and strategy." That's somewhat similar of Torvalds attitude
to Stallman. Gingrich reiterated this theme in a 1995 interview:

"You have
to blow down the old order in order to create the new order.
Reagan understood how to blow down the old order, but wasn't exactly
sure what the new order would be."

When Gingrich became speaker in 1995, his overriding goal was to succeed
where President Ronald Reagan had failed, by creating a "new political order"
in the United States. Sounds pretty similar with the "world domination"
rhetoric of Linus Torvalds. Both share extraordinarily ambitions, executive-style
understanding of political leadership, and an acute ability to work well
with press. Both repeatedly demonstrated willingness to challenge conventional
wisdom and take political risks to advance their goals (Linus decision to
bless commercial distributors was very unconventional, to say the least,
for a "member of a GNU generation").

Being a chief configuration manager for a large and complex project is
a very challenging, exhausting and thankless job. I suspect that now
Linus really hates his job. But the survival under relentless pressure for
a decade attests that Linus Torvalds has a really rare combination of top
programming talent along with traits typical for gifted managers and skilled
politicians. That's a very rare combination. In this respect he is a really
outstanding person, outstanding without any reservations, because most people
will quit such a job after the first several years.

His job requires not only talent and understanding of internals (that's
easier for Linus Torvalds, than for anybody else because he saw the whole
development for the very beginning and complexity for him increases gradually
from a pretty low level), but the ability to work long-long hours almost
without any vacations. And if you add Hollywood level popularity and corresponding
time spend in the PR events this is not very enviable position. And unlike
Hollywood stars he cannot resort to the alcohol, drags or to womanizing,
or and combination of those three to reduce this pressure ;-).

The work under constant stress, under constant overload does not make
life easy for anybody. First problem is that due to overload some decisions
now should be taken based on intuition as there is no time (and sometime
technical background) to understand the consequences of solutions proposed.
The second and major problem is that managing of the project of the
current size of the Linux kernel is probably just too much for any single
person, no matter how talented he or she is. The first signs of this overload
were evident already in 1998 a year after he was welcomed to the "Hotel
California".

In April, 1998 the birth of Linus's second daughter caused great joy,
and substantial disruption in kernel development as all work stopped and
many patches got lost. Some grumbling results as it becomes clear just how
dependent the entire process is on Linus.

In October 1998 tensions again exploded on Linux kernel after Linus dropped
too many patches. Linus walked out in a huff and took a vacation. Things
returned to normal, of course, but some people got talking. It becomes clear
once again that the Linux kernel became to be too big for one person even
to serve just as a configuration management specialist. Some ways of reducing
the load on Linus were discussed, but nothing was really resolved.

Understandably the kernel development considerably slowed down also due
to complexity of the current version of the kernel and SMP stuff (and here
is it really difficult for a student with just MS to complete with IBM,
HP or SUN laboratories staffed with a lot of PhD holders ;-). That inevitable
slowdown was first predicted in now famous
Halloween I memo that provided
one of the first objective (objective attitude is our attitude to people
we do not like ;-) evaluations of Linux. For example there was
no other major open source implementation of SMP. And commercial implementations
were complex and at this time closed. So while there was some space for
Linux to catch Solaris and NT in SMP area, very little can be achieved by
playing a plain-vanilla following somebody's tail lights. Here I would like
to stress the importance of the academic environment for the OSS projects
that was lost.

Not that I respect PhD holders for the diploma they have (a lot of PhD
dissertation are just weak to the point of being a scam and IMHO worth much
less than a single decent OSS program), but still I think that it's extremely
beneficial for an OSS software developer to work for an academic institution,
or at least be close to the university atmosphere. That was true for Denies
Ritchie and Ken Thompson, who did their groundbreaking work on Unix at Bell
Lab (the latter is as close to academic institution as one can get) and
later at Berkeley (and please remember that in some areas Berkeley is more
important contributor to Unix than Bell Lab). Same is true for Richard
Stallman. Only Larry Wall looks like a good counterexample here (although
later he joined a publisher O'Reilly as a free researcher, getting into
essentially semi-academic atmosphere), but one can always switch from Perl
to TCL or Python :-). And it's a common knowledge that large bureaucracies
can have some niches that provide an almost academic freedom for several
years.

Anyway kernel development in 1998 slowed down quite visibly. There was
no new kernel version in the 1998 -- situation that would be unimaginable
to the Linux crowd in 1992 or 1994 or even in 1996. Version 2.2 slipped
to January 1999.

In late January, 1999 kernel version 2.2 was at last shipped --
somewhat buggy because of pressure to release the version. All of a sudden
Torvalds declared the source code final, noting that, "enough is enough"
and that "Every program has bugs, and I'm sure there are still bugs in this.
Get over it -- we've done our best." To call it a stable kernel is like
drawing an arbitrary line, but market pressure is market pressure and Torvalds'
ability to withstand it under the current level of market hype is extremely
limited. Essentially, like in the past, Linus Torvalds has no other choice,
but to give what people want...

The version 2.2 was in some important areas a significant improvement
over 2.0.x kernel but it was larger and need time to stabilize. In retrospect
improvements also help to understand what was wrong with prev. version.
For example, among other advantages FreeBSD had a big speed advantage over
Linux 2.0.x when it came to NFS throughput, because kernel 2.0.x did not
yet have kernel-space support for it (it was all user space, redundant copying
-- a lot of overhead that was removed for Beowulf clustering). With Linux
2.2 and KNFS, I wonder if FreeBSD is still faster in this area.

Still the kernel itself was far from meeting enterprise standards. As
Mark Russinovich wrote in his article
Linux and the Enterprise (April, 1999):

Let me state clearly at the outset that
I don't intend to bash Linux in this article, nor do I intend to proclaim
NT's superiority to Linux. I base the information I present on a thorough
study of Linux 2.2, including its source code. I hope that by revealing
these problems, I will encourage Linux developers to focus on making
Linux ready for the enterprise. I also want to dispel some of the hype
that many Linux users have uncritically accepted, which has given them
the false impression that Linux is ready for enterprise prime time.

What Does
"Enterprise-Ready" Mean?
Before an OS can compete in the enterprise, it must deliver performance
levels on network server applications that rival or exceed the levels
that other OSs achieve. Examples of network server applications include
Web servers, database servers, and mail servers. OS and hardware vendors
typically use results from industry-standard benchmarks such as Transaction
Processing Council (TPC)-C, TPC-D, and Standard Performance Evaluation
Corporation (SPEC) SpecWeb to measure proprietary OSs or hardware against
other vendors' products. Vendors spend millions of dollars on benchmark
laboratories in which they tune, tweak, and otherwise push state-of-the-art
hardware and software technology to the limit. Enterprise customers,
in search of a computing infrastructure that provides maximum capacity,
often turn to benchmark results for guidance, and OS and hardware vendors
holding benchmark leads take great pride in being king of the hill at
any given time.

Thus, competing in the enterprise means
removing every performance impediment possible. Overlooking even the
smallest drag will create an opening that a competitor can drive through
to obtain a valuable lead on a benchmark. What complicates the science
of engineering an OS for the enterprise is that an OS might have subtle
design or implementation problems that don't adversely affect performance
in casual desktop or workgroup environments. Yet these problems can
keep the OS from achieving competitive results in an enterprise-class
benchmarking environment. A typical enterprise-application benchmarking
environment includes dozens of powerful multiprocessor computers sending
requests as fast as they can over gigabit Ethernet to an 8-way server
with 4GB of memory and hundreds of gigabytes of disk space.

Efficient
Request Processing
Network server applications typically communicate with clients via TCP
or UDP. The server application has either a published or well-known
port address on which it waits for incoming client requests. When the
server establishes a connection with a client or receives a client request,
the server must then process the request. When the server application
is a Web server, the Web server has to parse the HTTP information in
the request and send requested file data back to the client. A database
server application must parse the client's database query and obtain
the desired information from the database.

For a network server application to scale,
the application must use multiple kernel-mode threads to process client
requests simultaneously on a multiprocessor's CPUs. The obvious way
to make a server application take advantage of multiple threads is to
program the application to create a large pool of threads when it initializes.
A thread from the pool will process each incoming request or series
of requests issued over the same client connection, so that each client
request has a dedicated thread. This approach is easy to implement but
suffers from several drawbacks. First, the server must know at the time
it initializes what kind of client load it will be subject to, so that
it can create the appropriate number of threads. Another drawback is
that large numbers of threads (an enterprise environment can produce
thousands of simultaneous client requests) can drain server resources
significantly. Sometimes, resources might not be adequate to create
all the threads the application wants to create. Furthermore, many threads
actively processing requests force the server to divide CPU time among
the threads. Managing the threads will consume precious processor time,
and switching between competing threads introduces significant overhead.

Because a one-thread-to-one-client-request
model is inefficient in enterprise environments, server applications
must be able to specify a small number of threads in order to divide
among themselves the processing for a large number of client requests.
Where this client-multiplexing capability is present, no one-to-one
correspondence between a thread and a client request occurs. Neither
does a one-to-one correspondence between a client request and a thread
occur—one thread might share a client request's processing with several
other threads. Several OS requirements are necessary for a client-multiplexing
server design to be feasible. The first requirement is that a thread
must be able to simultaneously wait for multiple events: the arrival
of a new client request on a new client connection, and a new client
request occurring on an existing client connection. For example, a Web
server will keep multiple browser connections open and active while
accepting new browser connections as multiple users access a Web site
the server manages. Connections between a browser and the server can
stay open for several seconds while large files transmit over a connection,
or while the browser requests multiple files over the connection.

The second requirement is that the threads
must be able to issue asynchronous I/O requests. Asynchronous
I/O is an OS-provided feature whereby a thread can initiate I/O and
perform other work while the I/O is in progress—the thread can check
the I/O result at a later time. For example, if a server thread wants
to asynchronously read a file for a client request, the thread can start
the read operation and wait for other client requests while the read
is in progress. When the read completes, the system notifies a thread
(not necessarily the thread that began the read operation) so that the
thread can check the I/O's status (i.e., success or failure) and whether
the I/O is complete.

Without asynchronous I/O, a thread initiating
an I/O operation must wait while the operation takes place. This
synchronous I/O causes multiple-client-per-thread server designs
to perform poorly. Because such server designs designate limited thread
pools, taking threads out of commission to perform I/O can lead to a
situation in which no threads are available to accept new client requests
or connections. In such a case, a multiprocessor's CPUs might remain
idle while client requests sit backlogged. Worse, the server might never
have a chance to service client requests, because the client might stop
waiting for the server.
Figure 1 contrasts asynchronous and synchronous I/O.

Linux
and Request Processing
Unfortunately, Linux 2.2 doesn't satisfy either client-multiplexing
server-design requirement: Linux 2.2 cannot efficiently wait for multiple
events, and it doesn't support asynchronous I/O. Let's look more closely
at each of these concerns.

Linux provides only one general API to
server applications that want to wait on multiple requests—the select
API. Select is a UNIX system call that has been present in every
UNIX release since the OS's initial development. Select is one of the
OS interface functions that has become part of the POSIX standard for
UNIX API compatibility. One reason that the Linux select implementation
is not an acceptable function for waiting on multiple events is that
the system uses select to notify all threads that are waiting on the
same event whenever the event occurs (e.g., the arrival of a request
from a new client). Notifying multiple threads in this way degrades
server performance: Only one thread can handle the new request or connection,
and the other notified threads must return to a state of waiting. In
addition, synchronization causes overhead as the threads agree among
themselves which one will service the request. Other secondary overhead
results when the OS divides CPU time among the threads it has needlessly
notified. This kind of limitation forces a network server application
to designate only one thread to wait for new incoming client requests.
This thread can either process the new request itself, waking up another
thread to take over the role of waiting for new requests, or the original
thread can hand the request off to a waiting thread. Both alternatives
add overhead, because every time a new client request arrives, the waiting
thread receives notification and must then notify another thread.

If Linux provided some additional application
support, the OS could wake up only one thread. For example, an application
could specify that even though multiple threads are waiting for a particular
event to occur, the application wants only one of the threads to receive
notification for each occurrence of the event. NT provides such support
for its waiting functions (NT server applications do not typically use
select, although NT implements the select call for compatibility with
the POSIX standard) to allow multiple threads to efficiently wait for
incoming client requests.

Select suffers another serious problem:
It doesn't scale. A Linux application can use select to wait for up
to 1024 client connections or request endpoints. However, when an application
receives notification of an event, the select call must determine which
event occurred, before reporting the event to the application. Select
uses a linear search to determine the first triggered event in the set
the application is waiting for. In a linear search, select checks events
sequentially until it arrives at the event responsible for the notification.
Furthermore, the network server application must go through a similar
search to determine which event select reports. As the number of events
a thread waits for grows, so does the overhead of these two searches.
The resulting CPU cost can significantly degrade a server's performance
in an enterprise environment.

NT incorporates a unique feature known
as completion ports to avoid the overhead of searching. A completion
port represents a group of events. To wait on multiple events, a server
associates the events with a completion port and waits for the completion
port event. No hard upper limit exists on the number of events a server
can associate with a completion port, and the server application need
not search for which event occurred—when the server receives notification
of a completion port event, the server also receives information about
which event occurred. Similarly, the kernel doesn't perform searches,
because the kernel knows which events the system associates with specific
completion ports. Completion ports simplify the design and implementation
of highly scalable server applications, and most enterprise-class NT
network server applications use completion ports.

Even on desktop 2.2 still has a long way to go and still demonstrated
sound stutters, jerkiness when moving windows and slow window manager performance
(which is connected with weak support of interactive applications in the
kernel).

Another consequence of this overload is that polishing of existing codebase
became a top priority. Linux needed to compete with commercial systems in
benchmarks and that requires polishing and tuning, polishing and tuning...
That became pretty clear during so called
Mindcraft fiasco that took place in
the early 1999. This was Microsoft sponsored (and Mindcraft
executed) test that showed that despite Raymondism claims, Linux 2.2
still has problems in the application area were it is most widely used --
as a web server. Here are some important results from the test:

It was natural to expect that Eric Raymond will fiercely defend his beloved
operating system and his CatB fairy tail against any enemies, including
truth. And the desire to see a rebuttal was instantly gratified (see
Linux
Today Eric S. Raymond -- Latest FUD tactic may be backfiring. readers
posting are also very telling and can tell about Raymondism as phenomenon
much more that my writings):

A 21 April ITWeb story [1] reported results
by a benchmarking shop called Mindcraft that supposedly showed NT to
be faster than Linux at SMB and Web service. The story also claimed
that technical support for tuning the Linux system had been impossible
to find.

Previous independent benchmarks
(such as [2]) have found Linux and other Unixes to be dramatically faster
and more efficient than NT, and independent observers (beginning
with a celebrated InfoWorld article in 1998 [3]) have lauded the Linux
community's responsiveness to support problems. Linux fans smelled
a rat somewhere (uttering responses typified by [4]), and amidst the
ensuing storm of protest some interesting facts came to light,.

1. The benchmark had been paid for by
Microsoft. The Mindcraft press release failed to mention this fact.

2. Mindcraft did in fact get a useful
answer [5] to its request for help tuning the Linux system. But they
did not answer the request for more information, neither did they follow
the tuning suggestions given. Also, they forged the reply email address
to conceal themselves -- the connection was made after the fact by a
Usenetter who noticed that the unusual machine configuration described
in the request exactly matched that of the test system in the Mindcraft
results.

3. Red Hat, the Linux distributor Mindcraft
says it asked for help, reports that it got one phone call from them
on the installation-help line, which isn't supposed to answer post-installation
questions about things like advanced server tuning. Evidently
Mindcraft's efforts to get help tuning the system were feeble -- at
best incompetent, at worst cynical gestures.

4. An entertainingly-written article
[6] by the head of the development team for Samba (one of the key pieces
of Linux software involved in the benchmark) described how Mindcraft
could have done a better job of tuning. The article revealed that one
of Mindcraft's Samba tweaks had the effect of slowing their Linux down
quite drastically.

5. Another Usenet article [7] independently
pointed out that Mindcraft had deliberately chosen a logging format
that imposed a lot of overhead on Apache (the web sever used for the
Linux tests).

So far, so sordid -- a fairly standard
tale of Microsoft paying to get exactly the FUD it wants from a nominally
independent third party. But the story took a strange turn today (22
Apr) when Microsoft spokesperson Ian Hatton effectively admitted [8]
that the test had been rigged! "A very highly-tuned NT server"
Mr. Hatton said "was pitted against a very poorly tuned Linux server".

He then attempted to spin the whole episode
around by complaining that Microsoft and its PR company had received
"malicious and obscene" email from Linux fans and slamming this supposed
"unprofessionalism". One wonders if Hatton believes it would be "unprofessional"
to address strong language to a burglar caught in the act of nipping
the family silver.

In any case, Microsoft's underhanded
tactics seem (as with its clumsy "astroturf" campaign against the DOJ
lawsuit) likely to come back to haunt it. The trade press had largely
greeted the Mindcraft results with yawns and skepticism even before
Hatton's admission. And it's hard to see how Microsoft will be able
to credibly quote anti-Linux benchmarks in the future after this fiasco.

Despite numerous similar attempts by Raymondists (and some Linux developers
including
Alan Cox) to present this finding as a one-sided report and destroy
Mindcraft's credibility by painting the test as a PR exercise performed
by paid Microsoft puppets (attempts that most Linux developers regret later)
Linus was faced with an urgent necessity to find its own game in TCP/IP
stack improvement and SMP improvement. He understood more than anybody else
that even if Linux was properly tuned, it still may not have matched NT
on that hardware. The part of the test was run on a high-end SMP box, and
at this time NT may well outperform Linux 2.2 on that sort of platform.
SMP in the kernel 2.2 was not as good as Windows NT, to say nothing about
Aix or Solaris. And this put him in an extremely difficult position of making
some decisions of how to proceed without a real kernel development lab and
without any development plan. That probably contributed to already substantial
overload and played a role in a one year delay of 2.4 kernel. Here is how
he recollected the events later in his December 2000
Linux Magazine interview:

MindCraft thing [MindCraft is an independent
research laboratory that last year reported test results -- paid for
by Microsoft -- indicating that Windows NT outperformed Linux in certain
basic server tasks. ­Ed].

LM: How hard was that to deal with?

LT: It was really personal for a few
months. I took it fairly personal, especially the way they did it.

LM: What happened?

LT: Well, it was a panel discussion in
Chicago and it was the first time I'd been on the floor at the same
time as people from Microsoft. Five minutes before the panel started,
the Microsoft guy handed out this paper that contained the results from
the MindCraft study, and I didn't even have time to really see what
it meant. So, when he actually took this up in the panel, it was hard
for me to say anything.

LM: But in the end, Microsoft was right,
don't you think?

LT: Microsoft was right. The point was
that it actually gave us a much better baseline to compare what we were
bad at. We'd probably been naive and thought that we were doing some
things really well. Then having somebody do that comparison was very
motivational. That was quite important.

Everybody expected some kind of attack
from Microsoft, so I think we'd been a bit arrogant in believing that
there were so many benchmarks that we were so much better at than NT.
It took a lot of people by surprise, including me. We really lost badly
in that one. There was certainly that kind of naivete.

Well oiled machine of Linux startup PR professionals, distributors and
may be in pre-IPO period even high caliber PR people from largest investment
banks created a very interesting public image of Linux Torvalds. The zeal
to create a "positive sellable image" of the Father of Linux might
be studied as an example of a very successful PR campaign in communication
departments of Ivy League Universities. Even Linus Torvalds appearance
on the fist page of Forbes probably was probably not that accidental.

All-in-all it created a cult of personally effect similar to a
typical leader of a totalitarian party. Several journalists who tried to
explore this "make money fast" possibility contributed to this trend too.
And the results were pretty impressive even for Eastern Europeans who suffered
from this melodramas for most of their life. Like in case of many religious
cults and radical movements Linus became especially attractive as a role
model for teenagers. Here are typical examples:

Name:

Kenneth
Kowalsky

Location:

Victoria
BC - CANADA

Occupation:

Student

Linus
is my idol.

I am a 15 year old student. Programming
and computer science is my life. I have used Linux for the past
3 years and found the subject of Linux to be highly interesting
- not because it is Unix or the fact that it is highly stable
- because it has started a revolution. If you compare Linux
to MS Windows NT, Linux comes out on top. The reason is the
open source factor, while windows is only worked on by a 1000
programmers (at most) Linux is worked on my millions. This is
because Linux is open source, which allows anyone to patch up
or add to the current design. I mean anyone, including you who
are reading this note could go to kernel.org and get the tarball
and within the hour be knee deep in source code.

This gives Linux an advantage,
the kind of advantage that only can be achieved with open source.
I don’t mean to say that every instance of Linux on the planet
will kick butt with windows. When Linux is poorly configured
it can crash more than windows, but when properly configured
it has ten times the power that any windows copy could every
dream of.

Linus is the person that I hope
to be one day, don’t get me wrong. I don’t want to be the guy
in charge of Linux, I don’t even really want to start a movement
– but I do want one thing in common with Linus – I want to be
able to give away something that will better the computing world.
If you think about it, Linus gave away something that could
have brought him millions – or even billions. Linux could have
been the next Microsoft Windows.

It takes an incredible person
to do something like that – that is why I look up to Him.

Name:

CVRadhakrishna

Location:

Trivandrum-India

Occupation:

Scientist

Linus is to Computer
Field as MK Gandhi was to public life. Mahatma Gandhi gave exemplary
concepts like Satyagraha(passive resistance?) against oppression
and Linus has given the Computer world Linux aginst monopolies,
trademarks and patent dominance. Both gave freedom to the people,
one at political and social level and another in the realm of
Computers.

Name:

Faisal
Halim

Location:

Abu Dhabi,
U.A.E.

Occupation:

Student

A
MODEL worth EMULATING!

Mr. Linus Torvalds has proved
to students a model worth emulating. Not only does he develop
software based on the principles of software engineering, he
actually does not bear a fanatic attitude towards software licences.

Here, at Islamia English School,
we, the boys of A Level I, are asked to be models for the younger
boys to emulate. I think Mr. Linus is a man we can look up to
emulate, besides our teachers.

He has upheld the principles
of the hacker culture, been well mannered in the face of companies
violating the norms of the computer industry, a premise where
many people would just start to flame, he gives importance to
his appearance, he is capable of making presentations, and he
has broken the dark mist that has gathered on hackerdom by bringing
up a family.

What is really sad, the initial Linus (and Alan Cox) reaction to the
Mindcraft fiasco might be an early warning of the tendency that is common
to many other open source projects: initial developer instead being part
of the solution eventually becomes a part of the problem. An identification
with the project and the desire to keep the control of the project at some
point starts to hurt the project future. Please remember that revolutions
usually eat its own children and big software projects are not so different
from revolutions.

But as for Linus "world dominance" goal: an attempt to play (and win)
both desktop and server game were far from encouraging signs. This is continuation
of tremendous overload, pure and simple. That's why version 2.2 lacked
a journaling file system, which means that if fsck can't fix a disk after
a power failure or system crash, and you were pretty much hosed and the
most recent backups is your only chance, if any. That's not good for availability,
but project leader is just a mortal human...

But question of succession and the organization of Linux Kernel lab has
another dimension. If you are the leader of a big project you need
to be open about succession because the project will definitely outlive
the initial leader. Here Torvald's behavior looks really ambivalent and
demonstrates his supersized ego. And that speaks trouble for Linux
as a project and especially for the Linux companies like Red Hat or Caldera.
As Russ Mitchell in his
Business2 paper pointed out:

It would be
natural to expect Torvalds to respond to questions about succession
issues. But he's not talking. Busy developing software at computer chip
maker Transmeta in Santa Clara, Calif., even as he continues to manage
the Linux project, Torvalds has been keeping a low profile of late.
His handlers rebuffed several invitations from Business 2.0
to talk about the Linux organization, the question of succession, and
his own plans for the future.

Although the media tend to trumpet Linux as a global collective of
hundreds or thousands of programmers headed by Torvalds, the truth is
a bit more complicated. In fact, Torvalds and a tight cabal of top-notch
programmers that numbers fewer than a dozen do most of the heavy lifting.
Torvalds and his small team are responsible for developing new iterations
of Linux operating system software and for fixing bugs and otherwise
maintaining the current iteration. While the group is barely known within
the software industry, Torvalds' successor would emerge from this group.

More than two dozen Linux insiders were polled to name the one individual
most likely to take the baton from Torvalds, and one name consistently
turned up: Alan Cox. Cox is a bush-bearded, long-haired Brit who lives
in England and works for Red Hat. Cox is one of the "few people [Torvalds]
trusts to make important decisions about future directions," says Peter
Wayner, author of Free For All , a recent book on the history
of Linux. Cox, he says, is "responsible for making sure that most of
the new ideas that people suggest for Linux are considered carefully
and integrated correctly."

When Torvalds took his first vacation in eight years last summer
(a mere two weeks), Cox ran the operation√and he did more than keep
the wheels on the tracks: "While Linus was gone, Linux development became
unblocked in certain ways," says Michael Tiemann, an open source pioneer
who founded Cygnus Systems and now is chief technology officer at Red
Hat. Tiemann says Cox made a number of decisions about Linux code that
"broke some logjams" that had been hampering Linux development.

Cox and Torvalds could hardly look less alike. Torvalds is a cherub-cheeked
boy next door. Cox is the wild and wooly hippie freak. Don't fall for
stereotypes, though√it turns out that Torvalds is the more loosely-goosey
manager, according to Tiemann. "Linus is adamant that people do their
own thing," he says. "His message [to the programmer] is that you should
do what works for you. Cox says to them: "This is how it works best
together, this is how your contribution can become more of a building
block." Linus seems more of an anarchist. Alan seems more constructionist."

.... .... .....

When a corporation is headed by
a charismatic leader who won't spell out a clear path of succession,
investors usually get spooked. Are Linux customers worried?
There is little evidence at the moment. But they may be unaware
of how loose and uncertain is the structure that Torvalds sits upon.
They may be unaware that Linux's benevolent dictator has chosen not
to reveal his succession plans√or even to say whether he has crafted
any succession plan at all.

As I already mentioned in the "Raymondism" section there were some warning
signs of common "cult of personality" disease in Linux development
starting from early 1998. Work on Linux standards was (and is) marginal
and is not supported by Linus as this undermines his power as the supreme
technical guru. Documentation of internals was in really bad shape and Linus
politely explained "he is bad at documentation" :-).

Starting from the kernel 2.2 based distributions some interoperability
problems started to cause some concerns and papers start to appear about
problems in this area. And the speed of development and overload make any
fixes of this situation problematic, but in late 2000 the market conditions
seems took care about minor Linux distribution players without any lab.
Still there are questions to answered about maturity and rationality of
the decision of going both for desktop and low-level servers and high-end
computing using a single person an the supreme coordinator of all development
efforts. And this particular question is very difficult to answer
because of the personality issues involved.

For example, it's not clear how long Linus will be able positively contribute
to the kernel development or, more importantly, be able productively coordinate
increasingly complex and demanding configuration management job on both
ends -- desktop and high-end server. That to a certain extent deprive opportunities
for Alan Cox and several other major developers to influence strategic decisions
in the development.

Currently Linus is still the focus point of all kernel development efforts,
but with three daughters he is definitely not able to work 10 hours or more
a day on this problem (10 hours a day six days a week looks like a bare
minimum for a leader of such a project).

Yes I understand that he wants to achieve his financial goals but in
such situation the extent to which he is still a positive force is unclear.
May be its time for him to move his responsibilities as a kernel developer
coordinator to some independent body like proposed Linux Kernel Development
Lab financed by all commercial Linux distributors. For example now Linus
is trying to coordinate symmetric multiprocessing support in the kernel
without proper laboratory to test proposed solutions. And actually without
any supporting stuff (unless there are some shadow support staff in Transmeta).
He essentially controls very few areas of the kernel (probably less than
10% including memory management stuff and scheduler, both at the level far
beyond competition and both are constant source of problems) so his main
contribution now is a management of the development. It remains to be seen
if he will be up to the task and will continue to play constructive
role in the movement that largely outgrow the model of loosely coupled
mob of Internet connected developers.

Here his Transmeta affiliation looks far from being a best solution.
And I am not only talking about possible conflict of interest in a style
"What is good for Transmeta is good for Linux". I believe that
Larry Wall made a better decision when
he went to work for O'Reilly. First, publishers are closer to academic institutions
than most commercial organizations. Second, at certain point you need a
real organization to help you in your efforts. From that point of view organizations
like FreeBSD or even GNU with several core developers that are able physically
contact each other have an advantage over current Linux infrastructure.

In any case in some 1999 interviews Linus looks exhausted to level
of burnout and it is evident that he hate to produce another version
of the kernel in 1999. In
one interview
he even suggested that it probably would be better if Linux returned to
the status of his home hobby. This, of course, is now completely impossible,
no matter how he want that or how exhausted he feels. And was forced his
into another rat race to get kernel 2.4 out in 1999 -- the race that
was lost.

Question of
burnout is also important for other volunteer kernel developers as excessive
load is the best way of killing any interest in participating in any
volunteer project. I think that this is one of the main dangers as
talented volunteer developers are the most valuable capital Linux has. With
hired developers only (say Red Hat peoples exclusively) Linux is not much
different from closed commercial OSes like Solaris. It's just more difficult
to sell it profitably because "once GNU, forever GNU" factor.

From the other point of view if a program has a couple hundred thousand
lines does it matter much if it is open sources or not -- open source at
this level is just intermediate representation and other intellectual assets
like internals documents, special tools for dealing with this level of complexity,
etc. are significant barriers of entry. That's actually explain Linus unwillingness
to spend any time documenting development. Netscape failure with opening
of Communicator showed that large scale projects are a special beasts and
the complexity of the project at mature stage is a powerful barrier of entry
even to the most motivated and skilled developers. And please remember that
before mid-seventies almost all IBM mainframe software products were open
sourced...

Since approximately 1997 Linus probably became the most well known Finn
on the planet. Now he became a media darling -- a very dangerous and
rather time consuming occupation with not that much time to devote to kernel
:-). And media is pretty cruel as it serves as a powerful amplifier of "cult
f personality" disease. He now has dozens of pages and hundreds of papers
devoted to him, including this chapter ;-). Among the most active builders
of Linus cult I would like to mention Eric Raymond who lionize Linus
in his famous " The Cathedral and the Bazaar" paper (here Eric Raymond motives
are pretty clear -- he wanted to benefit from the cult; see my
OSS page for details). Linux movement
instantly became Open Source movement and where Linux and Open Source is
concerned,
hyperbole from the digerteratti hype meisters proliferates nearly as
quickly as the
hyperlinks. Here is one quote from
LinuxWorld Today that is as close to the style of Soviet media depicting
party functionaries as one can ever get:

Throughout the entire keynote the
audience listened almost in awe to this man with the good humor, the technical genius, and the incredible people skills
that have been able to harness the brilliance of the bazaar,
those hundreds, thousands of developers on the Internet, so that they
all pull together to move Linux forward, release by release, to the
point where it is today. An entirely unlikely object of affection
of nearly the entire computing industry.

And he himself proved to be as susceptible to this disease as communist
bureaucrats. In early 1999 interview
Linus first time used the word "invented" to Linux -- the word completely
inapplicable to the clone OS and very close to the Microsoft marketing style
(as everybody knows Microsoft "invented Windows" by a blatant rip-off
of the Mac GUI ;-). As always he failed to mention GNU (moreover, this time
he even forgot to mention that the adoption of GPL license was the smartest
thing he ever did; probably at this time he already has some reservations
-- see KDE jihad ):

PC Week: Give us the short history
of Linux's development.

Torvalds: Basically,
I invented it eight years ago, almost exactly eight years
ago. It started small, not even an operating system. It was just
a personal project. I just was doing something fun with my new machine.
It kind of evolved through luck and happenstance into an OS, simply
because there was very much a void where there wasn't much choice
for someone like me. I couldn't afford some of the commercial OSes
and I didn't want to run DOS or Windows -- I don't even know, did
Windows really exist then?

PC Week: You could have copyrighted
Linux and made a fortune. Why did you make it an open source code
operating system, and will that model work in the future as Linux
acceptance grows?

Torvalds: It started out as
a personal belief that, yes, open source was
needed. Then, when it got large enough, I encouraged people to license
their own development, their own parts. Now there are multiple owners
sharing all these licenses...

"Father" Linus Torvalds is now the holy cardinal of the church of Linux.
You can find more relevant quotes in the in the
Webliography to the chapter. Anyway, this
"superhero" status of Linus Torvalds remind me of communist empires with
their First Secretaries determined to die "serving the country" :-).
And, yes, the kernel can be a kind of mausoleum...

What I would like to stress is that Open Source is a very important avenue
of software development that need to be supported, but it is unrealistic
to consider it a panacea -- software development is really hard no matter
open sourced or not. So if one wants to participate he/she should
have no illusions about open source/free software. There is no royal
way for accomplishing big software projects and even with luck one need
to struggle/suffer a lot. That also means that the heroes of OSS development
deserve their fame, despite all in-fighting and problems that OSS face.
At the same time it's important to understand that none of them is or ever
been an angel, that they were driven by (always) complex and (often)
contradictory motives. We also need to understand that usually one
of two central figures catch all the fame that should belong to many more
people, with some never mentioned in the mainstream press. And that
we do not need the communist-style cult of personality for OSS leaders,
who IMHO are to certain extent "accidental leaders" of Linux cult.

The pressure was mounting against Linus excusive control of the kernel.
For example in a story
Is Linus Killing Linux published on Jan 26, 2001 at
Techweb Paula Rooney questioned the status quo:

... some solution providers, vendors,
and industry observers are beginning to question how long one man can
steer the evolution of
Linux, and whether Torvalds' sole oversight of the kernel,
now at version 2.4, is slowing its corporate adoption.

While he's not driven by profit motive,
the engineer has significant power over the kernel: Linux is a registered
trademark of Linus Torvalds himself.

"We need a full-time leader and a nonprofit
organization that can be funded by IBM, Compaq, and Dell and the [Linux]
distributors," said Hal Davison, owner and president of Davison Consulting,
Sarasota, Fla.

Some Linux solution providers view the
constantly evolving process of the posting of Linux libraries, patches,
and updates to the Internet as inefficient and cumbersome, Davison said.

"VARs are reluctant because they don't
see a clear channel. They don't see a Microsoft or strong corporate
company saying, 'We're going to be here forever,'" he said.

Torvalds opposes the notion of corporate
interests controlling the destiny of the Linux kernel.

However, experts say he'll face pressure
from big OEMs and ISVs that are bankrolling the transformation of the
technology into a lucrative industry.

The Linux market stands to double this
year to $4 billion, according to Deutsche Banc Alex. Brown, a Wall Street
investment firm. OEMs are hopeful but leery about Torvalds' casual indifference
to market needs and capitalist concerns.

IBM's recent pledge to spend $1 billion
to advance Linux commercially in 2001 comes with a no-strings-attached
promise today, but observers say that won't last if Linux doesn't pick
up steam in the form of revenue and profits.

For example, at the LinuxWorld conference
in New York, IBM plans to unveil new Linux initiatives and clients,
including Shell Oil.

"In the early stages of open source,
it was more of a charitable affair and developers didn't attach a fee,"
said George Weiss, an analyst at Gartner. "But the vendors are in it
for financial success, and they'll think twice about being charitable
while answering to their stockholders."

Publicly, blue-chip vendors recognize
Torvalds as the lead Linux developer, but note that they aren't beholden
to his final nod to carry out their product plans, as they are with
Microsoft's Bill Gates.

Still, insiders say Torvalds' casual
e-mail sign-offs on the kernel carry tremendous weight in the commercial
market and down the food chain from OEMs to ISVs, solution providers,
and customers.

For instance, when Torvalds declared
Linux 2.4 finished several weeks ago, only Red Hat opted to ship an
upgrade based on the "preproduction" Linux 2.4 kernel. Since then, Linux
distributors have begun detailing their product deliverables based on
the new kernel.

"[Torvalds'] decisions are not ones you'd
quickly throw out the window," said Bob Shimp, senior director of database
marketing at Oracle, Redwood Shores, Calif., which contributed to Linux
2.4 development.

"When he's ready to release the final
version, that's when distributors package it up," Shimp said. "Having
a little bit of control like that is a good thing. It all boils down
to market forces."

Despite Torvalds' technical reign over
Linux, IBM and Compaq have quickly become the industry's de facto Linux
leaders, and tensions over the kernel's direction will heighten as market
forces intensify, experts say.

"I don't believe open source works well
for commercial companies because they can't control schedules," said
Michael Cusumano, a professor at the Massachusetts Institute of Technology's
Sloan School of Management who sits on the board of solution provider
NetNumina Solutions. "Software companies try to have regular development
cycles. That's how you build a rhythm for a company

I think a large part of the business models that Linux companies tries
to adopt and for which they eventually paid a huge price are attributable
to venture capitalists greed. With an explosive combinations of venture
capital and investments banks, you simply throw some money at something
fashionable in order to build something akin to a pyramid scheme in which
you hope to sell off shares to gullible investors during IPO and get out
before the pyramid collapses. And collapse partially is caused by
excessive valuations and excessive zeal with which new public companies
tried to expand their market to justify unrealistic evaluations they got
during the IPO due to subtle manipulations of the underwriter banks. This
can be considered as a new genre kind of Greek tragedies played on the sunny
California soil :-). Ancient Gods were blood-thrusty...

Venture capitalists expect you to do you best to inflate the stock value
at the after IPO period as this is a period where they plan to get (huge)
return on their investment. And to please them you need to hire 250 or even
500 people even if you understand perfectly well that it might be a sustainable
business for only less than 50. This is the reason why so many Linux companies
are making a huge loss during their first year. Imagine a company with limited
market and a small revenue which like sportsmen on steroids has millions
of dollars injected into it to produce quick results no matter what. How
do you spend the money without making a loss? Your revenue will not increase
overnight and it might be impossible to grow in beyond a very modest sum.
But you spend and spend the money on hiring extra (and unnecessary) people,
advertising, marketing, partnerships, acquisitions.... Venture capitalists
want you to make a quick buck so that they can get out with profit. Any
how stock will behave after this point is not very interesting to them.

As Linux became fashionable there was a big temptation to make quick
buck on Linux and that temptation started a wave of very questionable IPOs.
A business is only able to survive if it makes a profit. But IPO itself
can be a (questionable) business model, although not a sustainable one.
And the motto of the movement for some time was changed to "Stop winning
about values and grab some cash". Boosting ore IPO share price took
precedence over creating real value and real software. Analysts shamelessly
hyped the stock of companies with which their investment banks did business.
It would be simply myopic not to see the Linux IPO story as a part of the
techobable of the late 1990s with all accompanying it hype, greed and immorality.
As we will see the post IPO scandals and suits scandal marks the end of
an era and later with Enron fiasco raised the question of the strength of
the fundamentals of the US markets: transparency, accountability and
trust.

Actually a company is only able to direct its own course if it remains
free from the the excessive influence of the capital markets. And that excessive
influence is essentially on of the most important factors that dooms many
developing country economies, to say nothing about small startups. That
means that in a healthy financial climate financing of the business is provided
via growth of revenue from sales, not from selling shares (we will for simplicity
avoid tax-avoiding schemes when shares sold at discount constitute a major
part of employee salaries). How much good can a company do to the open source
software movement, and its owners and employees, within those constraints?

Linux startups history in general and RH in particular suggests some
answers to this question. First on all let's state it frankly that from
business perspective each additional Linux distributors dilutes the value
of the other distributors franchises by diverting some potential customers.
So there is no much love left among creators of different commercial Linux
distributions.

The first largely unknown event in the commercialization
of Linux was the selling of
www.linux.com domain
by its owner Fred van Kempen for undisclosed sum (rumored to
be up to five millions) to VA Research (of future VA Linux IPO fame).
That was a nice return on investment that proved that Linux is able quickly
produce millionaires. And Fred van Kempen soon became just an interesting
footnote in the history of crazy Linux IPOs.

In 1998 Red Hat became an undisputed leader among Linux distributors.
It got several important investments including one from IBM (who also invested
in other Linux distributors) and one from Intel as well as one from Novell
(that was essentially a blow to Caldera).

The community developers are supposed to build Red Hat's product, while
the certifications and vendor endorsements are held back for the high-priced
"Red Hat Enterprise Linux" brand. With almost a hundred developers (including
Alan Cox) it looked like Red Had might implicitly controls the direction
of Linux kernel development by a mere share of brainpower it employs.
In this sense by mere fact of existence of this Red Hat developers involved
with solving of complex problem of running enterprise style applications
like Oracle on the Linux kernel the role of Linus Torvalds was shrinking.
Linux Torvalds cannot pay too much attention to the Oracle problems. One
sign of this is that he no longer controls production versions of the kernel
and in some cases it's possible to outsmart Linus putting into production
version features that he might object.

But Linus Torvalds is much less democratic than he appears and still
drive the configuration management of the kernel with an iron fist and that
pretty much limits Red Hat possibilities, although no question Red Hat has
its own proprietary kernel in his distributions. In late 2000 after Linus
sold all Red Hat shares, he actually slammed RH about version of GNU
compiler included into v. 7.0, but this was just an accident or as somebody
cynically suggested in Linux Today "an implicit assumption of guilt for
his Midas touch ;-)"

Red Hat's business model is by-and-large an outsourcing model (with "shared"
free developers or "not so shared" part time developers producing the code
that Red Hat sells). As there are less expensive distribution, revenue
stream form selling the software is not a reliable revenue mode, Red Hat
almost completely depends on its subscription-based revenue stream,
so he need to close distributions as much as possible. Clones like Mandrake
(that outsell Red Hat on the desktop) are (from the pure business perspective)
annoying enough to justify some counteractions.

It's interesting to note that with the complexity of the latest versions
of Linux, Red Hat managed to lock people into their distribution without
breaking GPL license; the key is in how they put it all together. If you
try to compile pine on a Red Hat 5.2 machine it won't run on Debian 2.1,
although both distributions use glibc6. But if you compile it the Debian
machine, it would ran on both platforms. Something about Red Hat's development
environment prevented the Red Hat compiled binary from running on Debian.
This power over little things, like where files should be placed, and which
libc to use is a considerable power independently whether the product is
GPLed or not ;-). This demonstrates that if a company develops a product
on Red Hat, users may have trouble running that product on other platforms.
At the same time, many companies feel that they must develop on Red Hat
-- most of their customers will be using Red Hat.

Anyway, in late 1998 it looks like Red Hat was close to be perceived
as the de facto owner of Linux. This is a perception created a solid preconditions
for the successful IPO pyramid scheme. After Red Hat files for IPO many
expected that it will do well, but nobody expected that it's stock will
go that high. After all this is a tiny company with no sound business model
and several well funded competitors like Caldera, who sell exactly the same
stuff. But Robert Yong proved to be an excellent marketer... In an
interesting move he offered 800,000 shares to members of open source community.
On April 11, 1999 Red Hat IPO went with a bang. As DowJones.com
reported:

"But three other stocks making debuts
Wednesday came nowhere near Red Hat's showing, reflecting the recent
trend of underwhelming first-day performances that has prompted several
companies to delay or cancel impending initial public offerings."

"Based on the 66.8 million shares outstanding
after the offering, Red Hat has an estimated market value of about $2.96
billion. Underwriters were led by IPO heavyweight Goldman Sachs & Co."

"Regardless of the degree of enthusiasm, all agree that Red Hat
has garnered attention by positioning itself as the company best able
to capitalize on the growth of Linux, the open-source operating
system that has been developed and updated by a community of independent
programmers, who, in turn, make all their changes widely available on
the Internet."

Almost three billion for almost nothing special, the thing that you can
buy from at least a dozen of other sources or download free from the Internet.
Here is the list of major shareholders (numbers reflect the split)

LINUX. LINUX. LINUX. Linux. Chant
the word enough times — or at least release enough press releases with
the word in it — and watch your stock price rise.

It's magic. Company after company has
associated itself, somehow or other, with the popular open-source operating
system begun by Finnish programmer
Linus Torvalds, a practice that is often followed by a big — if
short-lived — upward stock move for the announcing company.

Many companies have felt especially compelled
to disclose their Linux news since the raucous debut of VA Linux (LNUX).
In the eight days following that Linux-services firm's Dec. 9
IPO, about 400 press releases with mentions of Linux found their way
onto the public relations newswires that feed into
Dow Jones
News Retrieval. No wonder one publicist we interviewed referred
to the process as "Linux spamming."

Admittedly, some have good reason to
release Linux news; like companies from Hewlett-Packard (HWP)
to eShare (ESHR),
they may be understandably eager to tout a legitimate marketing agreement
or a Linux-friendly product. In an investing world where shares
of Red Hat (RHAT),
which were priced at $14 at the August IPO, now trade around $250 and
where shares of VA Linux rose 698% at their Dec. 9 IPO, who wouldn't
want to bask in a little collateral love?

But there are those Linux releases that
are so tangentially related to actual business performance as
to bring into question the motives of the companies that release them
and the prudence of investors who buy in. Adding to the ambiguous nature
of these releases, many of the companies making such announcements possess
stocks that are, to put it politely, a little languid.

Take Minneapolis-based K-tel International
(KTEL),
whose stock slipped from a high of $13.50 a year ago to $4.50 before
a recent mild rebound to the $8 range. On Thursday, the nostalgia-music
vendor released a Linux-riddled statement announcing that as of Dec.
1 the company had moved its e-commerce technology onto Linux-based systems,
aided by Red Hat. The
release devoted one paragraph — all of 90 words — to K-tel itself
before launching into a four-paragraph, 237-word discussion of Red Hat.

...The market loved the announcement.
On the day of the announcement, K-tel shot up as much as $2, or 24.2%,
to $10.25, before settling at $8.81, for a 56-cent gain. Over 4.6 million
K-tel shares were traded Thursday, compared to an average of around
650,000. Friday, the stock fell back 97 cents, or 11.3%, to $7.63. It
is unknowable whether investors who jumped into the stock were traders
playing momentum or naive buyers responding to the tantric repetition
of the L-word, but those who bought at $10.25 certainly saw their investment
dwindle.

...For their part, the people at Red
Hat and other actual Linux companies couldn't be happier with the situation.
There's nothing like free publicity, especially after months — if not
years — in the darkness of anonymity.

"From our standpoint, it's phenomenal
news. It was six, seven, eight months ago that we couldn't get
a company to 'come out of the closet' on their Linux deployment," says
Red Hat spokeswoman Melissa London. "Whatever the motivation
may be for dropping the releases, for the Linux operating system in
general I think that it's a good thing."

Coming out of the closet on Linux evidently
isn't so frightening anymore now that investors have dubbed the operating
system the Next Big Thing. And for as long as investors react with enthusiasm
to press releases that chant Linux, companies will no doubt continue
to put them out. The headline of a Dec. 10 release from Los Gatos, Calif.-based
"Pre-IPO start-up"
Wish.com
encapsulated these wistful Linux dreams perfectly.

"Santa Delivers With Open Source Linux."

Red Hat managed to raise almost $3 billion -- an amazing amount
if you compare it with other much more technically interesting IPO. And
a real thing behind this amazing market success was due to hours of volunteer
work by several thousand GNU and Linux developers.

As I already mentioned in an attempt to reward this group (and limit
early sellout to the owners of the company, as many enthusiasts might keep
their shares for ideological reasons) Red Hat made about 800,000 shares
of its stock available for them at its offering price of $14 through E*Trade.
However, some of these programmers either didn't even have the $1,000 initial
deposit necessary to open an account, or they didn't meet E*Trade's other
requirements for account holders. This generated a lot of excitement in
open source community as those 800,000 shares are soon after IPO were worth
about $60 million before sinking to level below IIPO prices.

Linus Tolvalds's dream to became rich that was the major motivation for
his move to California became true in 1999. Before August 11 IPO Torvalds
received considerable number of options for shares of Red Hat. Initially
he declined to disclose how many, but later he mentioned that he got them
for approximately a half million dollars (in his May 2001 CNet interview
Linus Torvalds, father of Linux- Torvalds- The shy renegade). Before
IPO RH shares were priced about $10-$14 per share and that means that he
got around 50000 options. After stock went up he managed to sell all of
them at the price close to the top ($250) and became a real millionaire.
He bought a nice house. See
Free Software
Communities Encourage Community Reinvestment for details about the wealth
of selected shareholders.

For the next year Linux speculation became a big thing on Wall Street,
especially, in the IPO market. And it really started (or refined an
Internet startups invention) a new business model but quite different from
what Raymondism preached -- it can be called IPO pyramid scheme as a business
model. You don't need any sound business to make money. All you need is
to be among the first in the Linux IPO gold rush and be quick to get out
before the pyramid collapses like Linus Torvalds did. Actually in late 1999
Linux became the center of one of the greatest financial speculations with
a lot of freshly minted millionaires including Linus himself. After seeing
several Linux-related IPO I start to suspect that Linus Torvalds claim that
he invented an alternative to Las Vegas casinos would be considerably more
realistic that his claim that he invented Linux ;-)

In any case Linus Torvalds now can be considered to be a pioneer in commercializing
open source movement and his name is associated not only with Linux kernel,
but with Linux Gold Rush and such figures as Robert Yong and Larry Augustin
as well as shady financial dealings surrounding Red Hat and VA Linux IPOs.
The latter led to a number of lawsuits against the companies and their underwriters.

Like with any revolution as Tim O'Reilly warned, "the open-source
pioneers will be shouldered aside by someone who understands precisely where
the current opportunity lies.". And this opportunity les in outsourcing
the development work and possibility to run "make money fast IPOs". It looks
like a lot naive investors became a proud owners of the Linux-related companies,
enriching a few shroud types like Bob Yong who invented "business by IPO"
model and understand the mechanics of this pyramid scheme from the very
beginning. For example in early 2000 Bob Yong came out with a neat 30 millions
in cash from the selling of Red Hat stock. Later he repeated the same trick
again and again, before eventually bailing out of the company.

Anyway media personality and millionaire Torvalds is a different Penguin
than Linus Torvalds, who was struggling with debugging of 0.9x kernels using
a very basic 386sx PC with 40 meg harddrive. It takes a naive view of human
nature to think that Linus won't strive to maximize his personal wealth.
In
ZDNN Is the Linux revolution over the author noticed an interesting
change after RH IPO:

Once was a time when reporters could
call the inventor of the Linux operating system at his office at cloak-and-dagger
marketed Transmeta Corp., punch in his extension and receive a familiar
declarative "Torvalds" from the man himself on the other end. He was
patient and he answered your questions. He told you when he had no time.
Sometimes he told you when you asked useless neophyte programmer questions.
But he answered the phone.

Today, when you call Transmeta and punch
in his extension, a pleasant female voice greets you. "Thank you for
calling Linus Torvalds. This voice mail does not accept messages. To
contact him, please send a fax to..."

What? And it starts to sink in. He's
not getting back to you. He's had enough. He's a celebrity and getting
a quick interview with him now will be like getting a quick interview
with that other big computer industry celebrity. The woman rattles
off a fax number and you're already thinking of hitting the old 0-#
combo for a receptionist...

"Our receptionists do not take messages
for him, nor do they keep his calendar." D'oh. She's not pleasant, she's
faux pleasant. PR pleasant. The worst. "But they will gladly get your
fax to him." Uh-huh.

And sometimes due to desire to make money he really got into strange
engagements. In one case despite his repeated claims that he is not openly
anti-Microsoft like members of Slashdot crowd, he accepted the role of "technical
consultant" in the film that openly was designed to cash on the antipathy
to "the evil empire" --
MGM's thriller Antitrust. Of course it's essentially another
PR job, but this one leaves some spots on the reputation:

"Despite the obvious parody of Microsoft's
leader, the film oddly has little to do with lawsuits. The title is
more a play on the fact that Winston can't be trusted; when programmers
start to die violent deaths, Milo suspects that Winston is not merely
ruthless, but bloodthirsty. However, in a coy nod to Microsoft's real-life
enemies, Miguel de Icaza, who created the open-source operating system
Gnome, and Sun Microsystems CEO Scott McNealy make cameos as themselves.
Linux creator Linus Torvalds served as a 'technical consultant.'"

After Red Hat stock crashed a lot of members of Linux lost money. And
that created some bad feelings not only to Red Hat, but personally toward
freshly minted "Linux millionaires".

1. Whatever goes upon two legs is an enemy.2. Whatever goes upon four legs, or has wings, is a friend.3. No animal shall wear clothes.4. No animal shall sleep in a bed.5. No animal shall drink alcohol.6. No animal shall kill any other animal.7. All animals are equal.

George Orwell

The significance of Red Hat IPO is that it signify end of volunteer (or
semi-volunteer ;-) period of Linux history. For some developers like Linus
Torvalds this period ended much earlier, but by-and-large the qualitative
change came with the Red Hat IPO. As
Joab Jackson recorded in his paper
Linux now a corporate beast since 1999 there was no such illusion
among top Linux developers:

Dispelling the perception
that Linux is cobbled together by a large cadre of lone hackers working
in isolation, the individual in charge of managing the Linux kernel
said that most Linux improvements now come from corporations.

“People’s stereotype [of the typical Linux developer] is of a male
computer geek working in his basement writing code in his spare time,
purely for the love of his craft. Such people were a significant force
up until about five years ago,” said Andrew Morton, whose role is maintaining
the Linux kernel in its stable form.

Morton said contributions from such enthusiasts, “is waning.”Instead, most code is generated by programmers punching the corporate
time clock.

About 1,000 developers contribute changes to Linux on a regular basis,
Morton said. Of those 1,000 developers, about 100 are paid to work on
Linux by their employers. And those 100 have contributed about 37,000
of the last 38,000 changes made to the operating system. Morton spoke
last week at a meeting sponsored by the Forum on Technology and Innovation,
a semi-regular meeting to address technology-related issues held by
Sen. John Ensign (R-Nev.), Sen. Ron Wyden (D- Ore.) and the Council
on Competitiveness.

Along with UberPenguin, other key developers entered the corporate world
using their status in open source community as the key to high, well-paid
positions. But Linux is increasingly developed by a
small pool of corporate developers. Many among them are former volunteers
hired by companies such as IBM Corp., Red Hat Inc., and SGI. In most cases,
they split their time between in-house projects and time devoted to developing
the Linux kernel.

Since these companies base some of their products and services on Linux,
it is beneficial for them to have someone who is familiar with the operating
system, Morton said. Also, if these companies modify Linux for their own
products, it is in their best interest to see that those changes are incorporated
back into the public kernel. If a company-developed feature is incorporated
into Linux, that feature will be maintained by all of the Linux community,
rather than just being the responsibility of the vendor. Characterizing
Linux’s volunteer development team, Morton said open source “does tend to
attract the very best developers,” individuals who are “miles ahead of regular
developers.” Most of the developers live in the United States, Europe and
Australia. The Eastern European countries are increasingly involved, though
participation from Asian and Latin American developers remains rare, he
said. Morton’s observations were verified by other panelists. “The person
in the basement is a rare bird now,” said Morgan Reed, vice president of
public affairs for the Association for Competitive Technology. I can
add that the person developing for the USA company and living on the other
continent is not.

In 2003 Nicholas Carr in his Harvard Business Review article "IT Doesn’t
Matter" suggested that that IT doesn't matter because it's effectively become
a commodity. Does this mean that this situation in turn creates a market
where low value-added commodities are produced outside the USA and freely
traded ? While Carr was widely of the mark with his cloud fantasies,
the claim that Linux accelerated the commoditization of software looks plausible.
That also means that job of the creation of commodity-type Linux applications
will move from the industrialized counties to third world countries.
I doubt that computer industry as a whole is becoming anything close to
commodity (it's actually a strategic asset for most firms: look at investment
banks), but some segments of it might be. And OS level is definitely one
of such segments, as Linux can be considered a new "super BIOS" a free base
on which everybody can build the next layers of complexity. Good software
systems are always a combination of build and buy and its proprietary/custom
part that created the comparative advantage. As Sun's Jonathan Schwartz
mentioned high bandwidth Internet access is more plausible commodity item
then high end software.

" ...It's simply a critical note of the get-rich-quick-scheme
that the market has become. In no way does this note reflect on
the quality of the Linux OS, Bob Young's character, the open source
community's willingness to contribute to the wealth of a select
group of investment bankers, venture capitalists, and multi-billionaires,
or the sex-appeal of TuX tHe PiNguIn.".

Disclaimer from the Slashdot post

The next big thing was, of course, VA Linux IPO with Eric Raymond as
the board member and it was much more suspiciously looking than Red Hat
and from the beginning it had more attributes of a financial scam.
VA founders as well as Eric Raymond understood that it's time to "Stop winning
about values and grab some cash". "Internet mania reached new levels
of frenzy Thursday as investors paid huge multiples on an initial public
offering, giving a market value of almost $10 billion to a tiny company
with powerful competitors, little revenue and no expectation of earnings
in the foreseeable future." wrote NYT. A quick reality check of VA, for
example, gives us net revenues to 29th October of just under $15 million,
and an operating loss of over $10 million [Register]. As one Slashdot reader
put it :

"They don't actually do anything
other than sell Linux boxes, right? Or is there something else they
do that I'm missing? Looks to me like Linux is becoming far too much
of a buzzword for its own good. Perhaps the reason it did so well is
because VA Linux is the only Linux IPO recently to actually have the
word "Linux" in the company name..."

VA Linux IPO pyramid scheme showed that the millions spent on buying
the domain www.linux.com
were really well spent -- those expenses were just change after multibillion
IPO. It not only lead to huge profits for directors (See also Eric
S. Raymond Wealth Clock), but due to artificial inflation of the
shares price also led to financial problem for naive believers in Linux
who decided to support the cause by buying some shares. Here is
the
message posted on Linuxtoday forum about "absurdly rich" Eric Raymond"
and VA Linux IPO:

I must be insane, because I placed a market
order for LNUX. I now must liquidate all my retirement account
to settled the trade...

500 shares X $277 = 138,500

I think some of this rip-offs were later settled by court action, but
still they left very bad taste. The stratospheric valuations of companies
like RH and VA Linux turned them into speculators heaven -- one of the most
profitable opportunity to short stock (sometimes was also called "playing
Linux lunacy".) A lot of speculators unerstood that the company are
largly fakes. And that means that you do not need product, you do not need
business model, you do not need profits -- all you need to become rich is
to buy short VA Linux or Red Hat stock. If the timing was right in a couple
of month you might well become a millionaire. Actually how many people
lost money by buying VA Linux stock at $320 or so is unclear. In the paper
Salon Technology Dissecting the VA Linux IPO the different possibility
was suggested:

Conventional wisdom holds that these
stocks are bought by small investors who are willing to buy into a good
story at any price. According to this theory, the buyers for VA Linux
are true believers who think that Linux-based computers will eventually
displace the Microsoft monopoly.

In this case, the conventional wisdom
seems wrong. In scouring hundreds of messages on investor bulletin boards,
I did not find one person who admitted to buying VA Linux at $320. This
is significant, because investors who like a stock tend to be vocal
about it. Many will happily go on Internet message boards to defend
their purchases - in part because the buzz helps keep up the price of
their holdings. If individual investors are not out in force defending
the companies prospects, you have to figure they're not holding the
stock.

There is another possible theory, and
that is that the investors who bought VA Linux the moment it opened
did so largely by accident. Something like that happened in November
1998 with TheGlobe.com. Naive individual investors put in orders to
buy the stock before it opened without specifying a "limit," or top
price. The next day some of them were
shocked that instead of buying the stock at a little bit over the
IPO price, their trades had been executed at $90 a share.

In this case, I don't think that happened,
either. After TheGlobe.com went public, many brokerages instituted tighter
control to make sure this didn't happen again. Most will now not let
investors ask to buy a new issue without putting in a limiting price.

So who was it who bought VA Linux at
these astonishing pricing?

There's one theory that has gotten a
lot less notice than it should. Earlier this year, TheStreet.com's Cory
Johnson reported that investment banks were asking institutional
investors who got shares in hot IPOs to buy additional shares after
trading opened. In other words, a large mutual fund might get 30,000
shares of a hot stock at the low offering price - but only if fund managers
indicated they would buy 60,000 more shares after the stock opened for
trading.

"It might not be stated explicitly,"
says Johnson, "But investment banks will give shares to funds
that will buy more shares in the after-market and support the stock."

...I can't prove that's what happened.
But right now it sounds like a better explanation than the idea that
there are tens of thousands of small investors putting lots of money
into a stock that has jumped above $200 -- if only because I can't find
those naive investors.

I like this theory, also, on epistemological
grounds. The market is a lot more sophisticated than it is given credit
for being. It is generally a bad idea to assume that investors who buy
a stock do so because they are simply silly or inexperienced. Sometimes,
of course, that is the case, but more often it turns out that the market
behaves quite rationally, and it is only the observers who are naive.

VA Linux went public at $30 a share on Dec. 9, 1999 and closed that day
at $239.25, a 698% increase. The stock closed Mar 25, 2001 at $3.44. This
interesting curve became a symbol of the IPO craze a year ago. In spring
2001 VA Linux became a poster child for everything that went wrong with
1998-1999 Linux IPO craze. Later I found the following assessment
of this crazy situation at one of the financial blogs forums

The VA Linux IPO confirmed my suspicions that the dotcom bubble
was exactly that. There were already signs….Anyone else remember
the dotcom layoffs?? You found out you were axed when your passcard
wouldn’t work to get in the door.

I had been lucky enough, blind luck actually, to be in NYC in
the IT biz during the last five years of the century. I started
up a small contracting company. At one point I was contracting out
HS kids to do HTML at $125/hr….Insanity.

VA Linux big hype was that they were a hardware company specifically
designed for Linux OS.
I mean really….how fucking stupid is that?? LOL
But there it was….closing at $239….up some 700% from the issue price….

That little voice in my head commenced to screaming. GET OUT
NOW!!

I floated my resume and got a killer offer from a STL based brokerage
firm to do database work for them.
So I bailed. Took my NYC money and bought some rental properties
out here.

Hindsight being 20/20 I probably should have stayed and milked
the cow for a few more years. But then I’d probably be under water
on the RE…so it evens out in the end I figure.

The STL brokerage got bought by Wachovia. They were offering
parachutes and I was sick of IT so I bailed and went back to school.
Now I do NucMed/PET scans for a living.

Life is a funny thing.

In Jan, 2001, at least 18 federal lawsuits have been filed against VA
Linux, claiming that the company conspired with its investment bankers in
a kickback scheme to drive up the price of its shares in what was the largest
percentage gain in at least a decade for a stock in its first day of trading.
Those calls action suits named the lead underwriter of the VA Linux IPO,
Credit Suisse First Boston, as a defendant. In March, 2001 VA Linux got
into a larger and potentially more dangerous investigation by the Securities
and Exchange Commission of the practices of investment banking firms in
a number of IPO, centering on how the banks allocated the prized IPO shares
and what they may have demanded from the recipients of the shares in return.

In late 1999 the number of people who became paper Linux millionaires
was probably close to a hundred. For some time there was even a couple of
Linux billionaires. Among Linux millionaires the most prominent one is of
course Linus Torvalds himself (as the holder of undisclosed amount of
Transmeta stock and VA Linux stock in addition to know known amount
of RHAT stock). As Transmeta was secretive about its products, Linus Torvalds
is very secretive about his holdings in these companies. That really distinguish
him from Eric Raymond who manage to declare his wealth the next day after
VA Linux IPO ;-). But still the fact of participating in this variant of
Ponzo scheme in which naive supporters of Linux lose a lot of money (some
lost everything) will remain one of the most problematic pages of his biography.

Other suspicious figures that greatly benefited from Linux IPOs include
owners of Red Hat (Marc Ewing and Robert Yong -- for some time paper billionaires),
a very shady figure of former VA Linux CEO Larry Augustin, several owners
of other distributions, one writer/evangelist (Eric Raymond who is also
a member of the board of VA Linux -- see his letter
Linux
Today Eric S. Raymond -- Surprised By Wealth where he claimed to have
a paper worth of $38 millions and shared with the readers his plans to buy
himself some small gifts), some Linux media persons (Slashdot founders who
became millionaires after Andover.net IPO as a front figures for the owners
of Andover.net).

Does anybody else find irony in the fact
that Bill Gates says he's trying to write a stable OS,
and Linus claims he's trying to take over the world?Anonymous

As the blonde, blue-eyed icon for millions of teenage
girls and more than a few boys everywhere, Leonardo DiCaprio emerged
from relative television obscurity to become perhaps the hottest under-30
actor of the 1990s. After leading roles in William Shakespeare's Romeo
+ Juliet and James Cameron's Titanic, the actor became a phenomenon,
spawning legions of websites and an entire industry built around his
name.

The controversial Transmeta employment partially demonstrated that Linus
Torvalds does not feel strong enough to try to reshape the world and
will be satisfied by a few millions of dollars of Transmeta stock.
Although he would deny this, he actually became a chief marketer for the
company that would be better technologically served by a quite different
OS (VM-based, for example VMware).

Moreover his popularity is not a result of his technical achievements
-- it's to large extent the result of making a soap opera out of Linux by
press and by being an informal leader of the influential social movement
that split from the movement initially organized by RMS. Yes, I would agree
that a humble personality that sometimes Torvalds pretends might be just
a mask (he actually confirmed this suspicion by famous "I am a bustard"
interview, but his decision to join Transmeta in no way was a decision
based on courage. I see it more as an attempt to preserve and extend status
quo (along with making nice profit from his growing popularity).

Actually it looks like Linux is a more conservative movement then Microsoft
PC revolution and from technical point of view. Torvalds paradoxically
is to a large extent conservative figure, far from accidental revolutionary
that he tries to pretend in his recent ghost-written autobiography. Several
authors raised a question about a key tendency of Linux following
technological innovations of the proprietary Unix and Windows platforms.
Here I don't mean Microsoft's "innovation" as a marketing slogan but real
Microsoft innovations in the area of software companent model that do exist
with COM as a really good example. See also:

Actually the question "Is commercial Linux development exemplified by
Linus Transmeta saga a follower, emulator of mainstream market tendencies?"
can be answered "Yes". Linus is, generally speaking, is reacting
to developments in the proprietary world rather than leading it by developing
new technologies. More ove his efforts sometimes look like an attempt to
eliminate the barrier of entry by lowering the price of the clone to zero,
then an attempt to develop something new. As I already mentioned Linux movement
often uses Microsoft strategy (replicate, embrace, extend). For example
like Windows was to a certain extent an attempt to duplicate Mac, Gnome
and KDE is an attempt to duplicate Windows interface. While some innovation
actually occurred along the way, people should be aware that Linux is not
primarily about innovation, but rather about having a free (both senses),
robust, standards-compliant operating system. A new super BIOS that other
can use.

But there is something interesting in the strong anti-Microsoft feelings
of open source movement that I discussed in my
First Monday paper. And that suggests that some similarities should
really exist. Actually on personality level the strong drive, workaholism
and desire for domination and desire for the control of operating system
they created are definite similarities between Gates and Torvalds. The success
of Microsoft and Linux are different but both are similar in one important
aspect -- both are inextricably linked to the power-hungry individual --
"the fearless leader", mass media and the corresponding cult of personality.
I would argue that despite his really unique success with Linux kernel Torvalds
is still more a "mediaboy" than a real business or technological leader.

Of course one might claim that Linus tried to work for the benefit of
mankind, but in fact Gates is more active in charity than Linux Torvalds
(other then kernel itself, I do not know of any Linus Torvalds charitable
contributions). It was noted by several authors that even giving away the
products and/or money can be considered a religious experience for strong
personality types with the intense view of the self that borders on megalomania.
Individual can give away because of the desire to make a difference
in the world and having the sense of confidence to do so. Any giving away
software is probably not that different from giving away money.

"Fearless leader" is a man who takes pro-active approach to the world.
Most of us are reactive -- we are in search to find our place in the
world. Proactive people try to modify the world to find a place. But to
what extent Linux is an innovation is an open question. Microsoft
at least legitimized a lot of other people innovations. Technically Linux
is as conservative as Ronald Reagan was a conservative politically. This
analogy extends to the role of media in the success of Linux and Reagan
:-). As Bob Pike in his famous paper
systems software research is irrelevant was the first to suggest that
the success of Linux is a symptom of deep crisis in system software research.
Rob Pike, well known for his appearances on ''Late Night with David Letterman,''
at this time headed the Computing Concepts Research Department at Bell Laboratories
in Murray Hill, New Jersey, where he has been since 1980, the same year
he won the Olympic silver medal in Archery. He stated that the excitement,
the art, of systems research is gone. The current status quo is stagnation.
Linux the kernel, nice as it is, is certainly a derivative of tried and
true old Unix. And people are content with the same old crap, when they
should be out there innovating.

Linux Innovation? New? No, it's just
another copy of the same old stuff. OLD stuff. Compare program development
on Linux with Microsoft Visual Studio or one of the IBM Java/web toolkits.
Linux's success may indeed be the single strongest argument for my thesis:
The excitement generated by a clone of a decades-old operating system
demonstrates the void that the systems software research community has
failed to fill. Besides, Linux's cleverness is not in the software,
but in the development model, hardly a triumph of academic CS (especially
software engineering) by any measure.

Although Linus Torvalds is driven by the same motives that drove
Gates, Carnegie and Rockefeller -- to build their famous empires,
Torvalds to much more extent was/is a media boy as a nice subject for media
fairytales in the style biblical "David vs. Goliath" story.

From the point view of charity, my point is that the charity of Linux
Torvalds with his ten year marathon of creating a replica of FreeBSD is
still far from from the charity of Gates, Carnegie, Rockefeller. In
effect, such a charity was a continuation of their notion that they are
changing the world. Another interesting question is "What's the real differences
between reshaping the world by creating a (monopolistic) software empire
and then giving money in the areas that you consider critical, or by monopolistically
running a charitable development of the kernel ?"

But after that commonality ends. Gates is a businessman who created the
largest software empire the world ever saw and manage to keep is expanding
for several decades. And his ruthless fighting with competitors is almost
a legend (see
Hard Drive as one of the best book about early Microsoft). Linus Torvalds
is bound by the rules of "media darling" games as well as GPL game he played
and he mostly was driven by events beyond his control.

In May 1999 IEEE Computer interviews Ken Thompson to learn about
Thompson's early work on Unix and his more recent work in distributed computing.
In this interview Ken Thompson made several interesting observations. One
of them was about Linux

Computer: In a sense, Linux
is following in this tradition. Any thoughts on this phenomenon?

Thompson: I view Linux as something
that's not Microsoft—a backlash against Microsoft, no more and no less.
I don't think it will be very successful in the long run. I've looked
at the source and there are pieces that are good and pieces that are
not. A whole bunch of random people have contributed to this source,
and the quality varies drastically.

My experience and some of my friends'
experience is that Linux is quite unreliable. Microsoft is really
unreliable but Linux is worse. In a non-PC environment, it just
won't hold up. If you're using it on a single box, that's one thing.
But if you want to use Linux in firewalls, gateways, embedded systems,
and so on, it has a long way to go.

In other parts of interview he outlined some of his view on the software
development process and his work with Inferno and Plan 9 that are contemporaries
of Linux and that broad perspective makes this comment much more valid.
He especially stressed the value of simplicity and clean interfaces in the
design. Simple and powerful paradigms is what made Unix so powerful and
long lasting operating system. As Ken Thompson explained:

Computer.
In an earlier interview you were asked what you might do differently
if you had to do Unix over again, and you said that you would add an
"e" to the creat
system call. Seriously, in hindsight, can you give us an assessment
of the problems you overcame, the elegant solutions, and the things
you would have done differently.

Thompson.
I think the major good idea in Unix was its clean and simple interface:
open, close, read, and write. This enabled the implementation
of the shell as well as Unix's portability. In earlier systems, I/O
had different entry points, but with Unix you could abstract them away:
You open a file, and if the file happens to be a tape, you could write
to it. Pipes allowed tools and filters that could accommodate
classical monster programs like sort.

Probably the glaring error in Unix was
that it underevaluated the concept of remoteness. The open-close-read-write
interface should have been encapsulated together as something for remoteness;
something that brought a group of interfaces together as a single thing—a
remote file system as opposed to a local file system.

Unix lacked that concept; there
was just one group of open-close-read-write interfaces. It was a glaring
omission and was the reason that some of the awful things came into
Unix like ptrace
and some of the system calls. Every time I looked at later versions
of Unix there were 15 new system calls, which tells you something's
wrong. I just didn't see it at the time. This was fixed in a
fairly nice way in Plan 9.

As we can see Linux does not address this real of perceived deficiency
of Unix. In a certain sense Linux preserved all most obscure things that
were accumulated in Unix without attempts to find a simple simplification
paradigm for this complexity. The value of a really great system is that
it simplify things that previously were complex and makes them simple.
In the begging of the interview Ken Thompson expressed this idea very clearly:

Computer.What makes
Plan 9 and the Inferno network operating system very striking is the
consistent and aggressive use of a small number of abstractions. It
seems clear that there's a coherent vision and team assembled here working
on these projects. Could you give us further insight into how the process
works?

Thompson.The aggressive
use of a small number of abstractions is, I think, the direct
result of a very small number of people who interact closely during
the implementation. It's not a committee where everyone is trying to
introduce their favorite thing. Essentially, if you have a technical
argument or question, you have to sway two or three other people who
are very savvy. They know what is going on, and you can't put anything
over on them.

As for the process, it's hard to describe.
It's chaotic, but somehow something comes out of it. There is a structure
that comes out of it. I am a member of the Computing Sciences Research
Center, which consists of a bunch of individuals—no teams, no leaders.
It's the old Bell Labs model of research; these people just interact
every day.

At different times you have nothing to
do. You've stopped working for some reason—you finished a project or
got tired of it—and you sit around and look for something to do. You
latch on to somebody else, almost like water molecules interacting.

You get together and say, "I have an
idea for a language," and somebody gets interested. Somebody else asks
how we put networking in it. Well, so-and-so has a model for networking,
and somebody else comes in. So you have these teams that rarely get
above five or six, and usually hover around two or three. They each
bring in whatever they did previously.

So that's the way it works. There are
no projects per se in the Computing Sciences Research Center. There
are projects near it of various sorts that will draw on our research
as a resource. But they have to deal with our style. If people get stuck,
they come to us but usually don't want to deal with the management style—which
means none—that comes along with it.

Computer.You
mentioned technical arguments and how you build your case. How are technical
arguments resolved?

Thompson.When you know something
is systemically wrong despite all the parts being correct, you say there
has to be something better. You argue back and forth. You may
sway or not sway, but mostly what you do is come up with an alternative.
Try it. Many of the arguments end up that way.

You say, "I am right, the hell with you." And,
of course the person who has been "to helled with" wants to prove his
point, and so he goes off and does it. That's ultimately the way you
prove a point. So that is the way most of the arguments are done—simply
by trying them.

I don't think there are many people up in research
who have strong ideas about things that they haven't really had experience
with. They won't argue about the theory of something that's never been
done. Instead, they'll say, "Let's try this." Also, there's not that
much ego up there either, so if it's a failure you come back and say,
"Do you have another idea? That one didn't work." I have certainly generated
as many bad ideas as I have good ones.

Computer.What advice do you
have for developers who are out there now to improve their designs so
that they could be viewed as producing simple yet powerful systems?

Thompson.
That is very hard; that is a very difficult question. There are
very few people in my position who can really do a design and implement
it. Most people are a smaller peg in a big organization where
the design is done, or they do the design but can't implement it, or
they don't understand the entire system. They are just part of a design
team. There are very few people I could give advice to.

It's hard to give advice in a product
kind of world when what I do, I guess, is some form of computer Darwinism:
Try it, and if it doesn't work throw it out and do it again. You just
can't do that in a product-development environment.

Plus I am not sure there are real principles
involved as opposed to serendipity: You happened to require this as
a function before someone else saw the need for it. The way you happen
upon what you think about is just very lucky. My advice to you is just
be lucky. Go out there and buy low and sell high, and everything will
be fine.

And in another part of the interview he expressed the same thing in a
different way:

Computer.Your nominators
and endorsers for the Kanai Award consistently characterized your work
as simple yet powerful. How do you discover such powerful abstractions?

Thompson. It is the way I think.
I am a very bottom-up thinker. If you give me the right kind of Tinker
Toys, I can imagine the building. I can sit there and see primitives
and recognize their power to build structures a half mile high, if only
I had just one more to make it functionally complete. I can see those
kinds of things.

The converse is true, too, I think. I
can't from the building imagine the Tinker Toys. When I see a
top-down description of a system or language that has infinite libraries
described by layers and layers, all I just see is a morass. I can't
get a feel for it. I can't understand how the pieces fit; I can't understand
something presented to me that's very complex. Maybe I do what
I do because if I built anything more complicated, I couldn't understand
it. I really must break it down into little pieces.

Computer.In your group you probably have both the bottom-up
thinker and the top-down thinker. How do you interact with both?

Thompson. I think there's room
for both, but it makes for some interesting conversations, where two
people think they are talking to each other but they're not. They just
miss, like two ships in the night, except that they are using words,
and the words mean different things to both sides. I don't know how
to answer that really. It takes both; it takes all kinds.

Occasionally—maybe once every five years—I
will read a paper and I'll say, "Boy, this person just doesn't think
like normal people. This person thinks at an orthogonal angle." When
I see people like that, my impulse is to try to meet them, read their
work, hire them. It's always good to take an orthogonal view of something.
It develops ideas.

I think that computer science in its
middle age has become incestuous: people are trained by people who think
one way. As a result, these so-called orthogonal thinkers are becoming
rarer and rarer. Of course, many of their ideas have become mainstream–like
message passing, which I thought was something interesting when I first
saw it. But occasionally you still see some very strange stuff.

At the same time many ideas that were incorporated in Unix was ideas
first developed on other systems. In a since Unix was a salad of preexisting
ideas that proved to be much better than competition:

Computer. Going back a little
bit further, what were the good and not so good aspects of Multics that
were the major drivers in the Unix design rationale?

Thompson.
The one thing I stole was the hierarchical file system because
it was a really good idea—the difference being that Multics was a virtual
memory system and these "files" weren't files but naming conventions
for segments. After you walk one of these hierarchical name
spaces, which were tacked onto the side and weren't really part of the
system, you touch it and it would be part of your address space and
then you use machine instructions to store the data in that segment.
I just plain lifted this.

By the same token, Multics was a virtual
memory system with page faults, and it didn't differentiate between
data and programs. You'd jump to a segment as it was faulted in, whether
it was faulted in as data or instructions. There were no files to read
or write—nothing you could remote—which I thought was a bad idea. This huge virtual memory space was the unifying concept behind Multics—and
it had to be tried in an era when everyone was looking for the grand
unification theory of programming—but I thought it was a big mistake.

I wanted to separate data from
programs, because data and instructions are very different.
When you're reading a file, you're almost always certain that the data
will be read sequentially, and you're not surprised when you fault
a and read a + 1. Moreover, it's much harder to excise
instructions from caches than to excise data. So I added the
exec system call
that says "invoke this thing as a program," whereas in Multics you would
fault in an instruction and jump to it.

He also stresses one thing that probably is applicable to Linux as well.
Unix was a democratic operating system that helped people to get rid
of the power of the corporate bureaucracy.

Computer:
What accounted for the success of Unix ultimately?

Thompson: I mostly view it as
serendipitous. It was a massive change in the way people used computers,
from mainframes to minis; we crossed a monetary threshold where computers
became cheaper. People used them in smaller groups, and it was the beginning
of the demise of the monster comp center, where the bureaucracy hidden
behind the guise of a multimillion dollar machine would dictate the
way computing ran. People rejected the idea of accepting the OS from
the manufacturer and these machines would never talk to anything but
the manufacturer's machine.

I view the fact that we were caught up
in that—where we were glommed onto as the only solution to maintaining
open computing—as the main driving force for the revolution in the way
computers were used at the time.

There were other smaller things. Unix was a very small, understandable OS, so people
could change it at their will. It would run itself—you could
type "go" and in a few minutes it would recompile itself. You had total
control over the whole system. So it was very beneficial to a lot of
people, especially at universities, because it was very hard to teach
computing from an IBM end-user point of view. Unix was small, and you
could go through it line by line and understand exactly how it worked.
That was the origin of the so-called Unix culture.

"Twelve voices were shouting in anger, and they were all
alike. No question, now, what had happened to the faces of the pigs. The
creatures outside looked from pig to man, and from man to pig, and from
pig to man again; but already it was impossible to say which was which."

George Orwell

If open-source software is so much cooler,
why isn't Transmeta getting it?

Bob Metcalfe

Maybe the savviest of marketers are no longer working for Redmond. Linux
Torvalds proved that he has a second (and probably more profitable) talent.
In Jan. 2000 it would be hard to find someone with more flair for publicity
than Transmeta's with Linus Torvalds starring as Transmeta's chief marketer.
Transmeta gave Linus an incredible amount of leeway to promote his image
of superhero subtly connected to the image of the Transmeta as a "superhero
home", including frequent travel to the conferences, on corporate time and
expense and received a nice return on the investment. As Alexander
Wolfe put it in his Feb.14, 2000 Byte column
Will
Transmeta Make It With Crusoe, Alexander Wolfe:

While a company with such a pedigree
might be expected to garner some publicity, the coverage
accorded Transmeta has been all out of proportion. True, Crusoe
is an interesting implementation of very long instruction word (VLIW)
computing technology in single chip form.

However, it's raison d'etre -- low power
-- isn't something that can't be achieved in a whole bunch of other
embedded processors and cores already available (say, MIPS and ARM).

What Transmeta became was the biggest
beneficiary of accidental -- or circumstantial -- spin since Bill Clinton
decided in 1992 that he felt America's pain.

Or, to paraphrase Winston Churchill,
never have so many who have said so little received so much in return.
I'm referring to Transmeta, of course. Given every opportunity -- harangued,
in fact -- it politely declined to say anything about what it was up
to. I characterized it as a kind of benign secrecy. And what did it
get for it? As far as I can tell, it got a slew of positive press. I'm
not talking trade journals, either. Every medium from Salon to the vaunted
New York Times itself...

...So now Transmeta, without any real
OEMs shipping products, is reaping the wind, and Intel, with a complex
infrastructure of in-the-work IA-64 chips and servers, is doing its
best to avoid reaping the whirlwind.

...

Be certain of one thing: I'm not saying
there's no technical meat on Transmeta's well-greased publicity skids.
However, on the technical front, there remain a number of key, unanswered
questions. Most important is: What exactly are the instruction sets
of the first two Crusoe models: the TM3120 and the TM5400?

All Transmeta will say is it will run x86 applications -- and a
slew of operating systems -- transparently. As an old-time assembly
language programmer, I have to say that's not good enough for me. I
want to see a tech manual with instruction, flags, bits, and the whole
nine yards. Transparency may be good enough for Fortran programmers,
but it doesn't wash with me.

Torvalds will probably be appalled to hear this, but he's the best front
man for Transmeta for the past three years. How else to explain the fact
that Transmeta's rollout of the controversial and not proven Crusoe chip
merited coverage on national TV? That's the kind of attention usually reserved
for Bill Gates or Steve Jobs. And about chip that needs to compete with
the existing chips like StrongARM. But Torvalds marketing job made wonders
and news about the chip were on all major and minor news channels.
This contradiction was not un-noticed.

Still "Viva Torvalds, viva Transmeta" cries were even on Slashdot, when
major "open source, closed minds" cult site ignored the skeleton in the
closet -- the fact that the secret chip and secretive company is on
the opposite side of barricades with open source. Association with Linus
Torvalds is all that matters. But in more skeptical part of the community
the suspicion that Linus Torvalds was a chief marketer for Transmeta and
used his reputation to promote the chip became much more strong after the
rollout:

If it's not about marketing why would Transmeta, a hardware
company obsessed with secrecy want to bring onboard Linus Torvalds,
a software programmer that promote open source and has almost
Hollywood-scale name recognition?

Conversely, what was amount of money that seduced Torvalds,
a man who achieved that recognition through the giving kernel code
away under GPL license, to cast his lot with an ultra-proprietary
chip company?

All in all the obvious success of Transmeta PR and marketing can be attributed
to Linus Torvalds. With all its unorthodox technology, the current marketing
position of Crusoe is yet another (lower power) Intel Pentium clone. And
this success was achieved despite huge technological difficulties (Crusoe
Chip's Rocky Origins), so the technology might not be very scalable
and thus belongs to the lower end (cheap laptops):

Like a character in a Graham Greene spy
novel, David Taylor sometimes labored under the pressure of constantly
guarding Transmeta's secrets.

"It made me kind of psychotic," said
the talkative software engineer, one of the original programmers of
ID Software's Quake and Doom, who was hired away by the hush-hush chip
design firm about a year ago. "I had my wife and brother sign NDAs,
just so I'd have someone to talk to."

The obsessively secretive company demanded
that employees reveal no more than four pieces of information to anyone:
the name of the company, its location in Santa Clara, that its CEO was
David Dritzel, and that Linus Torvalds worked for the company.

"You couldn't tell your friends or family
anything," Taylor said. "You wanted to tell them about your life, your
shitty day at work, or the cool people you worked with. But you couldn't
do any of that."

What made things worse for Taylor was
that he'd come from a company that divulged everything to anyone who
cared.

"This was a complete culture shock to
me," said Taylor, who was dressed for the launch in a black morning
coat with long tails, black leather pants, and a cream waistcoat. "But
today I'm free. I'm a free man."

...Speaking on condition of strict
anonymity, engineers with a major computer manufacturer said that after years of painstaking design work, the earliest Transmeta chips
were a big performance disappointment. The engineers would neither
confirm nor deny their company's plans to build Crusoe devices.

"They were dogs," said a Transmeta engineer,
who also spoke on condition of anonymity. "We were in trouble."

I always was puzzled why Torvalds chose to work for Transmeta, not in
academic or research institution. As interesting as Crusoe is, it
needs to complete with StrongARM that is also more economical (looks like
less that one watt in any mode) and other chips that are already field-tested.
And Crusie is not suitable platform for Linux at all: it has advantages
with WMWare style OSes only. There are some skeptic voices (Transmeta
round-up Technology worth the hype):

Richard Gordon, senior analyst with Gartner
Group research remains unconvinced. He says, "I think you can safely
say that their marketing and PR campaign has been very successful.
There's been lots of hype but they've yet to prove the technological
innovation."

Gordon also has reservations about the
specific technology involved. He continues, "If you read between the
lines, it's not quite as quick and as clever as some think it
is. When you have the ability to run different OSes it has to
be an emulation technique. Then there has to be a performance penalty.
From what I understand the headline speed is 700 MHz but emulation reduces
this to 500."

How good is Crusoe against
StrongARM? The latter is established player with a lot of backers in
Linux community (Corel is one example). Intel stated that:

Next-generation StrongARM processors
are based on a high-performance, ultra-low power 32-bit RISC architecture
implementation and Intel's new 0.18m process. These chips will provide
a number of significant advantages for developers:

High processing power up to
750 MIPS at less than 500 mW.

Scalable combinations of performance
and power consumption ≈ from 150 MHz to 600 MHz at 40 mW to
450 mW.

Scalable voltage from 0.75 Vdd
to 1.3 Vdd.

Compatibility with current ARM
architecture.

With all this hype about Crusoe it looks like Strong ARM is at least
as powerful as Crusoe(750 MIPS at Ultra-Low Power) and more economical and
Transmeta did not bring much to the technological table (althouth X86 compatibility
is a huge plus). As for Linux, I remember that Corel supported
StrongARM and ported Linux to it. StrongARM is not Intel 386 compatible,
but why it should be an issue for Linux? Intel 386 emulation is not a winning
card for the cheap CPU, because Crusoe is not that cheap. Moreover how good
is emulation layer in Crusoe remains to be seen -- it's very complex solution
that might have some holes. Is it patent-wise clean and how good then is
floating point compatibility is one obvious question to be asked; this is
non-issue as long as IBM is a producer of the chip (IBM has cross patent
arrangements with Intel), but that prevents taking it anywhere else. Also
StrongARM is field tested. In his Salon paper
The Transmeta energizer Andrew Leonard noted:

Your average Internet start-up promises
a revolution, but usually just delivers a clever Web site and a stunning
IPO. Hype rules, and the successful manufacturing of buzz is sometimes
all you need to pull off the "big win."

Transmeta IPO was hugely successful, but assessment
of the company chip in the press became much more realistic. Many called
Crusoe flash in the pan. Investors seemed unfazed by the huge dept of the
company, and the fact the IBM abandon chip just before the IPO. Despite
all the facts Transmeta's stock price shot as high as $50, before
closing at $42.25 in the first day f trading (Oct 09, 2000). The company
even managed to bump up asking price from $11 to $21 before trading on Tuesday.
But due to IBM pre IPO poison pill evaluation of Transmeta in mainstream
press after IPO became less rosy. In the paper
Crusoe problems blamed on software emulation technique InfoWorld put
it strait -- performance sucks and this is not a bug, but a feature:

... ... ...

Compaq on Tuesday was
the latest computer maker to adopt a wait-and-see strategy concerning
the Crusoe chip. Officials for Houston-based Compaq, which invested
$5 million in Transmeta earlier this year, said they would continue
to follow Crusoe's progress and consider it for use in future products.

While Compaq officials
wouldn't delve into specifics as to why the company wanted more time
before committing to a Crusoe-powered system, one Compaq source said
that certain "software interactions" having to do with the Crusoe chip
did play a part in Compaq's decision.

... ... ...

The advantage to software
emulation, according to Transmeta, is a smarter processor that company
officials agree operates more efficiently on repeated commands that
the chip has already translated.

But it is the initial
translation that creates an extra step for the Crusoe processor, and
this step apparently robs Crusoe of needed performance.

A source close to IBM said that
Crusoe's need to translate incoming command lines created an unacceptable
degradation in performance.

Dean McCarron, an analyst with
Mercury Research in Scottsdale, Ariz., agreed.

"Essentially, [software emulation]
can be likened to a memory cache. It takes longer when you first access
it, but it's quicker the next time, and there is no way around that,"
McCarron said.

McCarron said that in the past,
other companies such as Digital, which is now owned by Compaq, tried
the software emulation approach with its Alpha processors but encountered
the same performance degradation while the processor translated incoming
code.

McCarron added that for Crusoe
to regain the performance lost in translating code, Transmeta would
have no choice but to turn up the clock speed of the chip, negating
any power savings the chip might offer.

"Turning the clock up is the only
way to speed [Crusoe] up, but if they turn it up, then the power consumption
goes out the window," McCarron said.

How then does Transmeta increase
the performance of Crusoe without sacrificing its primary selling point,
which is low power consumption?

"The question may not be answerable,"
said Joe Jones, the CEO of Bridgepoint, an Austin, Texas-based semiconductor
testing company with clients such as Texas Instruments and Philips.
"How can you take low power and turn it into a permanent market position?
Because the other guy has to simply figure out how to get his power
lower."

Both Intel and Austin-based Advanced
Micro Devices currently ship mobile processors designed to operate with
a minimal amount of power.

The news from Compaq
came as the second disappointment in a week for Transmeta, which began
its IPO on Monday; seven days ago, Armonk, N.Y.-based IBM cancelled
its plans to ship the Crusoe processor in its ThinkPad 240 model laptop
computers.

IBM officials made
little comment concerning the reasons for the company's decision, outside
of the fact that Big Blue would continue to look for low-power performance
options for its laptop systems.

IBM, which last June
began publicly testing the Crusoe processor in only three ThinkPads,
never really committed to Transmeta, according to officials. IBM's plan
for Crusoe was to showcase the Crusoe-powered ThinkPads to its corporate
customers and make its decision based on customer response.

Hewlett-Packard, which
has been considering the Crusoe processor for its portable computers,
has also been less than satisfied with the performance and battery-saving
potential of Crusoe, according to sources close to Palo Alto, Calif.-based
HP.

HP's corporate customers
are more comfortable with processors manufactured by Santa Clara-based
Intel than with chips from competing companies such as Transmeta, a
source said.

The latest stance in Torvalds "media boy-chief marketer"
role was of course his agreement to publish ghost-written autobiography,
which happened around this time:

The Linux creator says it was purely
accidental.

NEW YORK -- Finnish software revolutionary
Linus Torvalds, the freewheeling inventor of the Linux operating system
popular in Internet circles, has agreed to write an insider's account
of the grassroots crusade he began.

HarperBusiness, a division of HarperCollins
Publishers, said in a statement Monday that it plans to publish the
book, entitled "Just for Fun: The Story of an Accidental Revolutionary"
in the spring of 2001. The book will be co-authored by David
Diamond.

A spokeswoman for the publisher of popular
business titles said the book will reflect Torvalds' quirky and irreverent
personality. "Just for Fun" will combine autobiography, technological
insight, and business philosophy in an account of the phenomenal growth
of Linux.

"It's as much about Linux as it is about
Linus," said Lisa Berkowitz, HarperCollins associate publishing director.

Torvalds first developed Linux as a Finnish
undergraduate nearly a decade ago while seeking a low-cost alternative
to business software programs.

Now used by over 12 million people, Linux
has challenged the dominance of commercial operating systems such as
Microsoft Corp.'s (Nasdaq:
MSFT) Windows operating system used on personal computers and the
Unix system favored for big business computer systems.

That definitely increased his perception as a yet another
"media boy":

Are we really as memoryless as placing
this (albeit good and kind) student on the level of Churchill,
Eisenhower, Fermi, Wilbur and Orville Wright,
Howard Hughes, Henry Ford, Louis Lumière,
de Gaulle, Maria Callas, Benjamin Britten,
and the - surely - hundreds other people of similar importance in that
century? I bet Linus himself would be offended.

Linus, by rewriting its kernel and calling
to write new extensions in its old scheme, revived an old idea (UNIX)
that had been stuck by too greedy vendors. Gates, OTOH, dropped the
old ways or omitted to join them (TRS, PARC, UNIX, OS/2, Fortran, ...)
and jumped to launch new ones (MS-Basic, DOS, Windows NT, Visual Basic,
ASP,...); Linus just rewrote software, while Gates did all, including
building a company and leading it to success.

Yes the market (and Microsoft itself!)
need new and good competitors. But I think Linux isn't the good one.
UNIX neither. OS/2 neither. As usual, the good one is still unknown.
Remember: in 1980, some were afraid that IBM would be challenged by
then powerful DR (Digital Research), and cited this fear as a reason
why IBM chose instead a 2-student unknown company, called Microsoft.

Who knew Microsoft then? Who knows the
next Microsoft today?

And this "yet another accidental revolutionary" theme sounds like a funny
plagiarism of the title of Eric Raymond's book and became an
instant source of jokes that the word "revolutionary" was a typical version
1.0 bug and the correct spelling should be "an accidental millionaire".
ZDNet eWEEK even published a Spenser Katt's story
For Torvalds, the revolution was fun

El gato winced when he heard Harpercollins was
publishing a celebrity's autobiography titled "Just for Fun: The Story
of an Accidental Revolutionary." "I bet it's the inspiring story of
how lil' Britney Spears left Kentwood, La., to become a teen music phenomenon,"
assumed the remedial-reading Ragamuffin.

Imagine the Katt's chagrin when a pal informed
him it was the title for a tome chronicling the life of fun-loving Finnish
Linux inventor Linus Torvalds. "Next you'll tell me George W. Bush is
really the president," cackled the unconvinced Kitty.

Movement is as important as the leader and IMHO the
leader himself should be humble enough to avoid to be presented as a "saver
of mankind" in the best Marxist/Hollywood fashion. "The Kingdom of
Illusion" is actually more close to Marxism or even national socialist aesthetics
(cult of the superman) that one can think ;-). Hollywood decided to
make a quick buck on open source and here flexible Linus Torvalds can be
pretty handy consultant. And Linus Torvalds, jumped on the opportunity to
make a quick buck by providing technical advice. As Washington Post correspondent
wrote in the article
'Antitrust' the Movie; Hollywood's Thinly Veiled Take on Microsoft

On Friday -- coincidentally the day the
government files its brief in the Microsoft case -- a new movie opens
nationwide. According to Metro-Goldwyn-Mayer Pictures, the film is a
"relentless suspense thriller" about "the high-speed, high-stakes computer
industry." It is centered on "a corporate behemoth whose power, greed
and paranoia know no bounds," and stars Tim Robbins as the company's
CEO -- who just happens to be "the world's richest man" and is out to
monopolize the technology that will unite all electronic communications
devices "into one superpowerful feed, encompassing television, the Internet,
radio and telephone."

Try guessing the name of this new movie
that features intrigue, sexy liaisons and a catchphrase, "In the real
world, when you kill people, they die. This isn't a game!" Give up?
The new movie is called "Antitrust," of course. We're sure that, to
Hearsay's loyal readers, the word "antitrust" evokes images of a high-tech
suspense thriller with sexy side plots, double-crossing, intrigue and
danger.

The MGM press packet is careful not to
name names. It's just coincidence that Robbins is dressed to look like
a familiar billionaire CEO. Just coincidence that when the studios picked
a model for the sets that they "wanted to capture the look and feel
of the wired world on the campus of a powerhouse Pacific Northwest software
company." Hmmm. Which one would that be? RealNetworks?

What does Microsoft think about the new
movie? "We deeply resent this unfair personal attack," said Microsoft
spokeswoman Ginny Terzano, "against Steve Case."

While Case, America Online's chief, could
not be reached for comment, apparently others in the software industry
had some fun with this movie too: Microsoft nemesis Scott McNealy, chief
executive of Sun Microsystems, appears in the film giving an award to
a software programmer. And Linus Torvalds, the creator of Linux, the
open-source code operating system, provided technical advice. A major
theme of the movie supports the largely anti-Microsoft open-source code
movement that Torvalds personifies.

"...Thank goodness there is one
segment of American society that can't be bought and will not be silenced.
That is Hollywood. The great cause for which "Antitrust" sacrifices
the lives of brilliant young software developers is open-source code.
Open-source crusaders believe that software should not be copyrighted.
They believe that universal freedom to use and tinker with existing
programs is the best way to promote future innovations. But more than
that: They believe the very concept of intellectual property rights
-- legal ownership of information in any form -- is downright immoral."

"'Linus, we love you. Please come to Brazil'. With these
words, hundreds of young people, technology specialists,
businessmen, executives and members of the Brazilian government
wound up their participation in the 5th International Free
Software Forum (FISL), which took place in Porto Alegre,
in Brazil's far south, at the beginning of June. Recorded
on video by John 'Maddog' Hall, president of Linux Internacional,
this affectionate plea was aimed at bringing Linus Torvalds,
probably the only great figure in the history of free software
who has not yet visited the country, to Brazil."

Linux community never was especially technically savvy. Most technically
savvy specialists, especially in Eastern Europe traditionally preferred
Free/Open/NetBSD. ISP still prefer it due to quality of stability despite
hundred of millions dollars spend on Linux development by Intel, IBM and
other large supporters. But what Linux has from the very beginning were
distinctive features of a small vocal technical cult. And as any cult
leader Linus needed to become a hypocrite. As Bob Metcalfe sarcastically
noted about Torvalds' keynote speech at LinuxWorld Expo in Jan. 2000:

Am I
the only one to see that Torvalds and other open-source software revolutionaries
are acting out the finale of George Orwell's Animal Farm?

Orwell's farmhouse
is full of open-source pigs, which are now almost indistinguishable
from the proprietary humans they recently overthrew.

It's true that I have
been unkind to the "open sores" movement. But to be clear, anyone is
welcome to beat Microsoft with better software, even a utopian community
of volunteer programmers.

May the best software
win.

And don't get me wrong,
even if he disappoints Richard Stallman by not always referring to GNU/Linux,
Torvalds is a genuine hero of the open-source revolution.

But with Torvalds
saying some animals are more equal than others, why is the sanctimonious
open-source press still cheering him on? Are the likes of Slashdot.org,
just gobbled by VA Linux, also porking out in Orwell's farmhouse?

Torvalds wrote and
now controls Linux, the open-source operating system, due this summer
in Version 2.4. By day, he is a programmer at Transmeta. Transmeta just
announced Crusoe, its low-power microprocessors for mobile computers.

The architecture of
Crusoe chips is based on VLIW (very long instruction words). It has
"code morphing" to convert and cache software in speedy VLIW codes.
And it comes with Mobile Linux, with Linus extensions for power management.
According to Transmeta, Crusoe is two-thirds software and one-third
hardware.

So what I want
to know is, if open-source software is so cool, and if Torvalds "gets
it," why isn't Crusoe open source? For a start, why aren't the Crusoe
chip's mask sources published for modification and manufacture by anyone?

And yes, Mobile Linux
is open source, but not the "code morphing" software Torvalds helped
write. Transmeta has taken the phrase Code Morphing as its proprietary
trademark. And what the code does, according to Transmeta, has been
... patented.

Worse, Crusoe
is touted for running Intel X86 software, and in particular, Microsoft
Windows. Doesn't the open-source community say Windows is beneath contempt?

Torvalds showed up
at LinuxWorld Expo touting open source, of course, but then went on
to revise two of its bedrock principles.

Torvalds talked at
LinuxWorld about fragmentation -- the emergence of too many Linux versions.
Being old enough to have watched Unix fragment during the 1980s, I worry.

But instead of
holding to the party line that Linux will not fragment, Torvalds now
says there is bad fragmentation and good. One can assume, because he's
in charge of both, Transmeta's Mobile Linux will fragment Linux 2.4,
but in a good way.

Then Torvalds talked
about commercial companies, which aren't so bad after all: Take for
example Transmeta. His audience, packed with employees, friends, and
family of newly public Linux companies, did not boo him back out into
the barnyard.

Where is the outrage?

So just to keep
Torvalds honest, I'm thinking that Crusoe chips, which are mostly software,
should be open source and basically free. Chips have to be manufactured
-- with white coats, ovens, and stuff -- so maybe it should be OK to
sell open-source Crusoe for the cost of its silicon, trace metals, media,
and manuals.

And how about this?
To keep pigs out of the farmhouse, how about bundling Crusoe chips with
Transmeta shares? This would cement commitment to Transmeta products
and its inevitable IPO.

Until the Internet
stock bubble bursts, this would provide Transmeta with funds to pay
to make the chips and to pay for Super Bowl ads. This would be breaking
new ground in The New Economy.

In 1999 Linux cult became an important part of the computing landscape.
Important because of Microsoft existence. Important because it gives a large
number of people a kind of religion to belong to -- complete with
a nice cult leader Linus Torvalds.

But nothing last forever. Without any real technical edge Linux technical
position remained shaky both against Microsoft (with its new Windows 2000
codebase) and established Unix players like IBM and Sun. Linux was cheaper,
but technically it;s not better. Even its struggle against FreeBSD technically
was not that successful despite huge gains in user base and hundreds of
millions spend by Linux distributors on the development of the system.

Followers of small cult are well known to go to the extremes to promote
their point of view and a good number of Linux evangelists and huge number
Linux zealots make fools of themselves by claiming otherwise and declaring
Linux a ultimate solution for all world ills. At some point they started
to annoy and tend to be ignored. Troubles with GPL and especially with RMS'
interpretation of GPL started to surface in KDE vs. Gnome case. Events started
to turn partially against Linux, especially in Linux startups and the situation
in computing took direction that were probably impossible to predict in
early 1999. Actually even at this time it became clear that Linus dream
of "world domination" is a fake and will remain largely a dream:

Due to corporate pressure Linux seized to become a small compact
desktop and still failed to become a reliable server. Volunteer work
gradually was substituted with hired labor and the problems of hired
guns started to surface with Linus himself being a primary example.

Artificial infusion of a lot of money during Linus bubble helped
to create an impressive Linux distributions that are usable as an alternative
desktop, but they still were far behind Microsoft in usability and the
developers support.

Microsoft noticed Linux and that revitalized part of Microsoft developers.
Both Windows 2000 and Office XP were important steps forward.
But it was enough of a threat (and was a useful Trojan horse to
defeat government suit against Microsoft) to make Microsoft notice,
and that was a really good thing to Microsoft customers. For some time
Microsoft was on the defensive, at least a little, and that gave customers
substantial leverage they wouldn't enjoy otherwise. Actually Linux
proved to be a bigger threat to other Unix flavors, especially Intel-based
commercial Unixes (SCO) than to Microsoft. Many people in the
industry started to realized that you should not make an important decisions
based on anti-Microsoft feelings. Government suit against Microsoft
also helped Microsoft to regain some PR support that it lost.

There were signs that "decentralized commercialization" of Linux
adopted by Linus Torvalds created the situation when the OS exists in
dozens of variants. helped to created multiple, semi-compatible
versions with the incompatibility as a marketing advantage (like is
the statement "this software is supported only on Red Hat") with Red
Hat as a dominant brand and a dozen of second level brands (Debian,
Mandrake, Suse, Slackware, Gentoo etc) and that in some way Linux repeats
the old Unix history. Linus Torvalds failed to promote a single, strong
standard based of superiors technical decisions (for example built-in
around Perl instead of older shells).

People talk about how wonderful it is that Linux is free. But the
cost of desktop OS is not the total cost of ownership and here you can
save money with Linux only if you are very technically savvy.

Big hardware like IBM decided to co-opt Linux as a part of there
strategy. And that makes Linux developers hostages to big money of big
blue and other players.

Fewer and fewer people define their technical priorities solely
by what operating system kernel they use. Pluralism became a style in
technical world. Scripting languages and powerful GUI to a certain extent
relegated kernel issues to the BIOS level. New players like BeOS
and Mac OS X emerged and took (for BEOS, unfortunately, only temporarily)
some clout from Linux, especially on the desktop.

FAIR USE NOTICE This site contains
copyrighted material the use of which has not always been specifically
authorized by the copyright owner. We are making such material available
in our efforts to advance understanding of environmental, political,
human rights, economic, democracy, scientific, and social justice
issues, etc. We believe this constitutes a 'fair use' of any such
copyrighted material as provided for in section 107 of the US Copyright
Law. In accordance with Title 17 U.S.C. Section 107, the material on
this site is distributed without profit exclusivly for research and educational purposes. If you wish to use
copyrighted material from this site for purposes of your own that go
beyond 'fair use', you must obtain permission from the copyright owner.

ABUSE: IPs or network segments from which we detect a stream of probes might be blocked for no
less then 90 days. Multiple types of probes increase this period.

The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be
tracked by Google please disable Javascript for this site. This site is perfectly usable without
Javascript.

Original materials copyright belong
to respective owners. Quotes are made for educational purposes only
in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains
copyrighted material the use of which has not always been specifically
authorized by the copyright owner. We are making such material available
to advance understanding of computer science, IT technology, economic, scientific, and social
issues. We believe this constitutes a 'fair use' of any such
copyrighted material as provided by section 107 of the US Copyright Law according to which
such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free)
site written by people for whom English is not a native language. Grammar and spelling errors should
be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to make a contribution, supporting development
of this site and speed up access. In case softpanorama.org is down you can use the at softpanorama.info

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or
referenced source) and are
not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with.We do not warrant the correctness
of the information provided or its fitness for any purpose.