Actually, you can run other applications, as it's just debian with some small adaptations like launching steam at boot. You have the option to turn on a gnome desktop in the menus, and if you want to tinker more, pretty much anything you can do on debian you can do on SteamOS.

Even if the only people using the OS were the developers (and they weren't) development still wouldn't have stopped. They weren't driven by economic motivations, they were driven by the lack of a truly F/OSS OS on the market. While you can't get the OS on the machine out of the factory, that doesn't matter--you can just install it later (do note, however, that this has recently become a problem in the form of UEFI secure boot--but that was developed long after linux had a commanding server marketshare.)

Yes, but Linux and the BSDs were not under threat of extinction through lack of market share as they are open source and the development isn't directly driven by companies trying to make money (although many web companies and hardware vendors do contribute code to the kernel community nowadays.) Even if the F/OSS movement wasn't around when Microsoft was big, other companies (like Sun (now Oracle)) would've capitalized on the huge demand for a stable server OS that arose during the dotcom boom. Windows directory structure and bloat don't make for a very good server OS, which is why they don't have a larger market share. Windows was never really ever in a position to create a complete monopoly, and the rise of server-client, mobile, and embedded computing has simply accelerated the rate of it's decline. Even at the time, one of the major reasons that they couldn't create a monopoly was that software is the field with the lowest barrier to entry of any market-- the only capital you need to compete is a computer, meaning the truly innovative ideas will always make it to the top. Microsoft never could have truly had a monopoly, as the nature of software insured they would always have competitors.

Linux (and other UNIX likes) would have become major competitors regardless of that decision, due to their lightweight (and often open source) nature. If you look here, you can see that UNIX likes make up near 65% of the server market, about 97% of the high end supercomputer market, 91% of the mobile market, and (one would assume, though it isn't on the page) near-as-makes-no-difference to 100% of the embedded systems market. Though they aren't the most used desktop OS, UNIX likes (mostly linux, but also others such as the BSDs and the various Apple OS') make up a VAST majority of the computers in use today. Desktops are a tiny part of the computer market.

Well, building an inadequate seawall was a poor long-term business decision, as people dying in nuclear accident=less customers=less profit. Regulation is only necessary because many companies fail to realize how short-term profit can lead to long-term loss.

GMO foods (especially such a small modification as stopping the inhibition of the EPSP synthase due to glyphosate presence, which is what "RoundUp Ready" is doing) have no actual negative effects on the people or animals who consume them. Neither does "RoundUp" (Brand name glyphosate w/ a few additives to stop it from flowing out of fields and into rivers, killing algae in the ecosystem). I could drink a cup of pure glyphosate right now and experience no negative effects, as animals do not synthesize aromatic amino acids (namely phenylalanine, tyrosine, and tryptophan) through the shikimate pathway, a process that requires EPSP and would be inhibited by the presence of glyphosate. The claim that such genetic modifications would alter how your intestines process the food is utterly foolish and misguided, as the process by which your intestines function is completely independent of what food happens to be passing through. Finally, the claim that you buy "organic" is utterly ridiculous, considering how hard it would be not to. Organic simply means being related to a living thing in some way. The modifications of the genetic structure of plants associated with GMOs have no effect on if the plant is a living thing or not-- It is either way.

No, It's more secure because you can audit all the code on your system (and modify it if you choose), and because it it has a sane permissions system due to being based on an operating system designed to run on mainframes with hundreds to thousands of users. To do any real damage to your system, a malicious program would have to:

Be running [and you can easily control every program that is running (This really does mean every program-- If I wanted to, I could terminate the windowing system in just a couple of commands, leaving me with a text terminal)]

Obtain the password of somebody with system administrator status (Which you shouldn't be handing out willy-nilly)

Of course, if you're stupid enough, it's impossible to be safe from malicious programs. However, If you only type your password in where you know you have invoked a program (and regularly audit your init scripts), you should be fairly safe.

Have you ever tried navigating the windows filesystem? That shit is a mess- they spend so much time trying to obscure stuff from you that trying to do anything is nigh on impossible. I'd rather have a operating system that has a reasonable file tree that makes at least some sense then deal with windows. They don't even mount drives on the tree--they're seperate file systems.

First off, for recent (within the past few years) versions of tar, to extract any file, you just have to type tar -xf (That x stands for extract, and the f for file-- it automatically detects the format), and if that's too difficult to remember you could always use one of the many graphical helpers. However, I get the sense that you aren't just talking about tar, but about all command line applications. Well, while those pretty graphical tools might make extracting files easy, trying to do a complex operation (Say I wanted to copy a bunch of files to a separate directory, but only if it's the third Thursday of the month and there's a high tide in Singapore) would be almost impossible, while on the command line it would be a short one liner. The command line makes the hard tasks possible. And the beauty of it is, you don't need to remember it because commands ship with a manual. You can type "man (command)" and you will get a full page of options, examples, and explanations. While it's a bit hard to get used to, it is an INCREDIBLY powerful tool.

As for the second part of your post, let me explain one thing-- *nix file structure is not unusual. It is the standard for computing. *nix systems are not some niche group of operating systems. *nix systems fill EVERY market- phones, supercomputers, embedded systems (that's your internet router or the little tv in the back of airplane seats), web servers, and desktops. *nix systems are not niche. They are the market standard (Just so you know, mac OS X is a *nix system as well). The "unusual" file system was created long ago, and is the best way we've found to date to provide a coherent "tree" of files. As to configuration files, most of those are application specific, but they are almost always kept in your home directory, either in the ".config" folder or as standalone files starting with a . in your home directory (they're called "dotfiles") this system, while seeming unusual and chaotic to a user used to a "control panel", allows for a clean, multiuser system where different user's configuration files don't interfere with each other. The difference between distros is caused by a select few distros (ubuntu) modifying upstream packages to try and "centralize" these dotfiles.