Windows 7 is better than Vista. Great. But saying that is like saying you'd rather catch the common cold instead of swine flu. I've demoed the release candidate for Windows 7, and I can safely say that I still don't like it. Aside from the default options being obnoxious and hard to use (the icons for running applications are identical to the directly-adjacent Quick Launch icons; running programs have no text to show you what they are; unless you have the hardware to back up the Aero interface, you can't get the window previews to help you, either), there are several things I need to see in a Windows operating system before I'll even contemplate switching back.

Multiple virtual desktops — Windows is pretty much the only significant operating system that does not support this. Mac OS X's desktops may not be implemented very well, but they're there all the same. My cell phone has multiple desktops. Why can't Windows get with the program on this? It's an invaluable feature which reduces clutter. I think you'll find that clutter reduction is centric to many of my needs.

Application organization — When I click on the Start Menu in Windows, I have a list of programs to sort through which aren't even alphabetized until I tell them to be. The list is huge, presenting me with a different "folder" for each program I have installed. When I have to go looking for a program, I want to be able to look in one of these "folders" that tells me what type of program it is. Is it an Internet program? A productivity program? Is it a minor accessory? One of my programming applications? Keeping this kind of organization to programs keeps the list short, which would be a blessing considering the tiny, half-height, scrolling list of applications which contains six times as many programs as will fit in its frame. Microsoft tried implementing something like this with games when they launched Vista, but that doesn't work automatically for everything because it's layered on top of the existing system, not integrated as part of the system. The way they implemented it required you to open a new window just to see your shortcuts. First of all, that's counterintuitive. Secondly, it clutters my desktop.

Useful window management — In Linux, I can click and drag windows across my multiple desktops by dragging to the edge of the screen in the appropriate direction. I can move a window by holding the Alt key and clicking and dragging anywhere at all on the window. I can move a window to the best location and resize it so that it's as big as it can get without overlapping any windows it wasn't already overlapping at a single keypress (see video below). In Windows 7, they have added some window management features where the movement of a window to an edge of the screen resizes the window to fill half of the screen along that edge. Whoopee. What if I don't want it at exactly half size? What if I just want my window on the right-hand side of the screen? There's no customization here, only an assumption that I want my windows to be exactly where Microsoft wants them. I sincerely hope this feature can be turned off.

Installation across drives — As it stands, I get a tiny speed boost and a major OS installation advantage by being able to install my home directory on a different drive or partition than the rest of my OS. This is great for home users because it means they can reinstall the operating system without damaging any of their personal data or application settings. It's also great for server users because MySQL databases can sit on a RAW partition, which is often faster because they don't have to follow the rules of the filesystem that way. The best I can manage in Windows is to create a separate partition and manually save and copy files to that partition after the fact. Nothing will be automatic, and I will have a large separation in functionality between the two. Unlike Unix OSes, Windows does not mount all filesystems fluidly together.

Security built in — With Vista, Microsoft attached "User Account Control" to Windows, and that turned out to be a major annoyance that did little to aid security. It prevented nearly every program from running because Windows required administrative privileges to run nearly every program. When all users have instant administrative control, that's a bad thing, and a security problem. That's why they pushed UAC through. But UAC popped up for everything, and most users just turned it off so they could be allowed to use their computer. Again, this is a bad thing, creating even more of a security problem. With Windows 7, not much has changed. Users can now select how many UAC warnings they receive. What will be the effect of this? Just like last time, users will either be annoyed or turn it off. Still a bad thing. Still a security problem. When Microsoft manages to write an OS that has security layered into its core, when they can sort out what should and should not require administrative privileges, they might have a chance at winning me over.

Fragmentation-free file system — I don't want to have to spend hours every month defragging my harddrive and slowing my computer to a crawl because my operating system allows fragmentation to happen. I certainly don't want my computer to do this in the background on a schedule that I'm unaware of, slowing my computer down when I need to use it. Mac and Linux do not allow for this to happen. A defragging program is not the proper solution. The NTFS file system is about a decade old now. It's no longer "New Technology." I never wanted to work for my computer in the first place, and it's time to ditch this abhorrable system.

Singular application installer and updater — In Windows, when I want software, I go to the Internet and either search Google or go to a website that I know carries that software. I install it using a six-page installation wizard that probably only needs to be a one-pager. I install software one program at a time. And then a week later, when the fifteen programs I took the time to install last week have been updated, I have to either download the software from the individual websites again and then reinstall them all separately through more wizards, or I must run fifteen separate updater programs in the background constantly, just waiting there for an update to happen. Neither of these are viable options. Linux uses central, customizable repositories to pipe software through a single, centralized installer/updater/uninstaller program that allows me to install, update, and uninstall as many programs as I want simultaneously, and in one fell swoop. Again, even my cell phone does this. And as with the fragmentation problem, Microsoft should fix the core problem instead of adding on layer after layer of faux-solution to bandage it.

Let me customize! — I don't want the ugly Aero interface, and I don't want the even uglier atrocity that I get when I turn off Aero. I want something that I like and that I choose. I want my colors in front of me. I want my style, my appearance, my everything. Please, Microsoft, let me do this without paying for third-party software that only adds a separate layer to the problem. The software linked above uses the Windows API to accomplish this, which means that the functionality exists deep within the Windows system files. If Stardock can do it, so can you. You can implement it straight into the software. Do it, already! Let me use my computer the way I want to.

Saturday, July 25, 2009

Their claims? That my slow browser is probably "several generations old." They tell me that IE8 is "a huge improvement on the speed scale."

These statements are true if I ordinarily run Mosaic. Of course, IE8 is one more generation of the same old software, and to say that IE8 is faster than IE7 is kinda like saying that Windows 7 will be faster than Vista. They'd have to try hard to make it the other way around.

Of course, they completely fail to mention that IE8 fails almost as hard as IE7 when it comes to meeting web standards. Finally, Microsoft managed to turn this:

...into this:

Congratulations, Microsoft. You finally passed the Acid2 test. But every other browser's been breezing through for quite some time now. Find something else to impress me with.

And how about Acid3? Here's the debacle we know as IE7's rendering:

And here's the major improvement shown by IE8:

Wow! Now it says, "FAIL" in giant letters just to let you know that it does, in fact, fail. It probably says that in the IE7 rendering as well, but it's difficult to tell what with all the mangled distortion of crap way up there.

This test takes about two to three seconds to run in Firefox and Chrome (with Chrome running about half a second faster). In Internet Explorer, it's about nine seconds. How's that for fast!

One generation older and seven seconds slower.

On an unrelated note, after Microsoft's failed ad campaign starring Jerry Seinfeld, you'd think that other has-been performers from the 90's would realize that perhaps Microsoft commercials are not the best way to make a career comeback.

GoboLinux is a Linux distribution I heard about from a friend who said that it looked interesting for its flagship property - a simpler file structure. I decided to check it out.

I downloaded the distribution ISO from their website, which was easy enough, and booted up VirtualBox with that ISO mounted as a drive.

Installation

Installation was simple enough for an intermediate user to do. Instead of Ubiquity or another similar installer, the Gobo folks have opted for a custom piece of software.

I booted to the live CD's desktop environment. You must manually run startx to kick on the X server from the live CD. This is a good and bad thing, depending on who you are. I believe that this Linux is designed for mid-to-high levels of Linux experience, so if you have that, this should be no big deal, especially since the first output tells you exactly what to type to get things done. For novices, this would be confusing and unappealing, though not insurmountable. I doubt that newbies are the target audience for Gobo.

I had to boot to the desktop environment because the command line installer will not run without having already partitioned your drive. While I'm sure there was a command line partitioner available, I am not familiar with any of them, so I opted for the desktop install. Even from that environment, ease of install is hindered by the fact that the installer errors out if your drive is not already partitioned. Partitioning must be done separately, then the installer can be run.

The installer itself is easy to understand, if a bit unsightly. Again, this won't be attracting novice users, but anybody who knows a bit about computers will be able to figure it out in no time.

Installation was quick, even for the full package. My VM took care of it in 10 minutes or so on low-grade hardware, once I got through the install configuration sections.

Trying to reboot from the desktop environment was not possible without using the terminal. I exited my session and was spat out into a terminal still executing from the live CD image. I gave the reboot command and finally got to boot into the OS, ready for action.

First Boot

The bootloader allows for three options, as shown here:

So far, I have only booted to the Graphic Desktop option, but it's nice to have those other options available. Yet another sign that this is not for your average Joe the Plumber, this screen requires the user to make a selection before it will boot. I waited for quite some time, and it never timed out.

The boot is noisy until it hits runlevel 2, then it starts a slightly quieter boot process.

Desktop Environment

The default desktop environment for Gobo is KDE 3.5. I've never personally been a fan of KDE, but I could install Gnome on here if I wanted to.

The login process is more or less the same as any other KDE login:

The default desktop wallpaper is pretty lame, but most Linuxes tend to use their own logo by default. Fortunately, a collection of good photographs and textures are available, and you can, of course, add more as desired.

Three icons greet you, Home, Trash, and Manager. Manager is the applications management frontend for Gobo. It's recipe-based, and is supposed to provide users with a list of available versions of various software packages. Software can be enabled, disabled, and linked (more on linking later). The distro's website contains a decent collection of recipes for various products. It's poorly organized, but it's there all the same. The search feature helps to find what you're looking for.

Manager itself executes in a terminal, then creates an additional window for the GUI frontend. Naturally, closing the terminal closes the GUI as well. That's a bit unrefined, but not a killer problem.

The control panel allows you to change your settings well. The standard control panel for KDE is what exists.

As with other KDE-based distros, the default text editor is kate. Text editors are a big thing for me since I develop for the web with hard code like this. If anybody else cares, kate does a decent job with syntax highlighting. It's not as good as gedit in Gnome, but here's a picture of it highlighting most of the JavaScript embedded within HTML fairly well:

The web browser, as you may have guessed, is Konqueror. Again, I've never been too impressed with KDE and its suite of software, but at least it renders basic HTML, CSS, and JavaScript properly, which is more than Microsoft can say for Internet Explorer.

Software Installation

As mentioned before, Manager is the GUI frontend to Gobo's software installation system. That system is a very good one that solves some problems. There are tons of different ways to install software. Gobo looks at the problem from the perspective that compiling from source is the best way to ascertain the most recent version of software, and addresses it by using a custom program called Compile.

Compile combines the functionality of various other compilation programs like compileprogram, makefile, and xmkmf to always compile source code correctly. It also adapts the file paths of those source compilers to match the specialized file structure of Gobo.

Compile uses recipes to know where to get the source code, what version it is, what the checksum is for download verification, and how the compiler should react to the download. This turns the terminal command for installation into something as simple as "Compile foo" or "Compile bar 1.1" for a specific version. If properly managed, this is a great and easy way to get the job done.

The Flagship

So far, Gobo is a pretty mediocre distribution of Linux, but it does one thing extraordinarily well, and the devs flaunt it rightly. The file structure has been completely reworked. This is the root directory:

Seven directories. That's it. All of the POSIX stuff is still there, but only in the sense of symlinks that point to various locations. All the details about how they do it can be found here, but here are the important parts:

"GoboLinux is a modular Linux distribution: it organizes the programs in your system in a new, logical way. Instead of having parts of a program thrown at /usr/bin, other parts at /etc and yet more parts thrown at /usr/share/something/or/another, each program gets its own directory tree, keeping them all neatly separated and allowing you to see everything that's installed in the system and which files belong to which programs in a simple and obvious way."

Here's the breakdown alphabetically:

Depot - This is a shared directory for all users. The devs call it a "community area" or an all-users home folder.

Files - This is a shared directory for applications. Files that are needed by multiple programs, but are not necessarily owned by those programs are kept here, such as fonts and wallpapers.

Programs - Each program gets its own directory here wherein it can install all of its files and folders.

System - Here are the critical system files, the ones that make Gobo work. These include the Linux kernel, boot software, initialization scripts, settings for various software (/etc content, in other words), and all the symlinks that make the new file structure work (and compatible with other Unix-like file structures).

Users - This is the /home replacement, and that's more or less all it is.

Really, the system is quite intuitive. Hisham Muhammad, the ringleader for the GoboLinux development community, makes some very convincing arguments regarding the new file structure on the GoboLinux website. He makes a very good case for change and explains how, despite the deviance from the standard Unix/Linux file architecture, it is still compatible with that old Unix format, if not more so.

"Through a mapping of traditional paths into their GoboLinux counterparts, we transparently retain compatibility with the Unix legacy. There is no rocket science to this: /bin is a link to /System/Links/Executables. And as a matter of fact, so is /usr/bin. And /usr/sbin... all "binaries" directories map to the same place. Amusingly, this makes us even more compatible than some more standard-looking distributions. In GoboLinux, all standard paths work for all files, while other distros may struggle with incompatibilites such as scripts breaking when they refer to /usr/bin/foo when the file is actually in /usr/local/bin/foo."

Conclusion

After many years of Linux use, I have come to find what I like and do not like about Linux-based operating systems. I want a system that I can install quickly and easily, without much hassle. I love the repository-centric installation and updating of software packages, but I could probably adapt to a recipe-based installation system. I much prefer the Gnome desktop environment to KDE, XFCE, Enlightenment, or any other that I have come across. In these respects, GoboLinux fails to meet most of my expectations of a Linux OS.

But don't let that fool you. This is a perfectly stable OS with a lot of benefits. Most importantly, Mr. Muhammad has a really great idea going with this retooling of the file structure. It's user-friendly, system-friendly, and Unix-friendly. It's simple and clean. It goes a long way toward simplifying Linux for average computer users. While many aspects of the OS are more difficult to use and while I believe the new structure could use some further improvement, Gobo makes a great point with this. Many Linux users will disagree, and that's fine. That's why we have communities and a multitude of distributions. I personally wish that Ubuntu or Fedora would adopt a system like this. It could take those other distros a long way.

I won't be using GoboLinux on a regular basis, but I'll be checking back in on it in a year or two to see how far it's come. This is a distro with major potential.

Saturday, July 11, 2009

I, like most Linux advocates, have demonstrated the power of Linux to people who had no idea such power existed in technology, much less in free technology.

I have a friend who was at my house one day while I was at work. For some reason, he needed me to reboot my computer. So I did, via SSH. When I'm at work, I stream music off of my desktop at home for my listening pleasure via the web. Sometimes if I need to do something from my desktop when I'm working, I'll X-forward an application to my work computer and do it that way.

People see all of these things, not to mention the various desktop effects that Compiz provides, and marvel at how awesome and powerful it is. They ask what it is. When I help them set it up, they get frustrated and quit. These are the problems faced by many a Linux demonstrator. The complaints that I hear are consistent. Always.

But I'm not here to complain. I'm here to solve problems. So I will offer a solution to each complaint that I've heard a dozen times as I go.

The first complaint:

"I put a DVD in my player and it won't play."

This problem, like most of them on this list, is the result of user ignorance brought on by some external force or another. This particular problem comes about when terms like DVD and MP3 become marketing buzzwords. Their definitions are reduced to the most distilled, basic descriptions of what they should be because products sell better that way. Here's how these words look in a Best Buy salesman's dictionary:

DVD - n. A movie.

MP3 - n. Music.

But they're both so much more than that. They're patented formats, which makes all the difference. Any time you purchase a DVD or a DVD player or a DVD-ROM drive, anytime you buy any device that plays MP3 files, part of your money spent is paying the owners of the DVD and MP3 license. Some company turns a profit every time you buy a DVD or MP3 player whether or not they manufactured or even designed the device. When you pay a license to Microsoft to purchase Windows Media Center Edition which comes out of the box ready to play DVDs, you're buying the right to watch DVDs on that device.

Most Linuxes are free of cost, but more to the point, they're about freedom of usage. How free are you to play a DVD anywhere? In other words, why should you have to buy the right to watch DVDs that you have either bought from a store, rented, or made yourself? That's not freedom. Which is why most Linuxes won't give you the ability to decode DVDs upon immediate install.

But people shouldn't be discouraged by this. The guys over at VideoLAN have begun distributing a library of code called libdvdcss that decrypts DVDs just fine. And your music, those MP3s and AACs and WMAs and WAVs and whatever else you may have can be listened to by installing the appropriate libraries from the amazing folks developing gstreamer. Installation of these packages is simple enough, either by direct download from those websites or by using whatever repositories your distribution makes available to you.

The second complaint:

"Some piece of my hardware doesn't work."

This is a more legitimate complaint than the first, but I still must object to the reason why the complaint is valid. Microsoft has a long history of making contracts with hardware developers that say that they cannot write drivers for non-Windows operating systems or even disclose the methodologies of the hardware to non-Microsoft developers in exchange for promotion deals. This effectively prevents drivers for Linux from being made.

Unless, that is, a truly dedicated Linux dev decides to reverse engineer the driver from the activities and data transmissions of the hardware. This is difficult to do, and time consuming. These devs are a benefit to us all.

Even with this understanding that the disposition of Linux hardware drivers is dismal, I still can't validate fully giving up on such a problem. Rarely is a piece of hardware rendered completely unusable due to a driver issue. Forums exist for the sole purpose of spreading functional hardware drivers for Linux.

The main complaint used to be that wireless cards never worked. Back in the day, you might have had to use the Windows driver for the card in Linux via a utility like NDISWrapper or MadWifi. Those days are gone. Ubuntu now has built-in drivers for Intel, Atheros, and Broadcom wireless chipsets. These are automatically detected and installed in roughly three clicks of the mouse. Linux is no longer in the disparity it used to be in, but I don't know that I'd call it disparity anyway. Let's not forget that Linux initially grew and spread when people passed around the OS in its sapling form, writing drivers to match their hardware, creating in the process a massive driver base.

The third complaint:

"Installing software is too hard"

This is an almost unforgivable combination of a terminology confusion combined with the false belief that Linux is just like Windows, but better.

A Windows user wants to install a program by putting a disk in their computer and waiting for an installer to pop up onscreen. Then they want to go through seven wizard pages, ignoring all settings, just clicking "Next" a bunch of times. Then they want to wait while it installs, then have to dig through thirty unorganized program listings to find the software, then dodge some links to various readme text files and Internet links to corporate sponsorship websites to finally launch their program. And they want to do this for every single program they install. It's simply too much of a hassle to learn that Linux does things differently.

In Linux, you run a program that was installed when you first put the OS on the system. You search there for the software you want. You check a box suggesting that you want this installed. You do this for all programs that you want installed. Then you click an Okay button and wait while all software is automatically downloaded, all dependencies taken into account, then installed and configured to a predetermined best practice set of settings. Then they get organized by type - Internet, Games, etc. - so you don't have to pick through so much, and then they're alphabetized. The process is simpler, faster, and more organized. It's not harder. It's just not what they're used to. Uninstallation is just as easy.

The fourth complaint:

"Linux is not good for gaming."

Again, we see a confusion of terminology. Linux is actually great for gaming, especially since you don't have so many of your resources tied up in the OS alone. (Side note: generally, I speak of Windows XP here since that's still the version that's most common in the world, but it gets worse with Vista and Windows 7.)

The problem is that Linux isn't popular enough in the world for commercial game companies to bother porting all their games over. It's not financially viable. This, I will admit, is a viable reason to keep a Windows installation around. I myself dual-boot between WinXP and the most recent release of Ubuntu.

That field is changing, however. id Software is porting more and more games to Linux. Some older games are being ported in, which proves that it's doable at the same time that it lays out a process for doing so, a streamline to take care of future ports.

This is not to say that some great games do not already exist for Linux. Take a look at Savage 2 or Sauerbraten or Nexuiz. The graphics are phenomenal, the gameplay is great. The only thing these games lack is a storyline, but the tremendous success of the Unreal Tournament series, Team Fortress, and the Quake Tournament games prove that a storyline is not an absolute requirement of a good game. If you're looking for something with a plot, you can check out the Penumbra series of games.

Linux is great for gaming, some might argue that it's actually better for gaming than Windows, but the games are less available. Just clearing that up.

The fifth complaint:

"I have this program that runs fine in Windows, but it doesn't run at all in Linux."

The people making this claim apparently never looked at the system requirements for the software, and therefore never saw the part where it required Windows. This problem is to be expected. Linux has an entirely different software architecture beneath it because, well, it's a different operating system.

Giving up at this point is a display of short patience and also ignorance to the thought that there might be equivalent software out there for Linux. Need a word processor? Try AbiWord. Need a full productivity suite? Try OpenOffice.org. Need a Visio-style diagramming program? Try Dia.

Do none of those match your specifications? Install WINE and try to install your Windows software that way. The fact that Linux can execute a large chunk of Windows code is more than Windows can stake a claim to.

If all else fails and that software is an absolute necessity, then, yes, there's always Windows. Feel free to use it.

The sixth complaint:

"I asked how to do something, and they told me to type commands. That's not intuitive!"

Chances are this user was told to do it on the terminal because it was

The easiest way to describe the solution in a message board or chat room or

The fastest way to get the job done.

Probably, there's a way to do it through the GUI, but they asked how to do something that's been bugging them, so they were shown the fastest method, if perhaps not the method that they would have liked.

Besides, the terminal's not the scariest thing in the world. Running an apt-get command isn't the same as compiling software from source. Being told to use the terminal once, and being told exactly what to type, is not the most complicated thing one can accomplish on a Linux system, and it's not a constant requirement. Most full-time, die-hard Linuxers will use the CLI very regularly. I know I do. That's because we know it's the fastest way to get things done and we've taken the time to learn it. That's not really expected of a first-time user, especially one who isn't technologically inclined. Next time, the user should specifically ask how to do the task using the mouse instead of the keyboard.

The seventh complaint:

"Linux is not ready for the desktop."

This isn't so much a complaint as it is the sum total culmination of all of the others. People fear new and different things. Human beings don't always react well to change. The natural reaction is to call that change inferior, even against obvious examples of it not being so.

There are several million Linux users worldwide who would disagree with this statement fundamentally, and are proving its wrongness on a day-to-day basis. Millions of people use some form of Linux as their primary desktop OS. In fact, there are millions of people who use Linux every day and don't even realize it. The system is so versatile that it can be tweaked to perform on everything from iPods to Tivos to cell phones. It is backed and assisted by many multi-million dollar corporations like IBM and Google. Believe it or not, Microsoft runs a lot of their web servers with Linux, which goes a long way toward showing their faith in their own product.

But if people give it a try, adjust to changes for the better, and most of all contribute some way to the community that keeps advancing Linux and its derivative software far beyond the scope of Microsoft Windows, it just might work out for them. It already has for millions of users.

Naturally, the tech world goes crazy. I've seen people say such polarized things as "Today marks the knee of the great curve of Microsoft's decline" and "Google's vision is meaningless to consumers." I've seen people say that this is just what the world needs to push Linux into the spotlight and spread its market share, but alongside those, I've seen people criticize the proposed operating system as a direct affront to Linux. In truth, the tech world is full of zealots and fanboys who will say pretty much anything to draw attention to their OS/browser/etc. of choice. Let's look at this logically, level-headed.

Before we can begin to decide how this affects the computing community at large, we have to define what it is. That said, here are some basic facts about the project according to what Google published. It is

Open-source

Lightweight

Targeted at netbooks

Focused on "speed, simplicity, and security"

Capable of booting and being on the net "in seconds"

Running Linux at the core

Running a new window system (not X)

Running the webkit-based browser Chrome on top of that

Not Android

Let's tackle the two obvious questions.

What does this mean for Microsoft?

Quick answer: who knows? Certainly, Microsoft's long-standing campaign of deceit, condescension, and failure to meet promises is beginning to fail them, especially in the light of Apple's advertisement campaigns in the past few years which have been an unabashed, straightforward attack on Microsoft, even if they were largely a heap of fibs themselves. At the end of 2006, it would appear that Microsoft had roughly 94% of the market depending on how you measure it (and if Wikipedia's sources are accurate), while Mac claimed 5-6% and the various Linux distributions carried less than 1% of the pie. At the end of June 2009, those same sources show various Windows OSes occupying a little less than 88% of the market, Mac's share increasing to roughly 9-11%, with Linux still dangling in at a little bit less than 1% (but still more than its previous "roughly 1%").

Chrome OS fits a niche market: the netbook. These are those (almost annoyingly) small computers that are meant to have only enough processing power to put a person on the Internet and do some basic word processing. They are definitely helpful for the technologically disinclined. A person wanting to actually accomplish something impressive will look elsewhere, but netbooks definitely have a good reason to exist. I'm sure that Chrome OS will run quite nicely there. It's a niche that Microsoft will have a hard time filling, even with Windows 7, which is intended to do just that.

Win7 requires 16-20 GB of free space on the hard drive, and that's ignoring any filespace left over for other software. Microsoft Office will then chew up another 1.5 to 3 gigs depending on how much of that software you need. Tack on the high video card requirements and you have a stodgy, slow, and ugly operating system that devours more of the disk than necessary.

Most of the affordable netbooks have much larger disks than that, so the wasted space may not mean as much as it used to. However, the fact that Windows 7 sets up automatic defrags, even on hardware such as solid state drives where the act of defragmentation is both unneccessary and physically damaging, makes it a dangerous operating system to run on a netbook without doing some major tweaking first -- tweaking that the kind of people who create the netbook market probably won't know to do.

So what's worse for Microsoft, really? Is it really Chrome OS that will do them in, or is it Microsoft's own horrendous software? Besides, what Chrome OS proposes to be is something even less than what many Linux distributions already are. Where Ubuntu's most recent bit of awesomeness -- Ubuntu Netbook Remix --offers a full desktop experience with a complete productivity suite, an IM client that works across practically all chat protocols, and where other common net-based software is about three clicks away (blogging clients, microblogging clients, casual games, etc.) all in less than 1 GB, Chrome OS will only offer a browser that isn't very extensible yet. The Chrome browser may execute JavaScript faster than anything else right now, but there are tons of standards that it still doesn't support, and a person's web browsing experience will be less "correct" there than with Firefox 3.5 on any OS at all. Windows 7 can't fill any of those bills.

It looks like it could affect Microsoft in a couple of ways, though.

First, it might expose Linux to more people. Don't let me confuse you with that statement and that emphasis. What I mean is that it will succeed. Trust me. Google has enough money to create muscle in the market. It'll be popular to some degree or another, even if it's only within this niche netbook market. But even though it's Linux-based, and even though it's open source, I sincerely doubt that the terms "Linux" and "open source" will have anything to do with their advertisement plan. They will confuse customers that might otherwise be willing to buy into the concept.

Second, Google's tactic of late has been to simply undercut Microsoft's pricing, which is easy to do given the outrageous dollar value Microsoft places on their operating systems. Being that COS will be Linux and therefore open source, undercutting that cost is almost expected. Linux already does this, but has little to no advertisement behind it, and to tell a potential user that there are hundreds of Linux OSes to choose from doesn't exactly help win over the hearts of the masses. COS will provide a solution to both of these problems, and that will definitely help reduce Microsoft's monopolistic, anti-competitive hold on the computing world as a whole.

So where does that leave Linux?

Linux has, up until recently, been a "geek" thing, and that's a mildly unfair stigma that has only been slightly loosened by more recent, "user-friendly" distros like the Ubuntus, including Linux Mint and other derivatives. This will not change. I keep hearing people make very bold claims that this is either really, really good or really, really bad for Linux on the whole. But one of the founding principles of Linux is freedom of choice. If you wish to not run COS, there is nothing stopping you from using an Ubuntu or a Gentoo or Fedora or Sabayon or PCLOS or any other distribution. Hell, you could even run Windows 7 on it if you so choose. One's decision about what OS to run is exactly that -- one's own decision. The introduction of COS to your miles-long list of options only changes things inasmuch as your list would then become miles long plus one.

In fact, Linux has a chance to improve greatly. Having a popular Linux OS on the market backed by an unbelievably filthy rich corporation provides incentive for hardware manufacturers to release more official and better hardware drivers for the Linux platform. I'm not complaining about the quality of existing drivers, only the lack of drivers for a great deal of hardware. It's one of the main reasons why people quit Linux before they really begin. They can't get audio or their video card's driver can't send X up to the right screen resolution or they really need better Bluetooth support. Granted, Linux has better hardware support than Windows in most cases, at least so far as having it ready at install-time is concerned, but overall it lacks in this department. That's not Linux's fault. You can legitimately blame Microsoft for this. Their long-time practice of colluding with hardware manufacturers has been so deeply entrenched in the technology community that special words have been coined to describe it. Consider "Wintel" for instance, to describe the Brangelina-style marriage of Microsoft to Intel. Having a Linux-based OS in the mainstream could help facilitate a long-needed divorce.

In short, Chrome OS isn't going to change the world. It barely even has a home in the small portion of the market that it intends to occupy. Perhaps it will occupy it, maybe completely. If you consider that the only other major OS suitable for netbooks is not as suitable as Microsoft would lead you to believe, you've got a clear win for Google. But if you think that perhaps Windows 7 is more capable on a netbook than Google's offering, then there's a clear win for Microsoft.

Only one thing's for certain. Microsoft now has real competition, and it's been a long time since they've had to react to that. I guarantee that Microsoft cannot simply purchase Google, continuing their procedure of Assimilate and Deprecate that they've become adept at when facing opposition. And that is definitely a good thing.

Saturday, May 23, 2009

...but it doesn't?
Here's the article.
I really don't need to say much here, but I feel it necessary to point out that Microsoft has been saying for a very long time now that Windows 7 will be faster, smaller, lighter than Vista. And then they go releasing the specs saying your video card will need to be just as powerful, but you'll need twice the RAM and an extra 1-5 GB of hard drive space for the OS alone.
Sure, the OS may be faster, but it's definitely not smaller or lighter. As for me, I discovered the other day that I can boot Ubuntu 9.04 "Jaunty" from POST to desktop in less time than it takes to launch Microsoft Outlook. And Jaunty consumes less than 4 GB of hard drive space, less than 256 MB of RAM, and my desktop effects run fantastically on 64MB of video RAM.
I'm not gonna go into a rant here because the numbers speak for themselves. Just saying.

Thursday, May 14, 2009

There are a slue of lawsuits being filed back and forth between the major computer and video processor manufacturers. Intel's suing NVidia claiming that NVidia doesn't have the right to make VPUs that integrate with specific, Intel-patented processor assistant technologies. A company called Techsearch is suing Intel because Intel's hardware is based on an archaic model that was patented years ago by another company called International Meta Systems (IMS) that has since gone under, and whose patents were purchased by Techsearch. AMD is suing Intel on anti-trust charges, trying to convince the courts that Intel is trying to be anti-competitive and own the world's computing platform design.
A few years back, Intel finally manufactured a processor that marginally outperformed AMD's equivalent processor, and since then, rumors of AMD's demise have circled the Internet and geek culture. Buzz has also been passed around saying that AMD has a real knock-out product in its future, a set of integrated processor-based technologies that will blow Intel back out of the water, but that they do not currently possess the R&D budget to complete it. AMD's ultimate failure is like waiting for a bomb to drop because it would mean Intel's monopolization of the PC hardware platform.
I, like many others, do not want to see AMD go out of business, and not just because of what that means for Intel. AMD has consistently produced absolutely amazing processors. I've used AMD for years, and would not go back by choice. Even with the Phenom product line, advertised as a failure by Intel, and which consistenly have lower benchmark scores than the Intel Core 2 lineup, I chose the Phenom X4 to build my latest computer with. I did this because it was a full $100 cheaper than the Intel Core 2 Quad that I had been eyeballing with distaste, and the benchmark score disparity is really quite minor, if consistent. AMD has always provided their processors at a lower price than the Intel equivalent, despite that they have always benchmarked better right up until the Phenom line, and the "more for your dollar" principle has always applied.
But what would happen if AMD crashed? Could it actually benefit the computer market in the long run? I know this sounds crazy, but hear me out.
In this dystopian fantasy world, AMD goes under and Intel monopolizes the computer processor market. Intel will inevitably have anti-trust charges brought against them. After litigation takes place, some amount of reparation will happen, potentially benefitting the owner(s) of the AMD namesake and its operational facilities. But Intel cannot necessarily be forced into outright paying AMD to continue operation, so it's very likely that they might be forced into unlicensing the patent on the x86 architecture.
Long story short, AMD has always licensed this architecture from Intel, paying them regularly to continue to be able to produce processor chips that will perform beneath the same operating systems that have been in place above the Wintel *cough* I mean Intel platform all this time. In this manner Intel has always profited from AMD's success. But if the license no longer cost money, AMD's total cost of operations would be severely reduced, and the market would also be opened up to other, more independent developers. That means competition (which was the point of this legislation in the first place), and that's always good. I would venture to say that I would stop buying AMD hardware if there were another competitor that produced better hardware at the same cost.
This new processor market is definitely something that benefits everybody, since competition becomes more possible, and competition is the predecessor of innovation: hardware continues to get better at a faster rate. But do we really want to spend two or three years under the reign of a monopolistic Intel to get there?
On the other hand, Intel would have to play their cards well during this proposed AMD downtime. The fear goes that Intel would own the market and jack up their prices. Having no alternative to their products, the market would be forced to pay more and more and more for the same crappy products they've always produced, only they might end up being crappier than ever before because of the lack of competition. Intel would be safe from consumer brand-switching. However, if Intel knows or thinks in advance about the possible legislative resurgence of AMD (or other competitors), they would be wise to not do this. AMD, despite their license payments, have always managed to keep their prices lower, so even if their resurgence into the market greets us at the same prices as before their failure, AMD will still appear to even better for the dollar than they do now in reality. Any other competitors would be at the same advantage. This might even lead to a cycle where Intel is doused and bounces back.
Who knows, though? This is all just speculation on my part. In all reality, I want Intel to succeed. But only if AMD does as well. The competition is what keeps me able to build a system for $1000 that Dell sells for $4000. My prices stay low and I have a satisfying option. Live long and prosper, AMD.

Wednesday, April 15, 2009

My CS instructor is a believer in cloud computing, and after reading an article that he sent to me singing the praises of cloud computing, I wrote this in rebuttal of my argument. For the record, I really like my CS teacher. He's intelligent and funny and he seems to know what he's doing. I just disagree on this one point. So I thought I'd post what I wrote here, just to continue the argument I started a while back.
In truth, this article only goes to further my dislike of the concept. When we centralize processing power, when it becomes a utility that we buy into, a few things happen:

The center becomes an obvious and frequent target for malicious crackers who want my data.

Centralized processing providers become a corporate entity, and they become exactly as inefficient and careless as other corporations providing any other product or service (more detail in a moment).

The economy on the whole takes a downturn.

The experience of the individual computer user is unfairly diminished, as is the value of the non-cloud system we currently have.

In fuller detail...
Data Mining
I work for a company called Multimedia Games. We develop video slot machines, mostly for Indian casinos on reservations. When you play these video slots, you're really playing Bingo cleverly disguised to look like video slots. Since we are bound by a lot of legal restrictions, Bingo must be played exactly by the rules, and that means that multiple people need to be playing at once to get a game result (winners/losers), and that means that a network is involved. We have a rack of servers installed at each casino that plays the games and sends the win/loss data to the playerstations, then records financial data, gameplay data, marketing data, and a ton of other data related to casino-specific needs in a database. This data is crucial to the continued operation of my company and the individual casinos. Also due to high regulation, we go through tons of measures to make sure that this data remains secure. To host this data on shared devices or virtual machines in the public domain (meaning The Internet, not in the copyright sense), would be an extremely dangerous move with risks that propose a serious danger to our business.
This is a specific case, but it's a bad idea in general. Any time that you gather data onto one set of centralized servers, those servers become an immediate target for crackers who want that data for malicious purposes. Hardly a month (or a week) goes by without some kind of attack or security failure on an email server or MySpace or Facebook or an operating system in general becoming public knowledge. Technology is not infallible. In fact, when it comes to security, the best we can do is make it difficult on attackers. No amount of protection will keep out a person with enough knowledge and determination, and that alone is reason enough to distrust a centralized, utilitized data center.
Corporate Inefficiency
I'll use Dell as an example here. They used to be a small company reknown for their quality computers at affordable prices, backed by a solid warranty and service center. This, of course, attracted more and more customers, and these days, almost 50% of the PC market is populated with Dell desktops and laptops. Dell has infiltrated the corporate workplace and constantly battles Hewlett-Packard for workplace market share domination.
Dell became huge, and decided, as corporations inevitably will, to cut costs. Their hiring policy is now one where most entry level positions last six months before that wave of employees is fired and a new team is brought in. It keeps them from having to give raises to the largest segment of their employees. Their computers are no longer stable pieces of hardware, often installing 130 watt power supplies that bust every time there's a mild surge in power, and frequently pass excess power to their motherboards, which are made with cheap capacitors with a tendancy to explode, necessitating the purchase of an entirely new computer with surprising regularity. As an employee of a consumer electronics repair facility for two years, I can safely say that 75% of the computers we repaired were Dells, which is inconsistent with their market share, further proving their inefficency. In addition, Dell has failed to conform to well-defined computer hardware standards as a form of lock-in. The power supplies that fail all the time do not meet ATX form factor (or mini-ATX, or micro-ATX, or BTX or any other standard), which means that when one breaks, you have to pay over $100 to buy another low-wattage PSU straight from Dell. It's a way of monopolizing one's own product, despite the fact that better products exist through a million other means. It's unfair to the consumer, especially when you consider that most of their technical support is outsourced to foreign countries, which decreases customer satisfaction, but increases profit.
The point is that once people adopted Dell as a trusted name, Dell grew and grew and grew until they were too large to sufficiently support their operation. This will undoubtedly happen to cloud computing within a few years of its widespread adoption. By adopting this plan of allowing corporations to run our operations, we open ourselves up to this kind of reduced functionality and constant problems, which in turn affects our own ventures. Once we're locked into contracts, or, more likely, once it becomes too expensive a prospect to move our data back onto our own private server operation, the company providing a cloud service essentially owns our business operation and controls it fully. The only goal they need to make is to ensure that the cost of keeping data and load-balanced processing through them is cheaper than the cost of returning operations to our own servers and network.
In addition, a corporation must focus its attention on the customers that generate the most profit, and ignore customers that use the service, but also belong to niche markets. This can be seen in Netflix, with their on-demand digital viewing service. Up until recently, it required an Internet Explorer ActiveX control to work. People who know that this is a major security flaw complained about it. They want it to work in Firefox. Netflix has now accomodated for that by using Microsoft Silverlight instead, but nobody likes Silverlight, and this is an obvious example of instituting the DRM that nobody likes via a corporate partnership with a company that nobody likes. Still, Mac users cannot use this service, and neither can Linux users, so both of those niche markets are ignored. A centralized cloud computing service would only end up doing the same thing with more devastating results. "What? You don't like the way this all works because you're a Mac/Linux/Firefox user? Tough! You don't pay this company enough for us to care!"
Economic Downturn
Because of this country's corporate, enterprising outlook on literally everything, we have come to the conclusion that spending less money is the equivalent of doing better business, but this is sometimes untrue. One example that they cite in the article is that of the expense of IT professionals and the cost of running a data center. The suggestion that is made there is that if we unify our data center into one center for thousands of companies, we can pay one group of IT professionals instead of a thousand, thus lessening expense overall.
The problem with this line of thinking is that it puts 999 companies worth of IT professionals out of a job. If a company needs to cut expenses, it should cut expenses related to product and physical material. It should use less paper or turn off its workstations at night or settle for cheaper workstations (since, as the article says, we're only utilizing 10-30% of their processing power anyway). It should not dispose of human capital. When citizens get paid, citizens then have money to put back into the economy. Reducing jobs is the last thing we want to do. Have we learned nothing from the current recession?
When we outsource to other countries, the United States as a whole loses money because we're paying US money to foreigners, removing the cash from our economy entirely. It's only a natural Step Two, if you will. Once we've centralized a ton of data centers into one in America, a company will realize that they can just move that data center overseas, pay people less to maintain it, and not have to worry about environmental stipulations that cost money in the States. Everything about the cloud computing concept screams, "Failed economy!" Don't you think it possible (and very ironic) that the very things that allow us to be a developed country could turn us into a third-world country?
Diminished User Experience
When something new happens technologically, it doesn't take long for it to reach the entire community of tech users. It extends into both the workplace and the home. Currently, less than 25% of Internet users have broadband access, and this statistic does not include the nearly 50% of computer users who do not have Internet access at all. If the cloud computing concept ever hit home en masse, it would be detrimental to the vast, vast majority of computer users.
Also, no matter how you cut it, there is simply no way to make centralized processing work for gamers, whose demand would be that 120 images in 1920x1080 resolution be passed to their computer over the Internet, decompressed, and displayed on their monitors every second. No broadband connection exists that can carry that kind of data that quickly, and even then, the user is still utilizing most of his CPU's power to decompress and display data. Not to mention the fact that Time Warner is about to impose download caps to make money off of people who don't know any better. Cloud computing will likely screw over anybody with this type of connection plan.
Since cloud computing in the home means that a computer would require a broadband network connection, laptops would lose their primary purpose - portability. If I can't use my laptop without an Internet connection, then why even bother with one? If I can't go to a job site in Middle of Nowhere, Texas and access my data, what's the point in a laptop?
There are a ton of reasons that cloud computing is a really bad idea, but the funny thing about all of it is that cloud computing already exists on a much smaller and safer scale. I utilize this for myself. I run a Linux server out of my home with a dynamic DNS address that always points to my server, despite the dynamic IP address that my provider supplies me with. I run various forms of connection back to that server - it runs Apache, an FTP server, a Samba file share, and an SSH server - but the most important is X window forwarding over SSH. It's fast enough to use, if a little slower than actually being in the presence of my computer, but I can always access my hard drive and every program running on that server as long as I have an Internet connection. But even when I don't, my client computer can still run software and save data until such a point that I can transfer the data back to the server. There's little reason for an attacker to attack my computer because I am only one person who does not store sensitive info on my hard drive. It's much better to attack, say, Google, to gather user data from millions of accounts and years of usage. I can use my server from anywhere, and I definitely benefit from this in ways that I could never benefit from an actual cloud computing service like the one defined in the article.

Monday, March 30, 2009

Ramsey replied via Twitter regarding my post on cloud computing:
I think the new in cloud computing is the idea of a cloud OS that keeps most of the data intensive parts on a server.
this would in theory allow someone with an old computer or a netbook to run a fast effiecent OS. The cool part comes when they
starts shareing the load, as in all of the computers running that os make up the cloud in some small way. Speeds unimaginable!
My response follows:
When the data-intensive parts of computer operation reside on the Internet, everybody's speed slows down because...

Even if there's a remotely operating OS, my netbook still needs a client OS to connect to what that server OS is outputting. My netbook runs perfectly quickly with a decent Linux OS on it. Windows 7 supposedly runs pretty quickly, too. But I don't want to have to pay for a licence to two operating systems just to use one computer. Windows is too pricey already, and that's a huge reason why Linux netbooks sell. A Windows licence makes up 20% of the cost of a Windows netbook. It makes up more than 50% of the cost of a cheap-ass E-Machine. And you'd like to raise that even higher just so I can make my data vulnerable and out of my control?

If the data resides locally on my computer, then all of that data must be transferred across the net to get to a server where the data can be processed. Considering the generally atrocious upload speeds offered by most ISPs (and hopefully I'm not on some kind of cellular network doing this), my computing speed will decrease drastically, and that's the opposite of what you propose, and the opposite of the direction technology should be heading.

If the data resides on the net, on some server in the cloud, and it can be processed on the same server that my data and the data of hundreds of other people is being processed, then my data must wait until its place comes up in the queue of data processing. It would take some pretty powerful equipment to make that happen. I'm not saying it's impossible, just that I don't want to have to pay for that service when I'm perfectly capable of making it happen on my own.

And there are a plethora of reasons why cloud computing is either bad or not viable.

In 2008, 61.5 million users in the United States were connecting to the Internet via some type of broadband access. That sounds like a big number until you consider that 248.2 million users are online. That means that the rest of these computers connect via dial-up or some other slow connection. Translation: less than 25% of Internet users are connecting via broadband, and that's not even considering the number of people who are not online at all, but still use a computer at home. Generalized cloud computing is not viable, at least not in the sense that it's mandatory.

In the setup that I described in my previous post, my data is still being passed through the cloud of the Internet, but resides on hardware that I own and control. In the popular cloud computing concept, it's on hardware that I have zero control over, and which other folks' data is also residing. This makes it an immediate target for data miners who want my data. Nothing technological is impermeable, not even my setup. But with my setup, there is less likelihood of an attack because I am only one person. This is the same reason why Mac OS X, while proven to be less secure technologically than Windows, is still a safe alternative to Windows. There are fewer people to attack. A successful general attack for Windows will yield a higher success rate than the same type of attack for Mac. If, say, Google were to have a cloud computing service with millions of users, that's a better place to attack than my single-user setup on my personal hardware. Someone would practically need a personal vendetta to even bother with mine.

Even with broadband access, cloud computing is still slow. X forwarding is fast enough to use, but not fast enough to be my ideal method of using a computer. I use it because it's convenient, because I can manage a web server from my cell phone or from work. Because I can write documents or stream music or move my files around on my home computer even when I'm not there. But do I dare try to browse the web using a remotely operating browser? Do I try to do any amount of image editing remotely? Do I game? Of course not! Because even though my server has a 2 Mbps upload speed (roughly 250 KBps), that's still not fast enough to transfer that much visual data back and forth at a constant rate, much less all of my input. Cloud computing will take a very long time to be able to catch up to that.

We have a long way to go before cloud computing becomes a viable option. I'd rather just stick to a configuration where it's a possibility for me when I need it, my data is as secure as I make it, where I have total control over it, and where I do not try to exceed its limitations. I can't let my computer usage slow me down. When I'm thinking faster than my computer is, isn't my comuter made unneccessary?

Friday, March 27, 2009

Cloud Computing - It's the new buzzword that profiteering hopefuls like Google and Microsoft have been using lately to sell products and services on the Internet that they have not fully developed. Cloud computing is the concept that your data will be kept on some server or set thereof somewhere in the "cloud." The "cloud" is meant to mean "somewhere in the Internet where you have restricted access, but can use certain portions of on your personal computer, provided you can supply proper credentials."
An example of this that is already implemented: Google Docs. Also: Zoho. These are web-based pieces of software that let you work on various types of documents that you might ordinarily use MS Office or OpenOffice to work on. Your data is stored on their server, and when you log in, you can access the stuff that belongs to you. It can be edited, printed, exported locally, or imported from a local file.
This is pretty cool, and I've used both of these services before. Still, I have my reservations about agreeing to malleable terms of service and then storing whatever personal information online. It just seems like a really good opportunity for data miners. Nothing is uncrackable. You need only look as far as a decent computer security blog to see that what we perceive as impenetrable is really completely insecure when under the influence of the right people.
The other thing that bothers me is that, like "Web 2.0," the terminology has become a marketing buzzword meant to indicate something like "the way of the future." It's intended to arouse and excite. I have no doubt that some people jump onto this concept like nothing else when it becomes more popular simply because it matches a definition for "cloud computing" that they never had to begin with. And the truth of the matter is that cloud computing is not a new idea. It's been around for years. Offhand, I couldn't give you a number of years, but it's been in use for longer than I've been using it.
That's right. I've been using cloud computing for some time now, and it's a pretty cool thing. A friend of mine turned me on to a little thing called X forwarding. Perhaps this terminology requires some explanation.
I run Linux at home as my primary operating system. My particular distro uses the X window system, which is pretty common. X is what allows windows to appear on my desktop in various forms, in accordance with the description provided by whatever software is using those windows. X treats the relationship between computer and monitor as a server/client relationship. And it supports "server" forwarding. That is, it will send the graphical definition for what a window is supposed to look like to any other X server with proper credentials.
What this means I can do is be on my laptop, or be on the computer at work, or head over to a friend's house, or go to the library (assuming I can install some basic software on it), and run some free and open source software called XMing. It's an X server for Windows, Mac, and Linux. Then I establish a connection between that computer and my server through SSH (easily done with another FOSS app called Putty), and then I type the name of the program that I want to bring across the Internet. Voila! I am running a program on a remote processor, accessing my remote drives, and working with my own remote data. I have cloud computing.
The best thing? All FOSS. Doesn't cost me a thing. Microsoft is talking about introducing a paid-for service like this called Azure, and while I haven't done extensive research into it, it seems likely and plausible that you'll be locked into their software when using it. That is to say, the whole reason "cloud computing" has become a marketing buzzword is that it really does have profit potential. If Microsoft or Google can get you paying for their specialized "cloud" service, then you get stuck into using their software, fearing transistion into another format or service.
I'm glad I already operate more or less on the fringe of this stuff.
Oh, and I know what most people will think when they read this. It's too difficult for Joe the Plumber to use. Well, it might currently be too difficult for Joe to set up for the first time, but if someone were to set this up properly for Joe to begin with (for a small fee, I'm sure), he could use it just fine.

Tuesday, February 3, 2009

I'm pissed. And not just kinda. Like, really really pissed.
Microsoft has recently put forth a series of advertisements in promotion of Windows Vista, their latest stodgy, crippled, overpriced operating system, in which they trick innocent people (possibly actors) into thinking that the operating system they're using on a demo computer is the "brand new Windows operating system, Windows Mojave." They've set up a website as well (MojaveExperiment.com) to spread the word. Then comes the big reveal where they tell their victims that it's not really this "Mojave" thing, but Windows Vista. It's all very carefully planned and misleading. Some evidence of that follows. For the record, I had to view several videos to gather all of the information that is here. I actually had to work to find out all the relevant information. All the data is not present at any one single point, and I imagine this is all part of their need to trick folks into trying the operating system.
One video on the site says that the laptops they're using for the Mojave tests are "brand new, straight out of the box and into the hands of the users." Another video says that the laptops are actually these two Microsoft employees' work computers, and are at least a year old. A third agrees that they're a year old, but claims that the laptops are really the employees' personal computers. It seems the only consistency here is inconsistency. And don't tell me that two computer geeks who have owned a computer running Vista for over a year and use it for work haven't modified the software in all that time. Assuming it came preinstalled with Vista (and it did), I should hope that they've at least installed Service Pack 1 since then. That's miles better than it used to be. Also, I see a strange lack of desktop icons that should be there on a brand new, bloated as hell, crammed to the brim with advertisements, new HP computer. Also the desktop background's different. And I'm sure they've installed MS Office and a few other programs since Vista comes with absolutely no software that anybody could use to do any kind of job, especially one within Microsoft. So there's no way you're going to get me to believe that these computers have not had any customization over the past full year. Hell, you can barely convince me that they've been running for a full year without needing an OS reinstall.
They also say that they're not "special" laptops, just HP dv2000 series models running Intel Core 2 Duos at 2.2 GHz with 2 GB of RAM and an NVidia GForce 8400 video processor. This laptop (when new, and there aren't any more new since they are, in fact, over a year old) cost about $1700. That's a pretty special laptop, especially considering that most affordable laptops still don't even have dual-core or 64-bit processors in them, and usually 1 GB of RAM or less. I would go as far as to say that the laptop they use in these videos is at least twice as powerful than your average affordable laptop. The video card in that puppy is one model number lower than what I have in my desktop to play some pretty high-end video games. It's not the best thing on the market for desktops, but it's pretty close to the best thing on the market for laptops. There's nothing that this laptop shouldn't be able to do. I have run web and file servers on less.
Another video talks about security. If we are to believe the video, Windows Defender makes Vista "60% less likely to be infected by a virus," which, in my personal experience, is untrue. In my personal experience, Windows Defender does nothing at all. The official Microsoft statistics for it show that 22 million pieces of spyware were detected by Defender during its trial run under Windows XP, and that 14 million of those were removed. That's where the 60% statistic comes from (it's actually 63%). However, we are not told from these statistics (as it is impossible to tell us) how many pieces of spyware it did not detect at all. Also, we're told that you're 60% less likely to get infected, which is not the truth. What this shows is that you are 100% as likely to get infected, but 63% of it will be removed. A 63% removal ratio is not a good ratio. Not good at all. If I have one piece of spyware on my computer, I want it gone. I don't want to rely on a nearly fifty/fifty chance of it actually being removed.
Another demo shows us that programs can be run in a Compatibility Mode. This is a counter to users' complaints that Vista is not compatible with a great deal of software and drivers. However, we all remember XP's Compatibility Mode and how it never worked. In my personal experience, the one in Vista is no better. I am at a loss to find anything other than how-to articles related to the subject, so I have no third-party opinion to share with you on the subject of Vista's Compatibility Mode. The demo in the video shows the experimenter running what appears to be a Bluetooth application in Compatibility Mode, though we are never shown that it didn't run in Normal Mode first. We have no proof from Microsoft that this is a valid test.
The "organization" video comes with a notice from the gentleman on the right-hand side that these computers are "definitely, definitely not top-of-the-line." Then the guy on the left lists the amazing specs of the computer. Then the guy on the right reiterates that it's "definitely not top-of-the-line." They tell you that this computer can be purchased for $650 to $700. This, of course, is not a lie. You can buy that computer at that price because that computer is now a year old and has no warranty left on it. A new one, as I have said, will run you well over $1000. This video emphasizes the Start menu search function, which I'll admit is a pretty cool feature. Too bad they stole it straight off Mac, but that's not the point. The point is they've actually implemented a cool feature. Not to belittle it or anything, but their method of demonstrating it here is somewhat flawed. It works like this --

Left clicks on the Start menu, types "calc" and up comes the calculator at the top of the Start Menu.

LEFT: See? You don't even have to type the whole thing!

The trouble with this exchange is that the filename for the calculator application is, in fact, calc.exe. So when you type "calc" into the start bar, it's finding calc.exe, not necessarily the term "Calculator." The same affect can be achieved by clicking Start -> Run and typing "calc" in Windows 95, 98, 2000, NT, XP, and even Vista. Though this demonstration is flawed, I still will admit to the usefulness of the Start menu search function. It really does work pretty well, but only if you have the Windows File Indexing service on all the time, which can be taxing on the proc and memory of your computer. It's something I ordinarily turn off because, well, I know where I keep my files. I have organization and don't rely on my computer for such.
There are other new features involved, and you can watch all these demos at the website. There are some really cool things that Vista can do, but the long and short of it is that it's way too heavy on your hardware to be considered a useful operating system. For instance, when Vista creates thumbnail images for pictures on your hard drive or thumb drive or whatever storage medium you've chosen, it keeps the full-size image in memory, then performs a shrink command on it, then displays the shrunken image in the explorer window, keeping the larger image in memory. This created a problem for me when I was looking through photos taken with a professional camera. Each image occupied 15-20 MB depending on the color range in the picture. Instead of taking the images one at a time, shrinking them, then keeping the shrunken version in memory, it tried to load several hundred 15-20 MB images in memory at once. 1GB of memory couldn't hold it all. It fell back on Page File. I had to force reboot the PC to get out of the function due to the interconnected design of the OS and double my computer's memory just to be able to browse.
My point is that you should go ahead and try Vista if you want, but for God's sake don't pay for it first. Microsoft is running an extremely dishonest advertisement to overcome a lot of their software's completely valid detractions. They're not fixing much because to do so would involve writing an entirely new OS, and they'll be damned if they'll do that and not ask for another $200. It's bad enough that they're asking for that right now for an OS that is inferior in many ways to a great deal of free-of-charge operating systems. Just see my link list at the top-right for more information on this type of stuff.

Friday, January 16, 2009

Fernando and I discussed operating system market share the other day, and I mentioned something mentioned in this article: that Linux users don't care if the Linux operating system, (speaking collectively of all distributions thereof) becomes widespread. To quote Dominic Humphreys, the author of the article:

Linux is not interested in market share. Linux does not have customers. Linux does not have shareholders, or a responsibility to the bottom line. Linux was not created to make money. Linux does not have the goal of being the most popular and widespread OS on the planet.
All the Linux community wants is to create a really good, fully-featured, free operating system. If that results in Linux becoming a hugely popular OS, then that's great. If that results in Linux having the most intuitive, user-friendly interface ever created, then that's great. If that results in Linux becoming the basis of a multi-billion dollar industry, then that's great.
It's great, but it's not the point. The point is to make Linux the best OS that the community is capable of making. Not for other people: For itself. The oh-so-common threats of "Linux will never take over the desktop unless it does such-and-such" are simply irrelevant: The Linux community isn't trying to take over the desktop. They really don't care if it gets good enough to make it onto your desktop, so long as it stays good enough to remain on theirs. The highly-vocal MS-haters, pro-Linux zealots, and money-making FOSS purveyors might be loud, but they're still minorities.

There's some truth in what he says, but Nando, like always, made me see it from a different perspective. Truth be told, ask any one of us Linux users why we use Linux instead of Windows, and we'll get all high and mighty. What? Me use Windows? Use a stodgy piece of software with almost no security where I have to pay hundreds of dollars every few years in order to keep my computer from being a useless pile of garbage and end up buying a new computer every time I do just because Microsoft's operating system can't run on medium-end hardware? Pay oodles of money for expensive software that matches no standard and for which I have to pay extra money for basic support? No way! I'm tired of dealing with it!
But it's this last sentiment that proves inaccurate the statements made by Mr. Humphries. I'm not trying to belittle the article or call it wrong or anything. In fact, this article is usually the first thing I give to somebody who shows any interest in Linux because it's a great preparation for someone about to make the switch. It's just that we like to complain about Microsoft and how they're all over the place and how those of us who are the official family and neighborhood computer technicians end up supporting broken software constantly.
The hypocracy here is that we'll complain about the forced ubiquity of Windows (and enough's been said about the Microsoft tax, so I won't go into that rant here), and we'll make the argument that Linux is the ultimate fix for Windows, but (at least in Dominic Humphreys' world) we supposedly don't care about Linux's market share.
If we do not care about market share, then we are content to sit in the background screaming about the Linux solution and doing nothing to enact it while Microsoft continues to write shitty software and install it across the globe, laughing all the way to the bank. And the reason they're laughing?
Because they didn't even have to try.
Competition breeds innovation. The closest thing to competition that Microsoft has is Apple, but Apple does not count as true competition. Apple does not sell a run-anywhere-on-anything, just-add-drivers operating system. They sell hardware that looks pretty that happens to have an operating system on it. And if that operating system happens to be so easy to use that my grandma can use it, then let Apple sell five-hundred-dollar hardware to my grandma for two thousand dollars. But Mac OSX won't run on anything, only the hardware it comes on. Windows will run on a lot of different hardware, and is therefore a fundamentally different product, a product much like Linux. Linux is perhaps the only hope at creating competition with Microsoft, and in many ways, Linux has the advantage.
Linux is usually free of cost. It doesn't have to be, not by its own philosophy, but you certainly don't have to look far to find a free version. If it's something for your average Joe to use, you really only need to look at something like Ubuntu or one of its derivatives like Linux Mint. So it's not just a matter of how much cheaper it is than Windows, it's a matter of it not costing anything at all.
Basic support is free. Forums are all over the Internet. IRC chat rooms are available. It's a community effort and within the community is where you'll find the best help. You're getting responses from the same people who wrote the software in the first place, not some schmoe in India whose name is most definitely not Steve, plus you're not getting charged a single penny for any of it.
Desktop effects, security, and tons of free software make it an appealing choice. Every time I show somebody my desktop cube or my rain of fire when windows close or my ability to remotely control my computer in extremely granular detail, they are blown away. When I show them that my laptop can hop onto the closest wireless network in two clicks instead of countless menus, from a single drop-down box in my tray instead of several enormous windows, they say, "I wish it were that easy on my computer." My computer also doesn't slow down over time, fragment files, become unstable, crash, catch viruses or spyware, and I never have to perform regular maintenance to be able to make these claims. People like that. They like that they don't have to work for their computer.
So we should care about market share, because if we do, we can force Microsoft's hand. I would be stunned, but not offended, if Microsoft produced a quality operating system, or a browser that doesn't fail miserably when trying to conform to web standards, or a mail client that uses less than 1.5 gigs of hard drive space to store 137 megs of email. In fact, I would be thrilled were that the case. But Microsoft has no real competition, so they have no real reason to do these things. If Microsoft lost significant profit due to the disintegration of their market share, that would mean that enough people are using Linux to make any kind of FUD marketing Microsoft could do useless, and they would be forced into writing better software. I might be tempted to use a Microsoft operating system primarily if it could provide a significant benefit over my current Linux configuration. I'm one of many customers Microsoft could attempt to win back.
Linux also shows potential. It's been around since 1991, which is pretty impressive when you consider how small its user base is, and that means that its user base is strong and emphatic. The market share for desktop versions of Linux (non-server) has doubled in the past year, and that means the market share is growing. Granted, it's grown from one percent and a half to about three percent, but it's expected to grow to nearly ten percent soon. This is largely because of Linux's ability to run quickly on low-end hardware. Linux has become a popular choice for those "netbook" things, which are insanely popular. Already, Microsoft has been forced to react to this by making the upcoming Windows 7 run on less hardware. Linux still has the advantage here, though, because the customer buying the netbook (typically designed to be inexpensive) isn't having to pay for the operating system. I'd say that Windows has the advantage of popular software, except that they don't. My suspicion is that Windows 7 will have just as many operational problems and software backward compatibility problems that Vista had, while Linux has and will remain less vulnerable to those.
Contrary to popular Linux belief, Linux users do need to help populate the operating system if they care about the future of technology. Like I said before, competition begets innovation, and Microsoft is apparently already feeling the heat.