Posted
by
timothy
on Saturday September 05, 2009 @03:26PM
from the only-barely-worth-refuting dept.

DesiVideoGamer writes "Over at Overclock.net, a user has posted screen-shots from Microsoft's 'ExpertZone' training course entitled 'Linux vs. Windows 7.' This course is available to BestBuy employees and will make them eligible for a $10 copy of Windows 7 upon completion." The screenshots linked show at least some creative interpretations of the state of Linux vs. Windows on a wide range of things, from media playback and video conferencing to ease of updates to (of all things) keeping your PCs "safer." Most of the claims, though, aren't concrete enough to be perfectly refuted. Writes DesiVideoGamer, "I think I now know why, when I enter BestBuy, the employees say the odd lies that they do."

No profit in free. What's the cheapest way of getting Windows 7? Buy a new computer. Who sells new computers? BestBuy. How do you get Linux? You download it for free and install it on your existing computer. Who doesn't sell you a new computer? BestBuy.

Who is to say which OS is safer for example? It entirely depends on what metric you use to measure it

Like, say, which is more prone to being part of a trojan-infected zombie botnet scamming info for identity fraud and/or spreading spam?

I don't blame Microsoft for selling their products. That is what a software company SHOULD do.

If they can't sell their product without bullshitting (or at least keeping it to a tasteful minimum), isn't that a condemnation of their own product?

The only reason these are "stories" is because people [incorrectly] feel Linux is a community effort...

Actually, they are stories because this is an attempt to bullshit people, and people hate being bullshitted. People on slashdot especially hate seeing people who might not know any better being bullshitted by a cynical, self-serving marketing group. I don't mean to absolve other tech companies (most, if not all, do the same or similar), but Microsoft has long occupied a special place in tech history as one of the most blatant bullshit-marketing organizations ever. I personally have been involved in tech distribution for about 15 years, and no other vendor comes close to their level of arrogance or deceit. I've been to an RSA conference where Microsoft astroturfed a whole session that was promoted as a balanced and impartial hack-off, but instead was a scripted Windows lovefest. I've seen Microsoft flat-out lie to peoples' faces. I've seen them ship free product to people who didn't order it to inflate their "install base" of a particular item.

These are stories because in an industry saturated with kool-aid and known for marketing gross exaggerations and lies, Microsoft stands out as the worst.

That's pretty clear. I've noticed it on Reddit a good bit too. At least there though, I don't think the administration has been bought out. After that Blizzard fanboy piece they ran the other week, Slashdot has me more than a little concerned.

"Wait until you have a problem with a library when you're making/installing an app, and it's a library that generally needs to be an older version (why wasn't it backwards compatible? who knows?), which is impossible because you have other apps that require the newer version."

I'm not about to wait until hell freezes over. You see, in Linux, it is entirely common to have multiple different versions of shared libraries and they all coexist just fine. Every single point you made in every post in this thread was blatantly wrong, and shows that you are either a complete moron, haven't tried to use a decent Linux distribution in years, or are a straight M$ shill. Since you are offering up blatant lies about both Linux and Windows, and they all favor the virusOS, I'm betting on the latter.

It's a nice idea, it's a bit better than floating rpms or debs but I don't know if I'm entirely sold on it. It's been around for a few years and I haven't seen any of the other distros picking it up, so that might say something about it. There is apturl [ubuntu.com] but I know I've never used it or even had the opportunity.

Is it really any different clicking on an apt-url from some unfamiliar website?

Yes.

An apt-url is a URL to a package that is already in the repository, which means at least there's some minimal assurance that someone associated with Ubuntu or Canonical has looked at it, and some strong assurance that it hasn't been modified since then.

What about legitimate authors who didn't jump through the hoops to get their software into all the various repositories and packaging systems? Should they not be trusted simply because they didn't sign up with Canonical?

Pretty much. Ubuntu is community-maintained, and it's really not that difficult to at least get into the "universe" repository.

Of course, unlike the iPhone, it's always possible for the user to go around this, and I agree, it could be made easier than it is -- but it is pretty easy. I gave an example somewhere here involving Chrome -- it's a deb which sets up a repository as it's installed.

Now, you could make the case that downloading a random deb from the Internet is just as bad as downloading a random exe. But the point is, the capability is there if it's needed -- and most often, it makes much more sense to use something from an already-trusted repository.

All those -dev packages? Those are libraries that need to be on your Linux box in order to make the program. If, for some reason, the program you are trying to install cannot use a newer version of a library and you need that library for other software, you're quite thoroughly fucked.

Well, no, it's just gotten harder. Generally, you can pass an argument to./configure to tell it where to find headers, and there can easily be multiple versions of a library installed -- that's why the.so files have version numbers on them.

I also can't remember the last time that happened to me.

MSI and DMG files have dependancies built in.

This has two major disadvantages:

First, it's just extra space, bandwidth, RAM, and cache. That last one is still not cheap.

And second, it makes it difficult to globally patch a vulnerability in such a library. Example: OpenSSL is likely to be used in a number of packages. The only way to patch it in all packages on, say, OS X, is to not bundle it and pray Apple includes it in the OS -- then it'll be handled by Software Update.

Updates are a single extension file (with MSI files anyway).

What does "extension file" mean, in this case?

And where is the option for me to tell my system to update all MSI-based apps I have installed?

I'm fairly sure this does not exist for OS X apps.

What would you suggest for me now? Got any more commands that will magically fix a retarded install method?

Contact the maintainer?

Your complaint here is that it doesn't work with newer libraries. But this is, again, similar to expecting a Win98 app to work in Windows 7. It often will, but sometimes it doesn't. Even when Microsoft pours blood, sweat, and tears into backwards compatibility, it's often working around a bug in the original app.

It's also worth mentioning (again) that the blame for this falls squarely on the shoulders of the app developer. Just as you would scream bloody murder if a Windows app required you to download Visual Studio Express and compile from source -- but you wouldn't be blaming Microsoft.

Of course it's not The One Thing(TM) holding Linux back, you'd have to be an idiot to think that. But you've got to have your blinders on pretty tight not to see that it would definitely improve the situation for Linux on the desktop.

I agree that the situation could be improved.

However, the methods most people suggest amount to "Copy Windows" or "Copy OS X", which would be a serious regression in many wa

No piece of hardware ever comes with Linux drivers. Maybe a few barely-supported things a decade ago, but not any recent stuff.

This is because, unlike Windows, Linux doesn't expect hardware manufacturers to make their own shitty drivers that crash all the time because they're a hardware company and don't know how to write software.

Something like 90% of Windows blue screens post-Windows 98 are because of third-party hardware drivers. XP onwards stopped applications from being able to crash Windows, but there's not a damn thing it can do about shitty drivers. Now they have this 'certification' thing that works somewhat, but hardware companies are not software companies, and still cannot write good software.

This is why Linux drivers come with the kernel, and why kernel developers write them. Of course, the company is free to write their own and submit it to the kernel devs, but that's the distribution point, not some driver CD.

There is nothing stopping hardware manufacturers from saying, in the requirements, 'Linux kernel 2.6.4 or greater', and many, of course, actually do.

In fact, Linux is basically the only OS that you can be sure hardware devices that worked on a version of it in 2000 still work on modern version, which makes your entire premise absurdly idiotic. Linux may sometimes suffer by not having the absolutely newest hardware support, but it has about 10x the backwards compatibility that Windows has. The devices that used to be supported under Linux but are not anymore are probably countable on two hands, whereas there's plenty of XP stuff out there that will never get signed Vista drivers, just like there was plenty of stuff under 98 that never got XP drivers.

This is because the company is in charge of updating them, and they don't give a flying fuck about supporting hardware they don't sell anymore. In fact, they'd rather that old hardware didn't work, because they've got some new stuff to sell you. Whereas with Linux, the kernel people are in charge of keeping the driver updated, and hardware will only stop working if some kernel APIs change enough to break it and no one bothers fix it so it gets removed. (Recently, Linux lost the ability, as it redid its entire IDE/PATA/SATA/SCSI support to be in one unified driver, to read MFM hard drives. Aka, pre-IDE. No one seemed to mind.)

It's somewhat hilarious to hear anyone talk about a 'kernel ABI' on Linux. Man, the Windows kernel ABI and API changes every release, making all hardware manufacturers update, or not, their drivers. Whereas 99% of Linux drivers are already in the kernel, and just change along with it and keep working. It's only the companies that insist on releasing their own drivers that have problems.

Now, WRT to software ABI, there's a valid concern. Or, at least, it was. A long time ago. Nowadays it's trivially easy to release commercial software for Linux that works fine. You put an install script on a CD, you have that either use the package manager (either dpkg or rpm, you can include both on the CD and use whichever one the OS is) or you don't bother with that and just put it in it's own/opt/ directory. Then you stick icons in the right place for Gnome and KDE to pick them up.

If the libaries it needs aren't found, you can install your own, either compat libs for the entire OS, or just in your own directory.

Anyone who can't package software for Linux and have it work on any full-fledged Linux distro made in the last five years shouldn't be writing software.

Linux ABI is stable enough that Quake 3 runs on any current x86 Linux box

Quake 3 is a game. It is an application. It is not affected by the unstable Linux ABI.

The unstable Linux ABI means that if a hardware company is going to make a driver for Linux, then it has to update it frequently, and that new drivers may not work in old kernels, and old drivers may not work in new kernels. It is a serious problem, and is partially responsible for generally poor hardware support from device manufacturers.

http://ixnotes.wordpress.com/2009/09/06/microsoft-propaganda-handed-out-to-staples-employees/ [wordpress.com]
I thought while we're talking about it I'd post these images of Microsoft's propaganda they've been distributing to Staples employees.
Numerous lies like greater compatibility than GNU/Linux-when most of the older hardware won't work with MS Windows Vista. GNU/Linux is compatible with more hardware than any operating system in history. It may not work with some of the latest and greatest-but for the most part it works better. I don't spend 3 hours fiddling with installing my printer drivers. I plug it in- and it just appears as an option in whatever program I need to print with.
The learning curve for GNU/Linux is generally not as high as it is for MS Windows Vista. Unlike what they claim MS Vista and MS Office 2007 software which customers would buy if they got Vista is more cumbersome, has a reduced feature set, is slow, lacks important features like PDF support, and so on.
GNU/Linux has better support generally than MS Windows. GNU/Linux supports stuff out of the box whereas with MS Windows users hand to install lots of bloated software, drivers, and waste time figuring out how to use it. GNU/Linux on the other hand can generally be had without such support headaches. Once you're introduced to shut down, applications menu, saving in different formats, and exporting to PDF it is just simpler.
Getting devices to work in MS Windows can require modification/and or troubleshooting. Hardware rarely works out of the box.
Microsoft want's you to believe that GNU/Linux netbooks have a higher return rate. The fact is that some manufacturers screwed up their GNU/Linux introductions to customers and their particular return rates were higher. Overall GNU/Linux is on par with MS Windows.

Sorry, but I have to laugh. You really should take off your glasses with one lense tinted rosey, and they other completely blocked out. Real linux software doesn't "just work", especially hardware.

The subset of 'software' that is 'hardware' is pretty small. In fact, no software is hardware.

Actual software, however, works fine. Companies don't have any problems producing commercial software that will work on any distribution.

And as for actual hardware (Not software that is hardware, whatever that is.), it works just fine. As I said, Linux sometimes doesn't support the newest hardware. You have to check before you buy.

OTOH, it supports a heck of a lot older hardware, stuff there will never be a Vista or Windows 7 or any 64-bit Windows driver for. And Linux will eventually get drivers for new stuff, whereas those OSes won't for the old stuff. (As the hardware manufacturers have no incentive to make drivers for hardware they don't sell anymore.)

Again, pretty funny. Sorry, but even open source companies can't get stuff to install consistently across even the most popular versions of linux without resorting to custom install work.

Erm...did you just argue that companies need to use installers to install things? Also, Linux companies need to resort to CD burners to burn CDs! Why didn't I mention that?!

Um, yeah. You need write an 'installer', or set up a third party one, to, you know, install. As opposed to Windows, where you need to, um, do the same thing. Hrm.

A lot of the stuff you'd have to set up in the installer for a Windows program, you simply set it up in the package, and it's all handled as part of the OS instead of the installer having stub programs it installs to, for example, uninstall the program. So all you really need is a fancy dialog box saying 'Install blah? Yes/No' and, if Yes, run a single command based on the distribution.

I.e, instead of a myriad of third party installers like in Windows that handle a bunch of stuff automatically, Linux distros come with that built in, for free, and it's got more features like 'automatic updates' also.

Granted, there are two competing ones of those, the 'rpm' system and the 'deb' system, but it is trivially easy to convert a package back and forth, or make it for both, or, heck, there are tools that let you directly install one sort of package under the other system.

Of course, who knows what's happening in your imaginary universe, where the instructions to install Linux software probably start with 'mount your CD', which you probably think Linux users still do.

I can't recall where i found this (in a forum somewhere), but this is the entirety of the "script" i use to launch skype (it would be easy enough from command line, but i like a desktop icon to click):

#!/bin/sh
LD_PRELOAD=/usr/lib/libv4l/v4l1compat.so skype

this is the only way I can get skype to work right, and it does the job for my cheapo EZonics III webcam.

Your point seems to be that you can't be sure particular hardware will work with Linux. I haven't used Windows for several years so I can't comment on hardware issues with XP/Vista/W7, but I do know that on the 5 laptop/desktop computers in my household, every one "just works" with Ubuntu. Not a single hardware issue - not with a just-released printer/scanner from a supplier not known for their Linux support; not with the no-name PCMCIA wifi card one older laptop uses; or any of the built-in wifi adaptors.

I don't have access to unbiased datasets on this issue (I suspect that no-one does), but from my personal experience, this is a non-issue.

Sure they do. When the driver is distributed sepately and not integrated in the kernel, it's a PAIN to keep the driver when 'yum updating' your kernel, you have to go grab kernel source again and recompile the driver.

It takes 6+ hours of extra work, just because you can't simply use the existing driver binary with the new kernel.

Moreover, most major hardware vendors aren't willing and don't want to distribute drivers as open source. For various reasons; mainly for support concerns, 3rd party licensed code, OEM'ed parts in the hardware, and proprietary hw details source would reveal.

Many of them deliver binary drivers that only work with specific kernels.

Or they deliver a driver, and you have to compile a special 'wrapper' kernel module to load the driver.

Again, you've got to spend the 3-6 hours of extra work every time you 'yum update' your kernel, you've got to manually go get the kernel sources, prepare a build environment, and compile the module against the new kernel, before it will even be willing to load the driver.

Btw, all this reflects extremely negatively on the Linux kernel
and strongly discourages hardware vendors from trying to support it.

I like how it is currently. Once a driver goes in the kernel, it remans there. So long anybody is interested, it remains maintained. My cheap, ancient Quickcam Pro remains supported in the latest Linux kernel. In comparison, support is inexistent in 64 bit Windows versions, and there's nothing to do but buying a new webcam.

This is again why I prefer the Linux way. In Linux the user's interest drives the development. In Windows, it's the manufacturer, who has an economic incentive not to support hardware from 10 years ago, so that you need to buy a new product.

Although the first sentence is true, the last is certainly not true - even if the more expensive product has a lower markup in proportion to cost, it may still have a higher margin.

It may have more gross profit dollars, but the margin will be lower. Margin is a ratio. Specifically, it is gross profits divided by revenue. The gross profit dollars on a large item may be larger than the gross profit dollars on a small item with a larger markup, but the margin itself will be lower.

I might make more GP dollars on the big-ticket items, but I for every dollar I invest in big-ticket inventory, I make less profit back as a percentage of my investment. Items with lower cost but higher markup (like cables and service plans) help bring the overall margin% on the deal up.

If Company A needs overall margins of 6% to survive, but competition on its big-ticket items drives margins down below that, the only way they can afford to justify selling that item and stay in business is if the sale of the big-ticket item drags along enough add-on sales to bring the overall margin to an acceptable level.

In computer distribution, I've often seen Windows sold or even below cost, with back-end rebates making the disty partner almost whole at the end. The Windows 95 launch had some shenanigans like that.

That is only true if the product as a whole is ignored and each individual component is analyzed. Taken as a whole, the laptop or netbook, if the cost of providing an operating system is $0 compared to $99, then the inverse is true.

We aren't buying components off the shelf and assembling the devices at home. We shouldn't look at the costs that way when it's offered as a complete package.