Posted
by
kdawson
on Tuesday January 19, 2010 @11:03PM
from the of-by-and-for dept.

crimperman writes "The Open-PC project has announced that its first PC will be available at the end of February for €359. They claim the mini-ITX desktop machine is energy efficient, consumer ready, easy to upgrade, and — significantly — uses only hardware that has free software drivers available. As you'd expect, it comes with GNU/Linux which is running KDE (a €10 donation to the KDE project in included in the price). Interestingly all the key decisions on design, pricing etc. have been made by the community via online polls. The spec of the machine is pretty reasonable for the price: Atom 1.6GHz dual-core processor, 3GB RAM, 160GB HDD, Intel 950 graphics."

Or run Linux natively. I have a slightly dated 24" iMac with an ATI Radeon GPU. I ran OS X for a few days and then got frustrated with the limited and over-intrusive UI, and with the tediousness of dealing with the various software ports projects. (The latter aren't awful, and I don't mean to disparage the people working on this, but it's nothing like just having yum or apt-get already there and just waiting to install thousands of excellent free packages.)

So I installed rEFIt [sourceforge.net], and shrunk OS X down to a tiny partition I never boot into. Instead, I run Fedora 12 [fedoraproject.org] with all open source / free software drivers, including sound and 3D-accelerated video. (I think maybe the webcam doesn't work, but I don't really care.) Definitely the nicest Linux workstation I've ever had.

So for 190 Euro more, you get OS X, a much faster, 64-bit, virtualisation-capable CPU, and a real GPU with dual display support, but lose 1GB RAM. I see no mention of I/O on the OpenPC, either - the Mac Mini has USB ports for days and FireWire 800.

Nice idea, and I keep wondering why Ubuntu doesn't do this, in an "it's up to you" option deal how to go about things. Normal distro, then take your chances on whatever hardware you got, or, something they can make money at, a set of a small variety of competitively priced machines-netbook, notebook, desktop, server- that they sell, that their main devs, for at least the long term releases, do absolute testing on so that everything "just works" 100% guaranteed, along with recommended peripherals.

Sort of like the apple model of matched software and hardware, *but* with the distinction of no hissy fits from the company about using other hardware, either. Buy their gear, with their software preinstalled, you get priority warranty and useability support. Buy or build your idea/choice of hardware, you get such support as exists today, which is hit or miss, go lurk on the forum if you have any problems.

This raises an interesting question - whether a PC like this, which purports to use hardware that is fully documented, is sufficiently "free" for every possible scenario. A "more free" approach would be to use "open source hardware" (insofar as is legally possible, I believe things like GPS hardware have disclosure limits imposed by the legal system). By "open source", I'm referring to hardware that includes not only API documentation but hardware descriptions usable for chip production - things like OpenSparc and the OpenGraphics card. I doubt there are enough such pieces to form a fully functional PC (particularly when it comes to things like monitors) but for the sake of argument let's assume there are.

In theory, of course, the fewer restrictions on any IP related to making the computer work the better, but in practice modern PC hardware is not something that can be realistically produced (at least today) by any hobbyist. The physical hardware also doesn't benefit from the "cheap copy" properties of software, so the in-depth knowledge of how to make the hardware is hard to apply even when present. Also, such designs are (to my knowledge without exception, at least in the PC hardware arena) well behind the maximally performing hardware developed in non-open contexts. So the price to pay for full hardware knowledge is quite steep in terms of performance. The only real end-user applicable argument is that full hardware knowledge means the potential for better software support.

So a question for those in the open hardware community - is there potential for driver development using information of the kind available from OpenSparc and OpenGraphics to develop better performing drivers than can be achieved with the information (say) considered sufficient to permit inclusion of hardware in a product like the one in this article? If not, are there any other benefits (aside from the admittedly non-trivial one of being able to learn anything you want to about your computer) to an "open source" hardware platform?

I have to wonder why the 950 rather than something a little newer... My laptop's a year old and has a 4500 MHD, which was equivalent to a low end nVidia or ATI card from a year prior, and can do h264 hardware decoding.

All the good folk who are say they can "get a better machine, for less, and it's even got Windoze installed!" just totally miss the point.

There are many people out there like me who'd happily pay EXTRA to get a machine that is completely free of Micro$oft or Apple, and doesn't count as a sale for either of them. I will not contribute to either of these organisations in any way.

Only on/. would an above comment [slashdot.org] be marked Flamebait (when it is essentially correct with one of the major problems in Linux) and this comment can slide.

As far as difficulty and problems go, Windows has been at the bottom of the list. Unless the hardware or CD/DVD is damaged it works fine. Windows 7 was the fastest, cleanest install I've ever done on a friend's computer and worked pretty great compared to older Windows OS stock installs.

I don't have a spare rig to try to learn how to toy around with any of the major flavors of Linux, but even assuming if the install is as easy as installing Windows there's the issues we all know about. This particular model of video card has glitchy drivers, this particular printer doesn't work at all, etc. At least you don't see that happen as often with Windows.

I think you should try installing windows more often then. It is not exactly "click-click-done" either. After you install the "Operative System", you have to install all the drivers (IF they exist at all; I remind you that lots of 5-year old hardware actually don't have drivers for Vista).

The FIRST thing Linux guys say when you mention you are having problems with a distro, is 'You should install the latest release.' - The EXACT same thing is true of Windows. Windows 7 IS click-click-done. In fact I'll step through them here for you:

That's it. On almost all hardware built in the past 5 years. I used to manage corporate deployments of Windows, so I'm the last person to say that Windows is easy to deploy/configure (it wasn't), but Windows 7 is a massive leap forwards, even over Vista.

And barring Office, Windows 7 ships with everything the average user needs (Internet/Photos/Media/Burning).

I'm all for people promoting their preferences of OS, but lets at least start with a level playing field shall we ?

I personally think you are mixing up task friendly with user friendly which are two completely different things. Linux is task friendly, it doesn't care if you use a GUI or CLI, it doesn't care if you string together huge amounts of commands from separate programs, if the desktop is here, there, or even exists at all-Just like the old batch running mainframes of old Linux will happily run whatever. Which makes sense, as it was based on Unix, the classic big iron mainframe OS.

Windows and Mac OSX are user friendly, in that you can spend your entire life never seeing a CLI interface, the GUI is designed for the "hunt and peck" type of exploration, with lots of icons and wizards and GUI menus. Linux guys don't care for it because it is NOT designed to be Task friendly, although with Powershell Windows is getting better in that regard, but in the end expecting Windows and OSX to be task friendly is like expecting a laptop to behave like a mainframe-they are simply two different beasts.

In the end it comes down to taste and experince. The problem IMHO that Linux guys have with adoption is because they like task friendly and find it a more powerful way to work, they assume that others will too, and that simply isn't the case. The average Windows user, which I come into contact with in my little shop every day, won't even go near control panel in Windows because they find it "too powerful" and they are scared they will "break something". To those type of people, which is the vast majority of modern computer users, ANY CLI is simply too much. Task friendly will simply never ever work for them, because they don't think or behave in a task oriented manner.

But since Linux is written BY geeks and FOR geeks, with a much higher than average IT degree holding client base I just don't foresee Linux ever changing from a task friendly to a user friendly design. It would simply change too much of the underpinnings, and Linux users like their CLI interfaces too much to give them up. Which is fine, hammers and screwdrivers and all that, but don't think because YOU like task oriented and are comfortable thinking in a task oriented mindset that you can convert the majority away from a user oriented mindset, because it just ain't happening.

I've got news for all of you: we like our OSes because they're simple and functional,

That's news to me. I kicked Windows at home because I started getting migraines from clenching my jaw too hard every time I worked on my computer for extended periods. I clench my jaw when I get frustrated at things not working. So far, I've learned an important lesson in patience with Ubuntu, but I've yet to get the literal headaches that Microsoft gave me. Incidentally, my Xbox is starting to move towards that jaw clenching experience as well.