Open source 3D graphics drivers for ATI R600 garphics cards has been submitted to the kernel-next tree for possible inclusion in the Linux kernel 2.6.32. "David Airlie has pushed a horde of new code into his drm-next Git tree, which is what will get pulled into the Linux 2.6.32 kernel once the merge window is open. Most prominently, this new DRM code brings support for kernel mode-setting with R600 class hardware as well as 3D support."

The R6xx driver also covers the newer R7xx cards, so they are also supported.

Some things are still buggy (some screen corruption bugs, not optimised for speed yet), but hopefully by the release of Fedora 12 (and maybe even Ubuntu 9.10 if they also use the patches), so until the next generation of graphics cards from AMD, all of them should be supported to some degree.

Once this is stable, the major laggard will be nVidia. I wonder if they will decide to take part, or even if these devs can help reverse engineer the drivers for nVidia.

PS from what I have read, most of the drivers were written from using provided documentation, and not reverse engineered (some of the r300 may have been RE though), so Kudos to AMD too for playing ball.

PS from what I have read, most of the drivers were written from using provided documentation, and not reverse engineered (some of the r300 may have been RE though), so Kudos to AMD too for playing ball.

The new code (just being released now) for R600 (and R700 apparently) has been written using provided documentation from ATI. Kudos indeed.

The earlier drivers, for R500 and older GPUs, were reverse-engineered. These are for an older architecture of GPU, and those cards are no longer supported by ATI's proprietary driver.

Correct. And they did so in Feb 2008, almost a full year before they released the R6xx/R7xx documentation.

There was a pre-existing (reverse-engineered) open source driver that already achieved basic 3D support for R500 and earlier. I think some effort went into improving that after the R500 documentation was released by ATI.

However, about a year later (i.e. earlier this year), ATI released the R6xx/R7xx documentation. The architecture was significantly different.

AFAIK this required a re-write of the driver. I think the project was even called radeon-rewrite (Google for it).

ATI actually did release some specs (i think it was under NDA only) way back in the R200 days (Radeon 8xxx). That card was actually pretty well supported.

When the 9700 came out, there were no more specs, and the architecture was different enough that a new r300 driver was created. Some of the code there came from the old r200 driver, and the rest had to be reverse engineered. Later, r400 and r500 support were added by building on top of the r300 driver with reverse engineered knowledge. Good r500 support in particular didn't come about for a long time.

AMD did eventually drop some r500 docs as you noted, although it was much less than what we got for the new cards. The developers did use it to quickly finish up the half-done state of the existing r500 support, and it now works pretty well. I think AMD has hinted at providing more docs for other old cards as well, but it's very low priority and probably won't happen until the current cards are working well.

Meanwhile, r600/700 came out with a completely new architecture and the driver was started from scratch. This is the driver that is just now becoming ready for use, and although it's still rough it's working remarkably well.

The radeon rewrite project wasn't a seperate driver itself, but a port of the existing codebases to take advantage of the new KMS, kernel memory manager (TTM), DRI2 system. The rewrite added wrappers so that the same codebase could support both the new DRI2 features and the old classic DRI environment. As a result, the new code that is landing now supports both, just like the r300-r500 driver does as well. This is the change that screwed up the Intel drivers so badly, but radeon seems to have done much better, most likely because they didn't release it immediately when it still sucked and because Intel worked out a bunch of the kinks for them.

The current code only supports OpenGL1.4, although 1.5 support should come soon (already in the r300-500 driver), as well as Xv video acceleration and is quite snappy for desktop use. Things like Google Earth will run great in this driver. Things like the newest game won't (although i think you're pretty silly for trying to run those in linux anyway).

So OpenGL3 / GLSL and fast 3D support will only come with the Gallium drivers, which are still a ways off. I don't think anyone honestly knows when they will be ready. I've heard that the r300-r500 gallium driver is in pretty decent shape and might come out within a few months at least for an alpha type release. I'm thinking we could have a decent 3D driver for the newer cards by xmas 2010, but going that far ahead there's just no way to know for certain what will happen.

The current code only supports OpenGL1.4, although 1.5 support should come soon (already in the r300-500 driver), as well as Xv video acceleration and is quite snappy for desktop use. Things like Google Earth will run great in this driver.

Hopefully this will help lead to nice, solid ATI/AMD video drivers. Personally, I'm getting sick of buggy drivers (both for ATI/AMD and nvidia cards). I long for the day that I'm able to buy a video card, plug it in, and have it simply work.

I am eagerly awaiting for this batch of drivers. I am currently running the open source drivers because ATI's proprietary ones cause hard locks and all kinds of buggy behavior.

Here's to hoping that these drivers finally deliver decent and stable 3-d acceleration in kwin and compiz.

I have been getting away from Nvidia and purchasing ATI cards for all of the computers I built because I bought into ATI's promise of delivering enough documentation for real open source drivers to emerge and it looks like it is finally materializing.

Intel's current line of cards are simply too weak and the drivers for the most recent ones are not completely open. This provides ATI with a great opportunity to really shine. I just wished that they would focus all internal development on one true open source driver and renounce their proprietary fglrx crap.

Kernel Mode Setting is important because having that level of support in the kernel apparently means that the rest of the X graphics stack can be written to run in userland (i.e. it can run as a usermode program, it won't require root priveleges any more). This is a great improvement from a security perspective.

In turn this may also possibly allow X and/or the driver to crash, and be re-started, without crashing any application program.

Typically, it is only X that crashes. And when X crashes, it will always take your whole X session with it. And programs connected to the X server from elsewhere will happily die with SIGPIPE or whatever as they lose their connection.

Apart from that, yes, this is a great and important step for the Linux based desktop.

Ah, SIGPIPE, bane of my job. My boss likes to solve all problems by popening onto external applications (from compiled C code)... which means that I get to spend a lot of time worrying about catching and handling SIGPIPE.

That and, if the graphics server or graphics driver dies, I think it's reasonable for the whole session and all running apps to go down. It's certainly not surprising. At least for me, the problem is that X goes down a lot, and isn't really stable, high-performance or bug- and glitch-free while it's up!

Until ATI/AMD commits to going down this development route for all of their chipsets then we're still going to get a disconnected mess with fglrx - which simply shouldn't exist in my view. The trade secret NDA card has been played for years as an excuse as to why a driver isn't open sourced and in the kernel, it has shown to be false and AMD, and especially Intel, are in the process of showing that it is bogus for graphics.

We always seem to be saying "Oh, it's just around the corner" with graphics support on Linux - permanently. We've still got lots of drivers doing their own thing and even reimplementing a ton of the Xorg stack (yes, you nvidia) and still a load of differences between what a device supports and what a driver can actually do with it. If only graphics drivers were cajoled into being open sourced and using a shared codebase like those in the kernel.

Until ATI/AMD commits to going down this development route for all of their chipsets then we're still going to get a disconnected mess with fglrx - which simply shouldn't exist in my view. The trade secret NDA card has been played for years as an excuse as to why a driver isn't open sourced and in the kernel, it has shown to be false and AMD, and especially Intel, are in the process of showing that it is bogus for graphics. We always seem to be saying "Oh, it's just around the corner" with graphics support on Linux - permanently. We've still got lots of drivers doing their own thing and even reimplementing a ton of the Xorg stack (yes, you nvidia) and still a load of differences between what a device supports and what a driver can actually do with it. If only graphics drivers were cajoled into being open sourced and using a shared codebase like those in the kernel.

I'm sorry ... what exactly will we need fglrx for?

For the next kernel, 2.6.32 or later, only nvidia cards will still require a binary blob driver for 3D hardware accelerated graphics and compositing.

There is a reverse-engineering project (nouveau?) to write a driver for even nvidia cards, but AFAIK it isn't ready yet for 3D acceleration or KMS.

Excluding the open source argument; I wouldn't use Nvidia simply their products are poor quality and have been so for many years. The only people who don't seem to care about stability and quality are ricers and gamers who seem to change their hardware configurations more times than they change their undies.

This is one of the reasons I have resisted getting a new MacBook - I don't want Nvidia in my laptop or desktop; they've screwed the pooch far too many times with customers that have MacBook Pro loaded with 8400 GPU's still experiencing GPU failures with some having had their boards replaced 4 times.

Quite frankly, I'd sooner go for a ATI powered laptop running Windows than having a MacBook with an Nvidia chipset from Apple. Yes, I loath Nvidia that much.

"I'm sorry ... what exactly will we need fglrx for? For the next kernel, 2.6.32 or later, only nvidia cards will still require a binary blob driver for 3D hardware accelerated graphics and compositing. There is a reverse-engineering project (nouveau?) to write a driver for even nvidia cards, but AFAIK it isn't ready yet for 3D acceleration or KMS. http://www.osnews.com/story/21033/Nouveau_Becomes_Default_Driver_in...

Excluding the open source argument; I wouldn't use Nvidia simply their products are poor quality and have been so for many years. The only people who don't seem to care about stability and quality are ricers and gamers who seem to change their hardware configurations more times than they change their undies. This is one of the reasons I have resisted getting a new MacBook - I don't want Nvidia in my laptop or desktop; they've screwed the pooch far too many times with customers that have MacBook Pro loaded with 8400 GPU's still experiencing GPU failures with some having had their boards replaced 4 times. Quite frankly, I'd sooner go for a ATI powered laptop running Windows than having a MacBook with an Nvidia chipset from Apple. Yes, I loath Nvidia that much. "

That is all fine from a personal point of view. I too bought my last hardware with ATI graphics specifically because ATI published the specifications for open source developers. Excellent. Kudos to AMD/ATI.

However, having said that, it is still important that open source doesn't abandon those people who have nvidia hardware, IMO.

For that reason, even though I wouldn't get nvidia hardware myself, I still applaud the efforts of the Nouveau project.

Apparently they have Xrender hardware acceleration working (so KDE4 should be good to go), and they have made strides towards (but still have some way to go yet) for KMS, Gallium 3D support, 3D support in general, video support etc, etc.

Still, it works well enough for desktop use such that Fedora have been able to adopt it for the default desktop. Even if you don't like nvidia, this is still a good thing for users.

That is all fine from a personal point of view. I too bought my last hardware with ATI graphics specifically because ATI published the specifications for open source developers. Excellent. Kudos to AMD/ATI.

However, having said that, it is still important that open source doesn't abandon those people who have nvidia hardware, IMO.

For that reason, even though I wouldn't get nvidia hardware myself, I still applaud the efforts of the Nouveau project.

But the problem is with the Nouveau is the impression I get is the same I get from Wine. It sounds very nice to do it for compatibility reasons but it can be a double edged sword. Through the continued development of Nouveau Nvidia can easily keep the status quo and claim they don't have to cooperate because the OSS world is doing fine and dandy.

Apparently they have Xrender hardware acceleration working (so KDE4 should be good to go), and they have made strides towards (but still have some way to go yet) for KMS, Gallium 3D support, 3D support in general, video support etc, etc.

Still, it works well enough for desktop use such that Fedora have been able to adopt it for the default desktop. Even if you don't like nvidia, this is still a good thing for users.

Like I said, Nouveau is a double edged sword.

Then again, I question how many end users have Nvidia GPU's given that most of the time I come across people with Intel X3100 or X4500 in their laptops.

Mind you, I might be proven wrong and because of the additional infrastructure put in place within Linux that more companies are willing to open up specifications.

My choice was based on the 8800 GPU being the best value at my time of upgrade. Sadly, the 9600 and later GPU seem to be tweaked 8800 boards until you hit the 260+ GPU chips. Nvidia's drivers have been good though and if Nvidia can keep competitive with the open source development rate then I'm ok with that. If they drop support for the 8800 boards without providing the community driver project specs to keep going; I'll have an issue.

I also hear that man of the Nvidia developers also work on the community driver in there spare time. It'd be nice if the company officially backed the project but having the same devs on both drivers is still a benefit.

Now, when my next GPU upgrade comes around, I'll be reconsidering ATI, Nvidia and any competitive boards but it'll still be who has the best performance and support across platforms. Maybe it'll be ATI by then, this is the first GPU I've purchased that wasn't from them but the flakey as crap drivers and addon apps under both Windows and Linux did me in with my last AIW board. We'll see how AMD's new open policy does before I give them my money again.

That is all fine from a personal point of view. I too bought my last hardware with ATI graphics specifically because ATI published the specifications for open source developers. Excellent. Kudos to AMD/ATI.

I bought an ATI-card because they released specs, all I got was a barly working and slow card, what I quickly replaced with a nvidia card.

Maybe the linux kernel 2.6.32 will give me a card, I can actually use.

Actually quality was the reason why i dropped ati and replaced it with an nvidia card. The ATI drivers sucked majorly for Linux while the Windows drivers were working excellently.
Getting compiz up and running without locking X was a trial and error test (which stuff did not lock up X could be enabled the other one had to be disabled)
The revision before even crashed X on video window resize.
You can tell me many things about nvidia, but with the card I just had to enable the binary drivers and suddenly everything worked flawlessly, no X crashes anymore.
ATI always has been like that, good hardware really shoddy drivers, but at least under Windows they finally have gotten their act together driverwise, the Linux land is business as usual. And btw. where are the BSD drivers?

The ATI drivers sucked majorly for Linux while the Windows drivers were working excellently.

This is exactly why the fact that there are now open source drivers (coming soon in the mainline Linux kernel) is so important.

We now have the documentation for how to drive the graphics GPUs, and we have open source code to drive them. Having both of those also means that when new bugs are discovered, they can be fixed. This is now true for very capable, competitive graphics GPU hardware (since ATI hardware outperforms Intel hardware). These drivers and graphics cards will quickly become the top line for performance, stability and supportability on Linux.

It will no longer be possible for an OEM to hinder Linux (unintentionally or not) by providing sub-standard binary graphics drivers.

A few years ago ATI was nothing but trouble for me and some of my friends in *Windows*. In Linux, it was much worse. I haven't tried ATI lately, but now that they have OSS drivers, I do hope that the quality of the hardware and drivers has gone up.

A few years ago ATI was nothing but trouble for me and some of my friends in *Windows*. In Linux, it was much worse. I haven't tried ATI lately, but now that they have OSS drivers, I do hope that the quality of the hardware and drivers has gone up.

Yeah, I had the same experience. I had an ATI card on an old desktop and basically considered it to be "VESA-only" under Linux. Fglrx was awful and the open-source 2D driver barely worked.

Well, you wouldn't believe the improvements in the last couple of years. I have an HD3200 IGP (R700), an R400-series 128gb PCIe card, and a laptop R500-series IGP, and they all run *flawlessly* today under Ubuntu 9.04, including compositing and 2D acceleration. Until a few months ago, I had to use the awful Fglrx proprietary driver for the R700 IGP, but now that AMD/ATI have opened up the documentation progress has been astonishingly swift.

My hat's off to AMD/ATI for doing the Right Thing. They'll get better drivers and loyal users. I'm not buying or recommending NVidia graphics to anyone unless and until they open up as well.

NVidia only produces drivers for desktop systems based on their reference design boards.

The drivers coming with your Mac are partially developed by Apple. One are where I was disappointed with Snow Leopard is that it still lives in OpenGL 2.1 world.

Macs are good for many things, but not for Graphics workstations.

Comparing ATI and NVidia, the later gives way much better support to developers making use of their products. Just look to the amount of tools and developer documentation that each vendor is providing.

What I find positive is that ATI is providing GLSL support on their tools, while NVidia only provides Cg and HLSL.

NVidia only produces drivers for desktop systems based on their reference design boards.

The drivers coming with your Mac are partially developed by Apple. One are where I was disappointed with Snow Leopard is that it still lives in OpenGL 2.1 world.

Macs are good for many things, but not for Graphics workstations.

Comparing ATI and NVidia, the later gives way much better support to developers making use of their products. Just look to the amount of tools and developer documentation that each vendor is providing.

What I find positive is that ATI is providing GLSL support on their tools, while NVidia only provides Cg and HLSL.

OpenGL 3.2 drivers are in beta for Nvidia and AMD. When they are released I'd expect 10.6.2+ to have them.

Excluding the open source argument; I wouldn't use Nvidia simply their products are poor quality and have been so for many years. The only people who don't seem to care about stability and quality are ricers and gamers who seem to change their hardware configurations more times than they change their undies.

I have it exactly the opposite; I've never had anything except trouble with ATi cards and I've found nVidia cards to not only perform very well but also be stable as a rock as well.

As for driver side.. well, nVidia drivers may be binary but they've ALWAYS worked like a dream for me and support all the functionality of the card in question, even old cards are still supported. But my old ATi cards..well, the last ATi driver that works for them doesn't work with Compiz, it's unstable as heck and is somehow oddly slow. The open-source one works otherwise okayish except I still can't make TV-out work and the open-source one doesn't support pixel shaders. The lack of support for pixel shaders totally blows.

As for driver side.. well, nVidia drivers may be binary but they've ALWAYS worked like a dream for me and support all the functionality of the card in question, even old cards are still supported.

Ditto. I'd still recommend nvidia to Linux users, entirely because of their binary driver. We should understand that their driver codebase is their "crown jewel" (they share the codebase with the windows driver") and they are not giving that up lightly. But, in exchange we get a good (stable and fast) driver that receives much of the love dedicated to their money-maker (windows users).

There is no real need to get worked up about device drivers and open source. Hardware is expendable. When intel and ati get their acts together regarding the driver quality, we'll have more choices, but nvidia is currently the safe bet.

"As for driver side.. well, nVidia drivers may be binary but they've ALWAYS worked like a dream for me and support all the functionality of the card in question, even old cards are still supported.

Ditto. I'd still recommend nvidia to Linux users, entirely because of their binary driver. "

A binary driver fails with the first kernel update.

If there IS a problem, a binary driver is impossible to fix (so one is reliant on the goodwill of the OEM).

If the OEM no longer sells the hardware, binary drivers for it will no longer be forthcoming from the OEM. "Planned obsolesence".

We should understand that their driver codebase is their "crown jewel" (they share the codebase with the windows driver") and they are not giving that up lightly.

Doesn't make any sense. The could give out the source code of their driver to every single person on the planet, and it still wouldn't run on an ATI card.

But, in exchange we get a good (stable and fast) driver that receives much of the love dedicated to their money-maker (windows users).

It doesn't work on Linux. Nvidia have refused to fix a performance bug with 2D for over two years, for example. Because it is a secret, they could be being paid money to keep it poor on Linux.

There is no real need to get worked up about device drivers and open source. Hardware is expendable. When intel and ati get their acts together regarding the driver quality, we'll have more choices, but nvidia is currently the safe bet.

And STILL the open-source drivers for older ATi cards lack all kinds of features whereas the nVidia's binary-only drivers for similarly old hardware support all their features and work just peachy. "

You can blather all you want about open-source superiority, but I have only been let down by the open-source ATi and nVidia drivers.

How can you have been let down by the open source ATI driver (the one built from the specifications) when it hasn't been released yet? It won't be generally available until distributions include kernel 2.6.32. Only this week kernel 2.6.31 was released.

Any open source driver for ATI cards you have seen so far has been built by reverse-engineering. Not from specs. This is also true (and still is) for any open source nvidia driver. This is ALWAYS going to be disappointing.

Fixed now though, for Intel cards and for recent (R600 or later) ATI cards. These are both built from specs. The Intel drivers are written by Intel, and released as open source. The ATI drivers are not however written by ATI ... ATI instead released the specs to open source developers. The drivers for these when they become available will be fully functional. And fixable. And "debuggable". The ATI cards are far better performing than the Intel cards.

If the OEM no longer sells the hardware, binary drivers for it will no longer be forthcoming from the OEM. "Planned obsolesence".

GPUs pretty much have the concept of obsolescence built in anyway, though I don't think nvidia buyers have been suffering from this (the binary driver supports pretty obsolete cards).

The could give out the source code of their driver to every single person on the planet, and it still wouldn't run on an ATI card.

It might still contain some "secret sauce" they don't want ATI to see. I guess they value that sauce higher than perception among linux community, and that's ok for me. It's just a GPU, something mostly used for closed source stuff anyway (gaming).

Before the end of this year, people who are fortunate enough to have ATI cards and have Linux installed will enjoy by far the best-performing bang-for-buck desktop systems on the planet.

I'm waiting with baited breath for good drivers and cards from ATI - if they make the cut, my next GPU will definitely come from ATI. Hopefully, we will see a change from the situation where "if you don't have nvidia, you are on your own". Until now, if you had bought ATI and wanted to go Linux, the general advice was to try to sell the ATI card and get an NVIDIA ;-).

Sorry, but I'm using Linux for a _long_ time (about 15 years now) now and the most safe option for me was ATI and not NVIDIA. I'm in fear for the phone every 6 months when people start to upgrade there Ubuntu-installation and the binary drivers fails _AGAIN_.

Some people got smart and started to ask for a new machine, because the last one was failing. They found out that is now safe for them to follow the regulair updates and upgrades. And you can have a faster videocard, but when it fails it isn't that fast anymore. Also for everyday usage you don't need it, it doesn't make your browser or spreadsheet any nicer.

This is also where the whole picture comes into view. Both AMD and Intel are gearing up to deliver a complete "inexpensive" and "efficient" platform and they need every customer they can get onboard. The best way to get normal people onboard is to get there geek friends onboard.

The current mATX/mITX-boards are good examples of this. Looking at the next Atom-platform or the upcoming Fusion platform you can see which direction everythings goes. Keep also an eye on how many Atom-boards are now shipped without any fans. This was different a year ago when fans where needed and software was not tuned for the new platform.

And I said it more then a year ago on this site as well. More and more people don't care about 5 fps extra as long as their computer just works. And the next 5 to 10 years computers are here to mature and nothing else. Microsoft already found that out with XP that does basicly what people want and Linux has catched up over the years.

So mark my words. The moment that companies start to sell an inexpensive MacMini look-a-like with Ubuntu preinstalled it doesn't matter which videocard is in it. Internet and computers are becoming slowly something like electricity, water, gas. It will always be there and that makes it the businesscase. You sell complete units in high volume.

This is also why companies are screening there code and specs to check for any legal issues. The last big dump of open checked code was OpenSolaris as it's much easier and cheaper to maintain. In one year time they got ceritified support for over 2500 (!!!) laptops with only a handfull of developers. Companies like Intel and AMD can see a businesscase in this as they sell the chips and can now reduce cost on driver development and support.

So if you want to continue buying hardware with only binary drivers, then it is your choice. I stick to my policy that I have for a long time now. I prefer to exchange my money/goods with suppliers that respects my freedom. And what I have seen for people around me is that it takes between 12 to 18 months before they curse vendors for not respecting their freedom.

Before the end of this year, people who are fortunate enough to have ATI cards and have Linux installed will enjoy by far the best-performing bang-for-buck desktop systems on the planet.

You realize that the end of this year is in under 4 months? I think it may be difficult to take the software from the current state to "best performing" in that time. Do you truly feel this is a reasonable expectation, or are you just marketing?

I installed Mandriva and it said "oh, an Nvidia.. shall I install the proprietary drivers?" and graphics wasn't a concern again.

I installed Debian and grabbed the Nvidia drivers from there site. The downloaded binary looked at the system said "cool, shall I download and install the correct drivers?" and graphics wasn't a concern.

I added the Debian non-free repository and grabbed the repository provided nvidia binary with the same result; 3d GPU happy out of the box.

My old ATI was never that smooth even under windows. Install the drivers and get mostly stable performance. Upgrading your drivers? Go through a long song and dance of uninstalling and reinstalling. Under Mandriva, the last time I saw tv-in supported was with a first generation Radeon AIW board and I had two after that with partial 3D out support at best.

I may have got lucky if the Nvidia hardware is hit or miss but the driver support has been a dream. Three different sources for install managed painlessly.. I'll take it.

I wouldn't use Nvidia simply their products are poor quality and have been so for many years.

I've worked in several companies and industries where the stability and quality of the GPU is paramount and it's been many years since I worked in a place that didn't use Nvidia GPUs. These people are neither "ricers" or "gamers", but professionals who need to be able to depend on their graphics cards, and they're all happy with Nvidia (or at least more happy than they'd be with any other brand).

Wow... what an "objective" BS you just wrote there, backed by absolutely no evidence other than personal bias.

Given that NVIDIA defined, literally, 3D graphics in Linux. And the fact that up til recently ATi had mediocre support for the OS at best, and there are still plenty of features missing from the ATi drivers (which up to recently were just plain awful, in both Windows and Linux land BTW). Then yeah, it sounds like "you know what you are talking about" NOT.

Seriously, I never though I would see the day where someone would try to make a serious attempt at claiming with a straight face that ATI products work better under linux that NVIDIA's.

Wow... what an "objective" BS you just wrote there, backed by absolutely no evidence other than personal bias. Given that NVIDIA defined, literally, 3D graphics in Linux. And the fact that up til recently ATi had mediocre support for the OS at best, and there are still plenty of features missing from the ATi drivers (which up to recently were just plain awful, in both Windows and Linux land BTW). Then yeah, it sounds like "you know what you are talking about" NOT. Seriously, I never though I would see the day where someone would try to make a serious attempt at claiming with a straight face that ATI products work better under linux that NVIDIA's.

A lot of people just don't seem to get this.

I'll try again.

Up until a few years ago, there were no open source drivers for Linux. Only proprietary ones. This presents a big problem, because the stability and the entire performance of the entire system depended on the OEM's proprietary, secret, binary-only code. The Linux experts who coded the kernel and who would be the best people to debug any problems with drivers had no visibility at all into the graphics drivers.

So open source developers tried to write their own graphics drivers for Linux. In the dark, using reverse engineering, without specs. This is always going to be a slow, laborious process, only minimally effective. It is surprising the amount of functionality that was achieved.

The critical points here are these: (1) specs were NOT available, and (2) open source code was NOT written by the card manufacturers, (3) but code could be debugged, and (4) no danger of obsolesence through support being dropped.

OK, some while ago, this situation changed. Intel released their graphics drivers for Linux as open source.

The critical points here are these: (1) specs were available (to Intel staff), and (2) code WAS written by the card manufacturers, (3) but code could be debugged, and (4) danger of obsolesence through support being dropped.

This was a vast improvement, but still there were no specs. Still at the mercy of the OEM (Intel in this case). It is also a pity that Intel graphics are performance-wise significantly inferior to ATI or nvidia cards.

OK, early this year, ATI finally released the specs for R600 and later GPUs. Open source developers have been working on drivers since then (about eight months now).

The critical points here are these: (1) specs were available (to open source developers), and (2) code was NOT written by the card manufacturers, (3) but code could be debugged, and (4) no danger of obsolesence through support being dropped.

We are just now seeing the fruits of that coming through. These drivers represent an entirely new class of graphics driver, which has not been available in Linux before now. ATI cards are entirely competitive hardware-performance-wise. Finally they are going to enjoy a well-integrated graphics driver, written by people who know the Linux kernel and graphics systems inside out.

I never though I would see the day where someone would try to make a serious attempt at claiming with a straight face that ATI products work better under linux that NVIDIA's.

Well, now you have seen that day, it is finally almost here. The new code has been comitted to linux-next. "Working better" is precisely what this new class of graphics drivers will deliver. This is why people are excited about it.

PS: Obsolesence works to the advantage of a graphics card manufacturer. Card manufacturers enjoy a new round of sales (to 'serious' gamers) evey time Microsoft helps them out with a new version of Direct X. Think about what that fact means (to most end users) for a second.

Does anybody know how the ATI FOSS drivers compare to frglx in performance? Last I heard the ope drivers did better at 2D but really do poorly at 3D tasks. Has the situation improved at all? If it had, then this is good news. If the FOSS drivers still worse 3D performance than frglx than this won't be much of a big deal.

Does anybody know how the ATI FOSS drivers compare to frglx in performance? Last I heard the ope drivers did better at 2D but really do poorly at 3D tasks. Has the situation improved at all? If it had, then this is good news. If the FOSS drivers still worse 3D performance than frglx than this won't be much of a big deal.

While the 3D support was being developed, I believe that initially they were using memcopy (software) function. One of the "todo" tasks was to write proper memory management and to use DMA.

"Alex mentions in a blog post that the performance is currently (August 03, 2009) slow, but it should be improved soon. It looks like we may see a decent level of 3D support in time for Mesa 7.6 and the round of distribution refreshes this fall."

Now the only problem for me is the lack of suspend. Other than that it works well: stable and fast 2D and 3D.

Probably too optimistically, but it does seem to be stabilizing very quickly. Then again, if you count the lack of compiz corruption... hmmm.

The 3D code is mostly common between DRI1 and DRI2 -- this is the upside from porting the 3D driver over to the radeon-rewrite code base a few months ago. Doing that port delayed the 3D driver by a couple of months, but it meant that the move to KMS and DRI2 would be simplified.

You can talk about how great open specfications are all you want, but theres simply no alternative to NVidia on Linux if you want to run 3D applications without stability or performance problems, today - for the rest of 2009, and almost certainly for 2010.

So ATI/AMD have (finally) released the specs. Gee, they've only been promising to do that since, i dunno, 2007?

So now we're going to see horrible API churn, breakage and multi-level instability like with the Intel/Xorg/KMS work that has been going on recently.

And i'd estimate a driver that supports full and stable support for all the features on the ATI R600 cards with average 3D performance optimisation will be here by, say, christmas 2011.

Wake me when that happens.

Until then, at least you have a choice of graphics card with Mac or Windows - under Linux, NVidia is the only currently usable option if you want reasonable 3D performance.

Until then, at least you have a choice of graphics card with Mac or Windows - under Linux, NVidia is the only currently usable option if you want reasonable 3D performance.

I've got an Asus Z96J with an ATI x1800M card in it. I get pretty good performance both out of Compiz and out of Dawn of War: SoulStorm running under WINE, using the fglrx driver. I won't say there are no problems, but it certainly provides "reasonable performance" under Linux.
I was using an x2400 at work, too, for that matter, and it was working just fine, until an RHEL4 kernel update went badly sideways. The card got replaced with an nVidia card; my (not exotic!) GL code no longer renders properly.

You can talk about how great open specfications are all you want, but theres simply no alternative to NVidia on Linux if you want to run 3D applications without stability or performance problems, today - for the rest of 2009, and almost certainly for 2010. So ATI/AMD have (finally) released the specs. Gee, they've only been promising to do that since, i dunno, 2007? So now we're going to see horrible API churn, breakage and multi-level instability like with the Intel/Xorg/KMS work that has been going on recently. And i'd estimate a driver that supports full and stable support for all the features on the ATI R600 cards with average 3D performance optimisation will be here by, say, christmas 2011. Wake me when that happens. Until then, at least you have a choice of graphics card with Mac or Windows - under Linux, NVidia is the only currently usable option if you want reasonable 3D performance.

This is un-necessarily pessimistic.

Please note that Intel graphics driver for Linux are written by Intel. They are not written by the people who are intimately familiar with the Linux kernel or the xorg graphics stack.

That is not the case with these ATI drivers. They should be as integrated with the rest of the system, and as in step with it, as is any other in-kernel driver. This is a first for Linux graphics drivers, really, we have not really seen this before.

You can make all the pep-talk posts brimming with enthusiasm to osnews you like, but its not going to change the fact that ATI drivers have been 'coming soon' for years, that the 'about to be released' drivers are only basically functional, and that its taken years for intel drivers, with intel support to get from 'basically functional to fully functional and now back to basically functional.

I'll believe it when i see it. I'll believe that open source ATI drivers will support full speed XRender, GLSL and OpenGL 2+ functionality crash free and glitch free along with accelerated video playback on Linux when i see it working.

So why don't you calm down. When that day comes, i'll be able to forget my pessimism, and you'll actually have something to crow about.

Unecessarily pessimistic huh? I call it the plain truth. You can make all the pep-talk posts brimming with enthusiasm to osnews you like, but its not going to change the fact that ATI drivers have been 'coming soon' for years, that the 'about to be released' drivers are only basically functional, and that its taken years for intel drivers, with intel support to get from 'basically functional to fully functional and now back to basically functional. I'll believe it when i see it. I'll believe that open source ATI drivers will support full speed XRender, GLSL and OpenGL 2+ functionality crash free and glitch free along with accelerated video playback on Linux when i see it working. So why don't you calm down. When that day comes, i'll be able to forget my pessimism, and you'll actually have something to crow about.

Pffft.

You can't dismiss code until you have seen it, run it and measured it.

This driver is entirely new code. It is not fglrx. It is not a re-vamped version of the older reverse-engineerd open source drivers. It is the first 2D/3D accelerated graphics driver for Linux written by Linux developers with the aid of specifcations.

Lets wait and see how this entirely new code performs when it is made available. It is in the kernel staging area, but that means it has a lot of hardening and stability testing to get through yet.

Once we actually have a released driver, and we can objectively measure its performance, then and only then can we talk about the "plain truth" about it.

PS: ATI open source drivers have not been in work "for years". The specs were only released to open source developers in January of this year.

From David Airlie, the guy who committed the code: "It may not be 100% stable yet and I'm sure we can make things a lot faster, but the basics all work for me here." So these drivers are an initial release - basically functional, but may crash, and are slow. But sure, ignore the plain truth all you like, you're way out in irrational fanboy territory here.

Cute.

I'm supposed to be the one who is "in irrational territory" ... even though I posted this:

Lets wait and see how this entirely new code performs when it is made available. It is in the kernel staging area, but that means it has a lot of hardening and stability testing to get through yet.

From David Airlie, the guy who committed the code: "It may not be 100% stable yet and I'm sure we can make things a lot faster, but the basics all work for me here." So these drivers are an initial release - basically functional, but may crash, and are slow. But sure, ignore the plain truth all you like, you're way out in irrational fanboy territory here.

"May not be 100% stable yet" means that there is still testing to be done, it doesn't mean that it is necessarily buggy.

"I'm sure we can make things a lot faster" means it probably hasn't been profiled and optimised, it doesn't mean that it is slow.

I posted earlier quoting other testers on the Phoronix radeon forum ... it was decribed as "fast and stable" for them.

So your description that it will be "basically functional, but may crash, and are slow" is purely wishful thinking on your part.

"ignore the plain truth all you like"

I gave direct quotes from independent people who have run David's code. They found it to be QUOTE: "fast and stable". QUOTE: "stabilizing quickly". QUOTE: "I am *still* surprised how well it all came together." You are just guessing and speculating about it, and badmouthing it before it has even been through staging.

"Please note that Intel graphics driver for Linux are written by Intel. They are not written by the people who are intimately familiar with the Linux kernel or the xorg graphics stack.

What the heck? The Intel graphic driver for Linux are written by none other than Keith Packard, the project leader of xorg. All the new stuffs like DRI2 and KMS are also developed by Keith and co. Please get your fact straight. "

OK, fair enough. How is it then that Intel drivers for Linux have gotten themselves into such a horible tangle recently? Performance regressions and dropped functionality all over the place.

"[q]Please note that Intel graphics driver for Linux are written by Intel. They are not written by the people who are intimately familiar with the Linux kernel or the xorg graphics stack.

What the heck? The Intel graphic driver for Linux are written by none other than Keith Packard, the project leader of xorg. All the new stuffs like DRI2 and KMS are also developed by Keith and co. Please get your fact straight. "

OK, fair enough. How is it then that Intel drivers for Linux have gotten themselves into such a horible tangle recently? Performance regressions and dropped functionality all over the place.

Cos you're not saying 'lets wait for the benchmarks' you're writing post after post saying this is some kind of enormously significant milestone and we'll all have top notch ATI performance in the near future, without a shred of evidence to back it up.

And youre calling for Keith Packard to step down because of the intel graphics situation.

You trumpet the supposed superiority of the OSS driver development model, and in the next breath rail against its outcomes and insult one of the most important contributors to the effort.

To everyone asking when the code will actually be available (and not happy with git repositories), this code will be backported into Fedora 12. I'm not sure about the other fall distros, but I think it could potentially be in some others as well.

I think F12 will probably enable it by default, although I'm not sure about that either.