"AMD is on the heels of releasing the next set of GPU programming documentation to aide in the development of the open-source R500/600 drivers (xf86-video-ati and xf86-video-radeonhd). It's already been discussed what this NDA-free documentation release will have, but one of the questions that have repeatedly come up is if/when AMD will release information on accelerated video playback. AMD's John Bridgman has now stated what they plan to release in the video realm as well as a new requirement for their future graphics processors: being open-source friendly while avoiding DRM."

The open graphics project aims for a completely documented graphics card with 3D acceleration.

Of course the drivers will be open source, as well as the board layout and the hardware definition (Verilog code).

Maybe AMD sees the demand for open source drivers, and sees no real downside to making these drivers possible by releasing the specs of their hardware.

Maybe they figured out that all of their real inventions are patented, and whatever is not patented is no real danger to them in case Nvidia finds out about it.
So their management finally found out what would have been told to them by the first in-house techie they could have asked.

Heh, as much as I appreciate the vision of projects like OGP, they're never going to be commercially viable, let alone on the same radar screen as the big three. Not gonna happen.

AMD beginning to head down the long road toward open graphics specifications has much more to do with convergence and globalization than it does about their current and future competitors in the PC market.

From subnotebooks and smartphones to set-top boxes and grid computing, free software will play a more influential role on post-PC platforms than it did on the PC. Intel is pushing aggressively into these environments in part by embracing Linux and open graphics drivers. AMD has to open up in order to win contracts in this evolving competitive landscape where free software is crucial to the ongoing incursion of PC hardware into the consumer electronics industry.

I think that most hardware vendors now realize that free software is going to play a major role in emerging technologies and emerging markets. Smart CEOs look at transformational effects of free trade on the global economy and realize that demand-side stimulus from emerging middle classes in the developing world will be the predominant engine of growth in the 21st century.

Vendors that attempt to sell the same old stuff into the same old markets will have trouble competing with those that are positioned to exploit the incomparable economic force of hundreds of millions of people being pulled out of poverty. Free software is helping to flatten the world, empower society, and strengthen the global economy.

Not OGP, but Intel. Among the Linux users I know, most bought or consider to buy Intel PCs, because the drivers are open source. This is to a lesser degree for political reasons but for practical. Closed source drivers are not maintained forever and after a few years you might find yourself stuck with an older OS, unable to upgrade without replacing hardware, because the old driver simply doesn't work with newer OS versions.

I don't know how many users are really affected by that problem and for how many it's just a psychological thing. Maybe it's enterprises that just consider that they might switch to Linux on the desktops in the future and -- better safe than sorry -- bet on Intel chipsets.

Maybe it even simpler: After the PR disasters with the Phenom CPUs, AMD just wants good publicity.

I would say this could definitely be a pr move, and they need it more then ever. While the radeon 3870 looks to be a great card at a much lower price then the 8800, the Phenom release was an utter bust, and intel has a whole new wave of chips waiting to strike. At any rate, intel has contributed alot to the open source community, and not only graphics drivers. If you have an intel mobo, the intel site also has sound drivers and some other chipset goodies for linux. Also the have applications like PowerTop and the website lesswatts.org to help linux users maximize battery life on laptops. Additionally the OLPC is questionably a contribution from AMD to the open source world. AMD did not develop the Geode, they bought it from National Semiconductor who had basically given up on it for dead, and AMD made out of the deal by getting it in the OLPC. At any rate, with AMD looking to be hurting in the CPU wars in the near future, the ATI branch is what really needs to shine to keep the company afloat. Thus we are seeing a wave a new hardware that is going to compete on par with Nvidia's best at most likely a lower price and an outreach by AMD/ATI to users they have alienated in the past a.k.a. Linux users. Hopefully they find a way to skirt the DRM and can win back the Linux fanbase, who are a big deal. Linux users are not a large group(relatively speaking) but are passionate and VERY VERY vocal, and getting their support means tons of good pr on the web. Best of luck to ATI on this one.

Actually I think this are not so good news. They say that most likely any r600 and r700 cards (the current and the upcoming series) won't have accelerated video with the open source drivers due to it being entangled with DRM.

That is, you either get an old card now or wait a couple of years to buy a new one if you want to have accelerated video with open source drivers.

I really hope they find a solution to release information about accelerated video without touching DRM, because now that most people were willing to jump on the AMD bandwagon for its open source drivers, it's not easy to recommend to but an ATI card that will be crippled with open source drivers and that has worse closed source drivers than NVIDIA.

I can't blame AMD completely for this, since its all just about legal problems, but God do I hate DRM...

I can't blame AMD completely for this, since its all just about legal problems, but God do I hate DRM...

Agreed. But well, _if_ they can't find a way around DRM then there's always the proprietary drivers for those users who really want the accelerated video playback functionality. Also even with the accelerated video playback we still won't get accelerated h.264 or VC-1 decoding since there is no standard way of doing such under Linux yet. XvMC != accelerated decoding of such formats. Though, there has been some discussions going on about that and various suggestions. Like f.ex. not all hardware do support decoding of h.264 natively but it could be implemented as a shader program. All that is needed is to define some standard way of accelerating video decoding so that all the drivers can use that and not only drivers/cards from a specific vendor.

But well, _if_ they can't find a way around DRM then there's always the proprietary drivers for those users who really want the accelerated video playback functionality.

Incidentally, video playback is where the closed source driver is at its worst. The videos are horribly pixelated and the diagonal tearing is gruesome. They made progress, a few driver versions back it used to crash the X server when playing back videos.

Between the last three driver versions, one could choose either a incredible memory leak (hundreds of MBytes in a few minutes when running an OpenGL app) or a driver which is not capable of driving common resolutions like 1400x1050 or 1680x1050. This is how bad the state of the closed source AMD driver is.

Incidentally, video playback is where the closed source driver is at its worst. The videos are horribly pixelated and the diagonal tearing is gruesome. They made progress, a few driver versions back it used to crash the X server when playing back videos.

I don't own any recent ATI/AMD card so I can't confirm but well, XvMC isn't really that useful anyways. It can only be used for MPEG-2 streams, not for f.ex. DivX, MPEG4 or anything such. It's XV which matters more but I imagine the open-source drivers will have a working XV implementation. But XV doesn't mean hardware-aided decoding.. :/ Nowadays when graphics hardware is so fast and powerful, and even programmable, it is quite amazing that there is not yet support for hardware-aided decoding of video under Linux, atleast not without some very specific drivers and proprietary software (and I don't personally even know of any such combination).

Windows and Mac OS X has had support for hardware decoding of video streams for..umm, god knows how long :O It's unfortunately one of those things where Linux is still lacking behind..

True, but this standard will be replaced by VA (Video Acceleration) API, which does implement the modern features that GPUs have (and not only in MPEG-2, but also MPEG-4 ASP, MPEG-4 AVC and VC1). That's why it is really a pity that most likely all r600 and r700 AMD GPUs won't be able to take advantage of it with the new open source driver.

Thanks, that's exactly what I had in mind but couldn't remember the link/name! I'm sure that once the libraries are in place and drivers start to implement those features Linux will become quite a capable multimedia platform indeed And the best thing is that it will also benefit older cards too as they can implement atleast some of the functionality as shaders and as such still provide some boost in performance ^^

Incidentally, video playback is where the closed source driver is at its worst. The videos are horribly pixelated and the diagonal tearing is gruesome.

With the fglrx driver (maybe the same with other ATIs) video output has to be set to use openGL (in the video player you're using) or else it will seem horribly pixelated. I suffered under this for a while before I somehow found out about this.

"""
That is, you either get an old card now or wait a couple of years to buy a new one if you want to have accelerated video with open source drivers.
"""

Isn't that ATI's middle name? Still sitting on my Radeon 9200 and waiting. They talk the talk. But their walk still looks like a shaky DVD of someone else's toddler trying to get past the crawling stage.

i don't really care how a company decides to deliver a product, i only care about the final thing and how it works (if it works).

so with that, i will only choose what will function best and not choose a solution that will not fully function or be slower simply because its 'open'.

i've also been following and dealing with 3D cards since voodoo 1 all the way back to linux 2.0.x days, and the 'open' solutions from this day have never been good as far as features or performance... and the GPUs these days are extremely more complex than they were in 2000.

For me all that really matters is a working computer, and of course how to get there.

And that is, where open source comes into play. I like Linux and related applications and environments, therefore I want to install a Linux-distribution on my computer. Whenever I update a kernel or other component, I want to know that the distribution also can handle the influence this update may have on the drivers. Remember, Linux is not necessarily binary-compatible, at least not towards drivers which hook into the kernel.

With an open source driver, I can be confident that the hardware will be recognized and configured correctly, AND that updates will not break the system. In that regard open source is a feature, and a usability advantage to the company which offers the free drivers. I therefore think AMD is doing the right thing here (or at least they are on the right trail).

Don't get me wrong, I use closed drivers when I must; but I rather spend twice the money and get 1/2 the performance and use open drivers. Nobody is going to force an upgrade down -my- throat.

That's part of the reason I was hoping that one day we'll see a video card vendor come out with a dedicated PCIe video card loaded with dedicated memory using an Intel GPU.

I have a look at the Nvidia drivers, and basically, got a problem? tough luck. Same said for ATI drivers - look how long it took for them to fix issues with their drivers; and they're still crappy. So bad in fact I've decided to stick with the open source ATI driver over the proprietary one, the performance is that bad.

At the moment, the Intel Core 2 Duos pretty much spank the AMD X2s on most everything I've tested (I participate in some ~80 distributed computing projects of various types) - but prior to that, the AMD Athlon series easily clobbered the Pentium 4's and Pentium D's... Both the AMD X2s and Intel C2Ds are very energy efficient (compared to prior technologies), with the C2Ds slightly better at the moment IMO.

I'm guessing it's just a matter of time before AMD unleashes some new innovation. Intel may have the dual and quad-core battle won for the moment, and they have a lot more fab locations under their belt - but that doesn't mean AMD is permanently out of the picture.

I agree, but please don't call chip designs "technologies", it's pretty wrong. Major things like, perhaps, digital computers, BJT transistors, self aligned gates, or even PMOS/NMOS/CMOS, could all pass as "technologies". A particular chip (family) could not however - not even a micro architecture would be a "technology".

What exactly is it that you think is "optimised" to run in 64-bit mode?

If you stop and think for a moment, you'll realize how silly that idea actually is. If 64-bit mode is so much faster, then why wouldn't the 32-bit mode simply add 32 zeroes to the front of every number, thereby automatically transforming it into 64-bit code and gaining the extra performance?

In reality, the 64-bit code often tends to be slightly slower in a straight comparison with 32-bit because the larger pointers fill up the caches a bit faster. The extra registers available in the AMD64 architecture tend to more than make up that difference, and of course some programs are able to take advantage of using all 64 bits at once. But that tends to be program specific rather than cpu specific.

The AMD Athlon 5200 X2 EE is ~£76 (~104 Euros) on one of most competitive UK online retailers. In terms of price this seems to compete with the Intel Core 2 Duo E4500, which is a ridiculously overclockable chip that most likely kicks the X2's arse on any playing field.

No matter which way you cut it, Intel's current core architecture is more efficient than AMD's and Intel have a good line up across the entire budget range.

Technically speaking, of course you are right.
But it doesn't always matter which is 'the better option'.
Sometimes all that matters is, is it sufficient.
For most things most people end up doing on their pcs (disregarding here the people with eight gigs of RAM ), mere cpu speed doesn't matter anymore. RAM matters and so does hard drive speed.
My brother got himself a new, cheap AthlonX2 pc which he thought was really fast. I said, these days, Core 2 outperform them. Guess what, he doesn't care, he's used AMD for years and it works for him.

Then there's the energy efficiency thing.
Sure, very important.
Then again, a guy at work bought a Core 2 machine to replace his single core Athlon, for "a little extra power" - I asked him why he didn't upgrade just his cpu, would be cheaper and it would work with his particular motherboard (socket 939, A.64X2 is still available for it).

He said it was, besides faster, also an investment in energy savings..

He also got himself a new 20" screen that draws 25 Watts more than his present 17", which immediately kills the savings on his new pc.
And wireless mouse and keyboard.

Not to mention the energy that goes into the production of all the news things he got himself, which some (most?) people tend to forget. A huge amount of energy and resources goes into the production of cpus and the like.
Sure, that's how the economy works, but it makes me a bit more sceptical than I used to be, when it comes to this energy efficiency thing.

That's why we have competition, you know. Sometimes one option is better, other times another option. It tends to create innovation, higher quality and lower prices.

"""

Agreed. It is also best if the competition is fairly evenly balanced. I worry about AMD. They've done well in the past, but Chipzilla has tremendous resources and influence. I hope we really do see them leapfrogging each other technologically. I felt more comfortable when the smaller vendor had the better chips.

Hmm, there's AMD Geode in OLPCs but that's just hardware. AMD did not donate all the Geode chips for use in OLPCs nor have they done much else either. Intel on the other hand is working on several open-source projects, they have released open-source drivers for their graphics cards and some wlan cards etc.. So, what were you trying to say?

I was merely wondering what arguments Gilboa has to claim that AMD is somehow more friendly with Microsoft than Intel is.
I'm not saying it's not the case, maybe it is, I was just wondering what reasons there are to make this a fact.
My guess is that both Wintel chipmakers in some way need friendly relations with Microsoft, because its operating system should run nicely on their chips (and of course, MS needs them too).
What makes this need bigger for AMD?

AMD is a bit closer to them just by default. MS is dealing with them from a position of strength, which makes them happy and AMD can't afford to piss them off like Intel can. Meanwhile, Intel has tons of cash to throw around trying to expand into the emerging open source market, while AMD has to rely on getting the most for the limited amount of money they have which currently means going to where most customers are.

I think both companies are fairly friendly with the open source community, Intel just has a little more ability to do stuff on their own.

Video acceleration reminds me of the early MPEG2 decoder PCI cards that were needed because the CPU of the day were so weak, software DVD players would never decode at a reasonable quality. Zoom to today, and software DVD players are now common place.

The video acceleration of today is focused at the h264, which is an incredibly complex CODEC; even gigahertz machines struggle at times to decode at a reasonable quality, but like the evolution of DVD, the need to have the off loading onto a dedicated video card will go.

Then there is the new AMD GPU/CPU hybrid which will have video extensions accessible to all, without needing to have access to the patent protected Macrovision which current AMD GPU's use for their content protection.

With that being said, there will still be OpenGL GPU acceleration, and hopefully with that, interesting things will develop - with that being said, don't expect games to suddenly appear; then again, as a Mac user, I prefer having my Wii to play games, and keep my MacBook for everything else.

Video acceleration reminds me of the early MPEG2 decoder PCI cards that were needed because the CPU of the day were so weak, software DVD players would never decode at a reasonable quality. Zoom to today, and software DVD players are now common place.

Too true. I remember the DXR2 and DXR3 cards I used to have to support while I was working for Creative Labs. They where great cards but ass processors advanced they became unnecessary.

Then there is the new AMD GPU/CPU hybrid which will have video extensions accessible to all, without needing to have access to the patent protected Macrovision which current AMD GPU's use for their content protection.

Frankly, this type of solution is only of interest to me if the tech filters back to just the GPU. Don't get me wrong, I'm all for CPU/GPU integration for those that want it, it's just that I like to be able to pick and choose a graphics upgrade without having to splash out for a processor while I'm at it.

With that being said, there will still be OpenGL GPU acceleration, and hopefully with that, interesting things will develop - with that being said, don't expect games to suddenly appear; then again, as a Mac user, I prefer having my Wii to play games, and keep my MacBook for everything else.

More and more of my windows only friends are starting to just use consoles for their gaming. Considering the amount of high quality titles that are either Console only, or are taking much longer to be released on the PC, me thinks I should start looking into it!

More and more of my windows only friends are starting to just use consoles for their gaming. Considering the amount of high quality titles that are either Console only, or are taking much longer to be released on the PC, me thinks I should start looking into it!

Somewhat off-topic, but count me in that camp, for several reasons: 1) I don't have to worry about buying a new graphics card at practically the price of a new console every 6-12 months just to be able to play the latest games, 2) No need to fiddle with buggy driver revisions, and 3) the PC game industry pretty much stopped making all the game genres that I actually cared about. It seems as if very little besides FPS and RTS games come out for PC any more, and neither of those hold my interest.

If you don't game then the Intel GPU is the way to go. My only problem with them is that you can't get dual-monitor setups very easily at all. Supposedly there is a add-in card, but I have never been able to find one for purchase.