If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Open ATI R600/700 3D Graphics For Christmas?

11-18-2008, 07:20 PM

Phoronix: Open ATI R600/700 3D Graphics For Christmas?

It's taken an eternity (well, to the Linux community that has been eagerly awaiting code and documentation since very early this year when it was first promised), but it looks like the open-source 3D support for the ATI R600/700 (Radeon HD 2000 through Radeon HD 4000 series) graphics processors may finally be coming soon! Back in February around the time of FOSDEM, AMD had released the R500 3D programming documentation that allowed the open-source community to achieve initial OpenGL R500 acceleration just a month later. At that time we were told the R600 3D documentation should be out in roughly a month, but that never ended up materializing. Since then we've been told the open-source R600 3D is their leading focus and that they're working to provide the needed documentation or source-code (either a straight-up Mesa implementation or internal AMD software projects like TCore and KGrids), but due to their limited staff and being burdened with intellectual property issues, it's been a slow process to say the least. In late June there was an initial R600 Direct Rendering Manager, but it was without any 3D support, and then earlier this month we found out that they're close with their R600 DRI support...

Comment

Reading Alex's blog leaves the impression they are reverse engineering much of the 6XX/7XX code they've written, rather than implementing an engineering standard inherited from documentation provided by ATI which was the impression given to date.

Comment

How bad would the worst case be? Of course that AMD got sued, and that would mean, that they won't be releasing any more specs. So that's pretty bad.

If the worst case was as minor as not being able to release any more specs we wouldn't be worrying so much. The kind of risks we are worrying about are much larger, ie things which would either kill or cripple our graphics business.

Worst case is that we lose the ability to sell our products into the Windows market as a result of releasing info which results in our DRM implementation no longer being considered sufficiently robust. Without the Windows market (which is >90 % of our revenues) we would, for all practical purposes, cease to exist as a GPU manufacturer, especially since we would probably lose the Mac market at the same time.

Next worst case is that we find a way to continue shipping into the Windows market but get sued under one or more of the DRM-related agreements we have signed. These all have high dollar-value penalties, again enough to significantly impact our ability to continue operating.

There are a bunch of smaller risks but we spend proportionally less time worrying about them. What makes all this complicated is that we have to consider not just the information we release but the information which is likely to be reverse-engineered and published. Each time we release information we simply raise the bar for where reverse engineering starts, and it's the combination of released plus "likely to be reverse-engineered" info that we need to consider.

If we tripped any of these risks then the impact would not only affect the GPUs we are shipping today but anything we have in the pipe. Best guess is that we would lose the next generation (ie the one after 7xx) and see significant delays in the one after that.

Since we don't want that to happen (right ?) the alternative is to trim back the information we release until it appears safe, going through the review process each time until we find an appropriate level. What "tricked" us with the 6xx was that the internal docs contained information on a lot of functionality we are not using in any drivers today, so the amount of information we had to cut out was much larger than in previous generations so it took a few attempts to find the right balance. The sheer complexity of the chip doesn't help either, since every review involves more people and more time than previous generations.

This is very typical risk management for a large company, but I don't think anyone has talked about it much before.

If you are talking about "worst case" in terms of "what if they can never release 6xx/7xx 3d info ?" I think we're already past that. There was a scary period back in August when that looked like a possiblity but I think we're OK in that area now.

Comment

It was all a lot easier 10 years ago. You still had to worry about competitive advantage but the penalties for "getting it wrong" were a lot less drastic. We used to provide a lot of documentation for our GPUs back then -- it was really the integration of DRM functionality into the rest of the graphics pipeline that chased us out of the open source business last time.

It is a tough call. We wouldn't have HTPCs and DVD/BD playback on PCs without the integration of DRM into typical PC hardware, but we do pay a high price in terms of being able to open up programming info.

Comment

Now that you mention it, I tend to remember one of the worries about going open source was that the competitors could find bugs in the hardware and create comparison tests which trickered that bug.

Digital Restrictions Management is weird! Now that AACS and BD+ have been broken, HDCP serves no purpose any longer. No one would be interested in hacking HDCP. But try explain that to the movie studios