Posted
by
samzenpuson Thursday January 17, 2013 @10:59PM
from the forget-the-clips dept.

crookedvulture writes "AMD has begun addressing the Radeon frame latency spikes covered previously on Slashdot. A new beta driver is due out next week, and it dramatically smooths the uneven frame times exhibited by certain Radeon graphics processors. The driver only tackles performance issues in a few games, but more fixes are on the way. In the games that have been addressed, the new driver delivers more consistent frame times and smoother gameplay without having much of an impact on the minimum or average FPS numbers. Those traditional FPS metrics clearly do a poor job of quantifying the fluidity of in-game action. Surprisingly, it seems AMD was largely relying on those metrics when testing drivers internally. The company has now pledged to pay more attention to frame latencies to ensure that these kinds of issues don't crop up again."

When will Nvidia and ATI release proper open source drivers instead of us having to install a binary blob to get our hardware working? That would really help if there were drivers that could ship in the kernel to handle ATI hardware instead of the closed source options.

Probably one of the most important divides in engineers (the world?) is the ability to read the data, acknowledge your mistakes and fix it. It seems like most companies spend more time doing damage control than damage remediation. Reality must take precedence over public relations, for Mother Slashdot cannot be fooled (with apologies to Richard Feynman).

When you get all the patent holders for the patents they are violating to execute license agreements and hold harmless agreements. You think it's only software where patents are stifling things? Everyone in the industry kind of willfully looks the other way, as long as the other guy also willfully looks away, and as long as there's no source code for anyone to drag anyone else into court over.

Realize also that the interfaces used on the bottom end to interface the software for the hardware disclose substantial information about the hardware as well. Imagine the following question in the press: "if you say you support 'B' in hardware, but are actually doing 'Q' and 'R' in hardware when you are asked to accomplish 'B', isn't the claim that you have 'hardware accelerated B' only technically true in order to have that marketing checkbox checked?". There are similar uncomfortable questions.

Apart from those, the interfaces to the hardware can disclose additional hardware patent violations, which would normally be covered by the "willfully looks away" already in progress.

if you actually did come up with something clever, but which wasn't patentable for whatever reason, your competitor could just copy it, and then you would have lost your market advantage.

Finally, most hardware codec decoding, e.g. for H.264, is partially looped in software s that the license can be tied to the software instead of the hardware, and therefore be optional, and not add to the unit cost as a hardware royalty item to Sorenson (and others). By this fiction, they become an optional software royalty item where the company using the hardware in their design can choose wheter or not to use the capability, and thereby be required to pay the royalty. If it became easy to use the hardware capabilities from Open Source software, then this fiction disappears. You can argue that standards should all be royalty free or not be standards until you are blue in the face, but you are looking at approximately 100,000,000,000 DVDs total in the world, all expecting H.264 to decode them, and that requires a royalty payment.

No, these drivers are never going to be fully Open Source at the same time they give access to all the hardware capabilities.

You're assuming it's about ego. More likely they still have code that was produced by a contractor where they don't have the rights to redistribute or where they have a license which is only for their use, not for 3rd parties. Anybody who cares about it is likely to be more interested in the code being released, the people that wrote it likely don't get a say in it. Plus, I doubt most people really care one way or the other and are unlikely to even look. Assuming they can even program.

AMD has been releasing what they can, and the ingrates around here bitch about them failing to reduce all the code, regardless of whether or not AMD owned ATI at the time that the software was written, assuming that AMD can just wave a magic wand and own all of it.

And it took them YEARS to work this out? And only really weeks to "fix"?

Just because it is obvious to you doesn't mean that it is obvious to others. Really.

If there's a problem, kick up a fuss, complain, let someone know who can do something about it. This is true whether it is software, hardware, real life services, etc. There's always plenty for the people doing support to do, so if you want YOUR issues to be the ones fixed then you'd better sing up about them so that they get some priority. If you say nothing, everyone else will assume you're doing fine with no problems at all. That's the way the world works.

The Frame latencies by percentile [techreport.com] graph they create now is the right way to look at this data. It's a sort of probability distribution function for slow frames. Nothing simpler will capture the complexity of the problem. You can't usefully boil the universe of rendering latency issues into any single number.

The worst frame will vary based on card and game, and the tools available to reviewers are not practical to find them. And what this debacle has shown is that even though they're limited, the tools being used by reviewers are sometimes better than internal QA at the manufacturers.