If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

OpenGL Frame Latency / Jitter Testing On Linux

07-18-2013, 01:00 PM

Phoronix: OpenGL Frame Latency / Jitter Testing On Linux

Beyond there finally being Team Fortress 2 benchmarks on Linux, at Phoronix is now also support for OpenGL frame latency benchmarks! It's another much sought after feature and request for graphics hardware and driver testing...

Comment

On the windows side, we've been told it's impossible to capture this information reliably without a piece of dedicated hardware that NVidia recently released to some review sites. This seems to be completely in software, though.

Is this going to run into the same issues that windows games had? Is it the same less accurate tests that came out on windows a while back, that can show some obvious problems but not the real details of what's going on? Or is OpenGL somehow different than D3D in this regard?

Comment

On the windows side, we've been told it's impossible to capture this information reliably without a piece of dedicated hardware that NVidia recently released to some review sites. This seems to be completely in software, though.

Is this going to run into the same issues that windows games had? Is it the same less accurate tests that came out on windows a while back, that can show some obvious problems but not the real details of what's going on? Or is OpenGL somehow different than D3D in this regard?

Hmm, the documentation doesn't really seem to say anything about how it calculates that number. I guess i'll need to dive into the source code if i really want to know. Most likely it does the same thing the old windows tests did, and just adds a callback event before the frame is displayed.

As such, i think we need to take these tests with a grain of salt, although they are still very useful to see.

Comment

Hmm, the documentation doesn't really seem to say anything about how it calculates that number. I guess i'll need to dive into the source code if i really want to know. Most likely it does the same thing the old windows tests did, and just adds a callback event before the frame is displayed.

As such, i think we need to take these tests with a grain of salt, although they are still very useful to see.

Why? If it just calls gettimeofday() after every draw call and presents the difference from last frame, then it is showing frame latency. Sure, double or triple buffering will smooth things out for the user, but if you have peaks at the level of of 30ms, there's no way you won't notice it while gaming. So it does show how smooth the game is, and that's most important.

Comment

On the windows side, we've been told it's impossible to capture this information reliably without a piece of dedicated hardware that NVidia recently released to some review sites. This seems to be completely in software, though.

Is this going to run into the same issues that windows games had? Is it the same less accurate tests that came out on windows a while back, that can show some obvious problems but not the real details of what's going on? Or is OpenGL somehow different than D3D in this regard?

although windows related this video was interesting

Comment

Why? If it just calls gettimeofday() after every draw call and presents the difference from last frame, then it is showing frame latency. Sure, double or triple buffering will smooth things out for the user, but if you have peaks at the level of of 30ms, there's no way you won't notice it while gaming. So it does show how smooth the game is, and that's most important.

Because there are various queues, buffers, and schedulers that can distort things.

However, i will note, that much of the issue has to do with numbering frames precisely and detecting tearing, which is tough to do in an overlay but easy if you modify the actual game code. So it seems like that may be exactly what the Doom3 code does, so maybe that's not an issue after all, and it would only become one if Michael tried to make this a generic feature of PTS that applied to other applications that didn't directly support it.

Comment

However, i will note, that much of the issue has to do with numbering frames precisely and detecting tearing, which is tough to do in an overlay but easy if you modify the actual game code. So it seems like that may be exactly what the Doom3 code does, so maybe that's not an issue after all, and it would only become one if Michael tried to make this a generic feature of PTS that applied to other applications that didn't directly support it.

PTS only supports it if it can be queried from the actual application/game directly. Only when the actual data gets exposed to PTS is it then handled in a generic way after that.