got only so far as the method goes, but this method is flawed. it tests the entire lag from the press of a button to the update on the screen. also -- primitive method and just from that point i dont like it

The idea has been proven worthy, yes, and nVidia deserves credit for developing it, manufacturing hardware, and breaking it onto the market well before anybody else. FreeSync is a complete hoax of a marketing ploy at this point, designed to undercut G-Sync's rollout during CES. It still requires a new hardware development effort, which means it's not something everyone will be getting for free with DP 1.3. It's a complete falsehood.

Just to add... NVIDIA did the same with CUDA. NVIDIA evangelized and built the toolsets for GPGPU. They saw the need and pushed the market... so they reaped the first-mover rewards.

That's not saying proprietary implementations are good. It's saying that niche markets sometime require proprietary implementation to demonstrate the need or quickly adapt to the market.

Exactly. nVidia's propritaryness may be annoying, but damn if it doesn't get results when they really want it.

I must be rare, because I don't care about G-Sync or FreeSync now. I have UltraSharps, and I'm not trading them in for anything. I care that it becomes a standard feature eventually, overall improving the industry. Which is how most things are done anyway, very little in the tech world is done overnight.

I am with you. I have Korean IPS and the color quality and viewing angles make up for the little blur and input delay I have.

With DP 1.3 around the corner, G-Sync is nothing but an early adopter tax, but this time it's pretty huge.

The biggest problem with G-Sync, is that I could see Nvidia not implementing the standards in DisplayPort and forcing people to use G-Sync over the less proprietary solution. I kind of expect Nvidia to do this, they sort of did the same by deprecating Aegia PhysX add in cards and forcing PhysX to only work on Nvidia GPUs.

It really would be a shame if Nvidia dropped the features of DP 1.3 to force people into G-Sync, because it means it's going to divide everything up, and instead of everyone getting this technology, monitor vendors are going to have to decide if they want to support G-Sync or all of DP 1.3.

I haven't heard any proof that Nvidia would drop the features of DP 1.3 that would enable sync without G-Sync, but given Nvidia's history, it seems very likely. They have always been rather fond of making this proprietary and locking out other solutions. Look how well open source Bullet Physics runs on OpenCL on Nvidia and compare that to PhysX. Same sort of deal.

Given the high price of G-Sync monitors and the module, I have a feeling that 2014 is going to be a year where you choose between 1440p monitors with G-Sync or 4k monitors with DP 1.3 at the same price. And I might sound a little tin-foil hat here, but by Nvidia only letting G-Sync on specific monitors, it helps deal with their entire line-up's issues at higher resolutions, because Nvidia can essentially block G-Sync from going on any higher resolution monitors, to keep them from looking bad. Then, it can put reviewers into a position where they have to choose between G-Sync and higher resolution, and there's no clear winner. It would be one of those "Well the 4k looks very nice, but there's input lag so it might just be worth it to go G-Sync on a 1440p monitor!"

Given the high price of G-Sync monitors and the module, I have a feeling that 2014 is going to be a year where you choose between 1440p monitors with G-Sync or 4k monitors with DP 1.3 at the same price. And I might sound a little tin-foil hat here, but by Nvidia only letting G-Sync on specific monitors, it helps deal with their entire line-up's issues at higher resolutions, because Nvidia can essentially block G-Sync from going on any higher resolution monitors, to keep them from looking bad. Then, it can put reviewers into a position where they have to choose between G-Sync and higher resolution, and there's no clear winner. It would be one of those "Well the 4k looks very nice, but there's input lag so it might just be worth it to go G-Sync on a 1440p monitor!"

Why in the world would they block G-Sync on 4k monitors? At CES, they actually showed Asus 32 inch 4K PQ321Q with a retrofitted G-Sync module just to prove that the technology works at 4K just as well as 1080p. It's manufacturers who don't want to push 4K and G-Sync at the same time. I'm waiting to pick up the first 4K G-Sync monitor that comes out and nothing short of that and I expect they're coming out in Q3 or Q4.

I am with you. I have Korean IPS and the color quality and viewing angles make up for the little blur and input delay I have.

With DP 1.3 around the corner, G-Sync is nothing but an early adopter tax, but this time it's pretty huge.

The biggest problem with G-Sync, is that I could see Nvidia not implementing the standards in DisplayPort and forcing people to use G-Sync over the less proprietary solution. I kind of expect Nvidia to do this, they sort of did the same by deprecating Aegia PhysX add in cards and forcing PhysX to only work on Nvidia GPUs.

It really would be a shame if Nvidia dropped the features of DP 1.3 to force people into G-Sync, because it means it's going to divide everything up, and instead of everyone getting this technology, monitor vendors are going to have to decide if they want to support G-Sync or all of DP 1.3.

I haven't heard any proof that Nvidia would drop the features of DP 1.3 that would enable sync without G-Sync, but given Nvidia's history, it seems very likely. They have always been rather fond of making this proprietary and locking out other solutions. Look how well open source Bullet Physics runs on OpenCL on Nvidia and compare that to PhysX. Same sort of deal.

Given the high price of G-Sync monitors and the module, I have a feeling that 2014 is going to be a year where you choose between 1440p monitors with G-Sync or 4k monitors with DP 1.3 at the same price. And I might sound a little tin-foil hat here, but by Nvidia only letting G-Sync on specific monitors, it helps deal with their entire line-up's issues at higher resolutions, because Nvidia can essentially block G-Sync from going on any higher resolution monitors, to keep them from looking bad. Then, it can put reviewers into a position where they have to choose between G-Sync and higher resolution, and there's no clear winner. It would be one of those "Well the 4k looks very nice, but there's input lag so it might just be worth it to go G-Sync on a 1440p monitor!"

another crystal ball having wizard. pls, tell me, how saggy are my moobs in 2025.

got only so far as the method goes, but this method is flawed. it tests the entire lag from the press of a button to the update on the screen.

That's correct, but the method is not flawed.

I actually wrote about this -- further down the article, I explain why this method of test is done, and why it's more honest than other lag tests. It tests the entire human chain lag, so it's easy to mathematically calculate the differential between VSYNC OFF latency versus GSYNC latency, via the honest, whole chain method.

It was necessary to do the whole chain:
- to include whatever the display is doing (hardware-based GSYNC latency)
- to include whatever the driver is doing (software-based GSYNC latencies)
- to include whatever the game is reacting to GSYNC (software-based GSYNC latencies)
This makes the full chain method more honest and less flawed.

I do not trust most sites' hard numbers on input lag, without additional data on what part of the chain is measured, using what.
A lot of sites have not been always specific on what part of input lag chain.

- Most input lag tests uses test equipment, rather than real-world games.
- Lag relative to scanout? (Remember, top edge of screen has less lag than bottom edge of scree)
- Average lag?
- Whole button-to-pixels lag?
- Display-specific lag?
- CRT measurement method?
- SMTT measurement method?
- Leo Bodnar measurement method? (A veritable black box: Does it include HDMI transciever delay too? Does it account for the ~0.5ms-1.0ms latency of the Vertical Back Porch in the signal timings? Does it measure to 50% midpoint of pixel transition? Or measure to "first light" of a pixel?)
- Are you maeasuring lag from signal input to pixel illumation?
- Including or excluding cable lag?
- Lag from signal input to pixel illumation?
- Lag to first faint visibility of LCD pixel (early in pixel transition cycle?) Pixels don't react instantly, y'know.
- Lag to final full visibility of LCD pixel (late in pixel transition cycle?)
- And, even CRTs have input lag, if you're measuring bottom-edge input lag with a Leo Bodnar, because of the scanout delay (for framebuffered architectures).
- Sometimes it's not easily possible to measure certain parts of the chain.
- I also have a photodiode osilloscope (same as prad and TFTCentral), however, it's not suitable for real-world game testing.
- Most of the above lag tests do not measure real-world gaming.

They actually end up measuring different parts of the input lag chain, and differential measurements often is out of sync with non-differential measurements.

Therefore, the method is not flawed, at least when compared to a lot of other lag measurement methodologies.
Or if you prefer: All lag test methods ever done in human history, are all flawed, to varying degrees, even mine is flawed. But less flawed than the other lag tests for this specific purpose: real-world real-game input lag tests. -- It's simple as 1. Run full chain test with VSYNC OFF, 2. Run full chain test with GSYNC, and 3. Compute difference. It doesn't invalidate other lag test methods, and they are all useful for different reasons, but this is one of the most useful lag tests, because it's real-world. Future tests will include LightBoost, ULMB, VSYNC ON, and others.

I do personally believe, that this is one of the best and most honest input lag tests (real-world human, real-world game, "where it matters") that any review site on the Internet, can do, to do reliable input lag tests of any brand new, exciting "miracle voodoo technology" (such as GSYNC) so you include whatever the display is doing (hardware-based input latency) and whatever the driver is doing (software-based GSYNC latencies) and whatever the game is reacting to GSYNC (software-based GSYNC latencies). Thusly, the test is not flawed in that it doesn't miss anything in the chain that GSYNC might be influencing one way or another. Therefore, the whole-chain method is one of the best, most honest methods of measuring input lag, from a human perspective.Edited by mdrejhon - 1/16/14 at 1:10pm

Um just like I was saying there is only (G-Sync vs V-Sync OFF) on those graphs.

But I'm left guessing what the difference would be between. (G-Sync vs V-Sync ON) or (V-Sync ON vs V-Sync OFF)

I know there is a huge difference between (V-Sync ON vs V-Sync OFF) Because I can feel it, I just always wondered what the measurable difference was.

Edit : just realized in the comments it says there will be more testing later for what I was also pointing out.

Jabbadab says:

Quote:

More interesting would be to see how lag is different with normal vsync on vs g-sync. And maybe even forced tb+vsync on from nvidia inspector. And how lag differs between g-sync vs adaptive vsync.
Interesting test nonetheless thanks for that!

Chief Blur Buster says:

Quote:

Agreed. Eventually, we’ll do these tests in additional articles.

These tests were time consuming, so I ran out of time to do VSYNC ON passes — but we already know VSYNC ON almost never has less input lag than VSYNC OFF. However, we definitely want to do more input lag tests, in additional situations where we are interested in seeing lag results (LightBoost, ULMB, new fps_max values, VSYNC ON, other games, etc).

With VSYNC OFF averages of 72ms and 74ms, this is very similar to G-SYNC averages of 77ms and 74ms respectively. The variability of the averages appears to fall well below the noise floor of the high variability of Battlefield 4, so Blur Busters considers the differences in averages statistically insignificant. During game play, we were unable to feel the input lag difference between VSYNC OFF versus G-SYNC.

another crystal ball having wizard. pls, tell me, how saggy are my moobs in 2025.

2025 here. They are quite saggy. Might need a push-up brassiere.

OT though, I'm curious what HDD/SSD was used in these tests. Because game code is falliable. Access times to assets can vary with millisecond differences, all of which can attribute to skewing actual results.