The Waiting Game

NVIDIA G-Sync was announced at a media event held in Montreal way back in October, and promised to revolutionize the way the display and graphics card worked together to present images on the screen. It was designed to remove hitching, stutter, and tearing -- almost completely. Since that fateful day in October of 2013, we have been waiting. Patiently waiting. We were waiting for NVIDIA and its partners to actually release a monitor that utilizes the technology and that can, you know, be purchased.

In December of 2013 we took a look at the ASUS VG248QE monitor, the display for which NVIDIA released a mod kit to allow users that already had this monitor to upgrade to G-Sync compatibility. It worked, and I even came away impressed. I noted in my conclusion that, “there isn't a single doubt that I want a G-Sync monitor on my desk” and, “my short time with the NVIDIA G-Sync prototype display has been truly impressive…”. That was nearly 7 months ago and I don’t think anyone at that time really believed it would be THIS LONG before the real monitors began to show in the hands of gamers around the world.

Since NVIDIA’s October announcement, AMD has been on a marketing path with a technology they call “FreeSync” that claims to be a cheaper, standards-based alternative to NVIDIA G-Sync. They first previewed the idea of FreeSync on a notebook device during CES in January and then showed off a prototype monitor in June during Computex. Even more recently, AMD has posted a public FAQ that gives more details on the FreeSync technology and how it differs from NVIDIA’s creation; it has raised something of a stir with its claims on performance and cost advantages.

That doesn’t change the product that we are reviewing today of course. The ASUS ROG Swift PG278Q 27-in WQHD display with a 144 Hz refresh rate is truly an awesome monitor. What did change is the landscape, from NVIDIA's original announcement until now.

Last year, I spent a lot of time learning about the technology behind NVIDIA G-Sync and even spoke with several game developers in the build up to the announcement about its potential impact on PC gaming. I wrote an article that looked at the historical background of refresh rates and how they were tied to archaic standards that are no longer needed in the world of LCDs, entitled: NVIDIA G-Sync: Death of the Refresh Rate. We also have a very in-depth interview with NVIDIA’s Tom Petersen that goes through the technology in an easy to understand step by step method that I would encourage readers watch for background on the game-changing feature in this display.

The idea of G-Sync is pretty easy to understand, though the implementation method can get a bit more hairy. G-Sync introduces a variable refresh rate to a monitor, allowing the display to refresh at wide range of rates rather than at fixed intervals. More importantly, rather than the monitor dictating what rate this refresh occurs at to the PC, the graphics now tells the monitor when to refresh in a properly configured G-Sync setup. This allows a monitor to match the refresh rate of the screen to the draw rate of the game being played (frames per second) and that simple change drastically improves the gaming experience for several reasons.

Gamers today are likely to be very familiar with V-Sync, short for vertical sync, which is an option in your graphics card’s control panel and in your game options menu. When enabled, it forces the monitor to draw a new image on the screen at a fixed interval. In theory, this would work well and the image is presented to the gamer without artifacts. The problem is that games that are played and rendered in real time rarely fall into a very specific frame rate. With only a couple of exceptions, games frame rates will fluctuate based on the activity happening on the screen: a rush of enemies, a changed camera angle, an explosion or falling building. Instantaneous frame rates can vary drastically, from 30, to 60, to 90, and force the image to be displayed only at set fractions of the monitor's refresh rate, which causes problems.

If a frame takes more than the standard refresh time of a monitor to draw (if the frame rate drops low) then you will see a stutter, or hitch, in the game play that is caused by the display having to re-draw the same frame for a second consecutive interval. Any movement that was tracking on the screen would suddenly appear to stop – and then would quickly “jump” to the next location faster than your mind thought it should. V-Sync also inherently introduces input latency to games if these lag and stutters take place.

The common alternative for gamers worried about latency and stutter was to disable V-Sync in the control panel or in game. This solves the stutter and latency issues (kind of) but cause another, much more noticeable issue, called "tearing". With V-Sync disabled, the graphics card sends a new frame to the monitor at any point in the monitors refresh cycle, even if the LCD is currently drawing. The result is that, at some point down the screen, the user sees the previous frame above, and the current frame below, at a sharp line. You will literally be seeing an image where the geometry is no longer lined up and, depending on the game and scene, can be incredibly distracting.

Monitors with refresh rates higher than 60 Hz reduce this tearing by having more frequent screen refreshes, and thus a tear is less likely to occur in any single refresh cycle, but the tearing is impossible to remove completely.

NVIDIA G-Sync switches things up by only having the monitor refresh its screen when a new frame is ready from the GPU. As soon as the next frame is drawn it can be based to the display and drawn on the screen without tearing. If the next frame is ready in 16ms, it can be sent immediately. If it takes 25ms or only 10ms, it doesn’t matter, the monitor will wait for information from the GPU to draw the new frame. The result is an incredibly smooth and fluid animation that doesn’t stutter and doesn’t tear.

There are a couple of fringe cases that NVIDIA has needed to build for, including frame times below 33ms (under 30 FPS), where the image on the panel might visibly darken or decay if it isn’t refreshed automatically, even if a new frame isn’t ready. Also, some games have issues with G-Sync (Diablo III, for example, doesn’t have a true full screen mode) and have to be disabled, either through a profile or manually to avoid artifacts.

Ultra Low Motion Blur Technology

Another feature present on the ASUS PG278Q monitor is ULMB, or Ultra Low Motion Blur. Original built as part of the NVIDIA 3D Vision infrastructure, ULMB is a technology that is used to decrease motion blur on the screen and remove or reduce ghosting of fast moving images. It does this by turning on the backlight in time with the screen refresh and then quickly darkening the backlight after the pixel has been “strobed”. The effect is that, with ULMB enabled, images are sharper and appear to have less motion blur from frame to frame.

This sounds great! But the side effect is a much lower total brightness perceived by the gamer on screen. Just as we saw with 3D Vision throughout its development, enabling this mode effectively drops the light output of the screen by half. For some gamers and in some situations, this trade off will be worth it. Particular games like RTS, that include lots of small text of units that scroll across the scene very quickly, can see dramatic sharpness increases.

It's difficult to capture with stills, but animations are darker, but shaper, with ULMB

It’s important to note that ULMB can only be used when G-Sync is not enabled and it only works at 85 Hz, 100 Hz, and 120 Hz. Most games, at least in my experiences thus far, will see much more benefit from the variable refresh rate technology of G-Sync than they will with ULMB. If brightness is a concern (like playing in a well lit room) then ULMB could be a non-starter as the halved light output will be very noticeable.

Enabling ULMB is as easy as navigating the monitor menu and selecting it and you’ll be able to adjust the strobe pulse width. I tested the capability through fantastic website called testufo.com that offers users a host of options to test the motion blur of their displays. It was easy to find instances in which the ULMB feature allowed for sharper animations but the brightness variance was also very apparent.

question on resolution , how does desktop look at say 1080 if you didn't want to scale windows settings to 125% because some apps not like that .
I see many times comments anything but native gives blurry text .
I have old 2007FP Dell 1600x1200 and run a custom res and don't really notice any degrade .

Thanks for quoting me, but I was talking about people saying one card 'felt smoother' than another card, where frame pacing had been fixed on both cards. My point was, how could people 'feel' that particular difference when the actual pacing difference between both platforms is negligible.

...now in this context, pretty much anyone you sit down in front of a GSYNC display is definitely going to see the difference. It pretty much smacks you in the face, especially at low FPS.

Editor's Choice would be higher than Gold, yes. And you had it right - the exceptionally high price is what led me to not dive 100% in with this product. And also we have a 4K G-Sync option coming from Acer and a 1080p AOC model coming soon as well.

Hey Ryan, longtime… I don’t know if I’d be so generous with even Gold. Would you please explain the technical reasons why the G-Sync module appears to have limited manufacturers to one DisplayPort, not even bypassing or rerouting for the monitor to switch to other sources?

Also, of the several monitors I have found being released shortly, none is above the color challenged, off-axis fading, TN; are there legitimate technical reasons for this limitation? Are there no high-quality panels capable of G-Sync’s or FreeSync’s requirements; relatively speaking, anything TN is not high-quality. I would really appreciate knowing and cannot find a legitimate answer, thank you for your time!

ULMB still has workarounds for the VG248QE that do work, but ULMB on something like the ROG Swift will be very difficult to crack because it's a custom scaler designed by Nvidia to drive the monitor. Even Nvidia has been paying attention to how people are using ULMB, disabling it for GSync.

If you'd like any indication of how much Nvidia hates supporting ULMB, they appear to have gotten ASUS to disable it entirely for any Radeon GPUs, even though the option is in the monitor's OSD menu and not part of GSync at all.

I believe the dip Linus spoke of is not really a dip (or gap). When we made that second chart, (showing the gap / judder is gone), I made sure I was watching it run. The thing is that when FPS drops down in the 20's, the delay between frames really becomes noticable. It's really more a matter of your brain no longer seeing motion and more individual frames. The transitions into and out of such low FPS figures was smooth, even though watching video at such a low frame rate is jarring to the eye.

I mean, it does for the majority of games. But even after that, SLI by nature adds more input lag into the mix and will ALWAYS add SOME frame pacing issues (nvidia even is WAY ahead of AMD on this one, but still).

If your goal is gsync and MAX possible smoothness then more than 1 video card IS a mistake. Since its counter active to that goal.

You have no idea what your talking about....almost every online review and video showcasing this monitor has used SLI and all have said it runs smoothly. Which is the whole purpose of G-sync..*sigh* people and their(F*ck Logic!) statements...

You have no idea what your talking about....almost every online review and video showcasing this monitor has used SLI and all have said it runs smoothly. Which is the whole purpose of G-sync..*sigh* people and their(F*ck Logic!) statements...

You have no idea what your talking about....almost every online review and video showcasing this monitor has used SLI and all have said it runs smoothly. Which is the whole purpose of G-sync..*sigh* people and their(F*ck Logic!) statements...

I'd be more interested in this if they offered a stand-less option. I use a triple monitor setup and, as long as it has VESA, I would happily forgo the fancy lights and adjustments for a price reduction.

Read "as long as it has VESA" as a statement about monitors in general and "I would happily forgo the fancy lights and adjustments for a price reduction" as emphasis on price reduction. I have this prejudice against paying for things I know I'm not going to use.

Holy cow, get this reviewer a glass of water! Hearing him struggle to swallow was enough to make me register to make a comment! Guys, we're totally OK with you having a drink of water while doing a review!

I think that once this sync tech has been out for quite awhile and prices drop a lot(for both versions of it), then I might consider upgrading my monitor and gpu. But in the mean time, my current setup provides a very smooth gaming experience.

I've got the Asus VG278HE 144Hz 1080p strobing monitor and this is the sort of monitor I want to upgrade it to.

It's not enough though. I'm gonna wait until we see a 2160p version of this Asus ROG monitor that still works at 144Hz and has G-Sync. I will then upgrade my graphics card to maintain a solid 120/144Hz framerate. It might take up to two years for the monitor and NVIDIA graphics cards to become available, but I'm in no hurry and money isn't too much of a barrier, either.

Are you the kind of idiot that want's a brand new Ferrari for $20K and expects 100mpg from it and the ability to ferry 5 kids around?

* Poor Panel Choice, a TN monitor, colors fades with the move of your head

TN is a TN. TN is used for this monitor because of the vastly superior refresh rates and response times. Which is one of the key selling points of this monitor. If they have of used an IPS panel then the monitor wouldn't have this level of performance (speed).

If you wan't to edit photos etc, this is not a good monitor, I agree.

* Ultra Low Motion Blur (ULMB) does not work in conjunction with G-Sync

No one said it did, Asus clearly state as much. It's a shame it doesn't - I assume there is a technical reason?

* One connector, no legacy video connections, one video input, DisplayPort only

Doesn't both me personally, if you wanted VGA (on a 2560x1440? LOL) then fair enough, check the specs. I believe it needs a DP port as HDMI doesn't have the bandwidth to run this res at 144hz.

* Inconsistent Bezel Thickness: bottom is much thicker than the sides and top

Agree, this is annoying if you want to put monitors above/below or side by side in a vertical position

This is your only valid point as fair as I'm concerned, as I don't believe they make this clear before purchase.

* 2560 x 1440, 16:9; should be 2560 x 1600, which is 16:10

No they should use 4K, 8k, 1080p, 1920x1200..... why SHOULD IT BE 16:10 rather than 16:9 ? do you just WANT it to be 16:10 therefore you shall decree ?

Are you the kind of idiot that want's a brand new Ferrari for $20K and expects 100mpg from it and the ability to ferry 5 kids around?

* Poor Panel Choice, a TN monitor, colors fades with the move of your head

TN is a TN. TN is used for this monitor because of the vastly superior refresh rates and response times. Which is one of the key selling points of this monitor. If they have of used an IPS panel then the monitor wouldn't have this level of performance (speed).

If you wan't to edit photos etc, this is not a good monitor, I agree.

* Ultra Low Motion Blur (ULMB) does not work in conjunction with G-Sync

No one said it did, Asus clearly state as much. It's a shame it doesn't - I assume there is a technical reason?

* One connector, no legacy video connections, one video input, DisplayPort only

Doesn't both me personally, if you wanted VGA (on a 2560x1440? LOL) then fair enough, check the specs. I believe it needs a DP port as HDMI doesn't have the bandwidth to run this res at 144hz.

* Inconsistent Bezel Thickness: bottom is much thicker than the sides and top

Agree, this is annoying if you want to put monitors above/below or side by side in a vertical position

This is your only valid point as fair as I'm concerned, as I don't believe they make this clear before purchase.

* 2560 x 1440, 16:9; should be 2560 x 1600, which is 16:10

No they should use 4K, 8k, 1080p, 1920x1200..... why SHOULD IT BE 16:10 rather than 16:9 ? do you just WANT it to be 16:10 therefore you shall decree ?

So far it seems like the new driver released today (344.11) solved the out of range bug. Installed it this AM and I've been running at 144 Hz all day heavily gaming with no problems.

I am more than happy with the monitor. Yeah it's over priced (hopefully it'll come down a bit in price later) but GSync is just a total game changer.

The panel, despite being TN, is stunning.

As previous poster noted, it's odd that they chose to make the bottom wider then the other three sides of the bezel. Not a real problem for me, or most people, but it would ruin the effect if you wanted to do 4 or 6 display multimonitor in a rectangle.