HiGame

For the past few years, both AMD and Nvidia have been talking up their respective solutions for improving gaming display quality. Nvidia calls its proprietary implementation G-Sync, and has been shipping G-Sync displays with partner manufacturers for over a year, while AMD worked with VESA (Video Electronics Standard Association) to build support for Adaptive Sync (aka FreeSync) into the DisplayPort 1.2a standard. Now, with FreeSync displays finally shipping as of today, it’ll soon be possible to do a head-to-head comparison between them.
Introducing FreeSync

AMD first demonstrated what it calls FreeSync back in 2013. Modern Graphics Core Next (GCN) graphics cards from AMD already had the ability to control display refresh rates — that technology is baked into the embedded Display Port standard (eDP). Bringing it over to the desktop, however, was a trickier proposition. It’s taken a long time for monitors that support DisplayPort 1.2a to come to market, which gave Nvidia a significant time-to-market advantage. Now that*FreeSync displays are finally shipping, let’s review how the technology works.

Traditional 3D displays suffer from two broad problems — stuttering and tearing. Tearing occurs when Vertical Sync (V-Sync) is disabled — the monitor draws each frame as soon as it arrives, but this can leave the screen looking fractured, as two different frames of animation are displayed simultaneously.
Turning on V-Sync solves the tearing problem, but can lead to stuttering. Because the GPU and monitor don’t communicate directly, the GPU doesn’t “know” when the monitor is ready to display its next frame. If the GPU sends a frame that the GPU doesn’t draw, the result is an animation stutter, as shown below:

FreeSync solves this problem by allowing the GPU and monitor to communicate directly with each other, and adjusting the refresh rate of the display on the fly to match what’s being shown on screen.

The result is a smoother, faster gaming experience. We haven’t had a chance to play with FreeSync yet, since the monitors have only just started shipping. But if the experience is analogous to what G-Sync offers, the end result will be gorgeous.

As of today, multiple displays from Acer, BenQ, LG, Nixeus, Samsung, and ViewSonic are shipping with support for FreeSync (aka DisplayPort 1.2a). A graph of those displays and their various capabilities is shown below. AMD is claiming that FreeSync monitors are coming in at least $50 cheaper than their G-Sync counterparts, which we’ll verify as soon as these monitors are widely available in-market.
Which is better — FreeSync or G-Sync?

One of the things that’s genuinely surprised me over the past year has been how ardently fans of AMD and Nvidia have defended or attacked FreeSync and G-Sync, despite the fact that it was literally impossible to compare the two standards, because nobody had hardware yet. Now that shipping hardware does exist, we’ll be taking the hardware head-to-head.

AMD, of course, is claiming that FreeSync has multiple advantages over G-Sync, including its status as an open standard as compared to a proprietary solution and the fact that G-Sync can incur a small performance penalty of 3-5%. (Nvidia has previously stated the 3-5% figure, AMD’s graph actually shows a much smaller performance hit).

AMD is claiming that FreeSync has no such issues, and again, we’ll check that once we have displays in hand.

There’s one broader issue of “better” we can address immediately, however, which is this: Which standard is better for consumers as a whole? Right now, FreeSync is practically a closed standard, even if AMD and VESA don’t intend that to be the case. If you want the smooth frame delivery that Nvidia offers, you buy G-Sync. If you want the AMD flavor, you buy FreeSync. There’s currently no overlap between the two. To be clear, that lack of overlap only applies to the G-Sync and FreeSync technologies themselves. A FreeSync display will function perfectly as a standard monitor if you hook it up to an Nvidia GPU, and a G-Sync monitor works just fine when attached to an AMD graphics card.

The best outcome for consumers is for AMD, Nvidia, and Intel to collectively standardize on a single specification that delivers all of these capabilities. For now, that seems unlikely. Adaptive Sync has been defined as an optional standard for both DisplayPort 1.2 and 1.3, which means manufacturers won’t be forced to integrate support, and may treat the capability as a value-added luxury feature for the foreseeable future.

How this situation evolves will depend on which standard enthusiasts and mainstream customers embrace, and whether Intel chooses to add support for DP 1.2a or stay out of the fight. For now, if you buy a display with either technology, you’re locking yourself to a corresponding GPU family.