At CES, various display makers exhibited their gaming-grade monitors featuring NVIDIA G-Sync, a display fluidity technology that's an evolution of V-sync, which we've seen with our own eyes to make a tangible difference. AMD, at the back-room of its CES booth, demoed what various sources are calling "FreeSync," a competitive technology to G-Sync, but one that doesn't require specialized hardware, or licenses to the display makers. AMD didn't give out too many details into the finer-workings of FreeSync, but here's what we make of it.

FreeSync taps into a lesser known feature that AMD Radeon GPUs have had for the past three generations (i.e. since Radeon HD 5000 series), called dynamic refresh rates. The feature allows GPUs to spool down refresh rates to save power, without entailing a display re-initialization (the flicker that happens when a digital display is sent a signal with a new resolution and refresh rate), on supported displays. Dynamic refresh is reportedly also a proposed addition to VESA specifications, and some (if not most) display makers have implemented it. On displays that do, AMD Catalyst drivers already run dynamic refresh rates. For display makers, supporting the technology won't require buying licenses, or integrating specialized hardware into the displays.

According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware. Although the results of FreeSync will be close to those of G-Sync, NVIDIA's technology will have an edge with its output quality, because the two are implemented differently, and by that we don't just mean how the hardware is laid out on a flow-chart, although the goals for both technologies is the same - to make a display's refresh rate slave to the GPU's frame-rate, rather than the other way around (with V-Sync).

In AMD's implementation, VBLANK length (interval between two refresh cycles where the GPU isn't putting out "new" frames, a sort of placebo frames) is variable, and the driver has to speculate what VBLANK length to set for the next frame; whereas, in NVIDIA's implementation, the display holds onto a VBLANK until the next frame is received. In NVIDIA's implementation, the GPU sends out whatever frame-rate the hardware can manage, while the monitor handles the "sync" part. In AMD's the speculation involved in setting the right VBLANK length for the next frame could cause some software overhead for the host system. That overhead is transferred to the display in NVIDIA's implementation. We're looking forward to AMD's whitepaper on FreeSync. AMD holds the advantage when it comes to keeping costs down when implementing the technology. Display makers have to simply implement something that VESA is already deliberating over. The Toshiba laptops AMD used in its FreeSync demo at CES already do.

Past 3 generations eh? So they waited until now to tell us this? I call BS, typical AMD PR stunt.

Click to expand...

Why so surprised? AMD had something useful and they didn't market it, because their marketing skills are abysmal, and they assumed nobody would care about it. Kinda like that entire section they sold off that's now incredibly successful. They're only quietly marketing it now because NVidia is set to make a killing on GSync monitors.

I honestly don't see how people can play without it. Just seeing one screen tear sends me into inis, control panels, and scrounging the web to find solutions to enable vsync in a game that forgot to add it.

Barely could get through Uncharted and a few other PS3 games because vsync wasn't enabled due to my HDfury adapter breaking it. No, I don't have an HDCP monitor. I bought the gen before everyone suddenly made it standard. Lousy DRM companies.

lucid virtu makes almost my games running horribly stuttering even on single monitor. It's just play well on creating a wonderful benchmark score. I guess no one using lucid virtu while playing games.

Click to expand...

exactly, which is why I said a software solution to a hardware problem genuinely doesn't works, unless this is some massive breakthrough that we miss a few years ago and nobody knew about, I'll be convinced until I see some real results.

I honestly don't see how people can play without it. Just seeing one screen tear sends me into inis, control panels, and scrounging the web to find solutions to enable vsync in a game that forgot to add it.

Barely could get through Uncharted and a few other PS3 games because vsync wasn't enabled due to my HDfury adapter breaking it. No, I don't have an HDCP monitor. I bought the gen before everyone suddenly made it standard. Lousy DRM companies.

Click to expand...

I never said the vsync was creating the tear -it removes them yes. I simply summarized the issue with one sentence - I thought people would get the point, but now I have to type this entire paragraph just to explain it again. vsync was traditionally known for stuttering/lag on some video cards/games, it was never the perfect solution as gamers had to choose between tear or lag.

Also, vsync does not get rid of stuttering, it creates them because stuttering is caused by having vsync enabled when your frame rates drops under the monitor refresh rate.

I never said the vsync was creating the tear -it removes them yes. I simply summarized the issue with one sentence - I thought people would get the point, but now I have to type this entire paragraph just to explain it again. vsync was traditionally known for stuttering/lag on some video cards/games, it was never the perfect solution as gamers had to choose between tear or lag.

Also, vsync does not get rid of stuttering, it creates them because stuttering is caused by having vsync enabled when your frame rates drops under the monitor refresh rate.

Click to expand...

having vsync off caused tearing.

having vsync turned on with a badly coded game engine COULD cause stuttering. games fault since they assumed no one had hardware fast enough to run past 60FPS, thus they never looked into the issue.

Man, that is the worst analysis ever. It doesn't provide any concrete example or data, it just trashes AMD for free. Plus there are some really immature statements in that "article"

Click to expand...

My thoughts exactly, that guy seems like a douche, and provided no information other than "I think GSync looks better better on a demo after seeing Freesync in a backroom on a tiny laptop NOT DESIGNED FOR GAMES".

AMD are just complete retards. NVidia demoes their polished GSync probably on some real nice hardware. AMD clearly quickly rushed out a demo of a crappy little laptop showing off technology they haven't even bothered to spit-shine. They would have been better off waiting, polishing it up, and then showcasing it on a nice HD monitor with a 290X or something, you know, REALLY marketing it hard, showing how awesome it is on some high end hardware in a AAA title.

My thoughts exactly, that guy seems like a douche, and provided no information other than "I think GSync looks better better on a demo after seeing Freesync in a backroom on a tiny laptop NOT DESIGNED FOR GAMES".

AMD are just complete retards. NVidia demoes their polished GSync probably on some real nice hardware. AMD clearly quickly rushed out a demo of a crappy little laptop showing off technology they haven't even bothered to spit-shine. They would have been better off waiting, polishing it up, and then showcasing it on a nice HD monitor with a 290X or something, you know, REALLY marketing it hard, showing how awesome it is on some high end hardware in a AAA title.

Click to expand...

way to miss the point - they wanted to show it CAN work on existing hardware, including where it matters most - crap low end hardware that cant push 60FPS.

way to miss the point - they wanted to show it CAN work on existing hardware, including where it matters most - crap low end hardware that cant push 60FPS.

Click to expand...

Freesync is supposed to be competing against GSync right? I'm fairly certain people buying GSync monitors have enough cash to splash on a good system considering they're spending so much on a GSync monitor. If this isn't for the high end market like GSync monitors, then it isn't competing at all, just another bit of free stuff for everyone.

My thoughts exactly, that guy seems like a douche, and provided no information other than "I think GSync looks better better on a demo after seeing Freesync in a backroom on a tiny laptop NOT DESIGNED FOR GAMES".

Patented 7-11 years ago by ati and implemented 3 years ago in AMD gpus, along with something they're pushing for in the vesa standard, but didn't improve/capitalize on because conflicts with marketing budget/open-standard mentality?

So very, very, very typical of ATi...history repeats itself for the nth time. It sounds like pretty much every baseline hardware block/gpu-use implementation outside hardware T&L for the last decade. ATi takes an idea, implements it, pushes for it to be a standard in DX/OGL while it goes unused because by definition of invention initially proprietary, nvidia makes a version much later that based on that initial idea but developed further and pushed harder (because of marketing or newer fab processes affording them the space to implement it) usually at that point in a needlessly proprietary manner, and then eventually it becomes a standard.

Another entry into the forward-thinking but badly capitalized technology of ati. May they forever be the guinea pigs that break the ice that allows nvidia to publicize it so we all eventually benefit. Hopefully the open version of the tech, now that it is in fashion, is further realized and adopted.

I'mma file this right next to TRUFORM and CTM, and hope this turns out equally as well as those eventually did and will.