AMD claims it can offer the benefits of Nvidia’s G-Sync with a free driver update, Nvidia rebuts – fight!

Share This article

At CES this week, AMD made an unusual announcement about Nvidia’s new G-Sync technology. According to the company’s senior engineers, they can replicate much of the advantages of Nvidia’s G-Sync tech through the use of what are called dynamic refresh rates. Multiple generations of AMD video cards have the ability to alter refresh rates on the fly, with the goal of saving power on mobile displays. Some panel makers offer support for this option, though the implementation isn’t standardized. AMD engineers demoed their own implementation, dubbed “FreeSync,” on a laptop at the show.

Dynamic refresh rates would theoretically work like G-Sync by specifying how long the display remained blank on a frame-by-frame basis, providing for smoother total movement. AMD has stated that the reason the feature didn’t catch on was a lack of demand — but if gamers want to see G-Sync-like technology, AMD believes it can offer an equivalent. AMD also told Tech Report that it believes triple buffering can offer a solution to many of the same problems G-Sync addresses. AMD’s theory as to why Nvidia built an expensive hardware solution for this problem is that Nvidia wasn’t capable of supporting G-Sync in any other fashion.

Nvidia rebuts

Nvidia, unsurprisingly, has a different view of the situation. Tech Report spoke to Tom Peterson, who stated the difference between a laptop and a desktop running a software equivalent to G-Sync is that laptop displays are typically connected using embedded DisplayPort or the older LVDS standard. Standalone monitors, in contrast, have their own internal scaling solutions and these chips typically don’t support a variable refresh rate.

I think Nvidia is probably being honest on that score. The G-Sync FPGA is fairly hefty, with 768MB of onboard memory and a limited number of compatible monitors. Nvidia has a long interest in keeping its technology proprietary, but it also has reasons to extend G-Sync as widely as possible for as little up-front cost as possible. A G-Sync upgrade kit for $50 that fits any modern monitor would sell more units than a $100 or $150 kit that only fits a limited number of displays or that requires a new LCD purchase.

Nvidia’s G-Sync includes a 768MB buffer combined with a custom FPGA.

It’s entirely possible that both companies are telling the truth on this one. AMD may be able to implement a G-Sync-like technology on supported panels, and it could work with the manufacturers of scalar ASICs if G-Sync starts catching on for Nvidia. Nvidia, meanwhile, is probably telling the truth when it says it had to build its own hardware solution because existing chips for desktop displays weren’t doing the job.

Whether this works out to a significant halo for Nvidia in the long run or not will come down to price and time-to-market. In the past, Nvidia took the lead on computing initiatives like PhysX and CUDA, getting out in front on technical capability, while industry-wide standards followed along at a slower pace. The impact on the consumer market has been mixed — PhysX definitely delivered some special effects that AMD didn’t match, but CUDA’s impact on the consumer space has been small (its HPC success is another story altogether).

The difference between these technologies and G-Sync is that monitors are fairly long-lived. Buy a G-Sync monitor today, and you have the benefits for five years or more. Some games benefit from G-Sync more than others, but once Nvidia smoothes out the development pipeline, we should see a consistent stream of titles that run better in that mode. It’s not like hardware PhysX, which was never supported by more than a handful of major games in any given year. In the long run, if panel makers start building variable refresh rates into their own displays, than the need for Nvidia-specific G-Sync technology may fade out — but that doesn’t mean the company can’t make a pretty penny off the concept while it lasts. And since it’ll take time for panel manufacturers to adopt the capability if they choose to do so, it means Nvidia has a definite window of opportunity on the technology.

Tagged In

Post a Comment

LtMatt

Nice article Joel.

IKROWNI

This is going to be great for the gaming community. Get it free now and then grab the hardware later I you want at a driven down price. Good job AMD. Looks like they are really pushing to dethrone nvidia this year.

Heath Parsons

They dethrone each other every other release of products.

Naipier

Great article! I love how, even though Joel stated his opinion, he backed it up with concrete details. James Plafke could learn a lot from Joel.

Joel Hruska

Well, in this case, the opinion section is the impact of G-Sync on the monitor market — and I don’t think we know enough yet to predict whether customers will or won’t go for it. The benefit has to be significant, easy to demonstrate, and broad.

When it comes to the technical capabilities of the respective products, there’s no reason to doubt AMD’s claim that it can drive a monitor display using variable refresh rates in the GCN chip — and no reason to doubt NV’s claim that it built its own scalar solution because existing chips in desktop monitors can’t offer the capability. We know that desktop monitors have their own scalars, and if they *could* offer the capability, Nvidia wouldn’t have needed its own chip.

Naipier

There was more embedded opinion than that, but it was well supported. Either way, my comment was more of a compare and contrast than a nit-pick.

Phobos

So AMD can offered something similar with just an update driver then? if so that will be awesome.

Daniel Moreno

It can offer something similar with a driver update only IF the monitor has a variable refresh rate, which most don’t (laptops aside).

davidcianorris

DP 1.3 is the Answer… AMD has just replied to nVIDIA’s claim. I see a dark future for G-sync to be honest…

Daniel Moreno

DisplayPort 1.3 is only the answer if the controller of the display also included a variable refresh rate, which would seem logical, but would also drive up costs. Unfortunately, in your scenario, people would still be required to buy a new monitor.

davidcianorris

A new monitor way less expensive than a “G-sync ready” one…

Shadow

How do i know if my monitor has variable refresh rate?.. I checked in the settings and it can go from 120hz down to 23hz, 24, 25, 29, 30, 50, 59 and 60hz selectable.

Does that mean it could work on it? It’s a samsung screen.

standard

And that, ladies and gentlemen, is why competition is a good thing.

ephemeris

Yeah think of separate refresh rates in desparate open windows. At same time on desktop. Or even in different objects ‘within a window’ . Possible ? A problem with the monitor ? An operating system problem ?

That is real computing.

Sorry if this has nothing to do with G-sync. Or I’m being erroneous. Wishful thinking maybe ?

200380051

Its a hardware related polemic. It has to do with how and when the display and graphics subsystem present something to the user – its all about synchronising both graciously so no more power is used than necessary all while the user gets content displayed right when its available.

Nothing to do beyond the panel itself; no per-window or per-component stuff or further subdivision of your screen estate can be the focus of such features.

Jason

If AMD’s method can produce results that are comparable to G-Sync, then that certainly sounds like a better method. The less proprietary hardware, the better. I can picture a market where most monitors support variable refresh rate. I can’t picture a market where most monitors have an nVidia chip in them.

VirtualMark

“Dynamic refresh rates would
theoretically work like G-Sync by specifying how long the display
remained blank on a frame-by-frame basis, providing for smoother total
movement”

– how would the total movement be smoother than having a constant high refresh rate? Sure it can save power, but I don’t see how it can make things smoother?

Zylvur

It can ‘reduce’ tearing and stuttering because the panel can harmonize with the data feed from GPU.

VirtualMark

V-sync at 60fps eliminates tearing… so I don’t see any advantage here?

Joel Detrow

V-sync On eliminates tearing but can introduce stutter and a tiny bit of input lag. V-sync Off eliminates stutter but introduces tearing.

Monitors typically operate at a constant refresh rate, refreshing the frame presented by the video source every (for example) 1/60th of a second. With V-sync on, the video driver will wait to present a new frame until the monitor has finished drawing the last one. If the source rendered the next frame extremely quickly, it will have to wait almost an entire refresh before showing the next frame, which introduces a tiny bit of lag between user input and the visual feedback.

If frame render time varies while still falling within 1/60th of a second, the user may notice their input occasionally lagging slightly. Stutter comes when the next frame isn’t quite ready by the time the next refresh comes around, which results in the video card presenting the same frame for -two- refreshes, which is visible as a stutter.

With G-sync and its equivalents, that last scenario would allow the monitor to wait up to an entire refresh cycle for the next frame to become available before refreshing. Because, in most cases, the frame will only need a small fraction of the next 1/60th of a second to finish being rendered, the delay is much less noticeable than if it were to have to wait an entire frame refresh.

Put another way, the variation between frame times under V-sync is normally either invisible (always rendering above 60 FPS) or clearly visible as ugly stuttering if the frame rate drops below 60 for even a single frame render. With G-sync, the frames are displayed by the monitor as quickly as they can be rendered, up to the monitor’s maximum refresh rate, which means the variation of the time between two frames will always be very small, even unnoticeable.

It’s essentially the benefits of both V-sync on and off without the shortcomings of either. You’d be able to turn your settings up to operate in an optimal range without worrying as much about the frame rate. 120 Hz monitors are looking quite attractive now, eh?

Joel Hruska

Thank you Joel, for the excellent explanation.

I don’t normally cross link between employers, but I’m going to make an exception in this case.

There’s an actual video you can watch that’s recorded at a high enough frame rate to illustrate the G-Sync vs. V-Sync difference.

Joel Detrow

Thanks for the link, it’s definitely noticeable in that video. Here’s hoping the solution AMD’s pushing takes off so that everyone, not just Nvidia’s customers, can enjoy that kind of experience.

D.B.

NVIDIA has not invented Phisyx, Ageia has invented the Phisyx engine and later NVIDIA boughtit. Anyway seems to me that NVIDIA is trying so hard to patent something new, some new technology. I have both cards but i tend to prefer my ATI over nvidia. Not because of performance but for options. AMD is the king of customizing your multiple displays. and competition is good for us their customers :).

SirGCal

It never stated that NVIDIA invented PhysX… It simply stated they took the initiative with it meaning they used it stronger then the competition and eventually buying the company, etc. So I’m not sure what that statement is doing in your retort…

I used AMD for many many years but gave away my pair of 7970’s for 690’s for multiple reasons. Mostly though they have issues going above 60Hz effectively. Constantly got the ‘snow’ crash. Never a problem with the NVIDIA card. But also the frame delay in the multi-card setups got me. So my last few systems have been NVIDIA focused until AMD cleans that up which they seem to be doing on the multi-card issue anyhow.

Joel Hruska

That’s very odd that you have >60Hz problems. I’ve usd multiple Asus VG278 displays at 120-144Hz on both AMD and NV hardware without ever having a problem from any vendor.

Were you using HDMI, DP, or dual-link DVI?

Angel Ham

“Nvidia has a long interest in keeping its technology proprietary”

Looks like the folks at nVidia haven’t learned anything from Sony or Rambus.

SirGCal

Not sure they mean the same thing. They can keep the tech proprietary and still give it out, Like Blu-Ray. Sony gets a cut of every blu-ray setup. NVIDIA could simply get a cut of every monitor that uses their setup (and this cut is tiny, but pays off in bulk bigtime). They could also license it to AMD for a ‘small’ fee. If AMD will pay for it is the question. And it actually does make a nice difference so AMD might pick it up for their best cards, perhaps. That is the possibility while still keeping it proprietary..

Angel Ham

The thing is, no matter how superior the technology can be nor how cheap it can go, it is still no match for “Free” if all it takes for OEMs to make it work is an AMD software update.

Joel Hruska

There’s nothing intrinsically wrong with proprietary standards. CUDA has done great things for NV in the HPC space. PhysX didn’t really catch on as a major driver of the GPU market, but it offered its own advantages.

Rambus didn’t turn itself into a hated cariacture of a company because it had proprietary tech. It turned into a punching bag because it attended JEDEC meetings while secretly filing patents on the improvements discussed therein.

I did a great deal of research on that situation back in the day. Intel absolutely wanted to corner a new high-speed memory market for itself, but it made no secret of that fact. Even the preferential stock deal was the icing on the cake, not a secret revelation of intent. The irony was, the major DRAM manufacturers actively worked against Rambus and kept right on using its IP even after they knew about the patents. Rambus actively worked against the DRAM manufacturers, threatening terrible licensing deals if cases went to court. The only company that actually seemed to be working in good faith (albeit towards its own self interest) was Intel.

Nvidia hasn’t really done anything like this. And as for Sony, they persistently created great tech standards, then attempted to charge ludicrous amounts of money for them, or made them very user-hostile. PhysX isn’t user-hostile. Neither is CUDA.

Angel Ham

But how does it benefit the OEMs? I mean, every little cost adds up eventually and sounds like AMD’s solution is going to be cheaper for them unless nVidia steps up their game and brings an integrated solution rather than an expansion slot (which I wouldn’t be surprised it happens in the future.) I guess it’s gonna be up to the early adopters and see if the option floats or sinks. But my bet is going to be with the “inferior but free” solution that AMD is offering.

I find it laughable whenever there’s news about Rambus showing off new RAM technology that promises superior performance than what is out there on the market, but it always ends up as NO ONE wants to manufacture it. I mean Nintendo has burned a lot of bridges but the people at Rambus has burned the bridges, slaughtered the horses, poisoned the well, and spread salt all over the fields.

Joel Hruska

I don’t think this helps or hurts the OEMs. It’s like saying that 4K hurts OEMs because it’s more expensive. If an OEM bets the farm on 4K, and 4K flops, then yeah, hey, 4K flops and that OEM goes out of business. That’s why you don’t bet the farm.

Nvidia built the ASIC to support this capability. If the cost of supporting that ASIC is relatively low, then the risk to OEMs in doing so will be correspondingly low. Sure, it’s always possible that the OEMs will put int he effort and get no reward, but that could happen even with an open standard. If AMD works with Broadcom to create a TV ASIC that’s variable refresh, and then Broadcom doesn’t sell the numbers it expected to make, then that still hurts them, too.

Angel Ham

4K resolutions are not owned by one company. If anything, it’s a must for OEMs if they want to get back to the higher-margin displays now that HDTV prices have gone so down and profits are so small now.

nVidia’s G-sync is and your screen can live perfectly without it. The same for AMDs “free” option.

Ugh, feels like a format war is coming over the horizon..

Joel Hruska

It’s not about who owns something — just how much it costs an OEM to take a risk on it.

AMD’s solution won’t work unless OEMs adopt certain panels or build with different ASICs. Those ASICs will need to be implemented. We can safely assume that will impact cost, because swapping ASICs out for more advanced parts that can handle variable refresh *will* impact cost.

So there’s going to be a risk on both sides. If NV brings their solution to market first (and they have), then maybe NV makes significant money on small #’s of lxuury displays, as does the OEM. Then cheaper volume production comes later on an open standard.

Angel Ham

I think I’m gonna stick with “normal” displays until these format wars are over. Let the hard-core crowd and the early adopters decide the winner and then go back to business as usual.

zapper

WTF
I dunno why such a fuss is being made over such a small issue. This appears to be as simple as synchronization. Say , the monitor refreshes @60 Hz and the GPU card gives 100 FPS in a game , so let the GPU card slow down and output @60 FPS , watching when the scene is reqd. to be updated in sync with the monitor. So simple and if these guys have not done this homework, I dunno what to say about their IQ.

wuzelwazel

Great Scott! You’ve single handedly solved the greatest problem of realtime computer graphics! We simply force the GPU to render at 60 frames per second! We can call it V-SYNC and everyone will love it!

JD Rahman

Alright, so the Oculus Rift is making progress by drawing blank frames and letting our brains fill in the gaps; in order to synchronize head movement with content without latency.

I’m calling absolutely flaming horseshit here. If AMD knew what they had, they would’ve been capitalizing on it long ago. I’ll bet any money that, when pitted against G-Sync, there will be no comparison. This is just a ploy to try and salvage whatever crumbs AMD can scoop up.

Joel Hruska

“If AMD knew what they had.”

AMD says the technology to support variable refresh rates is already baked into GCN. Clearly that didn’t happen by accident. No one blinks, looks around, and says: “By George, we baked a non-standard alternative into our graphics core!”

” I’ll bet any money that, when pitted against G-Sync, there will be no comparison.”

You don’t seem to understand the technological issue in question. Nvidia built an FPGA + additional memory to provide a functionality that isn’t baked into a standard monitor scaling chip. Nvidia does not contest AMD’s claim that it can provide a G-Sync-equivalent experience using a panel that supports variable refresh rates. Nvidia’s point is that such panels depend on display standards that are not prevalent in the desktop world.

So AMD is, so far as we know, being honest that it can provide a G-Sync equivalent experience in certain circumstances, and that those circumstances are mostly confined to the mobile space and limited right now, even there. And Nvidia is being honest when it says it built G-Sync precisely because most monitors use fixed refresh rate ASICs.

The one thing Nvidia *isn’t* doing is saying: “AMD can’t provide this functionality in these use-cases.”

John Mellinger

To compare G Sync to AMD freesync is like comparing NVAPI to Mantle… they are both 2 different things.

If it is as good as reported, VFR may be the next revolution in gaming market. But if I understood it well, it’s only good enough for latest games which GPUs aren’t able to serve 60fps. If GPU is able to keep up with a display’s max refresh rate, then VFR is not needed at all. It may even make it worse, getting the frame rate slower than it should be. IMH prediction, in the next years we’ll see displays with max 120Hz-240Hz, capable of refreshing as soon as GPU finishes building a frame.

Carlos Diz

What nvidia is doing is taking manufacturers of displays off their ass to keep up with development!
Wy aren’t screens capable of this already?
Wy after all this time of talking about tearing and stuttering have screen manufacturers just ignored their responsibility in all of this?
They can make a screen have all the Hz they want but haven’t made it read what refreshrate to give depending on the device conected.
If my blue ray player say this video only shows 25fps then the optimal refreshrate should be activated. If my console is holding 59 (ish) fps the screen should match it.
And when eventually i connect my PC to it i should also have it reading what the GPU sais. I am sure this will not go the way Nvidia wants, but will sure push screen manufacturers in the right direction.
AMD is just buying time with the whole “driver update” how is a driver update going to let you lose tearing and stuttering at framerates That range from 80 to above 150 on screens with 120hz and 144hz? I don’t want to lose the better framerates to compensate for ocasional drops. I will gladly pay for a screen that keeps up with my GTX 780 at all time regardless of what game i play and whatever FPS or Hz. In this day n age that all should not need my intervention.
If you hate making profiles for all your 100+ games, nvidia gave us “experience” amd will probably have an update lol, eventualy pfffft…
Now Nvidia takes experience to the next lvl annd make yet another move that will make my pc ready to play any tittle without having to even look at the settings.

TLDR: I love you Nvidia!
(Lol at anyone who rages over this) =P

Shorty20122012 .

Out of all the companies making monitors and tv’s today, it took someone this long to come up with this idea? I’d say way to go Nvidia for capitalizing on a great idea that should have been implemented into tv’s and monitors 7+ years ago.

Use of this site is governed by our Terms of Use and Privacy Policy. Copyright 1996-2015 Ziff Davis, LLC.PCMag Digital Group All Rights Reserved. ExtremeTech is a registered trademark of Ziff Davis, LLC. Reproduction in whole or in part in any form or medium without express written permission of Ziff Davis, LLC. is prohibited.