Nvidia responds to AMD FreeSync

Well it was bound to happen sooner rather then later, but Nvidia spoke about AMD FreeSync. The interview is with Tom Petersen who spoke with the guys from techReport. So the credits and everything related to this post goes to them. Basically Petersen says he's excited to see a competitor take an interest in this but he claims that as things now stand, it will be close to impossible to implement free sync on a desktop display due to the use of different display architectures than on laptops.

Here's that snippet from the guys at techReport:

CES — On the show floor here at CES today, I spoke briefly with Nvidia's Tom Petersen, the executive instrumental in the development of G-Sync technology, about the AMD "free sync" demo we reported on yesterday. Alongside the demo, a senior AMD engineering executive asserted that a variable refresh rate capability like G-Sync ought to be possible essentially for free, without adding any extra costs to a display or a PC system. Peterson had several things to say in response to AMD's demo and claims.

He first said, of course, that he was excited to see his competitor taking an interest in dynamic refresh rates and thinking that the technology could offer benefits for gamers. In his view, AMD interest was validation of Nvidia's work in this area.

However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction.

When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists.

That said, Nvidia won't enable G-Sync for competing graphics chips because it has invested real time and effort in building a good solution and doesn't intend to "do the work for everyone." If the competition wants to have a similar feature in its products, Petersen said, "They have to do the work. They have to hire the guys to figure it out."

NVIDIA Reveals Tegra Note - 09/18/2013 01:56 PM
Rumors have been floating around for a while now, but today its finally announced as NVIDIA Reveals the Tegra Note. The Tegra Note platform delivers features such as Tegra 4, a stylus, good audio, H...

NVIDIA reponds to Linus Torvals Remarks - 06/20/2012 10:03 AM
The statement follows on this reaction from the man: Supporting Linux is important to NVIDIA, and we understand that there are people who are as passionate about Linux as an open source platform as we...

Nvidia Reportedly Delays the Kal-El SoC Launch - 08/03/2011 09:33 AM
As you guys know, Kal-El will be the successor of the current Tegra 2 SOC and it features no less than four processing cores based on the ARM Cortex A9 architecture, which are clocked at 1.5GHz, as we...

You can't do that stuff through DVI, it just doesn't have the capability.

again, that's eDP is not display port, if we want to be accurate. most likely ran on eDP because DP is the smallest video connector so embedding a DP is a lot easier in laptops (where components have to be smaller).

I still haven't seen anything to say that this can't be done on DVI

JonasBeckman
Senior Member

Posts: 13589
Joined: 2009-02-25

#4739819 Posted on: 01/08/2014 06:17 PM
again, that's eDP is not display port, if we want to be accurate. most likely ran on eDP because DP is the smallest video connector so embedding a DP is a lot easier in laptops (where components have to be smaller).

I still haven't seen anything to say that this can't be done on DVI

it includes a new Panel Self-Refresh (PSR) feature developed to save system power and further extend battery life in portable PC systems. PSR mode allows GPU to enter power saving state in between frame updates by including framebuffer memory in the display panel controlle

Something about these features not being supported under DVI, HDMI or even regular Display Port 1.2 currently, thus AMD demoing this using eDP on laptop, or so I understood it. (Apparently DP 1.3 will have something like that, unsure about HDMI 2.0)

Someone else can likely give a better explanation. (I believe GSync also requires DP for now although that could likely be adapted to DVI and HDMI since it's a external controller module.)

Asgardi
Senior Member

Posts: 198
Joined: 2010-11-13

#4739824 Posted on: 01/08/2014 06:23 PM
"I would AMD let to do their part with VESA as always. And all of us will benefit in the end. AMD users with nice feature to have and nVidia users with add-in cards cheaper than now as nVidia will be forced to lower price."

In the end its good to have cheap/free tech for everybody but I will support companies who actually do innovations to get some more. If we always supported copycats there would not be the innovations to share. I don't want companies to make money with the same stuff for forever, but for some time is just healthy and motivating.

toivonen
Junior Member

Posts: 15
Joined: 2002-02-18

#4739840 Posted on: 01/08/2014 06:52 PM
lol...and that is why latelly i passed to the "green" side!...sure, theire very far from beeing saints, but when we look to the last 5 years, they really had a much better attitude, both in the tecnology as in posture....AMD, despair takes no where, you should know that....

H83
Senior Member

Posts: 2026
Joined: 2009-09-08

#4739844 Posted on: 01/08/2014 07:02 PM
I like Nvidia´s GPUs but their stupid attitudes are really hard to ignore. Saying that they won´t share G-sync with AMD no matter what and that AMD should develop their own solution like that it´s really being an huge ass!!!

This way G-syng is gonna flop like Physix and the biggest loser is going to be Nvidia because someone will release a non proprietary solution that is gonna work for everyone...

It's not a "port" limitation at all. NVidia has intentionally built in limitations to prevent G-Sync from working on non-NVidia hardware. It really wouldn't be that difficult for say, Samung, to design a controller with a built in display buffer and logic to do exactly what nVidia's G-Sync module does, without being bound to a single GPU vendor.

"I would AMD let to do their part with VESA as always. And all of us will benefit in the end. AMD users with nice feature to have and nVidia users with add-in cards cheaper than now as nVidia will be forced to lower price."

In the end its good to have cheap/free tech for everybody but I will support companies who actually do innovations to get some more. If we always supported copycats there would not be the innovations to share. I don't want companies to make money with the same stuff for forever, but for some time is just healthy and motivating.

AMD applied for a patent on this back in 2006. It would appear, based on the patent filing, that nVidia is actually the copycat.

Lane
Senior Member

Posts: 6358
Joined: 2005-02-25

#4739867 Posted on: 01/08/2014 07:20 PM
Its funny because, on the Asus who support G-sync... g-sync take place in the scaler ( in reality it replace it ).

G-Sync is a hardware solution, and in this case the hardware resides inside a G-Sync enabled display. NVIDIA swaps out the display’s scaler for a G-Sync board, leaving the panel and timing controller (TCON) untouched. Despite its physical location in the display chain, the current G-Sync board doesn’t actually feature a hardware scaler. For its intended purpose, the lack of any scaling hardware isn’t a big deal since you’ll have a more than capable GPU driving the panel and handling all scaling duties.

Then: The first G-Sync module only supports output over DisplayPort 1.2, though there is nothing technically stopping NVIDIA from adding support for HDMI/DVI in future version

Definitevely some explanations are wrong somewhere.

AMD applied for a patent on this back in 2006. It would appear, based on the patent filing, that nVidia is actually the copycat.

In reality the first patent have been applied by ATI technologies in 2002, in 2006, AMD have transfer the licences under his name. ( as they have buy ATI at this time )

Keesberenburg
Senior Member

Posts: 705
Joined: 2013-01-11

#4739869 Posted on: 01/08/2014 07:21 PM
I like Nvidia´s GPUs but their stupid attitudes are really hard to ignore. Saying that they won´t share G-sync with AMD no matter what and that AMD should develop their own solution like that it´s really being an huge ass!!!

This way G-syng is gonna flop like Physix and the biggest loser is going to be Nvidia because someone will release a non proprietary solution that is gonna work for everyone...

Some guys just don´t learn...

It is just a personal meaning with a red tint. Don't do this to your zelf.
AMD does the same things bro, open your eyes or get transparent glasses

nhlkoho
Senior Member

Posts: 7181
Joined: 2005-12-06

#4739873 Posted on: 01/08/2014 07:32 PM
It is just a personal meaning with a red tint. Don't do this to your zelf.
AMD does the same things bro, open your eyes or get transparent glasses

All companies do this. Imagine if Intel spent all the time and money researching new technologies for CPU's and just gave them away to everyone else to use for free.
Nvidia keeping this for themselves may be bad for consumers but to give it away is a stupid business practice. And the main purpose behind a business is to make a profit.

H83
Senior Member

Posts: 2026
Joined: 2009-09-08

#4739883 Posted on: 01/08/2014 07:42 PM
All companies do this. Imagine if Intel spent all the time and money researching new technologies for CPU's and just gave them away to everyone else to use for free.
Nvidia keeping this for themselves may be bad for consumers but to give it away is a stupid business practice. And the main purpose behind a business is to make a profit.

But they don´t need to give it away! They can licence it or ask for 50% of the development costs!!! This way everyone wins! If they keep just for themselfes they are almost guaranteed to lose money on it because someone else will develop something similar without hardware maker restrictions. Not to mention people who buy AMD aren´t going to buy GPU´s from Nvidia because of G-sync, but okay this is just my opinion.

Denial
Senior Member

Posts: 11094
Joined: 2004-05-16

#4739884 Posted on: 01/08/2014 07:43 PM

Not to mention people who buy AMD aren´t going to buy GPU´s from Nvidia because of G-sync, but okay this is just my opinion.

You'll be surprised. I don't think most people know much about G-Sync but two of my friends choose 780's due to Shadowplay which like completely blew my mind. I mean I find the feature nice but I didn't expect something so minor to draw people to switch.

AcidSnow
Senior Member

Posts: 362
Joined: 2013-08-09

#4739917 Posted on: 01/08/2014 08:54 PM
Proprietary G-Sync is really disappointing, but obviously expected.
However, I can't help but feel somewhat upset, because it's like inventing "fire" as a cave man and not helping the entire human race by saying "here, I realize you're out in the cold, take this and live a better life."

Screen tearing is a horrible issue which requires V-Sync in all non-CRT monitors... And variable-FPS is a brilliant idea that would benefit basically every gamer on the planet.

It's no wonder why NVIDA is keeping it as a proprietary tech, it's just that powerful :\

oOEvil1Oo
Member

Posts: 70
Joined: 2013-02-21

#4740026 Posted on: 01/09/2014 12:19 AM
I am glad there is all the advancements we have in technology, for us to play games on! Too bad the damn game developers can't figure out how to release working titles for us to use the modern marvels!!