On Friday NVIDIA announced G-Sync, and considering the little details available out there I wanted to write a quick follow-up on this new technology, as it really is a big announcement - a really big ...

Not a lot really but sure, Low FPS could be a drag as say 20 FPS and thus Hz on a LCD panel will look like crap, you'd literally see the screen refresh. Meaning low FPS moments could potentially be horrible with refreshes that you could see live on your screen. So in an optimal situation you will need a graphics card that can stay above 30 FPS as minimum.

20fps doesn't mean that your panel will flicker at 20Hz. LCDs do not flicker . Their backlit does, but not like CRTs which have physical refresh rate. And backlit is not related with screen updates at all.
Even if your video card gives 3 frames per sec, it will be slideshow, but perfect one. When new frame arrives, it will be drawn in 5ms (or 2ms, or 1ms) - according to monitor specs.

20fps doesn't mean that your panel will flicker at 20Hz. LCDs do not flicker . Their backlit does, but not like CRTs which have physical refresh rate. And backlit is not related with screen updates at all.
Even if your video card gives 3 frames per sec, it will be slideshow, but perfect one. When new frame arrives, it will be drawn in 5ms (or 2ms, or 1ms) - according to monitor specs.

Agreed, also the whole sentence doesn't make that much sense. Is it missing few commas or what?

20fps doesn't mean that your panel will flicker at 20Hz. LCDs do not flicker . Their backlit does, but not like CRTs which have physical refresh rate. And backlit is not related with screen updates at all.
Even if your video card gives 3 frames per sec, it will be slideshow, but perfect one. When new frame arrives, it will be drawn in 5ms (or 2ms, or 1ms) - according to monitor specs.

They also stated that the minimum variable refresh rate is 30. Anything below they will duplicate frames. And yep, obviously it will not flicker.

They are saying this can be retro fitted to certain monitors.
They give a diagram and everything to hard wire it to monitors.

mmmmm i could prolly do it no bother, but do i wish to force open my monitor to solder this module to it.

My last monitors osd button broke/stuck and the monitor was a total nightmare to get into. Front bezzles are usually locked in with really weak small pieces of plastic.
I broke a few of them on the last monitor.

There's a lower bound of 30Hz as well, since anything below that and you'll begin to run into issues with flickering. If the frame rate drops below 30 fps, the display will present duplicates of each frame.

This looks so awesome. It should improve gaming experience by a lot for everyone whos rig can't sustain filling v-synced monitor at 60fps with no drops. Even then the input lag could be nasty. It should also make gaming @ 4k much more comfortable.

Now, when will we be able to get 4k monitor with fallback to FullHD @ 120Hz, and G-Sync, at <$1.5k?

It does look awesome, but i just got a VG248QE a while back. Now i need to turn around and sell if i want yet another proprietary Nvidia feature or the feature in general? I take it you wont be able to add this piece of hardware to the panel you'll have to buy one. Which IIRC i saw them showcasing it with the VG248QE.

What makes you think i care about Mantle i just stated a fact and
what needs to be there so i can buy it and make use of a great thing
And i hope this is a little more open then what i'm used to seeing Nvidia do
in the past with there technology

At the end of the day this has nothing to with Mantle.sometimes Fantards like really do make me facepalm.

This will be great when its on IPS panels otherwise it can go f*ck off like the rest of The proprietary Nvidia crap no offence.

Spoken like a typical AMD owner. Much like with physX, AMD owners wanted it, then found out it's proprietary, and immediately they all seemed to have received talking points saying it does nothing, is a gimmick, and nV is greedy and should give it for free. lol.

Spoken like a typical AMD owner. Much like with physX, AMD owners wanted it, then found out it's proprietary, and immediately they all seemed to have received talking points saying it does nothing, is a gimmick, and nV is greedy and should give it for free. lol.

I recall Nvidia been in the news, they did offer physx to ATI/AMD in licensing deal, but they point blank refused. Who knows if they had agreed it probably would been available in ps4/xbone an been implemented in lot's of games now. Them refusing all those years ago basically killed it from been widely adopted.

What goes around comes around, really think Nvidia will license Mantle, nope can see similar fate for it

See outcome for G sync been totally different, simply because it's something everyone want's....will be must have feature.

As far as I remember that talk about AMD being "offered" PhysX has been brought up a few times, basically there were some terms that AMD didn't like so they declined, or so it's rumored at least, I don't think we'll ever get a official explanation.

Similarly I can fully see AMD wanting some terms for letting Nvidia use Mantle, they aren't going to just give it away to the competition.

EDIT: That said it would of course be awesome for us end-users if AMD and Nvidia could somehow get along and implement CUDA, PhysX, Mantle, GSync and whatever else (3D viewing?) together but I doubt that will happen anytime soon.
(I guess it's similar between AMD CPU and Intel and their x86 / x64 stuff and extensions but I don't have much insight into these things so what would I know.)

As far as I remember that talk about AMD being "offered" PhysX has been brought up a few times, basically there were some terms that AMD didn't like so they declined, or so it's rumored at least, I don't think we'll ever get a official explanation.

IIRC Nvidia offered the license for free or close to nothing, but updates would be handled by Nvidia alone.
Which means, Nvidia had to have access to some of AMD's proprietary libraries in their drivers in order to provide proper PhysX support...and every time Catalyst is being updated, AMD had to give Nvidia the details of what had changed.

In short, had AMD accepted Nvidia's deal, they'd agree to pimp out (parts of) Catalyst to Nvidia; that probably what made AMD walked out of it.
CMIIW though.

Interested in folding with fellow gurus? Click here to get you started!

Spoken like a typical AMD owner. Much like with physX, AMD owners wanted it, then found out it's proprietary, and immediately they all seemed to have received talking points saying it does nothing, is a gimmick, and nV is greedy and should give it for free. lol.

No spoken like someone who Loves open and non propitiatory crap and if you check my sig i have a GTX 570 how does having a AMD card have anything to with what i'm saying.

I said i hope this isn't like there other Crap and will be on IPS panels.
And i hope its more used and really well adopted by everyone
and is open compared to there other stuff.