Except the Oculus Rift probably won't have it. They love non-proprietary stuff and G-Sync lands firmly in the proprietary category.

Make it a standard, make it cost about $10 more to implement rather than $120 and this will take off. I don't see this happening, though. NVidia just doesn't operate in that matter, unfortunately. It would make gaming so much better for the people who really need it--those with sub-par video cards.

No display maker is going to make a key component (the scalar) beholden only to a single manufacturer (nVidia). The technology needs to be licensed so it becomes an industry standard so that manufacturers can put it into their displays without having to rely on a single OEM. Reply

Carmack himself mentioned at the panel after the G-sync reveal that the first consumer release of the Oculus would NOT contain G-sync, but that is definitely something they want to incorporate.My guess is the reason being the use of LVDS as the sole panel interface. There simply AREN'T any decent 5.6"-6" panels using LVDS. Nobody makes them. The relatively bad (6-bit FRC, crummy colours compared to modern panels, low fill-factor, low resolution, too big to be used efficiently) panel was a compromise in that it was the only one readily available in volume and compatible with the existing LVDS board. Phone/tablet panels in the correct size, resolution and quality range are all MIPI DSI, with the exception of the Retina iPad Mini, which uses an eDP panel like the iPad 3 onwards. Except that panel is still too large, and will be unavailable in volume until Apple decide to reduce their orders in 6 months or so. The current 1080p prototype uses one of the early DVI->MIPI chips (probably on an in-house board) because it's the only way to actually drive the panels available.Reply

As useful as G-Sync would be for something like Oculus (especially for reducing motion sickness) it's still far too expensive to implement. Oculus, itself, wants to hold the line at $300. There's simply no way for $120 to be cut down into that price.

Then there's the fact that Oculus would benefit far more from 120hz panels than it would be from gsync. Honestly, I can't imagine Carmack or Oculus ever bad mouthing a new technology that it could benefit from in the future, but the fact remains there are so many other things that would be more cost effective for Oculus to do first. Higher resolution, 1920x2160 say; higher refresh rates, 120hz. Personally I hope they think about using some of the projector panels. Their smaller, lighter, and already have both the color depth and refresh rates. The only problem, of course, is they're probably too small and may be too expensive. Reply

They specifically avoid making a microdisplay-based HMD, because of the tradeoffs that every previous microdisplay HMD has had to make. Because the displays are small, you need some hefty optics to view the image, and these must be complicated in order to correct for distortion (as unlike the large-panel software-corrected approach the Rift uses, distortion with a much smaller display would be so great it could not be effectively corrected). This means the optics are bulky, heavy and expensive. And that goes doubly so if you want a large field-of-view (compare the Oculus 90° horizontal FoV to the HMD-1/2/3's 45° hFoV, and the HMD series were praised for their unusually large FoV compared to competing models). In fact, the only large FoV HMD I know of using microdisplays is the Sensics Pisight (http://sensics.com/head-mounted-displays/technolog... a huge 24 display monster that costs well in excess of $20,000.

Ahh, so I guess my fears on using projector panels are true. Damn. I guess we're going to be stuck with 60 hz on the Oculus for awhile. I just don't see phone displays moving up in refresh rates anytime soon.

I really want the Pisight now, but, unfortunately, I need to do things like eat and have shelter. :PReply

my buddy's aunt earned 14958 dollar past week. she been working on the laptop and got a 510900 dollar home. All she did was get blessed and put into action the information leaked on this site... http://cpl.pw/OKeIJoReply

G-Sync seems to live in a very small niche. How many people both:A) Need better performance*and*B) Need a new monitor as well?

Absent those two conditions, aren't people simply better off investing the ~$400 a G-Sync monitor would cost in, you know, a better video card instead? I experience neither tearing nor stuttering, because my absurd triple-slot, factory-overclocked R7970 has no problem pushing any game I play well beyond 60FPS. A special monitor would cost 80% what that card did at launch, so G-Sync seems like a bit of a non-starter to me, unless there's something I'm missing here.Reply

I'm interested in G-Sync at 4K. If the need for AA is reduced, and you're battling against 30-60 FPS numbers. But for those users who are in the mid range GPU market, having a good monitor that will last 5-10 years might be cheaper than a large GPU or system upgrade.

It's just another piece in the puzzle towards which will hopefully become standard. Think about it - in an ideal world, shouldn't this have been implemented from the start?Reply

Same here. I'm not going to rush out and buy a 1080p gsync monitor; but even in a year or two an extra $120 on a 4k monitor isn't going to be a large hit relatively speaking and gaming at <60 FPS will be a lot more common there than at 1080p.Reply

Here is the thing, if you have it all. Then a SLI/CF or Titan / 290X setup will mean that you will more than likely able to max out the graphics, and G-sync and V-Sync becomes more or less the same.

The target market is for when you are on a budget and is playing with mid range cards and the card cannot push 60 / 120 / 144 fps (or 30 fps, if that is your thing...) consistently at 1080 or 4K. Which means price becomes an issue, if you are going to buy a midrange card, likely you are going to reuse your existing monitor, or maybe get a nice cheap one unless the G-sync enabled models (and cards) are not significantly expensive that you can step up to a better card that can then run it nicely at full speed via v-sync.

So if they can price it so that a new monitor + new nv gpu is the same as a new monitor of same size and speed + new amd gpu + say 20 dollars then that is fine. But if they can't do that then for a mid range gpu dropping 20 or 30 dollar more can mean a lot more performance for the buck; unless you are already at the upper end of midrange, to go from upper midrange to high end is a large jump in cost. And even then, if people want to keep the monitor they have, then there will likely be NO way that this will take off, because even a cheap 1080 is ~100 dollars, and that means a huge jump in quality of the GPU if you used it on the card itself rather than with the monitor.

The killer app would be if G-Sync would work with any bog ol' monitor (or that all future monitor is sold with this soc enabled). Then it would become a nice new feature that is good for many people. Reply

"Here is the thing, if you have it all. Then a SLI/CF or Titan / 290X setup will mean that you will more than likely able to max out the graphics, and G-sync and V-Sync becomes more or less the same."

This is just flat out wrong...

I play BF4 on a 290x on a 120hz monitor. And there are very few maps that mantain a consistent framerate. So as soon as the framerate dips below 120 i start seeing suttering. And that's on the smooth maps. There are maps, like "seige of shangai" where the framerate hovers from 80 all the way down to 30-40 FPS... Vsync would be a HUGE deal in situations like that.

The gamer that has it all certainly won't invest on a 1080p Tn panel.Here's the problem right now: a bunch of things that will get implemented later. Isn't the solution in hardware? Will I have to replace my panel next year? And then, my panel will be tied to NVIDIA?Not just yet. AMD will surely come with an open source solution next year, as usual.Reply

And I'll counter with saying you're nuts for overlooking 120/144hz TN panels *if* the main use for your machine is gaming. I have the VG248QE, have enabled LightBoost on it, and I would *never* use my wife's 27" QHD IPS for gaming because I would lose the butter smoothness a 120hz LB monitor gets me. Yes her display has better color reproduction, but it is a mess in BF4 with all its ghosting and 60hz choppiness. Until you've seen what LightBoost and 120hz is like in a first person shooter, you can't call us "nuts".

And those of you on 120 or 144hz monitors who aren't using LightBoost, do yourself a favor and check it out. There is a substantial difference between LB 120hz and plain 144hz. Reply

I think a lot of this is leftover garbage from the way CRT displays needed to be implemented. Seems like we should be removing hardware here not adding it. Flat Panels when introduced to that ecosystem needed to output on a frame by frame basis even though the only real limitation seems to be the pixel color to color refresh. Since LCD pixels are more like a switch wouldn't a video card output and display system that updated on an independent per pixel basis be more efficient and better suited to modern displays? I understand games have frame buffers you would need to interpret but that can all be addressed in the video card hardware. If you have a card capable of drawing to the screen in such a way wouldn't that make this additional hardware unnecessary and eliminate the tearing problem?Reply

I think you are on to something here, something like a modification of the packet based DP protocol. Now, the speed of the packets in DP depends on the resolution and refresh rate. Why not make the packets come whenever they are ready (in 3d) and at a regular rate (in 2d). THen have a monitor with an lcd panel expecting a signal like that.

I mean, I think in the end the way nVidia did it right here is just a way to make it work right now with the constraints of the exiting lcd panels and DP protocols. In the future, I could easily see this sort of tech being built in to future protocols, video cards, and monitors, and probably all odne without needing an expensive FPGA and additional RAM.Reply

Yes, and if you aren't drawing frame by frame on the monitor there is no issue :D I'm saying remove the frame by frame drawing on the display side completely. This would create an effective "refresh" rate that is only limited by how many pixels the card can push and how fast the pixels can change on the display side. You could also take it a step further and move away from the frame by frame output on the software side. Fantastic example is the desktop display where 50% of the time the only change on the display is the movement of the mouse cursor. Even in a fps game where most of the screen changes constantly it would still benefit from the lack of a frame holdup since what we call refresh rate would no longer exist.

Truthfully this is all speculative, I don't have enough experience to tell if there are shortcomings here I'm overlooking but nothing blatant stands out at the moment.Reply

Here's one problem with the video card pushing out data on a per pixel basis. It would take a LOT more data.

When a scene is rendered, the instructions essentially target a set of pixels and apply an effect to them (colour, saturation, brightness, etc) and when all calculations are done the final product is sent to the screen to be rendered (tearing is when the not all of the new image has been copied into the monitor's frame buffer).

The problem is that the GPU is blind to any upcoming draws calls in that it does not know which pixels will be affected until the calculations are done. This means that there is no way for the GPU to know when a particular pixel is full computed or "rendered" and ready to be sent to the monitor's frame buffer.

A better solution would be to, for 2D applications, check for pixel changes from one frame to another and simply send the change.

For 3D is see this as impossible since any small change (camera movement or otherwise) will require a complete re-render due to the nature of 3D and how the calculations are done (the GPU has no idea how to render a scene, it simply follows instructions layed out by the developer, so it can't figure out which pixels it can skip).Reply

Does this have to do with the developer or the established rendering API? (OpenGL or DirectX) If the API is designed to output that way wouldn't that make both development and implementation easier? I get what you are saying about camera movement but there is a speed limitation caused by rendering frame by frame as well. if you are dealing with something like a tn panel that has a quick color to color change the effective fps on a per pixel basis becomes closer to 300 fps (or we could call it pixel per second here) even if you are drawing full screens.

I also am wrapping my head around what you are saying about 3d applications. Ultimately they are still outputting to a 2d display. I understand there are shaders and other effects that may require a full screen write and it sounds like the Graphics Card, OS, API's and Display are all set up that way. It may take some serious effort to really take a step back and take a more efficient approach based on the current display technology. Ultimately though if you can quickly kind of go through and make the changes to allow this to take place I do feel like it will end up being a much more efficient and faster approach. It may just not be as easy as it first seemed to me.Reply

In the future, we're all going to need a new monitor. Depending on the price premium; and assuming these make their way into IPS displays; it might be hard to justify buying a non-Gsync monitor over a GSync monitor. I doubt many are going to run out to buy a new $400 display right now; but this will have a powerful effect on consumer behavior down the line.Reply

I'm a bit torn on G-Sync. On the one hand, it removes some glaring issues that have plagued gamers for years. On the other, it's basically a beard. 15 years ago, you could play a game at insane FPS and refresh rates on CRT. Games were simple with small textures and almost no particle effects. 10 years ago, LCDs became affordable and suddenly everyone was capped at 60Hz and consoles were locked at 30fps or 60fps. Games were more complex, requiring faster hardware, but the slow LCDs made it less noticeable. Now were moving on to LCDs that operate at 144Hz and 4K displays capped at 60Hz. G-Sync is a band-aid. The REAL problem is that GPU makers (NVIDIA/AMD) have not kept up with the pace of resolution requirements and game complexity.

Like most reviews point out, it all comes down to what you're used to. I'm still using a CRT, 1920x1200@96Hz (sometimes lower, sometimes higher). I have all my games set up to maximize FPS for the target resolution and usually don't use vsync. Screen tearing is not as noticeable due to the high frame rate, instant response time, and the nonexistent lag that comes from CRT tech. G-Sync appeals to me because it would allow me to avoid the most glaring pitfalls of LCD tech and my inability to turn up eye candy to the max without buying all the highest-end hardware. But like I said, this is really just a band-aid and I'm not sure I want to reward this laziness.

G-Sync hasn't earned my dollar yet. I know my next display purchase will be 4K, but I'm not content with 60Hz LCD. DP 1.3 is on the way, bringing with it 4K and 8K support at significantly higher refresh rates along with 3-D and all that jazz. Will AMD have a response to G-Sync or will they be able to license it for Hawaii 2.0? Will someone develop and open spec that requires minimal hardware to implement for broader adoption? Will GPU makers significantly push performance to make G-Sync obsolete? My CRT hopefully has a couple years left in her, so I hope I can weather the oncoming storm (not a DW reference).Reply

Obviously there's a limit to how good of a video card you can get. This pushes the upper bound on the experience offered by allowing frame rates down to 35fps to be acceptable instead of down to 60fps. As far as cost analysis of buying a faster card for those not in the market for the top tier cards, one must remember that most users will upgrade video cards far more often than monitors. For the life of a monitor, one must continue to purchase more expensive video cards each time one upgrades video cards in order to equal the same experience of a g-sync-enabled monitor with less expensive video cards.Reply

It's kind of a stopgap device for those who don't want to shell out the extra cash for a better/second GPU. ..But even then, the cost of getting a new monitor would seem to offset the cost of a better/second GPU.

Anand hit the nail on the head when he pointed out that if you are getting a minimum FPS of 60, then vsync should be fine for you. At 1440p+ resolution, even dual GPU will start to encounter slow downs, so it makes sense to invest in Gsync, because minimum frames will be lower. Also, as your hardware ages in relation to the games you play, having Gsync will be good because you'll get a smooth experience without having to buy a new GPU / CPU. Old hardware will retain its relevance longer.Reply

C) Have, or are willing to go buy, an nvidia GPU to use with the screen.

It's always a little disappointing when a manufacturers spends a lot of time and money making some cool new feature, only to have it die because it's proprietary. If this was a part of DirectX or some other industry standard, maybe it would take off.Reply

Got to lol there - DirectX is proprietary. From the manufacturers point of view (which is in this case nvidia) if they have spent all that time and money how do they get it back if they just give the tech away for free.Reply

As a person who outgrew 1080p a while back and now has a 3x1 setup, I'm certainly hungry for more features in my monitors, if they're going to continue to cost the same (and they appear to have no intention of having a race to the bottom).Reply

It may be niche for the next couple years, but it seems like a technology which is destined to eventually become ubiquitous. It's common sense and has real world results. As the industry matures, the holes in the experience will be filled in.Reply

For what probably costs $20 in hardware, they can charge a $50-$100 price premium on a high end display for G-Sync. It's definitely niche, but so are video cards costing over $200 and they sell quite well.Reply

That article was completely wrong as well. They forgot to mention that it was the Asus monitor used in this article that was going to be receiving the G-Sync module first...not that Nvidia had an exclusivity agreement with Asus. No need to try spread false information.Reply

Actually it doesnt necessarily rule them out. It drives the panel with LVDS, so the panel needs LVDS. As long as the panel in those korean monitors is LVDS then theoretically it should work, even if the monitor did not originally have DP. Your video card, does need it, but all of the cards that actually support it, have DP anyways so thats not an issue.Reply

I haven't seen any such monitors, but DP 1.2 should have enough bandwidth to handle 2560x1600@120Hz. The bandwidth requirements are virtually identical to 4k@60Hz. Though I'm not sure whether that configuration is fully defined in the standard or not.Reply

It's an... interesting display. Somewhat disappointing that they don't guarantee 120Hz, though. It just say "Up to120Hz". That and I am not impressed in a $600 DVI only screen. Considering HDMI caries the same signal as DVI, why wouldn't they make it HDMI? I guess you could just get a DVI-to-HDMI adapter at Newegg.Reply

I came here to post this, and am quite surprised that there was no mention of triple buffering in the article. It seems very disingenuous on the part of both nVidia (assuming there were no slides mentioning triple buffering) and Anandtech to omit this issue.

In fact, it was Anandtech who did an excellent (IMHO) job of informing me about the advantages of triple buffering back in 2009, in this article: http://www.anandtech.com/show/2794/2

Triple buffering is a type of v-sync. On the first page of this review it explains the issue. The buffers hold on to what the GPU rendered for anywhere from 0 ms to 17 ms so there is no way for the rate of motion of objects in the frames from the GPU to actually match up with when the frames are displayed.Reply

My thoughts exactly, triple buffering is a software solution, while this is a hardware based solution. If I understand correctly, the input lag for both should be the very similar. The distinction comes from the memory requirements; on already taxed hardware (think 4K), G-sync would work better.

ultimately, Unless Nvidia can get the G-sync compatible LCDs under ~$30 premium, it (G-sync) will only make sense for ultra high 4K monitors; If you have middle of the road stuff, your money is better spent into better GPU imho.Reply

Triple Buffering causes lag, since you never get to see anything that happens in the game until the third frame is scanned on the screen. That's a pretty huge deal when you are playing games where reflexes affect the outcomes. If you get a chance to react up to 33ms faster than the next player, all else being equal you win. Reply

Of course triple buffering has drawbacks (input lag) just like G-Sync, however it's a perfectly viable solution to the same problem that this hardware+software combination tries to solve. I doubt competitive twitch FPS players are going to jump on G-Sync anyway as it may causes input lag too - they will keep playing with double-buffering + no VSync.

With triple-buffering, as long as your actual frame rate is over 60 FPS, there should always be a frame ready in one of the back buffers when the screen redraws. I've always found that in the few games that support it, the frame rate is very even and smooth.Reply

so does G-sync, isn't telling the screen to wait until the next frame is ready = lag? The only difference is G-sync waits for the next frame, buffering grabs the latest finished frame. Both creates extra lag.Reply

Triple-buffering solves the issue when you have performance to spare (i.e. spend most of your time rendering at above the display refresh rate) and a very high tolerance for update delay (lag). When performance constrained, triple-buffering offers little to no benefit over double-buffering (as you're never filling that other buffer before display update), and you still get that frame-by-frame variance between render time and display time when performance varies.Reply

I never suggested that it wasn't worth an article, and I am in fact a longtime reader who appreciates Anand's articles on niche subjects. I merely wanted to understand if there is or might one day be a non-niche application for the technology.Reply

with about 60K 248 monitors sold and over 5k sold a month on average - team green is going after the mod kit circuit. Once other options, i.e. other manufacturers, release their GSYNC monitors this will become much less "niche" than you thinkReply

This will be something only the hardcore tech enthusiasts look for. Unless Nvidia is willing to let go with the proprietary nature of the tech then it will be stuck as be a niche feature like Phys X is.Reply

I'm looking at that screen shot with "Set up stereoscopic 3D" and wondering how well it works with 3D Vision?

Any chance of testing with 3D Vision? S3D typically has the problems seen here, with half the frame rate, and locked vSync to synchronize with shutter glasses, so G-Sync has the potential to dramatically improve S3D.Reply

Could you revisit this with multiple monitors? Knowing that going to three monitors triples the workload, and adding 3D doubles it, I see this having a large impact when trying to push 3 1080p monitors running 3D. You should be able to finally crank the settings up.Reply

I think nVidia/AMD need to do a round of cross licensing. AMD would get G-Sync + PhysX while nVidia would get Mantle and an AMD x86 CPU core to build an nVidia branded SoC. An enthusiast can dream can't they?Reply

Well AMD has opened to using other IP in a custom SoC. So actually it wouldn't be an nVidia x86 core but rather AMD who also would be responsible for the overall layout of the SoC. Of course, individual components in the SoC, (mainly graphics) would be nVidia IP.

Of course this would never happen given that AMD and nVidia are absolutely fierce competitors with each other along with Intel (a three way stand off of hate). Like I said though, one can dream.Reply

Any chance you might compare this to LucidLogix VirtuMVP VirtualVsync ??

I've been using VirtuMVP for some time, and although it might not be as perfect as GSync, it does offer a similar feature of syncing the dislpay while letting the GPU run like its VSyncOFF = better gaming response at 60hz refreshrate... Reply

It would be interesting to see if the same approach can be used in a media player to perfectly sync screen refresh to video frame rate. You'd get perfect frame timing on a media center box, without worrying about display timing, and it could adapt to different frame rates as needed.Reply

So long as something used full screen exclusive mode (remember, windowed mode doesn't work), that should work. However keep in mind that G-sync has a minimum refresh rate of 30Hz, so you'd have to double up sub-30fps framerates to stay above the minimum.Reply

It blows my mind no input lag testing was done. If the game looks smooth, that's nice. If G-Sync adds even 10ms to current input lag numbers then it's useless to competitive FPS gamers.

I would rather play games with NO AA, lower smoke settings, etc to get a 100FPS @100hz. My Qnix 2560x1440 runs at 110hz refresh rate (After overclock).

To enable Gsync I'd have to down grade my monitor dramatically. I'd have to go back to a shitty TN panel that "officially" supports 100hz+.

Thank you Korea for shipping us A- 2560x1440 PLS & IPS panels with 100hz capable refresh rates for $300! These monitors are the best deal I've found in PC gaming since the 9500pro to 9700pro unlock. The Qnix qx2710 is awesome.

I hate the idea that 30 to 60 fps "is enough"... It's not. It never will be.Reply

I have a qnix and ASUS VG248QE. To be fair, while it is a TN with terrible colors, the VG248QE shows much less blur when playing FPS. Blurbusters did a lot of analysis on that with their Lightboost hack. Even at 110hz+ on the Qnix, it still looks much worse than the Asus when turning around, aiming quickly in a FPS. For general use, of course the Qnix has been the best bang for the buck in recent years, but it's still not that same. It's not only being an officially supported 100hz+, there is some value in the TN panels for now.Reply

In theory it should actually improve your reaction times, even if just slightly if you are using a 144hz panel. I'll use a 60hz panel with v-sync on and off vs gsync just to show the advantage.

Say for instance you have a 60 hz panel with v-sync and triple buffering on and you are doing a test where a pixel appears on screen and you have to click the mouse button in response.

When the machine makes the pixel appear, it is in frame 3 of the buffer. On average you will be half way through the currently synched frame so you have to wait 8ms for that to finish, 16ms for the second frame to finish, and then the third frame will draw as fast as your display can flip pixels and then you click.

No Vsync on you won't have triple buffering so on average you'd have to wait for half a screen refresh to see the pixels flip and then click.

G-Sync, as soon as the frame is done the video card tells the monitor to draw so all you wait for is the pixels to flip . . . it will be the same every time, not sometimes 16.2ms and sometimes 1ms and then pixels... just pixels every time.

I haven't read anything that it causes any different input lag than a regular monitor though, but it still should be a net advantage in almost any scenario to user experience and reaction ability. Reply

I learned these aren't actually to fool commenters but to increase the google page rank of the linked site since a bunch of sites will link to it if this spam comment is placed all over the place..Reply

Im super happy that someone solved this problem but I feel this was solved the wrong way. Why isn't this technology just a revision to current display standards? Why isnt vertical blank sync just removed and a new 'frame' sync introduced to the standard? Then every display will have it and every GPU vendor could support it.Reply

I totally agree. This could seemingly be solved with a much simpler solution that included a new display protocol/standard. Hopefully this is just the tip of the iceberg, and a more sensible solution will come in the future.Reply

Solid points... If someone game me a LB monitor I wouldn't use it though because I love 1440P. I was close to buying a 27" Lightboost monitor but when I saw my friends Qnix I changed my mind instantly. The tradeoffs to get LB are too drastic. G-sync looks to make lower fps feel better. I'd rather have higher FPS.Reply

Shouldn't Adaptive V-Sync be thrown into the mix as well? I thought this was also supposed to be a way to improve the user experience in situations where framerates drop below your display's refresh rate. G-Sync seems to be a better and more direct solution to the problem, but it requires one to buy new specialized (a.k.a. more expensive) hardware and also (currently) limits connectivity options.

Ideally, I would much rather that this pushes development of an open standard that leverages DVI/HDMI/DP, which will likely require "smarter" displays, but doesn't discriminate on the GPU and connectivity side. Further fragmenting the market by implementing yet another proprietary solution to an otherwise universal problem will severely limit the adoption. I assume there are patents, etc. that likely prevent anyone from implementing similar solutions without having to license this "novel" idea from nVidia.Reply

On the first page the second diagram illustrates "V-Sync off causes 'Tearing'". But in the image, why does the GPU wait to start drawing the new frame until the monitor hits refresh? I thought the point would be that the gpu waits to display the new frame until the monitor refreshes; there's no reason wait for the refresh to start drawing. And for the given example there would be no lag, because the three frames could be drawn in time if the gpu wouldn't wait.

Another question: If your card can't push 60Hz, why not just run the game at 40Hz, and v-sync to that? If the gpu can push at least 40hz, there will be no stuttering.Reply

Because generally monitors are fixed refresh rates and you can't easily just sync to 40hz. 40hz on a 60hz panel would mean judder since some frames would be displayed for different lengths of time which you then pick up as unnatural movement even if your frame rate is pretty high. A better option would be to since to evenly divisible numbers, so if you have a 144hz panel you only draw every 1 refresh for 144fps, 2 for 72 fps, 3 for 48fps, etc.

That or you could just use G-Sync and tell the monitor when the frame is ready so you don't ever have to wait around for the monitor. Reply

It highly depends on the monitor, and is rarely mentioned in the technical specs.My ASUS PA238Q accepts any refresh rate between 40 and 70Hz, but anything other than 60 Hz indeed results in judder (skipped or missing frames...) because the panel always runs at 60 Hz.But some other screens behave differently.Reply

You guys need to lay off the IPS fandom. It's stupid and vastly over-stated. The advantage is viewing angle, that's it, it stops there. For gamers IPS is actually stupid given the increased response time. Seriously, stop harping on it constantly. Only people who need IPS are professional image creator/editors. NO ONE else outside of mobile. Reply

You have never played on an IPS running at 120hz at 1440 then. Once you do you will eat your words. Lag is not an issue with the tempest and the color/clarity/brightness plus screen size and high refresh rate kill, absolutely KILL any other panel on the market. Get one - then comment.Reply

He's talking about the overclockable IPS (and PLS) 2560x1440 monitors that Anandtech has mysteriously failed to cover in their articles, despite the very significant enthusiast fanbase for these monitorsReply

Does G-Sync have any potential benefit for the use case of film/video playback, especially media encoded at sub-30Hz framerates? This seems like it has been conspicuously absent from the discussions I have seen thus far. My HDTV can already sync with a 24p source to display native film framerates by essentially rendering each frame 5 times @ 120Hz. G-Sync should be able to trivially match video framerates (using frame duplication if needed) and eliminate any need for 3:2 pulldown or similar approximations. This, of course, speaks to a niche (A/V-philes) within a niche market (PC Enthusiasts) who would be currently limited to relatively smaller desktop displays unless this sort of tech works it's way into projectors and larger TVs.Reply

Un TV there is constant 24 screens per second standard, so the display is always in sync if it does support 24 standard... The problem with games is that there is variance in sync speed, in movie and TV material the speed is allways same, so the problem is not in there...The problem with movies is that 24 FPS is too little. Even the "super" smooth 48 FPS that is used in Hobbit is not so great even some people think that it does destroy the "movie feel"...I think that Peter Jackson should have got it 96 FPS so it would be in par with good ole CRT screens! IMHOReply

Based on this it seems like we'd all be better served by 240hz 1080p displays than Gsync. More affordable, allow us to use more affordable GPU's (instead of attempting to power a 4k display) and TV's it would allow for that AMAZING feature that apparently no one uses where you can have split screen multi-player with each person having the full screen.Reply

Higher static refresh rate requires more interface bandwidth. Abrash talks about 1000hz panels, which would be great, but the current interfaces aren't anywhere near there in bandwidth yet.

You can have splitscreen multiplayer with two people at 120hz (with glasses of course). At 240hz you could have four player, but then each player would have the same V-Sync issues a regular 60hz monitor has today.Reply

This sounds like an attempt to recover some $$ from a system builder, over charging for mid range hardware. Its a separate 'piece' and perceived as a separate sale. If this tech is licensed and not 'made by nvidia' I could see industry standard here. And no one else was going to do it, I'll give credit for market making. However, my inner geek is appalled that I'd just not get the horse power to the computer, and enable 120hz, and 144hz vsync in the driver, if your monitor can handle it. In addition to the polling, I see this being more of a distraction from a 'build your own' stand point.

I also have to ask why now. None of the consoles have NVidia chips, so none of the next gen TV's will use it. (probably)

I can only guess that this is a first step in delivering data center rendered grafix to whatever device at whatever refresh rate, and resolution.

Considering that since kepler, the upgrades have only been incremental for gamers, (not so sure about data center cards.) just to keep up the AMD tit for tat. Is g-sync someone's attempt to convince executives that there is still a market in PC gaming? a get it out the door before its over product?

Nothing here makes me want to buy this instead of a water block, pump, reservoir etc. for similar price. So were left to non builder pc gamers. Does anyone like this even exist?Reply

Great. Another way to get morons to spend extra money. Another reason to trick people into buying a new monitor or a video card.

First of all, tearing and stuttering is not that big an issue. Certainly not big enough to get a reasonable person to shell out $100+ for a new monitor. It barely ever happens, and when it does happen, I barely notice it, because I am used to it from old days of gaming. Honestly, with my current setup, (670 GTX + 120Hz Acer monitor) I don't have any tearing or stuttering. I just dont know what these people are talking about! But hey, if they just keep telling us that we NEED a new monitor to have "smoother" gameplay we just might believe it if they keep forcing it on us. Great way to create hoards of mindless consumers to buy a new product they don't really need.Reply

"G-Sync also lowered my minimum frame rate requirement to not be distracted by stuttering. Dropping below 30 fps is still bothersome, but in all of the games I tested as long as I could keep frame rates north of 35 fps the overall experience was great. "

Anand, what are you talking about?! Are you really that spoiled? Why is it that when I play at 30 fps the experience is butter smooth for me, and you need to be north of 35fps? Seriously...Reply

After 3 days on a 120Hz monitor I could not go back to 60Hz and in fact started demanding a 90fps minimum in-game in order for it to feel 'smooth.' 30fps feels butter smooth to you because you have only experienced 1 single, disgusting, Soviet Republic brand of butterReply

Hmm, interesting article. I actually currently have the ASUS VG248QE. While gsync sounds intriguing, what I find even more promising is the use of lightboost to give CRT like quality to the panel. WIth my current setup I have a GTX 680 with the max framerate limited via EVGA's OC tool to 120fps. On a 1080p screen, with 120hz refresh rate, and 2d lightboost enabled you get absolutely no motion blur, very little tearing, and overall an amazing gaming experience. Since you have the hardware already, I'd be interested in hearing your opinion on 2d lightboost + gsync (at 120hz), and if that makes any difference. Also I'd love it if Anandtech did an article on lightboosted monitors as well! My ideal monitor would be something like a 27in 2560x1600 IPS panel with 120hz lightboost supported... of course I'd need something like dual 780s to get the most out of it, but it'd be well worth it to me heh. Reply

Lightboost doesn't work well on low framerates since you'd see the backlight flicker. If you flicker it more than once per frame you introduce retina blur again. It works best at high, stable framerates. G-Sync would still be useful with lightboost if your framerate hovers between 60 and 120 though.Reply

Some of them darken a lot, and others darken less. Some have better contrast ratios, and much better colors. Some of them (BENQ Z-series) can strobe at 75Hz and 85Hz, if you want zero motion blur with a bit less GPU horsepower. Some of them are zero-ghost (no double-image effect). But you can't "get it all" simultaneously.

From my experience playing on the EIZO FG2421 (warmed up after 30 mins to reduce VA ghosting on cold panels), it's lovely to have a bright and colorful picture, something that LightBoost has difficulty with. The VA panels ghosts a bit more (until I warm up), but when I sustain 120fps@120Hz (Bioshock Infinite, VSYNC ON on a GeForce Titan), it produces spectacular motion quality, the most CRT quality I have ever seen.

Now, if I fall below 100fps a lot, like Battlefield 4, I prefer G-SYNC because it does an amazing job of eliminating stutters during fluctuating framerates.Reply

All of those articles focus on variable hz function of g-sync and not the supposed "superior to lightboost" backlight strobing option. The articles say "30 to 40 fps is 'fine'", with 40 being the sweet spot. I would disagree. These same people complain about marginal input lag milliseconds, yet accept long "freeze-frame" milliseconds with open arms in order to get more eye candy. I think people will be cranking up their graphics settings and getting 30 - 40fps. At 30fps you are frozen on the same frame of world action for 33.2ms while the 120hz+120fps user sees 4 game world action update "slices". At 40fps you are seeing the same frozen slice of game world action for 25ms, while the 120hz+120fps user see 3 action slice updates. This makes you see new action later and gives you less opportunities to initiate action, (less "dots per dotted line length") then you add input lag to your already out of date game world state you are acting on. Additionally, the higher hz+higher frame rates provide an aesthetically smoother control, aesthetically smoother higher motion definition and animation definition. Of course 120hz also cuts the continual FoV movement blur of the entire viewport by 50% (vs 60hz baseline full smearing "outside of the lines" blur) as well, and backlight strobing at high hz eliminates FoV blur essentially (eizo FG2421 now, "superior to lightboost" backlight strobing mode of g-sync monitors in the future supposedly).Reply

60hz vz 120hz vs backlight strobing. Note that newer monitors like the eizo FG2421 and future "superior to lightboost" backlight functionality of g-sync strobe mode (unfortunately mutually exclusive from the variable hz mode) do not/will not suffer the lowered brightness and muted colors of the lightboost "hack" shown in these examples. However they will eliminate the blur which is shown in these examples.http://www.blurbusters.com/faq/60vs120vslb/Now remember that in reality it's not just a single simple cell shaded cartoon object moving across your screen, rather your entire 1st/3rd person viewport of high detail textures, depth via bump mapping, "geography"/terrain, architectures and creatures are all smeared "outside of the lines" or "shadow masks" of everything on screen every time you move your FoV at 60hz, more within the "shadow masks" of onscreen objects at 120hz but still losing all detail, textures and bump mapping, and essentially zero blur when using backlight strobing over 100hz. Reply

I'm more interested in high fps and zero blur obviously, even if I have to turn down the ever higher *arbitrarily set by devs* graphics cieling "carrot" that people keep chasing (that ceiling could be magnitudes higher if they wanted).I still play some "dated" games too.. fps is high.

You are seeing multiple frames skipped and behind a 120hz+120fps user, watching "freeze-frames" for 25ms to 33.2 ms at 30fps and 40fps, and every time you move your FoV you are smearing the entire viewport into what can't even be defined as a solid grid resolution to your eyes/brain. So much for high rez.I think people are sacrificing a lot motion, animation, and control wise aesthetically as well as sacrificing seeing action sooner and being given more and sooner opportunities to initiate actions - to reach for higher still-detail eye candy aesthetically.You don't play a screen shot :bReply

The "biggest issue with what we have here today" is not that it's nVidia only. That's a big issue, to be sure.

The biggest issue is that there are a LOT of us who have fantastic displays that we paid high dollar for and will not go down to 16:9 or TN panels. Hell, a lot of us won't even go and spend the same money we just spent on our incredibly expensive and incredibly hard to resell monitors to get this technology that should have 1) been included from the start in the LCD spec and 2) should have a way of being implemented that involves something other than tossing our old monitor in the bin.

They need to make an adapter box for monitors without built-in scalers that translates what they're doing to DVI-D. Else, there's a LOT of people who won't be seeing this technology have any use until they get around to making 4K monitors that include it with IPS and at an even semi-reasonable price.

Really, the biggest problem is they didn't find a way to adapt it for all monitors.Reply

in regard to the backlight strobing functionality, the eizo FG2421 is a high hz VA panel whose backlight strobing "zero blur" capability is independent of gpu camps.

We are talking about gaming usage. Practically all 1st/3rd person games use HOR+ / virtual cinematography which means you see more of a scene in 16:9 mode, even if you have to run 16:9 mode on a 16:10 monitor. 16:10 mode cuts the sides off basically.http://www.web-cyb.org/images/lcds/HOR-plus_scenes...

Gpu upgrades can run $500 - $1000 now too for high end, and somewhere in between or double for dual gpus. 16:10 / 16:9 is really a bigger deal at 1080 vs 1200 even for desktop use. 16:10 30" is not as much real-estate difference as the size suggest between 2560x 27", the 30" pixels are a lot larger. Here is a graphic I made to show three common resolutions compared at the same ppi or equivalent percieved ppi at viewing distances.http://www.web-cyb.org/images/lcds/4k_vs_27in_vs_3...

Imo for the time being you are better off using two different monitors, one for gaming and one for desktop/apps instead of trying to get both in one monitor and getting worse performance/greater trade-offs combined in one (i.e 60hz vs 120hz, lack of backlight strobing or gsync, resolutions too high to maintain high fps at high+/ultra gfx settings relative to your gpu budget, resolutions too low for quality desktop/app usage,lots of tradeoff, etc).

Upgrades to display and gpu technology are the nature of the beast really. Up until now you would be better off getting a korean knock off 2560x1440 ips or the american mfg versions for $400 or less, and put a good 120hz gaming monitor next to it imo Eizo FG2421 24" VA backlight strobing model is around $500, so for $900+ (and a good gpu of course) you could have better of both worlds pretty much for the time being. Going forward we know g-sync will have backlight strobing functionality but we don't know if any of the higher resolution monitors due to come out with g-sync will have 100hz+ required to support strobing adequately. If they don't, again we are back to major tradeoffs gaming vs desktop use again (low hz -> low motion+animation definition/much less game action updates shown per second/lower control definition, full 60hz baseline smear bluring out all detail and textures during continual FoV movement/motion flow).Reply

The G-Sync module is a daughter card for another custom NVIDIA board which replaces the inputs and scaler (if present) on whatever monitor they decide to build one for. Theoretically, any panel with a suitable LVDS connection for the TCON (i.e. most of them) can be supported. Also, NVIDIA only has to provide the specs for the daughter card socket and/or a reference design for their scaler replacement board in order for any display manufacturer to implement it and create a "G-Sync ready" product.Reply

I'll leave it at this for now because I've been posting too much..To review,- every time you move your FoV greater than a snails pace on a sub 100hz, non backlight strobing monitor you drop to such a low rez that it isn't even definably a solid grid to your eyes and brain. So continual bursts/path-flow of the worst resolution possible more or less, the entire viewport dropping all high detail geometry and textures (including depth via bump mapping) into a blur. So much for high rez.

- 1080p is the same exact scene at 16:9 no matter what in HOR+, HOR+/virtual cinematography is used in essentially every 1st/3rd person perspective game and every virtual camera render. All of the on screen objects and the perspective are the same size on a 27" 1080p and a 27" 1440p for example (in a 1st/3rd person game). The difference is the amount of pixels in the scene providing greater detail obviously. This is a big difference but a much bigger difference for desktop/app real-estate where the usable area and display object sizes change, especially considering gpu budgets limits/fps in regard to games...-at low hz and low fps, you are at greatly reduced motion definition and control definition.Greatly less the amount of new action/animation/world state slices shown, seeing longer "freeze frame" periods during which a high hz+high fps person is seeing up to several newer updates.1/2 the motion+control definition and opportunities to initiate actions in response at 60hz-60fps1/3 the motion+control definition and opportunities to initiate actions in response at 40.1/4 the motion+control definition and opportunities to initiate actions in response at 30.

-you need at least 100hz to support backlight strobing for essentially zero blur (120hz better).-you can upscale 1080p x4 fairly cleanly on higher rez 3840x2160 (aka "quad HD") monitors if you have to, its not optimal but it can work(so you can game at higher fps/lower rez on demanding games yet still use a high rez monitor for desktop/apps for example)

-the eizo FG2421 is a high hz 1080p VA panel that uses backlight strobing, it isn't TN.- we know that nvidia is still supposed to support backlight strobing function as part of g-sync monitors, just that it won't work with the dynamic hz function (at least not for now). So "the industry" is still addressing backlight strobing for zero blur in both the eizo and the g-sync strobe option (which again, requires higher hz to make the strobing viable).

-We know there are higher rez and likely ips g-sync monitors due out, but we do not know if they will have the max hz bumped up which is necessary to utilize the backlight strobe function adequately.

There is more to a game than a screen shot resolution/definition.There is continual FoV movement blur (an undefinable"non"definition resolution, unless perhaps you were to equate it to an extremely bad visual acuity number /"out of focus")There is otherwise essentially zero blur using high hz and backlight strobing,and there is high or low action updates and motion definition, animation definition, and control definition. Reply

Great reply, just minor clarification.Not necessarily, if you can strobe at rates below 100Hz.Some strobe backlights (e.g. BENQ Z-Series, such as XL2720Z) can strobe lower, like 75Hz or 85Hz.

You only need one strobe per refresh, since framerate=refreshrate on strobed displays leads to zero motion blur. (Also why CRT 60fps@60Hz has less motion blur than non-strobed LCD 120fps@120Hz). 100Hz is simply because of less flicker, and because of LightBoost's rate limitation. But nothing prevents zero motion blur at 85Hz, if you have an 85Hz strobe backlight (with low-persistence; aka brief backlight flash times).

yes I didn't mean that it was impossible, I meant that for people like me with "Fast eyesight" / visual acuity, 100hz sounds like it would be a good minimum against seeing flicker. I know from your posts on other forums that there are even 60hz sony tv's with some from of stobing but that would drive me crazy personally. Thanks for the input though so everyone reading knows the rest. Reply

This $120 can also be invested into a better GPU which can easily hit 60 FPS. $120 is just way too pricy for this. I never play FPS with vsync. Especially in BC2 the effect is terrible and I can't hit anyone. The difference is day and night. However I never notice tearing...Reply

Technically unnecessary from whose perspective? Anand, I'm sure you meant the monitor's perspective, but this otherwise benign comment on VBLANK is misleading at best and dangerous at worst. The last thing we need is folks going around stirring the pot saying things aren't needed. Some bean counter might actually try to muck things up.

VBLANK most certainly IS "technically" needed on the other end ---- every device from your Atari VCS to your GDDR5 graphics card!

VBLANK is the only precious time you have to do anything behind the display's back.On the Atari VCS, that was the only CPU time you had to run the game program.On most consoles (NES on up), that was the only time you had to copy finished settings for the next frame to the GPU. (And you use HBLANK for per-line effects, too. Amiga or SNES anyone?)

On most MODERN consoles (GameCube through XBox One), you need to copy the rendered frame from internal memory to some external memory for display. And while you can have as many such external buffers as you need (meaning the copy could happen any time), you can bet some enterprising programmers use only one (to save RAM footprint). In that case VBLANK is the ONLY time you have to perform the copy without tearing.

On any modern graphics card, VBLANK is the precious time you have to hide nondeterministic duration mode changes which might cause display corruption otherwise. Notably GDDR5 retraining operations. Or getting out of any crazy power saving modes. Of course it's true all GPU display controllers have FIFOs and special priority access to avoid display corruption due to memory bandwidth starvation, but some things you just cannot predict well enough, and proper scheduling is a must.Reply

G-Sync isn't V-blank, so, yeah, if you have G-Sync you don't need V-Blank. You can take your time rendering, not worried about what the monitor is doing, and push your updated frame once the frame is ready. This moves the timing responsibility from monitor to the GPU, which obviously is a lot more flexible.

If you need time to do GPU configuration or other low level stuff as you mention, then just do them and push the next frame when it's done. None of it will result in display corruption, because you are not writing to the display. You really can rethink the whole setup from bottom up with this. Comparing to systems that are not this is kinda meaningless.Reply

Although true -- especially for DisplayPort packet data and LCD panels -- this is not the only way to interpret GSYNC.

Scientifically, GSYNC can be interpreted as a variable-length VBLANK.

Remember the old analog TV's -- the rolling picture when VHOLD was bad -- that black bar is VBLANK (also called VSYNC). With variable refresh rates, that black bar now becomes variable-height, padding time between refreshes. This is one way to conceptually understand GSYNC, if you're an old-timer television engineer. You can theoretically do GSYNC over an analog cable this way, via the variable-length blanking interval technique.Reply

Yeah, put this into perspective:"Refresh rates" is an artificial invention"Frame rate" is an artifical invention

We had to invent them about century ago, when the first movies came out (19th century), and then the first televisions (Early 20th century). There was no other way to display recorded motion to framerateless human eyes, so we had to come up with the invention of a discrete series of images, which necessitates an interval between them. Continuous, real-life motion has no "interval", no "refresh rate", no "frame rate".Reply

While you can still see the effects in 120 Hz, tearing or lag is nearly not visible at all anymore at 144 Hz. At this point, I can easily do without G-Sync.G-Sync is certainly a nice technology if you use a 60 Hz monitor and a nVidia card. But nearly the same effects can be achieved with lots of Hz. A 27" 1440p@144Hz monitor might be quite expensive though. ;)Reply

The article states that at any framerate below 30fps G-Sync doesn't work since a panel refresh is caused on at least a 30Hz signal. That conclusion doesn't make sense; unlike a "true" 30Hz refresh rate, after every forced refresh, G-Sync can allow a quicker refresh again.

Since the refresh latency is 1s/144 ~ 7ms aka on this panel, and a 30Hz refresh is ~ 33ms, that means that when the frame rendering takes longer than 33ms - but shorter than 40ms, it'll finish during the refresh, and will need to wait for the refresh. Translated, that means that only if the instantaneous frame is between 25 and 30 fps will you get stuttering. In practice, I'd expect frame rates to rarely be that consistent; you'll get some >30fps and some <25fps moments, so even in a bad case I'd expect the stuttering to be reduced somewhat; at least, were it not for the additional overhead due to polling.Reply

Just a thought but the one thing you really didn't take into account was NVidia's next GPU series, Maxwell, which supposedly will have the VRAM right next to the DIE and will share stored files with direct access from the CPU, if you take that into account, along with G-Sync you can see what will be happening to the framerates if they can pull off a die shrink as well.At this point I think Monitor technologies are so far behind, and profit milked to the point of stifling the industry so bad, 1080p has been far longer cycle that we could expect partially due to mainstream HDTVs. We should have had the next jump in resolution as a standard over a year ago in the 200$-300$ price range and standard IPS technologies with lower MS in that range as well or have been introduced to other display types. LED backlighting carried a heavy price premium when in reality they are ridiculously cheaper to produce because of the hazardous waste CFL bulbs being taken out of the equation, which cost more and have certain fees that need to be paid.Reply

I just don't see why can't they integrate this into the video card completely, and work out a method with display manufacturers to bypass the scalar. Closet guess I have is Nvidia doesn't want to add that $120 if that to their cards.Reply

Thanks for that very intersting review, Anand! Glad you're not only doing iWhatever by now ;)

Some people have brought up the point that one could simply get more raw GPU horsepower and push for high frame rates with VSync on. I think GSync is superior, in fact I'd formulate it the other way around: it could let you get away with a smaller GPU, since 30 - 60 fps is fine with GSync on. Apart from buying the GPUs this also saves on power consumption, cooling requirements, noise etc.

And this could go really well with mechanisms built into the game engines to ensure a certain minimum frame rate by dynamically skipping or reducing the complexity of less important stuff.

And decoupling of the AI and interface from the display refresh, of course.Reply

When will you guys talk about Gsync's alternative modes. I was curious about what improvements they made to Lightboost with their Low Persistence Mode. I was a little shocked to see you say gsync won't have much benefit for those who can push out a lot of FPS which is correct but LPM is supposed to address those type of people but it wasn't mentioned.Reply

Stuttering? Stuttering? What?I have never experienced stuttering with Vsync on, except when I turn the pre-rendered frames too high. They should always be on 0 or 1, never more, or you GET stuttering and input lag. Also LCDs have far too much motion blur to notice slight stuttering.Reply

Unfortunately, with this being a proprietary hardware/software 'gimmick' I don't see it taking off. A standard needs to be created for this to really take off and be viable for everyone. If it can be adopted by everyone (ATI, Intel, Matrox, VIA ect..) it will no longer be a gimmick.Reply

This is hardly a gimmick. This is absolutely a game changer that makes Nvidia have all the cards as far as capturing high-end gamers.

Of course it makes sense for ALL high-end gamers (people willing to buy Nvidia and ATI's flagship GPU cards around this time of the year annually) for it to be licensed out eventually, but it makes sense for them to not do that until a year or so.

It's a no-brainer the benefits of the problem solved that's not necessarily critical to have unless you're already the niche part of the audience that values high-quality levels of entertainment.

It's not too different than retina displays by Apple early on if you ignore the closed access to this technology: You have to SEE it to believe it, but the problems it solves is a no-brainer to capture if you're already willing to pay that much for a laptop/desktop.

If you're deciding between the best tier of Nvidia card or ATI card right now for gaming, this technology--along with PhyX--makes Nvidia the rationale choice right now to side with.

They undisputedly have the best cards this year, the better proprietary technology that enhances games in a progressively-enhancing way (if you don't have it, you don't notice it; if you do have it, it's very satisfying) Nvidia better provides, and their SLI is more consistent in what it delivers compared to Crossfire that has had issues ATI is solving to bring parity back ot that discussion.

G-Sync being exclusive to Nvidia graphics cards for a year isn't too much of a big deal.

I wouldn't be surprised nonetheless if they license it after a year or so to offset any attempt for someone to research their own answer to it.

My only concern is the component that replaces the monitor scaler: Would it pose a problem with non-nvidia cards in the future? Reply

This seems to be perfect for mobile gaming. Only one unit, and constant lack of performance. And i dont think the cost of implementation has to be 120, thats such a random figure taken out of context. Reply

I'd love to see a report on how this behaves with a title like Deus Ex:HR-DC, which has an egregious stuttering issue in hub areas. It's not clear this stuttering is a rendering issue, so how would g-sync respond?Reply

No mention of mouse lag in the article. Somehow, it goes unnoticed by some folks, while for others ( me) it downright makes the game unplayable, at least competitively.

With V-Sync on and the system capped at 60fps, mouse lag on some games is just terrible. Forget stuttering and scream tearing, for me this is much worse. That's why I bought a 120hz LCD. I can either turn VSync off and enjoy zero mouse lag and relatively no tearing on modern games that are FPS-bound; or I can turn VSync on and experience considerably less lag than at 60hz and a beautiful artifact-free picture.

But if Gsync eliminates mouse lag entirely, as well as any visual issues, that seems like an awesome tech that I wish were implemented in all setups. Although it wouldn't be as useful for someone with a 120hz+ display as 60hz, it'd still be a better overall experienceReply

One of the BIGGEST FAILS at introducing a new. Tech I have ever witnessed.1080P=Automatic fail. As the author mentioned, the entire point of the technology would be suited for a higher res. Investing 400$ for a 1080P? No thanks. This is a complete mismatching of technology. If you can't run over 60fps on a 1080p monitor, you don't need to spend 400$ on a proprietary monitor that can look good at 30hz.. you would start with a better vid. card.Reply

Hi, everyone. I have a gtx 670 paired with an ASUS VG248qe. I've always thought that I've been quite sensitive to screen tearing, and once I started playing games with vsync off at 120/144hz refresh rates, tearing seemed to have disappeared altogether. Whether it's bioshock or spelunky, the buttery smooth motion flits past my monitor with no tearing. Of course, when I can't match my output FPS to the high refresh rate of my monitor, there's some motion blur (with LB on or off), but no tearing.

So, it seems like g-sync is unnecessary for me because my gfx card doesn't need to throttle itself to match my monitor's high refresh rate. I probably won't see any difference, in fact, as the article stated, I'll see a slight decrease in fps. Anand was using a 760, which seems like a better fit for g-sync. But would someone with a gtx670 and a high refresh rate monitor (120hz<) really need this technology?Reply

Somatzu - I just recently completed a DIY on this myself and I have a GTX Titan. There are still some pretty demanding games like Metro: Last Light and BF4 multiplayer that occasionally bring frame dips down to the 40-60 fps range. For those who have higher end PCs, the G-SYNC completely removes the occasional stutter that comes from big frame drops during high-intensity games. And for those games where you tend to hold back on extra features like AntiAliasing, you can now crank them up a little bit and still get really smooth performance. Is it 100% necessary if you've already got a high end rig? Not necessarily, but it's definitely more pleasing to the eye when playing games where frame drops are common (i.e. BF4 multiplayer) and stuttering when you need to twitch is the difference between life and death.

Also, the GSYNC module comes equipped with a toggle-able improved "Lightboost" mode called ULMB, so no more ToastyX hack is required for Lightboost.Reply