It looks like the release of the Nvidia GTX 680 is imminent. All current reports point to it being based on GK104 and priced at $500 or $550, with about a 10% performance increase over AMD's 7970. It seems we will have to wait for a more powerful card based on GK110.

Also rumored is a new, much more effective anti-aliasing technique and support for three displays on one card.

Leaked photos show a smallish card given its supposed performance, and it uses just two 6-pin power connectors. It could be a boon for high-end HTPC gaming applications where significant power and low noise are wanted.

TXAA looks good, the rest is fluff......clockspeeds and throughput...thats all we care about Nvidia. Decreased noise is also important however because, lets face it, you jam a case up with a couple of these things and you don't want to hear the hairdryers running while you're trying to immerse yourself in, say, dead space.

Leaked TOMS HARDWARE review looks like base unit is getting on average 20+ FPS higher than 580's at stock speeds.....wonder what the OC potential on the 680's will be........(and Skyrim barely budges obviously since its cpu bottlenecked).......hmmmm, wonder how it runs CRYSIS

Xbox Live / PS3 / Steam: HeadRusch1Keeping the world safe from the evil antics of Bernie Tanaka and Mel Fujitsu since 1986

Leaked TOMS HARDWARE review looks like base unit is getting on average 20+ FPS higher than 580's at stock speeds.....wonder what the OC potential on the 680's will be........(and Skyrim barely budges obviously since its cpu bottlenecked).......hmmmm, wonder how it runs CRYSIS

I was just reading that. Not bad at all, especially at 1920x1080.

I hope a site does PhysX benchmarks in games like Arkham City, and compares it to the 580.

The next couple of months will be interesting.. I was about to pull the trigger for a 7870 (and probably a new PSU since my 460W might be bottleneck for it) but I am now thinking about the kepler.. am I normal doctor?

Since I,m already using an AMD card and I'm reading alot of horror stories about drivers issues switching from AMD to NVidia.. not really interested. Also, if the new kepler price is around 500$+... add a PSU it,s getting expensive. The 7870 should probably suffice to game on a single monitor at 1080P... we'll see I'm not settled yet

Those who bought a 7970 should not feel bad at all when it comes to performance.

I just wish competition dictated for this card to be priced as the mid-range card that it probably should have been with the more powerful GK110 looming. Now people will have to pay through the nose for the GK110.

Those who bought a 7970 should not feel bad at all when it comes to performance.

I just wish competition dictated for this card to be priced as the mid-range card that it probably should have been with the more powerful GK110 looming. Now people will have to pay through the nose for the GK110.

Some review links have been added to the original post.

It's as or more powerful than the 7970 at a lower price, and will probably drive the 79xx rpices down. Win for the consumer.

GTX 680 is impressive, outperforming the 7970 at triple monitor resolutions with less RAM.

I hope the kepler will in fact drive AMD to cut the 79xx and 78xx prices down

For a single monitor, I don't know if it's worth to have such a powerful card except to be futureproof.... 130 FPS in skyrim at Ultra... giggydy giddydy ya!

I'm torn don't know what to buy

Huh, where you seeing 130fps in Skyrim? Skyrim has been scaling kinda crappily on higher end GPU's has it not? Regardless....what is the 110 Kepler you guys are talking about, is that their single PCB/Dual GPU solution?

Xbox Live / PS3 / Steam: HeadRusch1Keeping the world safe from the evil antics of Bernie Tanaka and Mel Fujitsu since 1986

The GK110 is a single-GPU solution. It is rumored to have a massive die size of about 550 mm², and is expected to be released later this year.

Thanky, I didn't realize there were "two stops" on the nvidia roadmap this year.....looks like this is a more efficient 580 (yes, I know different architecture),phase out the Fermi's and bring Kepler to bear.....but I guess the 110 is the one folks are waiting for?

I'm still tempted to upgrade if only to see what that adaptive VSYNC is all about

Xbox Live / PS3 / Steam: HeadRusch1Keeping the world safe from the evil antics of Bernie Tanaka and Mel Fujitsu since 1986

OK - snagged my two for SLI! Two of the EVGA card on their way overnight for my new i7 3930K gaming box.

Reviews look good out there.

This has some new features, also, compared to Fermi. Changed the way texture shaders were built/used, and compensated by going with waay more CUDA cores, as well as new anti-aliasing modes, TXAA. And changes to how PhysX is handled, and eliminating the old PhysX processor.

I think I'm going to skip the generic reference boards and wait for a TWIN FROZR overclocked or superclocked version, then push that a little further. Looks like everything being released today, tomorrow falls under the realm of "stock speeds". Nothing wrong with that, I'm sure the O/Cing potential is there. Can't wait to hear your guys experiences, ESPECIALLY if you are upping from a 570 or 580 like I would be.

Xbox Live / PS3 / Steam: HeadRusch1Keeping the world safe from the evil antics of Bernie Tanaka and Mel Fujitsu since 1986

Thanky, I didn't realize there were "two stops" on the nvidia roadmap this year.....looks like this is a more efficient 580 (yes, I know different architecture),phase out the Fermi's and bring Kepler to bear.....but I guess the 110 is the one folks are waiting for?

I'm still tempted to upgrade if only to see what that adaptive VSYNC is all about

I would imagine that it's mostly owners of multi-monitor setups who are waiting to see what the GK110 is about.

As for adaptive vsync, I think (I could be wrong) the idea is for vsync to be engaged when the frame-rate gets higher than the display's refresh rate, and then disengaged when the frame-rate is lower than the display's refresh rate. The problem with the latter point is that tearing will still occur (albeit to a lesser degree) as the output will not be perfectly sync'd to the display's refresh cycles. It is a nice option to have, but I'll stick with regular vsync and enable triple buffering when lag (or more specifically, frame-rate halving) is an issue.

The problem with the latter point is that tearing will still occur (albeit to a lesser degree) as the output will not be perfectly sync'd to the display's refresh cycles. It is a nice option to have, but I'll stick with regular vsync and enable triple buffering when lag (or more specifically, frame-rate halving) is an issue.

I *thought* they said it would match the framerates to the monitor refresh rates, so you wouldn't get the "immediate drop in frames" when you got less than 60fps on your 60hz monitor...almost like putting some kind of hardware triple buffer in there so if you are at 45fps you'll get 45fps progressive without tearing.

Obviously...there are MATHS at play SKYRIM is the last game where tearing drove me insane both due to high framerates as well as low......

And not really tearing more like microstuttering, and you engage vsync and have to deal with the input lag (even with forced double or triple buffering, you still get the mouse lag).

Xbox Live / PS3 / Steam: HeadRusch1Keeping the world safe from the evil antics of Bernie Tanaka and Mel Fujitsu since 1986

And not really tearing more like microstuttering, and you engage vsync and have to deal with the input lag (even with forced double or triple buffering, you still get the mouse lag).

In my experience with Direct3D games, forcing triple buffering almost always eliminates noticeable mouse lag in games that have it with vsync on. This revelation first came about when I tried it with the game FEAR.

In my experience with Direct3D games, forcing triple buffering almost always eliminates mouse lag in games that have it with vsync on. This revelation first came about when I tried it with the game FEAR.

I was using D3Doptimizer or whatever to try to force triple buffering with vsync on in Skyrim and always got "mediocre" results. With Vsync turned off you get horriffic microstuttering. I tried setting the read-ahead frames to 1, but that made things worse. Vsync on is perfect and you just deal with mouse lag AND the painful drop to 30fps when things slow down a bit....

But honestly Skyrim is a bad offender.....many other games don't give me the grief about vsync. In fact I almost always ran with Vsync off and never noticed stuttering or tearing.....*shrug*.

I am very interested to hear how the 680 changes that equation.

Xbox Live / PS3 / Steam: HeadRusch1Keeping the world safe from the evil antics of Bernie Tanaka and Mel Fujitsu since 1986

I was using D3Doptimizer or whatever to try to force triple buffering with vsync on in Skyrim and always got "mediocre" results. With Vsync turned off you get horriffic microstuttering. I tried setting the read-ahead frames to 1, but that made things worse. Vsync on is perfect and you just deal with mouse lag AND the painful drop to 30fps when things slow down a bit....

But honestly Skyrim is a bad offender.....many other games don't give me the grief about vsync. In fact I almost always ran with Vsync off and never noticed stuttering or tearing.....*shrug*.

I am very interested to hear how the 680 changes that equation.

Skyrim does have all kinds of oddities going on.

Forcing triple buffering on top of vsync has made for perfect gameplay with a number of games that were at the time taxing my GPU. Some games use triple buffering by default, and some others give you the option to enable it via menu (such as some older Tomb Raider games). Other Direct3D games, like FEAR and Just Cause 2, require use of D3DOverrider for this function.

I am definitely interested in knowing more about adaptive vsync. The chart I saw made it look like a simple algorithm that enables vsync when the frame-rate would otherwise be above the display's refresh rate, and then disables vsync when the frame-rate goes below the display's refresh rate.