Post Your Comment

132 Comments

Since mid-range cards aren't really that strong yet for 2560x1440(/1600), I'd say 2-3 years waiting for it to be strong enough for 4K is wishful thinking and at the same time, games will be even more demanding. Still, very nice to see the numbers though.

Yeah theres no way even dual mid-range cards with handle games at 4k in just 2 - 3 years.

I'm also expecting quite a graphics jump and especially a big VRAM usage jump because of the new consoles and the good amount of memory they have. We're going to start seeing very high res textures. So mid-range cards will also start needing 6GB+ for PC game, especially @ 4k res in 2 - 3 years time.Reply

Next year, video cards will double in density. 3 years from now, that will be doubled. In other words, the systems in 3 years will be hugely powerful to what we have now, much less the nearly year old Titan, which again, you are comparing to a new video card which will be twice as dense as a video card that is twice as dense as the Titan, approximately. Think of what card was released 4 years before the 680, and look at how they stack up performance wide.Also keep in mind that VRAM is a big part of all of this, and all signs point to the next generation of cards having more VRAM standard.Reply

Um, the GTX Titan isnt anywhere near a year old. I dont think that this meteoric rise in GPU power will happen quite as fast as you're expecting/hoping for, especially in reasonably priced consumer products.Reply

The main problem with the future of Moore's Law is the actual physical boundaries. Eventually we will get to the atomic size limit of transistors which means that some serious innovation will have to occur to pass tip toe around that boundary. We don't know when we'll hit that limit but it'll happen. Until then MordeaniisChaos will be correct.Reply

I'd think that mid-range cards would actually do a lot better at 4k if they simply all had their VRAM doubled- especially the Nvidia cards with 2GB/GPU. An 8GB GTX690 should be pretty potent at 4k if settings are balanced out a little.

Note that I appreciate Ian pushing the sliders all the way to the right, to set a good performance baseline for 4k. I'm sure they'll have much more reasonable reviews when 4k60 panels in desktop form factors start dipping into the enthusiast friendly <$1000 bracket in the next year or so.,Reply

Even a decent card like a 7950 has trouble at high reses. I run triple 2560x1440 27s. For one display it's fine, but for three, we're talking medium settings in Crysis 3 and BF3 with the cards being badly vram capped. That was one thing I felt was missing in this review, vram usage.Reply

This is an excellent question. Aren't the point of these rendering features to eliminate artifacts that are a result of the pixellation and line breaking at lower resolutions? Shouldn't these artifacts should be inherently eliminated by the higher resolution?Reply

You need some AA at any resolution for a game to look its best. There are patterns that the human eye can pick out relatively easily that are still present on high-PPI/DPI montiors.

Example: go take a look at Galaxy on Fire 2 running on the iPad 4. I know for a fact it runs at native resolution, but there are times when you notice aliasing going on, or at the very least some weird jagged edges, especially when looking at a ship against the black background of space.

That said, SSAA is COMPLETELY unnecessary for this resolution. It'd be nice to get some analysis of what kind of settings we can expect out of games for the near-ish future at 4K. That's dependent on getting one of the reviewers a 4K TV to test on, which should be easy with those Seiki sets available.Reply

Ian, just a thought, given the level of GPU power we're talking about here, and what futuredisplays might require, perhaps European power outlets will just about to be able to copegiven their 3kW+ typical limit, but what about US power outlets? Despite the inevitableadvances in lower voltage GPUs, etc., is it possible we'll see ever more heat & powerproblems in the future at the top-end of PC gaming? Even if it's not bad enough to bea problem at the wall socket (at least not for Europeans), what about venting the heatfrom out of a case? Makes me wonder whether the ATX standard itself is out of date;perhaps it needs to move up a size level.

"Shouldn't these artifacts should be inherently eliminated by the higher resolution?"

Not really, no.

The issue is that you're trying to render three dimensional objects on a two dimensional plane: that means that things get stretched and contorted. Also, these features are useful for blurring the edges between different rendered objects.Reply

That's what I was wondering too. At such a high resolution and a high enough pixel pitch, any form of anti-aliasing is just pointless. 2x SSAA is practically equivalent to running the cards at 8K, a resolution which is still a decade or two away from being seen outside the lab.Reply

While we might have to wait a little longer in the USA, NHK Japan is on track to begin 8K (Super Hi-Vision) satellite broadcasts in 2016. The codec and compression methods have been finalized, commercial equipment is being prepared for launch... 8K will be here sooner than you think. I never would have thought we'd have 4K TV/monitors for $699 already, but we do.

I've said it before, but GPU makers really need to get on the ball quick. Hopefully mainstream Volta and Pirates will be up to the task...Reply

Error, in 2016 Japan will start the trails for 8K but it's not going to be rolled out until 2020 (If they get the Olympics). You can read the official paper from the Ministry of Internal Affairs and CommunicationsReply

Most movies shot on film (35/70MM) pre-2000 are scanned at 2K or 4K (a select few have been scanned at 8K), then sampled down to 1080p for Blu-ray with some to 4K for the new 4K formats rolling out right now. Those movies were edited on film using traditional methods. Any CGI was composited on the film. Those movies have an advantage of always being future proof as one can always scan at a higher resolution to get more fidelity. 35MM tops out at about 8K in terms of grain "resolution", while 65/70MM is good for 16K. Results vary by film stock and quality of elements.

Most movies shot on film after the DI (digital intermediate) became mainstream had ALL of their editing (final color grading, contrast, CGI, etc.) completed in a 2K environment no matter what resolution the film was scanned. "O Brother Where Art Thou" and the "Lord of the Rings" trilogy are examples of films that will never be higher native resolution than 2K. The same goes for digitally shot movies using the RED cameras. Even thought they shot at 4K or 5K, the final editing and mastering was a 2K DI. "Attack of the Clones" was shot entirely digital 1080p and will never look better than it does on Blu-ray. Only in the past couple years has a 4K DI become feasible for production.

10 years from now, the only movies that will fully realize 8K will be older movies shot and edited entirely on film or movies shot and edited in 8K digital. We'll have this gap of films between about 1998 and 2016 that will all be forever stuck at 1080p, 2K, or 4K.Reply

nathanddrews, thank you for your comment. I asked a relative who helps design movie theater projectors about that very issue. The answer is the industry has no answer, which you confirm. Most of the early digital films will depend on upscaling. On the plus side, in 20 years we should be ready for another LOTR remake anyway :)Reply

That was the same thought that occurred to me as well. I'd love to see how the results would have been without antialiasing. As it stands I barely use it on my 1200p monitor (though admittedly a big reason for that is because of the glorified fullscreen blur filter they generally try to sell you as "antialiasing" nowadays).

I wish I did as well! Perhaps SSAA was properly overkill at that resolution; with more testing I would obviously like to see the differences. Though it may be a while before I can get a 4K in to test.Reply

I game on a 1440p monitor, and antialiasing is still useful at that resolution. It's just not as important as at lower resolutions - on my old 1600x900, anything short of 8x AA seemed jaggy, but at 2560x1440, 2xMSAA is generally fine, maybe 4xMSAA if there's horsepower for it. In some games I actually prefer to use just FXAA, instead putting the horsepower behind some other setting. At 2160p, I would expect 2xMSAA to still be somewhat useful, although more than that is probably overkill (and 8xSSAA is already overkill at 1440p).

Of course, overkill is fun sometimes. I actually run some older games (like UT3) with 8xSSAA, just because I can still get 80+ FPS with everything maxed out.Reply

I was thinking the same thing. At that PPI, don't you essentially get AA for free? Any chance you collected FPS with AA disabled? I would happily suffer through some very minor artifacts to get a smoother 4K experience. Not unlike when I used to play games at 1920x1200 without AA on my old Dell 17" laptop (Core Duo, 7950Go).

While I know it's a different scenario than with fixed pixel displays, I currently do that with my FW900. I run it at 2560x1600@75Hz with some games, but I run them without AA and can't detect (most) artifacts.

I know AT wants to push the bleeding edge here and that this is just a quick look, but $700-1200 for a 4K monitor (Seiko) isn't exactly in the same ultra-enthusiast territory as 4X Titans. That said, it would be nice to see some benchmarks at 4K with medium-to-high settings without AA for the less-wealthy enthusiasts.

My initial thought was that Ian was trying to set a long-term baseline; using these settings in the future, he'd be able to show how far future cards have come relative to the (future) old beasts.Reply

because its there, and it clearly shows that at only 4K these gfx cores Need a lot more work to come up to standard when people migrate to this standard, And note I said "at Only 4K" ....

the reasoning is simple, the UK,Japanese and Chinese PBS are all working to implement Real 8K over the air broadcasting technology Today and for several years now with the expected commercial retail availability by 2020 .... so this near 4k/2k standard is only the start of the ultra HD transition and there's not much reative time to get the computer GFX up to speed at and affordable price point to make it a generic mid range option by 2020 never mind 2023Reply

Considering this is testing 4K resolutions, it's not really judging "real world" environments. So I think the SSAA and running it with everything pumped up is a good way to go. Especially considering we are on the cusp of games getting much better looking, so in a way it could very well be appropriate. We won't be playing games in 4K for quite some time just because the screens aren't really out there for the every man. By then the games will look much better, much denser, with lots of fancy new rendering techniques.Reply

Actually, some sort of comparison of "jaggedness" on a 4k screen vs a 1080 screen would be really nice to see. I'm curious just how much better the PPI will actually make edges look. I'm sure it won't eliminate jagged edges, but surely more than 2xSomethingAA would be enough. Only logical considering a lot of games don't benefit from 8x over 4xMSAA in any meaningful way. And something like 2xTXAA would probably look fantastic. The resolution helps with the blurring issue (which I never minded or really noticed too much, I was happy to have clean, smooth, non-crawling edges), and 2xTXAA should be pretty easy to run. Hell, and I can't believe I'm saying this, but maybe even... FXAA won't look terrible at those resolu-... *barfs* nope, can't say it.Reply

A 7990 is effectively somewhere between 2 x 7970 and 2 x 7970 Ghz edition. So a single 7990 would be much faster than a Titan. They are however in now way in competition as AMD doesnt have a Titan-class single GPU. Its 7990 is head-to-head with a GTX 690 (which itself is effectively 2 x 680s)Reply

It isn't a 7990 6GB. Each GPU has 3GB, so that's the max VRAM the host systemcan use. Adding the VRAM of each GPU is misleading marketing. Find a gamingtest that runs on a Titan using (say) 4GB VRAM (eg. heavily modded Skyrim atmega res and AA), try it on a 7990, it'll tank.

Interesting, but how much difference to the frame-rate it would have made it running the cpu at stock on the the 4 gpu setup and single GPU setup. That way we would all know to waht degree -that clocking your cpu to 5GHz really benefits your frame rates at these resolutions which is only the price ot 4 * 1080p monitorsReply

I think at this high resolution, we won't (or shouldn't) be seeing the CPU bottleneck, but rather GPU bottleneck. So, running the CPU at stock speeds or overclocked won't (shouldn't) make a difference. If there is one, it's probably very miniscule.Reply

I'd expect to see a difference between PCI-E 2.0 and 3.0 as well as 8x and 16x lane configurations since SLI/CrossFire is really helpful at 4k. This would place socket 2011 chips at an advantage here.Reply

I've said it before, and I'll say it again... 4k resolution is total bollocks and offers no improvements over the current 1080p standard.

High density screens are for mobile devices, where you look at them from 30cm distance. And for this area it's allready hitting the same wall as the 1080p resolution for TVs.

Usually you sit some 70cm away from your PC-screen, or in a more ergonomic way, the length of your arm. At this distance tho, you won't really see single pixels anymore on a 22-24" 1080p screen, especially when talking about movies or games instead of still images.For a TV in the living-room, the rule of thumb is 1m distance for every 15". So you'll sit at some 3m distance when looking at a 40" TV. At this distance you won't notice too much differences between 1080p and 4k. So the only thing a 4k-resolution does really is to unnecessarily increase bandwith or storagespace.

For games... yeah... look at the results there. Quad-Titan Setup to get 60FPS in Metro 2033 or Sleeping Dogs. This is the totally wrong direction imho. We want hardware that can drive 4k-resolutions, but at the same time we all know that energy isn't getting any cheaper. And to have a PC that sucks 1kW while playing a video-game :cough: sorry, but that's just stupid.

Leave it at 1080p and get me hardware that allows me to play games like Metro at max settings with only 100-150W for the complete system, i.e. 35W TDP CPU + 75W TDP GPU (75W is the maximum powerdraw over PCIe x16, so no extra PCIe-powerconnector).Reply

Interesting perspective. Until about 2 months ago I would have supported your opinion without reservations. But ever since I've bought my HTC One, I'm not so sure. going from 720p screens to 1080p screens on mobile was supposed to be a pointless excercise as well, only increasing the compute requirements and not giving anything back in return. But it DOES genuinely look a LOT better than 720p screens of the same size, so i'll reserve my opinion until I get to see a 4K desktop panel, around 27-30", myself.

For me, i believe the ideal screen would be a 27 inch 4K IPS/OLED panel at 120Hz.Reply

Just because you don't see the benefit of higher resolutions and high PPI doesn't mean the rest of us have to follow. A few years later, there's a chance you'll get what you want (75W GPU that can max Metro). Of course there's an even higher chance that at the same time, games released will have graphics and physics that make Metro looks more like Super Mario (okay, maybe that's going a bit overboard but you get the point)Reply

It's not about personal preferences, but about physiology of the human eyes.

Higher PPI only is benfecial at closer distance, hence why mobile devices benefit from it. The farther you get away from the screen (PC usually ~80cm and TV 3 or more meters) however, the less difference you'll notice and the only thing that increases is the energy needed to drive the hardware.Reply

Needed resolution has nothing to do with distance (at least not directly), and everything to do what Field of view the screen covers in your field of vision, that field of view is huge for a 30 inch monitor in regular settings. We can see details down to 8 -10 arc-seconds in some circumstances, even down to 0,5 arc-seconds in extreme cases. We need well above 1080p for that, even above 4k. Even 0,5 to 1 arc-minute for pure pixel-resolving in non-special-cases needs more than 1080p, and thats with 20/20 vision, which large portions of the public exceeds.Reply

Adding to ATWindsor's comment, as we move to larger and larger displays, either as a "TV set" in the HEC or a computer display on our desk the "Retina" effect will go away.

Right now a 50" 1080p "TV" and 2560x1440 27" display may place the viewer far enough away that they can't see the pixels… but display will continue to get larger.

For those aforementioned resolutions and display sizes it's 6.5 and 2.6 feet to get the "Retina" effect with 20/20(6/6) vision. That sounds to me we are pretty much on the cusp of this, especially with the HEC display. You can buy a 65" 1080p for under $1500 and yet those that doing so to replace a smaller 1080p HDTV might not be getting a better overall experience. You have to sit at least 8.45 feet away to get the effect.

I'm glad that display technology, GPUs, and even codecs (H.265) are all lining up nicely to tackle this issue.

jrs77, regret to say you're wrong wrt human eye visual fidelity. Humans can resolvefar greater detail than the examples you give as being supposedly the most requiredfor anyone, though of course it varies between individuals. However, just becauseone person can't tell the difference, I can guarantee others could.

Rough approximation of human vision is about 8000 pixels across at near distance,115K pix across at far horizon. We can also resolve a much greater number of shadesof each colour than are currently shown on today's screens (which is why SGI's oldhigh-end systems supported 12bits/channel as far back as the early 1990s, becauseit was regarded as essential for high fidelity visual simulation, especially for night timescenarios, sunrise/sunset/etc.)

Indeed, some TV companies have suggested the industry should skip 4K technologycompletely and just jump straight to 8K once the tech is ready, though it seems plentyare willing to hop on the 4K bandwagon as soon as possible whatever happens.

NB: Caveat to the above: how the eye resolves detail (including brightness, contrast,movement, etc.) is of course not uniform across one's field of vision. Imagine a futureGPU with eye tracking which in real-time can adjust the detail across the screenbased on where one is looking, reducing the load in parts of the screen that arefeeding to our peripheral vision, focusing on movement & contrast in those areasinstead of colour and pixel density. That would more closely match how our eyeswork, reducing the GPU power required to render each frame. Some ways off thoughI expect. Even without eye tracking (which is already available in other products),one could by default focus the most detail in the central portion of the display.

If you search for the famous "Viewing Distance When Resolution Becomes Noticeable" chart, you will see that at a 30-31" screen size, you have to be sitting about 4 feet back for 4k not to matter vs 1080p for someone with good vision. The full benefit of 4k on a 30" screen is visible at approximately 2 feet, which is pretty much exactly how far back I sit from the screen, so I think a 30" 4K panel would be just about perfect.Reply

That's what I found out too, I could use a 30"/4K monitor but my 60"/1080p TV is fine for my couch distance - now if they sold 100-120" 4K TVs or 4K projectors priced for mortals it would be different, else I'd have to get a lot closer.. Reply

In the end it'll be worth it, but res is only one part of the package for a good gaming monitor - you need low input lag, 120hz refresh, good colours, etc. Are there any 4K monitors with 60hz refresh even, let alone 120hz (most are 30hz right now)?

So right now you have to spend a fortune on something that ticks the resolution box lots of times, but has a lot of x's elsewhere.

You'd be best with a 3*1080p @ 120hz surround with lightboost for fast paced games.Reply

I can see pixels on a 27" 2560*1440 monitor at around a metre. Full stops are still made up of just six pixels at my prefered text size and hence look blocky.

Those rule of thumbs are silly. Why would we have movie theaters that clearly break those rules unless people enjoyed massive screens?

Next, these tests are at max settings, including max AA. Duel 770's can average 44fps in Sleeping Dogs at 7680*1440 (33% higher resolution than 4k) with AAx2 and everything else maxed. The GPUs + CPU are rated for around 550W. So yes, 1kW for 4k is stupid.

Lastly, if you want a 35W CPU and 75W GPU try a laptop. Intel has been selling 35W quad cores that turbo to around 3GHz for the last three generations. ATI and Nvidia both have compelling products that will easily push 1080p at max if you go easy on the AA. Best part is you get a extra screen, UPS and a small form factor.Reply

Of course you think 1080P is fine sitting 3M from a 40" screen. YOU CANT EVEN SEE YOUR TV!!! :PI've got less then 10' between me and a 65", and Im installing a 120" drop down screen infront of that. I have a 30" on my desktop and would go bigger if I could. 4k is not BS, but you need 4k content, disks that can store it uncompressed, players and screens. No upsampling etc...

The delta for true 4k is almost as big as DVD to blueray or cable to DVDReply

Meh... 1080p is for people who sit far away from their screen and/or for people with lousy eyesight. I sit about 3 feet away from my 27" 1440p monitor and I can see the pixels quite easily. I'd love a higher resolution screen!

For TVs, it's pointless because there's no 4k content for a TV and there won't be for a long time. But for a monitor attached to a high end PC, it's great!Reply

well, if you insist on using a small 22-24" monitor, then I would have to agree with you that 4K would be overkill; But nobody is going to buy a 4K 22" monitor for their computer (though in time I expect 4K 10" tablets as an extension of 5" 1080p phones). We are going to be buying 4K monitors for our computers in the 35-45" range, and still sitting just as close to the monitor as we do currently. At those sizes and distances a 45" 4K monitor is going to have almost the exact same pixel density as your 22" 1080p screen. But the 45" screen will be huge and immersive, while your 22" screen is on the small side even by today's standards.

I am currently staring at a 45" cardboard cutout which is sitting right behind my 28" monitor and it fits my field of vision quite nicely. It is big, and I am probably going to get a tan just from turning it on, but someday in the next few years that cardboard cutout will be a 4K monitor, and I am going to be a very happy nerd.

For the living room 4K is going to be huge. As you mentioned, 3m distance equals a 45-50" 1080p TV. 4K has a similar rule, and you just double the size of the TV. At 3m you would technically want a 90-100" TV. The pixel density is the same, but the TV fills more of your vision out of sheer size. 90" is very large... but it is not so large that it is not going to fit in a house (though transporting it there may be a trick).

But when you start talking about 8K, then the size doubles again. Meaning that the optimal size for an 8K set at 3m would be 180"... which is enormous! That is 9 feet away, but with a diagonal size of 15 feet! We are talking about a 7.5 foot tall screen that is 13 feet wide! That would not even fit in my front door, and the screen would be as tall as the walls in my home before you even add height for a stand and bezel.So when you start talking about 8K not being practical, then I will believe you because it plainly isn't practical. I can even say with some certainty that I will probably never own a TV larger than 140" even if it was affordable simply due to size constraints in my home. I may at some point own an 8K TV or monitor, but I am under no illusion thinking that I am going to see any great improvement between 4K and 8K for screen sizes that fit my field of vision. But if it becomes standard and affordable you are not going to hear me belly-aching about how "4K is good enough, and 8K brings nothing to the table but pain and missery". Instead I will get my eyes augmented so that I can appreciate the glory of 16K screens...

Lastly, for the game results, keep in mind that these games were played with max settings, including max AA/AF turned way up, and dirt was already playable with a single high end GPU. At these resolutions AA and AF are essentially not needed (or maybe at a 2x setting?). This is not going to make these games all of the sudden playable for my GTX570... but a GTX1070 may be able to play more intense games at these resolutions with low AA at decent settings without requiring me to get a 2nd mortgage.Or put another way: 10 years ago we were playing on the PS2 which could not even play full res standard def games at 30fps. GTA Vice City, and Tony Hawk's Underground were cutting edge games of the time, and they look absolutely terrible by today's standards! Back then nobody was imagining us playing games at 1080p at 120 fps in 3D with passable physics and realistic textures while being on the verge of realistic lighting... But today you can do all of that, and while it requires decent hardware, it does not require a 4 Titan setup to achieve.

Point is that we are still a year and a half away from general availability and a wide selection of 4K screens. And another 2 years after that before the price will hit a point where they start selling in real volume and a decent amount of 4K content becomes available. That puts us 3.5 years in the future which will be right on track for high end setups to be playing maxed out settings at nearly 60fps on these screens. Another 2 years after that (5 years from now) and mainstream cards will be able to manage these resolutions just fine. After that it is all gravy.Rome was not built in a day, and moving to a new resolution standard does not happen overnight. If you still like your 1080p screen 5 years from now then buy a next gen console when they come out and enjoy it! They will not be playing 4K games for another 9 years. But the PC will be playing 4K resolution in 3-5 years, and we will pay extra to do it, but we will enjoy it. If nothing else, hitting the 4K 'maximum' will finally put an end to the chasing of graphics at the cost of all else, and we will start to see a real focus on story telling, plot, and acting.Reply

The ASUS 4K60 monitor will not work with Nvidia cards for the time being. The monitor is being driven by two rx chips via two hdmi cables or a single DisplayPort cable using MST. This means GPUs must support a 2x1 monitor setup to drive the ASUS 4K60. While Eyefinity supports 2x1, Nvidia Surround only supports 1x1 and 3x1 making this monitor useless for gaming.

Nvidia has indicated that they will be including such support in R325+ but it still has not appeared yet in the R326 drivers that are available.Reply

Weird. I didn't think it'd have such a handicap but looked it up in the manual. And yes, the Asus display will appear as two 1920 x 2160 displays using DP 1.2 with a 60 Hz refresh rate. This could allow for some funky things like independent rotation, resolution and scaling in windows even though both logical monitors are part of the same physical display.

It does appear as one 3840 x 2160 resolution display when the fresh rate is set to 30 Hz over DP 1.2.Reply

First time Anandtech tests at 4k,you should frame the article.The good thing about GPUs is that 20nm is about to arrive,then the pseudo 16/14nm soon after and if they start to put the RAM on a silicon interposer they gain a lot of memory bw too.Also the 39inch Seiki is not just announced ,can be pre-ordered at Sears and they list release date as 08/05/13 (fingers crossed for a review and maybe them commenting more on where they are on using 2xHDMI inputs).Reply

Yeah ofc ,even if the GPUs most have won't be enough to push 60FPS.But remains to be seen if 4k monitors will be the preferred gaming display , i just want a bigger screen with decent DPI and price for everything and 4k will push prices down but for gaming i do wonder if Oculus Rift and other similar products won't rule the market soon.Even for non gaming glasses could take over ,they can be a few times cheaper than big screens.Reply

Having seen a Seiki 4K30 display, I definitely agree that 4K30 isn't going to cut it. It looked awful. For now I'll be waiting for a reasonably priced 55-60" 4K60 (or higher) screen before I replace my existing 1080P display.

Ian, just a thought, did you use the default SLI mode when testing 3/4-way Titan?When testing 2/3-way 580s, I found the default was in some cases nowhere nearas fast as selecting some other mode manually, eg. AFR2 boosted one of the3DMark13 tests by 35% (though at the same time it dropped another by 10%).

"The good thing about GPUs is that 20nm is about to arrive,then the pseudo 16/14nm soon after and if they start to put the RAM on a silicon interposer they gain a lot of memory bw too."

you don't really want an old silicon interposer for the next generations though, you Really Need the current and later "Wide IO(2)" with its generic 512bit bus as 4x128 configuration and Terabits/s throughput interconnect ,especially for the real 4K+/8K stuff in the pipeline now....

ask yourself this, how long is the real development timescale from one generation of CPU and GFX chip given the expected real 8K over the air retail devices by 2020ish...

that's only two GFX generations away, maybe 3 at best.... times running out if they don't already have acceptable throughput 8K + many audio channels (was it 128 channels or some such, i forget now) capable silicon in the lab right nowReply

Good to know my two overclocked 780s are ready for this... I don't need AA at such high resolution, and I could always add a third 780 down the road. Now I just hope someone puts out a bare bones 4k panel with DisplayPort for $1K or soReply

Use 3x X-star or QNIX monitors with VESA stands, mount in portrait and bingo, 4320*2560 @ 100-135Hz for around 1k.

If you want to minimise bezels remove the panel and electronics from the case and build a new wooden one for them. The bezel is about twice as thick as the metal edge around the panel so you can get them a fair bit closer.Reply

Metro 2033 is an awfully programmed game and I really don't know why it is used in benchmarks. The developers admitted themselves that the engine is horribly optimized. I'd like to see something like Battlefield 3 or Tomb Raider or Skyrim + Mods.Reply

Games that have separate interfaces to run benchmarks are a godsend to testers. Normally while I have a benchmark running I can continue writing another article or set up the next test system / install an OS. Metro2033 does this better than most, and is a strenuous enough benchmark.

Despite this, a lot of games and engines are not optimized. Plus it doesn't really matter if the engine is optimised for benchmarks - users who play the game are going to experience the same results as we do with the same settings. So an argument that 'this benchmark is not optimised' is quite a large non-sequitur in the grand scheme of things - the games are what they are and it's not for reviewers to optimise them. If I had had more time and preparation, I would have perhaps included Tomb Raider / Bioshock Infinite in there as well. Reply

Well since that game hasn't been finished yet, any performance tests would probably lose meaning over time, as they optimize the engine and whatnot. Maybe they'll include it after the game gets released.Reply

I have to wonder how much improvement might be made with better engine programming. I realize that a lot of work has already gone into making game engines very efficient, but with the growth of GPGPU computing over the last few years there is a lot more work being done on GPU computing in general. Perhaps some creative programming may get us playable framerates at 4k sooner than we expect.Reply

I asked a friend of mine at a movie studio about this last year. I hope he won't mindmy quoting his very informative reply...

"The differences between "HD" and "2K" are not just a matter of spatialdimensions. Although modern (ie. digital) HD is always either 1920x1080or 1280x720; there are a multitude of both frame and field rates - infact over a dozen for each format (although the most widely usedpresently are 50Hz and 59.94Hz interlaced). The colour space for HDvideo is invariability 10-bit per component; linear.

By contrast, 2K is generally regarded as a nebulous resolution - it canmean either 2048x1556, 2048x1536, 2048x3072, 1828x1556, 1828x1332 and atleast six additional dimensions... The reason for such confusion is thatwhen Kodak released their Cineon film scanner / workstation / printersystem back in 1992 (and thus single-handedly invented both the conceptand technology of the digital film intermediate) these figuresrepresented the quarter resolutions of various 16mm / 35mm / 65mm filmformats.

For example, when scanning a full aperture 4-performation 35mm film frameat 6m (ie. noise level and thus differences are indistinguishable to thehuman eye) the 24.892mm x 18.669mm frame is sampled into a 4096x3112pixel image (4K); a quarter of which is 2048x1556. Additionally, inscanned film data colour space is almost always 10-bit per componentlogarithmic (roughly equivalent to 14-bit linear). Interestingly, newKodak Vision3 film stocks are capable of recording two additional stops(in the shoulder section of the sensiometric curve); which require 16-bitper component linear scans to record highlights without clipping!

Have you launched Cineon on any of your IRIX hosts? Kodak were so cleverthey practically spawned an entire industry (it was years ahead of itstime - hence the quarter resolutions) ..."

Pure insanity. I feel that gaming at 1920x1080p or 2560x1440p should be something of a norm for the foreseeable future. I probably won't be able to be bothered to even think about gaming beyond that, and I'm not much of a 3D gaming or multiple monitor gaming fan. Reply

It's a shame that the next gen consoles are coming into the world at the same time as 4K. This means that consumers will have to wait...5-10 years before they can buy a console with enough power to run 4K games. Reply

Timing's not that bad really, it'll probably take about that long for these newer displays to come down in price and become the norm... I doubt pushing a 4k-capable console in two year's time would really accelerate that process, but I'm not really into consoles these days so maybe I'm underestimating their impact. Hopefully PC GPUs move at a much faster pace, am I the only one that's thinking multiple 4k displays would make a badass setup? :p C'mon now, there's already a lot of people running 3x1440... I'd be one if I had an unlimited budget and space, 3x1200 will have to do for now!Reply

I think it's important to note the various typos in this article. This is not 4K resolution, this is UHD resolution. For an analogy, this is like calling 1080p '2K'. 4K is 4096 x 2160. UHD is 3840 x 2160. I find it annoying this is still an issue people keep making.Reply

Just imagine people calling 1920 x 1200 "1080p" in an article similar to this one. It seems people are either ignorant to what I'm talking about or they just don't care enough to make the differentiation between these two as they do others. Seems a little stupid, especially when talking about this on a technology forum where things are very specific to begin with.Reply

Yeah I noticed that too. I'm guessing the GK104s are becoming severely RAM or bandwidth limited at those high resolutions. Even in older benchmarks the Tahitis tended to claw back some ground, showing off their 384-bit bus to good effect at the highest resolutions (2560x1440/1600), so I guess with UHD the effect is even more pronounced.

Kinda sad that the cheapest 384-bit part Nvidia has now starts at $650.Reply

I'd love to see a future AnandTech investigation into the practical implications of 4K for the average enthusiast. 4 Titans are likely out of reach for most, but are there ways that 4K can make life better on realistic budgets? As many have suggested, lower amounts of *AA are an easy start; if necessary could gamers even let the display upscale from 1080p while still taking advantage of the much higher resolution for non-gaming tasks?Reply

Ok. So practically current generation GPU hardware is not guite ready for 4K... Nice! So there is really a good reson to push GPU technology forward. The 4K will be near 1500$ in 3 to 4 years in smaller screen sizes, so Nvidia and AMD has that much time to make it happen. Until then 3 to 4 cards combinations are solution for those who really can afford these. The need for GPU upgrades has been stagnated for so long time that this is actually refressing!I would take 4K screen now if I could affrd it. Run all desktop aplications in 4K mode and games in 1080p untill I would see enough GPU power for it with single or two card combinations.Reply

You can't be serious, I can't even use my 2560x1440 monitor without large screen fonts which look awful in Windows. I have 20/20 vision too so it has nothing to do with eye sight. The fact that Android scales so beautifully is a huge advantage. The smallest 4k monitor with comfortable font sizes without scaling the UI is over 70 inches for the bulk of north americans. Eyesight is NOT improving. Reply

What I want to know is, since this is as near as makes no difference to 4x27" 1080 monitors, do I get the same effect of viewing 4 times the content, or do I just get 4 times the detail?

Let's say in your game with the 27" the aspect ratio lets you see a 5 foot radius (hor) in front of you (It's an example), with this will I see a 10 foot radius, or will I see the same 5 feet just bigger.

did anyone notice the perfect scaling of Sleeping Dogs with Titan and the 7950? not sure why the 680 performed so completely and utterly horrendous. Ian: any insight?

Titan:13.63 -> 27.4 -> 41.07 -> 57.781 -> 2.01 -> 3.01 -> 4.24

7950:11.1 -> -> 34.581 -> -> 3.12

680:6.25 -> 8.551 -> 1.37 (ouch?)

what allows Titan and the 7950 to scale so perfectly (other than the obvious: true GPU limited graphics vs resolution limiting)? and, why does the 680 suck so bad with this title and scale so poorly (i'm thinking driver simply not optimized)? do you think this is indicative of 680 scaling in general? (this effects me personally because i'm looking to get a 2nd 690 specifically for 4k/UHD gaming!!)

maybe 4k (UHD for those who care) can shed some light on architectural differences not only between AMD and nVidia, but generationally between makers as well. i look forward to any insight you can offer (now, and down the road)

note: the new 55/65/73 in. Sony Wega line is advertised as both 4k and UHD - for those who are offended by this lack of distinction, blame the manufacterers :) for those who are interested the costs are 5k, 7k, and a whopping 25k for the 73 (ouch!)Reply

the numbers provide ZERO evidence of a VRAM limitation at 4k resolutions. at metro 6gb titan much slower than 2x3gb 7990. dirt 6gb titan slower than 2x2gb 690. sleeping dogs 6gb titan less than half performance of 2x3gb 7990 or 1/3 of the performance of 3x3gb 7950. As tested before, its only at 3x1440p/3x1600p resolutions that you start to see some VRAM limitation on a few games. It's amazing how cofirmation bias can work the human mind and distort reality. there not a single number on the article that could remotely speak of a VRAM limitation, but we have dozens of comments saying such. as xkcd once joked " dear god, i would like to file a bug report"Reply

Someone needs to beat the HDMI 2.0 with their own cables until they release.... I'm pretty sure they promised Q2-Q3 2013. Its crazy we can't get 4K60s right now (or more than 8 bit color) because a bunch of EEs can't get it together and roll out a long overdue cable. (And yea I know its the content providers hassling them about security but its definitely time to tell them to STFU)Reply

Why do we let them call this 4K? We've always called our monitors by vertical resolution. 720p, 1080p. 2560x1600 is not known as 2K or 2.5K. Now we have a 2160p monitor and they try to call it 4K. Marketing shouldn't win. We need to fight back!

You guys apparently forgot that Oculus will obsolete the resolution race if you ask me - you can just move your head to view more desktop realestate, and the price of the panels will track closely the price of cellphone and small formfactor tablet displays.

I predict that precisely because of the graphics horsepower required to drive 8million pixels - 4 titans! - something like Oculus will dominate this market at least until the power of titans can be had for ~$300... Which looking at the history of performance improvement per generation, could be far longer than the 2-3 years hypothesized by the author.Reply

/em counts tvs on best buys site and notices 4K is the vast majority buyable.

Only off by a little under a decade /shrug

Yea game developers need to take it easy. Spent 10k building my quad Titan beast and it's already a terd cause I chose to go surround (roughly 5k reso). I figure the benchmarks at 4K are close to what I get at 5k due to my overclocks.

Ironically I've tried none of the aforementioned games due to a predisposition for mmorpgs.