Post Your Comment

78 Comments

If I wanted huge screen real estate, I'd definitely go for a 1080p projector that can do anywhere from 100" to 20'. Of course, a top-of-the-line one would cost upwards of $10000, but a really nice one would only be a bit over $1000. Give me this over "jail" bars of bezels anytime!

I'm a bit puzzled at why ATI is doing a 2GB version to counter the GTX 480, and not a slightly faster version. Right now is AMD/ATI's real chance to seize the bull's horns with a death grip. By all means they should release a 950-1000MHz version of 5870, named 5890! Even if the power consumption is 25-50W more, it would still be considerably lower than the GTX 480, and actually pwning it in nearly all of game benchmarks. Even better would be to release a 512-bit version just like they did 4 generations ago with HD2900XT. With up to 100% greater memory bandwidth, there would be roughly 20% more performance at 1000MHz core clock across all benchmarks, if not more.

I say this with mercy.. if AMD does not truly seize the moment with a death grip by the horns, AMD will regret it for a long time, if not forever.

Why not go with 3 cheaper projectors and use them with eyefinity? One of the oft neglected advantages to Eyefinity is a properly supported game can actually provide a player with a FOV advantage - they can actually see more of the game world than other players without distorting the image.

This was never a counter to the GTX480, the E6 edition card had been planned long before we knew anything concrete about Fermi. And considering the benches, its quite obvious that 2GB is not needed for today's games. If ATI was going to introduce a counter to Fermi it would simply be a higher clocked 5870, but even that's not necessary save for bragging rights.

And a 512bit memory interface is the last thing I'd expect. It's actually bizarre you bring up the HD 2900XT as if it was something ATI should look back on for inspiration. If anything the HD 2900XT was ATI's own GTX480 debacle.Reply

I've hadn't thought about it that way, but the 2900XT situation was very similar to nVidia's 480GTX situation now. Like you said, definitely something ATI doesn't need to look back on for inspiration. That's why (I believe) ATI switched to GDDR5 as quickly as possible, to get as much throughput through that 256 bit memory interface.

On the other hand though, I have a 2600XT with GDDR3 that makes a perfectly satisfying backup card. It definitely wouldn't have enough power to drive 6 displays though.

Also, what's up with AnandTech? I don't check back for two days, and the site disappears, only to be replaced by this sexy tech website. ;)Reply

Here's a thought: get a theater room with 6 hi-def projectors, and set them up in the eyefinity 6 setup. if you spent a little bit of time with it, you would be able to perfectly line up the edges of the projections from each projector, and you then have the eyefinity 6 setup, without the need for bezel correction (no bezel!), and therefore no crosshair problem. The only problem would be the cost....Reply

I think that the other problem would be the space. If a 1080p can comfortably drive a 100" screen, having a large enough wall to put 3x2 surfaces on it would become problematic, I'd think. I don't know too many people that have a 21' wide by 8' tall room where they could reasonably project onto...

Plus the screen for that would be ... pricey.

However, some cheaper 720p projectors would be an interesting proposition, particularly projecting on a smaller wall - maybe 1/2 the size? so about 11' wide by 4' tall?Reply

True.. 3 cheaper projectors with eyefinity would be an ideal solution.. and the screen could be a bit curved like at many cinema movie theaters today!

On the same day Nvidia released GTX 480, AMD released this 2GB version to counter Nvidia's offering. Of course, AMD promised this 2GB version a long while ago, so it's about time. Perhaps it won't be long before AMD releases the faster 5890.

About the 512-bit bus: It is certainly do-able on a 40nm process, compared to when it was done on 80nm process with a 1024-bit ringbus a while ago on that HD2900XT (I will agree with you here in that it was redundant for the 2900XT)..

____""Does a 512-bit bus require a die size that's going to be in the neighbourhood (or bigger) of R600 going forward?"

No, through multiple layers of pads, or through distributed pads or even through stacked dies, large memory bit widths are certainly possible. Certainly a certain size and a minimum number of consumers is required to enable this technology, but it's not required to have a large die."-(Sir Eric Demers, architecture lead on R600 which is the still the basis of 5870's today)http://www.beyond3d.com/content/interviews/39/5

If a 4890 simply performs around 19% better overall than a 5770 in all games except when using DX11, what shall we point at as the cause of the difference? The GPU cores are nearly identical in terms of clock speed, shaders, ROP's, etc.. with perhaps a slightly better optimization in the R800 architecture and better drivers. The main "obvious" difference is a 62.5% increase in memory bandwidth over the 5770. A 5870 is basically 2x 5770's in one GPU with everything doubled. It has been shown that a 5870 certainly does benefit from greater memory bandwidth.. let's say about 0.2% increase in performance per 1% increase in bandwidth.

By the way, Nvidia made quite an interesting statement on the memory bus a short while ago:

"With 3-D interconnects, it can vertically connect two much smaller die. Graphics performance depends in part on the bandwidth for uploading from a buffer to a DRAM. "If we could put the DRAM on top of the GPU, that would be wonderful," Chen said. "Instead of by-32 or by-64 bandwidth, we could increase the bandwidth to more than a thousand and load the buffer in one shot."

Based on any defect density model, yield is a strong function of die size for a complicated manufacturing process, Chen said. A larger die normally yields much worse than the combined yield of two die with each at one-half of the large die size. "Assuming a 3-D die stacking process can yield reasonably well, the net yield and the associated cost can be a significant advantage," he said. "This is particularly true in the case of hybrid integration of different chips such as DRAM and logic, which are manufactured by very different processes."" http://www.semiconductor.net/article/print/438968-...

Nvidia's own John Chen mentioned increasing the bandwidth from "by-32 or by-64" per chip to "more than a thousand". This translates to 8x1024, which is an 8192-bit bus. Hopefully vertically stacked dies are the future. It would effectively reduce the need for increasingly larger buffer size, and act just like embedded RAM that can instantly load the buffer in one shot. ..a bit like SSD's today (small, but "instant"), and thought to be a pipe-dream a few years ago.Reply

The 2900XT used a dual 512bit ring bus topology. The fact ATI or Nvidia don't use this technology today is a hint that it was not efficient enough or too complex to be commercially viable. In that sense, it was not a classic 512bit wide bus, as used by Nvidia previous generation or the 256/384/448bit bus in use today.

A 512bit bus would be impossible to implement on the 5000 series simply because the memory controller is physically limited in hardware to "talk" to a 256bit bus. You need twice the traces on the PCB to go from 256 to 512bit and those traces must be, oner way or the other, physically linked to the GPU. The only way to speed up the memory access on the 5000 series would be to use faster DDR5 chip.Reply

Unfortunately, a 1080p projector just won't get you the pixels that this thing will.

I use a 2x2 setup on my desk at work and it has far more pixels (at far FAR less price) than a 1080p projector has (which is what? 1920x1080? something like that? - I'm working on 2560x2048)

My question would be if you can set these up as individual monitors just extending the desktop of if you HAVE to use eyefinity? I'd love to be able to do this instead of running dual cards, with the limitations on the motherboard that brings. . .Reply

I agree that they should release a higher clocked (binned) version of the HD 5870, if only to steal NVIDIA's thunder. They wouldn't need mass availability. Even just a few hundred, or ideally 10,000+ units would be enough to dethrone NVIDIA from being able to claim "the fastest single-GPU card". And I think such claims form the bulk of what NVIDIA has to work with right now.

A 512-bit version would require a redesign of the chip, though, which would require a lot of manpower including design verification, etc. I don't think it would be worth it for ATI/AMD. Again, releasing a higher-clocked part - now that would be super easy and super effective.Reply

A redesigned 512-bit memory interface card wouldn't come much earlier (if at all) than the next generation. Also, it would use a lot of design/test/silicon resources and time (financially, manpower, ...) for what would be some couple thousands cards sold (when AMD can not produce enough graphic chips as it is). Keep up availability and low price instead of the absolute top. NVidia will be in the "enviable" position of having the top performance card which nobody can find, and nothing else in performance and mainstream segments.Reply

Ok it might not be ideal for gaming right now but i could see ATI selling heaps of these cards for commercial purposes. Should be good for security people , finance sector, research and education, advertising dispalys etc the list goes on.

OMG i just had a thought of connecting 6 HD TV's, you could make your own billboard lol.

I suppose it is not possible, but would it be possible to crossfire a normal 5870 and a 5870 E6?

If AMD can enable that then I think they will sell quite a lot more: people who bought a single 5870 + use eyefinity might want an even more immersive experience, and if they could add a 5870 E6 + x-fire it with their normal 5870 they might be a lot more tempted to buy one, even if they lose 1 or 2 frames (due to 1GB+2GB vs 2x2GB, the driver would probably need to treat both cards as 1GB models) in comparison with 2 5870 E6's x-fired (still a lot more performance than a single 5870 E6)?

Just wondering if you guys have tried putting 3 projectors in portrait mode and seeing how that worked. Figured 3 1280x720 projectors would make a pretty sweet wall of gaming...then use the other 3 display ports for your actual desktop monitors. Anything in the works for that? Would be a fun little project to put in Anand's theater room. :)Reply

1) Wall space. My theater has a 2.35:1 screen, I'd need something much wider (or end up with a really skinny display) for a 3x1 projector setup. I don't think I even have a room that has enough uninterrupted wall space for this to work well at a good size. Perhaps I'm thinking too big though. I could just stitch together three 80" screens or something like that.

2) Inputs. Most 16x9 projectors don't use DisplayPort, although a quick Google search reveals a few options.

How about 3 projectors, 3 screens stitched together, and just hang them from the ceiling so you can create a curved screen? That's the beauty of using 3 projectors anyway. Figure a 5970 could drive a 2160x1280 curved screen perfectly. Reply

This is just craziness. Dunno how someone couldn't just be happy with a single big 1080p TV. Ok, you can see the pixels, so what?; You can also the entire image. I'd like to see a video showing a nice (60"?) set up right next to this E6 display showing the same game or video and do a poll: "Which would you choose?"Reply

This isn't for video - it's for things like - let's say - playing a war plane simulator and seeing actual planes in the distance, not a black dot, or for seeing at decent quality text from several large sources (like seeing several of the very large Excel spreadsheets some of the financial people use). FPS gaming still has issues, I'd say using 3 old, 1600x1200 displays in portrait mode would be best for FPS (a 2.25 aspect ratio). Even with 5 very wide monitors in portrait, you'd end up with almost 3:1 view ratio (which might be good or bad)Reply

Does the extra memory make a difference in crossfire benches? I am curious for each frame buffer has to keep track of what the other frame buffer is doing, thus having a larger frame buffer would make sense. Is there any chance we can see these results?Reply

Why don't AMD go talk to display manufactureres to thin out or even totally forego any bezels on Eyefinity compatible displays? In other dual/multiple screen situations than Eyefinity it can still be desirable to have real thin or no bezels, so it won't be that far out.Reply

I expected they would cost a bit more. Though I don't have any figures on the premium I guess it would be worth it compared to what gains can be had it these special situations where you'd be spending a small fortune anyway.Reply

The bezels are there with a purpose (strength, if nothing else). There are monitors with thin bezels - what we might need now could be pre-built monitors in 6x configuration, reducing as much as possible the bezel size (they could do it better in the factory). Maybe some boutique industry could spring from this? Something like the tuning shops in the auto industryReply

I think that this tech would be wasted on a Media Server implementation. Unless you're talking about something different than what I'm thinking of. Streaming media to these devices would be essentially pointless, as few, if any, media is available at any resolution beyond 1080p.

Putting it on a Mac makes even less sense, given that what makes this unique is the ability to run solid 3D games titles. And last I checked, there were few, if any, 3D games available on the Mac Platform.

While Apple does offer multiple graphics cards in their MacPro systems, they're generally very low-end graphics products (currently NVidia GeForce GT 120 based), meant to drive CAD or other non-3D gaming applications. Those can easily handle any Media server load you could throw at it. I suppose you could make the argument that it could upscale the video to 2160p (doubling 1080p), but that seems to be pointless to me - just run a larger 1080p projector.Reply

The media I use to drive multiple dipslays normaly reqiure something along the range of 640x480 to a 4k type of resolution. Although these high resolutions are not mainstream, Youtube for instance does alow you to upload videos in 4k resolution. The future of HD+ video is very very near.And nowadays OSX ships with something called quartz composer. This something you can compare with prosessing. Its OpenGL bases image synthesis. Truely amazing stuf: 4k+ resolutions rendered at 60hz. Eazely.

Did you have a chance to try (or ask the AMD guys) about 12 screens? Crossfiring 2 E6 cards makes you wonder about that chance.

I once had a chance to put up a 4x3 screen, 2 years ago, with absolutely no bezel whatsoever, but that set up cost my company an insane amount of money. Each screen cost 7.000$, for starters.

I see this E6 as an alternative to keep an eye on. I couldn't care less about the bezel problems, as my company usually sets up multiscreen displays either with projectors or with bezelless LCD's, but 6 displays might not be enough for our line of work.

So, can it be crossfired to a 12 screen 3D accelerated output ? (I'm not concerned about performance, as our apps usually don't stress GPU's much)Reply

It can be done. They had it working under Linux using X-Plane back at their September launch. However it's not even close to being in a shipping state, and I don't have the foggiest idea when it would be.Reply

Can you tell me how that works, in practice. Now that Windows 7 (and Vista) ditched horizontal spanning, I can't just set my company's apps to 3072x768 (3 screens), because that resolution is not even made available by the driver anymore.

From my previous experiente, I can say that as long I have the taskbar spanned through 3 screens, it is safe to assume that I will be able to accelerate 3D apps at the same resolution as the desktop, at least.

Eyefinity must be somewhat different. If the Catalyst 10.3 is still compliant with WDDM 1.1, then I'm guessing our 3D engine must be 'approved' or at least able to aknowledge the availability of Eyefinity.

Back in the XP days, spanning throug 2 screens was transparent - the 3D app didn't even know it was outputting to 2 or more screens, but with eyefinity, compatibility must be achieved at a much lower level. Is that right?

I'm sorry to bother you, but I don't have access to a hands-on approach.

- Does the desktop look like XP in Span mode ?(I think I see a taskbar streched along 3 screens, in Anand's video)- when you run a 3D app (one that is NOT oficially compatible), will 'awkward' resolution (say 5760x2160) be availabe as a choice?

If you answer YES to these 2, then I'm saved, and I'm in trouble aswell, as my company only supports Nvidia (and changing this standard will cost a lot, testing in Quality Assurance will be havoc. Time for a change, I guess. :)

Background info: Eyefinity is the trade name for what AMD calls Single Large Surface technology. SLS operates pretty much as how the name implies: the drivers provide a very large resolution option for applications to work with, and then AMD's hardware takes care of chopping up the image for multiple monitors. Providing the OS/software with a very large resolution is the fundamental aspect behind Eyefinity.

So to answer your questions, yes, 3D games see the large resolution. As for the desktop I've never tried it (and Anand currently has all of our Eyefinity gear). To applications/games at least, this is completely transparent. Eyefinity support basically amounts to being able to handle the oddball aspect ratios and the higher resolutions.in the case where resolutions are hard-coded in.Reply

I agree with that completely which is why I said it wasn't an issue in a 3x1 setup. However in the case of a 3x2 configuration you always notice the bezels in the center of your screen because they often occlude important information (e.g. dialog boxes, crosshairs, etc...).

Why is it that people believe this setup is meant just for gaming? What about people who can have six LCD screens monitoring all sort of devices in a corporate world? What about those who use photoshop and can use other multi-monitor software?

Oh I agree, and that's why I mentioned dialog boxes as being a problem. Honestly I think the biggest application for a 6-display setup right now is for more than just gaming. It's just a shame that you have to buy a $479 gaming card to enable it :)

Perhaps because such solutions have been available for non-gaming uses for VERY many years. You could grab Matrox graphics cards (still hanging around in the business world), or use two GXM products to get six displays on a standard ATI/nVidia graphics card.

There are also other solutions from other companies, or just the possibility of sticking three dual-head graphics cards in a system, which can be pretty cheap if you don't need powerful 3D performance.

In short, for productivity, this is old news; Eyefinity is really only of note for gaming, and it doesn't look very useful or practical for that either. Triple-head gaming has merit since it avoids many of the problems, and that can be done fairly easily (some graphics cards support three outputs, or you can use a GXM product).Reply

The larger frame buffer did help raise minimum frame rates, but not enough to positively impact the average frame rates in our tests.

I thought there had been review comments before about the 5870 being memory limited in some tests. Does this mean that the added 1GB doesn't solve any of that without further hardware/software changes from ATI?

The 2GB cards, standard not eyefinity, are starting to list at manufactures sites. I was going to wait for them but maybe not worth it after all? I'm not interested in eyefinity, just CF for 1920 +above gaming.

From Review "It's worth mentioning that these power numbers were obtained in a benchmark that showed no real advantage to the extra 1GB of frame buffer. It is possible that under a more memory intensive workload (say for example, driving 6 displays) the 5870 E6 would draw much more power than a hypothetical 6-display 1GB 5870."

I think a tester should run Furmark when in 3 way or 6 way eyefinity. I have a sinking feeling, this would break the card. The tests prove out that you need crossfire grunt of two 5870's to do any gaming. So the extra cost of this card , trying to do it all in one is a failure.Reply

i have a 4870, great card but runs hot and sucks up a lot of wattage at idle power both the 5850 and 5870 due much better in heat and wattage, but at load the 5870 uses more wattage than my 4870 does. the 5870 card is NICE with the eyefinity 6 edition. MY question is will there be a 5850 Eyefinity 6 Edition? dont care about price just performance and wattage

I still don't understand the point of this card by itself, as a single card(no CF).

It's too expensive and too gaming oriented to be used in the workplace where, as someone else already mentioned, there have been cheaper and more effective solutions for multi-display setups for years.It's too weak to drive the 6 displays it's designed to for gaming. Crysis(I know it's not a great example of an optimized engine but give me a break here) which is a 3 year old game isn't playable at < 25fps and I can't imagine the next generation of games which are around the corner to be more forgiving.

My point is, why build a card to drive 6 displays when you could have 2 cards that can drive 3 displays each and be more effective for gaming. I know this isn't currently possible, but that's my point, it should be, it's the next logical step.

Instead of having 2 cards in crossfire, where only one card has display output and the other just tags along as extra horsepower, why not use the cards in parallel, split the scene in two and use two framebuffers(one card with upper 3 screens and the other card with the lower 3 screens) and practically make crossfire redundant(or just use it for synchronizing the rendering).

This should be more efficient on so many levels. First, the obvious, half the screens => half the area to render => better performance. Second, if the scene is split in two each card could load different textures so less memory should be wasted than in crossfire mode where all cards need to load the same textures.I'm probably not taking too seriously the synchronization issues that could appear between them, but they should be less obvious when they are between distinct rows of displays, especially if they have bezels.

Anyway this idea with 2 cards with 3 screens each would have been beneficial to both ATI(sales of more cards) and to the gamers: Buy a card and three screens now, and maybe later if you can afford it buy another card and another three screens. Not to mention the fact that ATI has several distinct models of cards that support 3 displays. So they could have made possible 6 display setups even for lower budgets.

To keep a long story short(er), I believe ATI should have worked to make this possible in their driver and just scrap this niche 6 display card idea from the start.Reply

I have an idea for the monitor manufacturers (Samsung). Just bolt a magnifying glass to the front of the monitor that is the same width and height (bezel included). I vaguely remember some products similar to this for the Nintendo Gameboy & DS.

Magnifying glass for such a large surface would be thick and heavy (and probably prone to cracking), and "thin" variations have image artefacts (I've seen a magnifying "glass" usable as a bookmark, and the image was good, but it definitely had issuesReply

There are a fair number of comments to the effect of "Why did ATI/AMD build the Six? They could have spent their money better elsewhere..." To those who made those posts, I respectfully suggest that your thoughts are too near-term, that you look a bit further into the future.

The answers are:

(1) To showcase the technology. We wanted to make the point that the world is changing. Three displays wasn't enough to make that point, four was obvious but still not enough. Six was non-obvious and definitely made the point that the world is changing.

(2) To stimulate thinking about the future of gaming, all applications, how interfaces *will* change, how operating systems *will* change, and how computing itself is about to change and change dramatically. Think Holodeck folks. Seriously.

(3) We wanted a learning vehicle for ourselves as well as everyone else.

(4) And probably the biggest reason of all: BECAUSE WE THOUGHT IT WOULD BE FUN. Not just for ourselves, but for those souls who want to play around and experiment at the edges of the possible. You never know what you don't know, and finding that out is a lot of fun.

Almost every day I tell myself and anyone who'll listen: If you didn't have fun at work today, maybe it is time to do something else. Go have some fun folks.Reply

nice review, the real shame is the bezel, hope display vendors will start making some extremely thin bezel models for this kind of use.as for battlefield is saw you use a chase bench and waterfall bench... are these sequences done buy you or in game benchmarks you just have to run?

One thing i've been thinking about since the bezel problem, why don't anyone make a setup of 3x2 22" monitors in a single frame? I've seen DIY people take the frame off monitors for embedding them in walls, custom frames, or computer chassies. It should be doable to take out the panels, and mount them in a new frame with tape or glue or something on the backside. I would easily consider buying such a setup. You would end up with a monitor rougly around 50" (maybe 55"?) with 5040x2100 or 5760x2160.

For a 3-panel setup, 3 22" screens in portrait mode in a single frame would also be nice. 3150x1680 or 3240x1920.Reply

How can anybody who is serious about image quality fall for this obvious sham. How can the black bars that separate the monitors be anything less than unacceptable? You have to be crazy to waste your money on this tech. 3D is way more appealing than this pseudo high res garbage. If you want real high resolution you simply get a quad XGA monitor like the HP LP3065 I'm using right now and call it a day. If you want something actually interesting then you get anything that might be 3D capable. It seems to be the next cool gadget feature in video.

The cost of projectors and a screen and the features necessary like lens shift would be so damn expensive and not to mention the heat generated by 3 or 6 LCD projectors would be so ridiculous to not have the "black bar" effect. I really don't understand where AMD/ATI is going with this tech.

Hell, I can't even get multiple displays to work properly with some of my 4850 crossfire setups and they come up with the idea to make a video card capable of up to six displays. How about fixing the Gray Screen of Death with multiple displays on the 4800 series? Eyefinity, yeah whatever.........

I have a 4850x2 driving 4 22" screens in a 4200x1680 config (all 4 in portrait mode). Running my 4 (or even getting another 2 screens) from 1 gpu is much more interesting now... Have you tried running the new card in Crossfire just to see what the AA performance in games is?? And I mean Crossfire with 58xx cards and the Crossfire with 48xx cards just to see the support/scalability and so on?? If you're showing the performance of the new 480 in SLi, why not show the 5870 w/ 6 outputs in Crossfire with 1 5970 or even 2 5970... some people actually have the money and interest for this... not to mention you can buy the cards in 6-9 months and get them at half the price compared to today.. And btw, regarding the monitor stands, AMD looks to be choosing a "budget" alternative when showing them up, my Ergotron LX Dual Side-by-Side Arm stands got me up and running in about 30 mins from opening their boxes and clearing my desk, and I got my screens 99.9% prefectly aligned.Reply

For someone wanting to simply setup an extreme resolution display the ideal route (setting cost aside) is using 6 1080p projectors. they don't project a bezel. otherwise, go buy a 55" LED LCD, or wait until they have double res (denser pixel) displays for larger scale monitors. The 30" is a good balance of size, immersion, price, setup, resolution at much higher than standard High def, this is bledding edge, which means many will bleed money to get it right for the rest of us. This is simply not something you will see often. Hope the rambling came together as a thought.Reply