Once man can harness the capability of rearranging matter (elements, protons, neutrons, etc), this graphical hologram will be a thing of the past.

All that we'll need to do is provide existing matter and the technology will be able to break it down and reconfigure it to another molecular structure. Thus, changing trash into food.

In coordination with that, the real holodeck will do the same thing, but instead of making food, it'll make people and inanimate objects. However, a computer program will track what it's made so that it can't "hurt" real humans.

That is the wave of the future, which may very-well be possible with metaphysics. Reply

I'm surprised there's no mention of HDR. That's what's required for images to be indistinguishable from reality. Without HDR, all the new game engines, megapixels, and multi-monitors won't get you there. Reply

I think instead of focusing on millions of millions of pixels that the eye will not be able to perceive, why not focus on widescreen, high resolution glasses?
To see a single image (does not even need to be 3D) through glasses, and when rotating the head, the display rotates too (but the controller,mouse or keyboard controls the direction that the in game character faces).
Why spend precious resources on:
- First of all pixels we're not able to perceive with the eye
- Second of all, when we focus to the right with our eye, why render high quality images on the left monitors (when we're not able to see them anyways)?

Makes more sense to go for glasses.
,or, some kind of sensor on the head of the player,that will tell the pc, where to focus it's graphics to.
Images shown in the corner of the eye, don't need highest quality, because we can't perceive images in the corners as well as where we focus our eyes to.

Second of all;it would make more sense spending time in research on which monitor is the ultimate gaming monitor?.
The ultimate gaming monitor depends on how far you're away from the monitor.
For instance a 1920x1080 resolution screen might be perfect as a 22" monitor on a desk 2 feet away,while that same resolution might fit an 80" screen 10 feet away.

There is need for researching this optimal resolution/distance calculation, and then focus on those resolutions.
It makes no sense to put a 16000 by 9000 resolution screen on a desk 3 feet away from you,and will take plenty of unneccesary GPU calculations. Reply

In practice, the "Single Large Surface" approach will not be as good as the O/S being aware of multiple monitors.

Why? Because the monitors do not present a single contiguous display surface - there are gaps. So if the O/S isn't aware, in certain monitor setups it's going to plonk a dialog box across a boundary or more, making it hard to read. And when you maximize a window it gets ridiculous. I don't think it helps to have your taskbar stretched across 3 screens either..

Nvidia actually does let you treat two displays as one (span mode), but they also allow you to expose them to windows (dual view). And I prefer dual view to span, and don't have many problems with it.

I can quickly move windows to different monitors with hotkeys (or just plain mouse dragging). The problem I have is moving "full screen" stuff to different monitors usually doesn't work.

"Single Large Surface"/span mode is actually going to be suboptimal for most users. Just a kludge to support OSes that can't handle 6 displays in various configs.
Reply

buy 6 cheap led projectors, 800x480 resolution each, easily make a surround setup with it too.

best part is no bezels. thats the ticket. "

What in the world is that about. I think everyone here is looking at a minimum resolution of 1920x1080 or 1920x1200 for a 24 inch panel. To get that in a projector would cost about $2400 right now per projector.

For me to do that with this step, I would need 6 of them.

Now if you are talking about 30 inch panels, then the resolution per panel is 2560x1600.

Try finding a projector that normal people can afford with the pixel count, and then multiply it by 6

Once Dell comes out with an OLED 30 inch panel with out a bezel I am in for 6 of them.

I dont really see the appeal of a wraparound display in the concave styles. No matter what you do it isnt going to be 3D and its not going to fool anyone.

However, imagine if you took 2 monitors and angled them in the opposite manner, so that they were at a slight angle away from you. Convex in other words. Then you can generate a convincing 3D image. In theory, if you had a semi-spherical display, it could generate a very convincing 3D image. It would take some powerful software, but it would look cool as hell. Reply

Don't know if it is mentioned above, but you can throw 3 (well 6 now) LCD projectors onto angled screens and do away with the bezel lines all together. And no, it does not work the same as projecting one big screen. The screens on the side definitely add to the experience. Reply

If you've never used a multi-monitor setup, you know that the monitor edges don't mean jack squat. For a triple setup, you don't actually look at the outer monitors. They're for peripheral vision. Alot of people miss the point entirely on that. Now for a x6 setup, yes this would be annoying, but for 3x or 5x setups, you don't even notice it. After racing on a x3 setup, racing on anything else is completely pointless. It's something that truly needs to be experienced to understand. Reply

I agree, if for no other reason that a 6x2 setup would have two of the six monitor top/bottem edges close to the center of the large image. I'm quite sure it would be unworkable for me, in the long run.

Doesn't mean the vertical monitor edges in a 3 monitor rig are as intrusive, however, and I tend to believe they wouldn't be, yes.

They won't be, if your brain can get past them, any more that the low resolution/hi-pixelization image of a large, close projection would be.

But the big question is "IF." Yes, I can look past the low res image out onto the imaginary road beyond...but others might find it more difficult.

And maybe you can easily ignore vertical edge on a flatscreen, while I find it more difficult to do so....

Projection solutions should make both of those issues easier to solve.

Agree too that the tow peripheral monitors (or projections) in a three monitor/projector setup should NOT require either hi-resolution or even focus--ideally they are seen out of the corners of the two eyes, and are important only for determining overall sense of speed when driving--for viewing "objects passing by at high speeds." Reply

The person who took the pictures for the article should have been standing further back. For displays this large, they actually don't look good if you're too close to them. Something like the 2 x 3 setup would require you be a good 10 - 15 feet away from it to make the bezels less noticeable and really show off the massive resolution.

Home theater guys will tell you when you're buying a big screen TV that you need XX feet from the TV, minimum, for it to look good. Same goes for this setup. The best picture was the last one on the 2nd page, of Dirt 2 I believe. A little distance back really shows off how crisp and huge this display setup is. Seeing several balding white guys sitting 2 feet from this massive display doesn't really give you a sense of how it will look in person. Reply

I can appreciate the desire NOT to see individual pixels in any given image, and, yes, I certainly share that ultimate goal...

...but at the same time, I think there IS something to be said for seeing a very large image up close--at a certain point, the "immersion" factor kicks in, and it really can help offset the downside of seeing a large image under low-resolution.

I drive racing simulations using a XGA resolution projector, and I have it set up to throw a LARGE image not far from my eyes--60" horizontal image (4:3) only 36" or so from my eyes.

It certainly IS pixelated, yes. The Windows hourglass is just silly.

But when used to experience a "3D space" I find myself looking through the screen...and therefore past the pixels...into that 3D world.

I will agree in the next breath that seeing 2D chat can be a problem, however. :D

Ye, it looks better from the distance rather than from close. But most people won't use Eyefinity, the really great part of the card is its power. I mean being able to run Dirt 2 on a playable frame rate on such a high resolution is quite impressive.

Can't wait till some PC magazines get their hands on one of those and show us the real difference in power compared with current cards. Reply

7680 x 3200 is nice but the DPI is praticaly same. And if u sit right before them like the guy in the wow picture its even confusing. To hawe much higher dpi u would need to watch the display from much greater distance (and than u couldnt read the chat). Of course it doesnt aply to panoramatic wiew in simulators or racing games where the 3 display setup can hawe use (arcades maybe) and in fps games for extra space for maybe map or fps wiew from other squad members like in old hired guns on amiga and pc. Reply

The big attraction for me here is the possibility to run these outputs through projectors, rather than flatscreen monitors.

I DO spend most of my PC gaming time driving racing simulators (primarily "rFactor"), and do use a projector to throw up a LARGE, CLOSE image. Pixelation is an issue, but, IMO, the rest of the tradeoffs make it worthwhile to go this route.

What intrigues me about this new card/system are two things: (1) The possibility of running this card output thru two or three-projector rigs, in which one or two "widescreen" projections (covering most or all of a full-surround 180 degree "dome/planetarium" space) are overlaid in the center with a smaller and more highly detailed/higher resolution third projection. If such a rig could be melded with realtime headtracking/eyetracking inside a projection CAVE *or better yet, a dome*, it seems to me we might finally realize the holy-grail: A full-surround, simulation space, at fairly nominal cost.

(2) The possibility of enabling at least that smaller, central region for 3D (stereo) imaging. Obviously, since this is an AMD card, any stereo output would necessarily depend on alternative solutions to Nvidia 3D...but there is at least one of those solutions that might work: TI's "DLP-link," which apparently can be used to enable some new projectors (ViewSonic) with the new Crystal Eyes-5 shuttterglasses to allow 3D output (all without using Nvidia's cards and 3D specific drivers)....

Sigh. And here I thought Anandtech readers were a brighter group of people. A 6 monitor setup pumped out of one video card is incredible, no doubt about it. But to the average consumer it's not even close to practical. Everyone is talking about the six display setup capabilities, issues with bezel and LEDs as though they are considering taking advantage of this. Guys, read between the lines: the real story is a GPU that can play DX11 titles so well that even 6 monitors at 4 times the typical person's resolution aren't even enough to bring it to its knees. Reply

Crysis? Fuck that. YES, It was funny game for like a few days, but basically that's just throwing shit at my PC, so you FUCKS can justify selling your ridiculous hardware. That doesn't strike me as a good, intelligent, and honest effort. That's not efficient. That doesn't wow me. KZ2 does. GT5 does ( For the record, im no ps3 fan) And those games are running on a super piece of shit notebook gpu from 2005!!

So enough of this bullshit. ENOUGH! YOU WANT ME TO BUY YOUR STUPID HARDWARE? WOW ME. USE WHAT I HAVE FOR A FUKING CHANGE. PUT SOME FUCKING EFFORT ON IT. HIGHER ANTI ALIASING AND HIGHER RESOLUTION IS NOT GOING TO CUT IT ANYMORE. IM NOT ASKING FOR MUCH. 720P AND 30FPS IS GOOD ENOUGH FOR ME.
JUST TAKE WHATS LEFT AND SQUEEZE REALLY HARD. YOU KNOW? LIKE YOU FUCKS DO WITH THE CONSOLES. UNTIL THEN, FUCK YOU. Reply

sometimes less is more. thats why i love my hdtv, less resolution (720p), more screen (42) and thats why i hate desktop lcds, to much fuking resolution + tinny screens. ATI this shit does not appeal to me at all. Give me a gpu that renders at low res and then scales my games ( not movies) at 1080p lcd resolution so i can play crysis on a cheap desktop lcd. This Eyegyimmicky? > stupid.
Reply

Something I cannot do today is to have two displays of a cloned desktop, one being a different resolution than the other.

Why would I want to do that? Sometimes I would like to display a game on the television. It accepts VGA input (yes, yes, it's old tech), but I have to change the monitor to the same resolution as the TV in order to do that. You would think it would be so simple to display the same desktop on two monitors, but you can't do it if the resolutions aren't the same.

Obviously this card (and a hundred others) has the power to do that simple setup. I wonder if it lets you. Reply

I had said this after the last launch of gpu's but I think AMD/NVIDIA are on a very slippery slope right now. With the vast majority of people (gamers included) using 19-22" monitors there are really no games that will make last gen's cards sweat at those resolutions. Most people will start to transition to 24" displays but I do not see a significant number of people going to 30" or above in the next couple of years. This means that for the majority of people (lets face it CAD/3D modeling/etc. is a minority) there is NO GOOD REASON to actually upgrade.

We're no longer "forced" to purchase the next great thing to play the newest game well. Think back to F.E.A.R., Oblivion, Crysis (crap coding, but still); all of those games when they debuted were not able to be played even close to the max settings on >19" monitors.

I haven't seen anything yet coming out this year that will tax my 4870 at my gaming resolution (currently a 19" LCD, looking forward to a 24" upgrade in the next year). That is 2 generations back(depending on what you consider the 4890) from the 4970, and the MAINSTREAM card at that.

We are definitely in the age where the GPU, while still the limiting factor for gaming and modeling, has surpassed what is required for the majority of people.

Don't get me wrong, I love new tech and this card looks potentially incredible, but other then new computer sales and the bleeding edge crowd, who really needs these in the next 12-24 months? Reply

why is so much attention given to its support for 6 monitors? that's cool and all, but who on earth is gonna use that feature? seriously, lets write stuff for your target middle class audience, techies that generally dont have $1600 nor the space to spend on 6 displays. Reply

I could do this with two monitors long ago. This has more monitors and maybe have less bugs ("just works") but its still a reimplementation of an old thing. A real multi-monitor setup with independent displays, where a maximized window does NOT span across all displays is much more usable. Reply

What old card gives you the option of running 6 screens from one card?

And that should be a consumer product, not a professional one. And if you actually read the article, you'll see that you CAN setup each monitor independently. Or in 5 groups. Or in 4. Or in 3. Or 2. Also as one big screen. Reply

Watch this video closely. There are 24 1080p monitors being rendered by 5800-class Quadfire. Notice how the screens lag when the camera pans? Chances are that maybe you wouldn't notice when up close, but it certainly is distracting...

Well, OLED can do without the bezel, And i believe LED Backlight can also be done as well.
The good things about this is, if it really do take off we can finally GET RID OF TN Panel. Because having poor vertical viewing angle would make the experience yuck. Reply

I had to create an account to comment on this. I am running 2 ATI 4870's with 3 Dell 2408WFP's and a 42 inch Sony XBR on HDMI

6 Dell 3008WFP's would be sweet and at 80FPS.

My only question... WoW? An ATI 1x series card from 15 years ago can run WoW at 80FPS at full res...

Why not give us some info using a game that can take advantage of a card like that.

If you are going to pick wow, at least look at Guild Wars where the graphics can actually scale to the resolution and test the card out... need I say... Does it play Crysis at that res at 80FPS? lol Reply

Agreed - if they wanted to demo an MMORPG they should've used EVE Online, at least it has a recent graphics engine that doesn't look like ass. WoW's minimum system requirements are hardware T&L for crying out loud... that's the original GeForce!

good to see the hardware manufacturers bastardizing moore's law. not only does the technology double in power every 18 months (or whatever), it also suspiciously doubles in price at the same time! well, at least on the new tech to take up the slack for the price halving of the old tech. Reply

There is a sweetspot for pixel density (~100 dpi) and minimal viewing distance (~20 inches). More Pixels and bigger screens just means bigger minimal distance from the screen.
Unless there are concerns with myopia, big but distant screens don't make much sense for reading or playing. (People beyond 35 tend to pull up and "hug" their monitors anyway.)

Bigger resolutions are interesting for advertisers who don't want their walls of flickering commercials appear blurry when approached.

Also 10 and more megapixels would make it possible to show photographs in native resolution.

There is no point of rendering a plain on a surrounding monitor setup, but rendering a long horizon with 3 monitors could be quite immersive indeed.

A setup of six 16:10 monitors has practically cinemascope aspect ratio 1:2,4. Six 200$ monitors could create a brighter alternative to the projector home cinema experience. Reply

"A setup of six 16:10 monitors has practically cinemascope aspect ratio 1:2,4. Six 200$ monitors could create a brighter alternative to the projector home cinema experience"

No thanks, i'd take a single 1080p projector over 6 monitors. Whats the point of all that res if you have distracting black bars running through it? LCD manufacturers could make super large, super high res displays right now...if there was demand. Reply

There is one inherent problem that needs to be addressed with the curved, wrap around. It has to have several FOVs, one for each monitor. We ran into this using the Evans & Sutherlin Laser Projector (5Kx4K and 8Kx4K - yes those were real resolutions using 16 PCs). Generally video game will use a single FOV, whereas you would need one for each monitor in reality. The other way is to get distortion correction working for the whole picture, but that involves rendering in an area bigger than the displayed resolution and also doing it on the fly, which will induce a frame lag in rendering. Reply

I would just like to see an FPS that shows you what you would see with peripheral vision. Then I could see the point in having a three monitor setup for games. In real life, I can tell when someone is right beside me, but in an FPS, I can't unless I turn that way. Just having a wider view of what is in front of me doesn't do anything for me. Reply

I believe Matrox's Triple Head gaming was supposed to do this, and not just offer a wider resolution. I don't think it had much support, but I do think that Quake III was one title that did support it.

The setup reminds me of the Phillips Videowall technology of the 1980's. Consisted of a set of rear-projection monitors that could be individually addressed or spanned as one, with a barely visible seam between them that quickly became unnoticeable if you sat and watched a show.

We set one up in a retail location, and since it coincided with the VHS boom at the time, we showcased new titles on it, which led to really high sales. Even Monserrat Caballé came to the shop to give a short recital "broadcast" on site. (And, yes, once one disgruntled employee managed to put a truly nasty sex tape on at peak shopping hours). I tell you, we wowed the rubes. But it was mega-$$$a the time.

The Samsung concept display shown in the article looks attractive, and a first step toward getting it right. One can see IMax+3D home theaters in the offing in a few years. Reply

Blurry textures, blocky models, leafs and vegetation made out of 2D sprites, and overly shiny, plastic, artificial looking trees and rocks will still look as crappy at 50 megapixels as they do at 1.8 Mp.

Well not quite but a lot more advanced then what ati is talking about. It's called a cave. You sit in a large cube, every wall of which has 2 projectors back projecting on to it. Put on the 3d glasses it's like the holo deck in that everything is in 3d and surrounds you.

It's used for things like styling cars - designers can make the inside of a new car in cad then sit inside it too see if it really works.

Costs a fortune however and wouldn't fit in the average house :) Reply

It's cool and all, but it makes it like you're playing all your games through a football (gridiron) helmet due to the edges of the monitors intersecting the display.

Also, unless you use one of the far right-side monitors as your primary desktop, you'll have that annoying run-off problem with the mouse cursor where it slides over to the display to the right when you just want to click the [X] for the window in your current monitor, just due to our ingrained nature of over-sliding the mouse to the upper-right corner when we want to close a window. :)

Using a left-side monitor as my primary, I can solve this with UltraMon by making my second display not only to the right but LOWER than my primary display, so that the cursor still stops at the upper right edge of my primary display.

I wonder how this system will handle it. Perhaps it will be smart enough or have a setting that allows you to keep the cursor from sliding beyond the right edge of the current screen when a program is maximized on that screen, so that you can hit the [X] easily like you can on a single display setup (or a right-side display in a multi-display setup). Reply

As I understand it, you can't just maximize windows into screens. The applications and OS don't know your desktop is spread across several screens, so using maximize will maximize the window to cover all available screens.

Which kinda sucks if you want to have several different windows all fullscreen on their own monitor. Reply

Can you have Windows manage the monitors instead of going through this AMD software trickery? I would imagine you would want them to appear to Windows as 6 separate monitors, but then turn on the AMD single-surface-whizbang when you launched an OpenGL or DirectX app.

I see they're touting Linux support for this. I hope they start taking their Linux drivers more seriously.

This will be huge news for guys using DMX and Chromium (not the Google browser, the other Chromium) to do the giant wall-o-monitors displays. Reply

It says that you can manage windows within the software.
In the article it mentions taking 6 monitors and dividing it so 4 are one "screen" and the other two form a "second" screen. I presume that means within each grouping applications would maximise as they would if you had 2 physical monitors and were letting Windows control them.
It's like a grid system (which already exists within at least the NV driver options) but spread across multiple monitors in groups, I would assume.
Windows will see whatever ATI get their drivers to show it, so if ATI allow you to manipulate monitors into groupings, that's what Windows will see. Reply

This sort of setup isn't ideal for all games, and I doubt anyone would argue it is, but it is great for some titles.
In RTS games the borders don't matter.
In racing games the borders don't really matter, the added visual field is very advantageous if you are using an in-car view. Going from a 20" flanked by two 17" monitors to a single 24", I notice the loss of peripheral vision, and the monitor breaks weren't far off the roof pillar positions anyway.
In flight sums, as has been said in the article, not a problem.
In FPS games maybe it will be more of a nuisance, but not every game is an FPS. Reply

I would think it would be a problem in ANY game. It even looks like it from the screenshots in article.

Look, the best way to implement a muilti-monitor setup is what has been done in games already. Supreme commander is a good example..you make a monitor setup with each monitor acting independent of the other.

Open map on one monitor, game on other, stats on one, etc.

Having a large screen, with bezels like that, does not impressive or work towards a advantage in a game to the user. Having a multi-monitor setup with the game outputting scenes you want to each monitor would be far more impressive. So many more ways a game could take advantage of that.

The game that comes out that has those features implemented well into the game play would drive the sells of these setups into the gaming market. But till then its going to never take off.

Define 'maxed out'. Was multisampling maxed and triple buffering turned on? I can't imagine a single card could drive that resolution at 80fps. If so, wow, nvidia is in serious trouble and AMD is going to be getting -a lot- of customers.. Reply

Not necessarily. Simply put, triple buffering still allows the GPU to push max frames of its ability, but it throws out frames not synched to the display frequency. So while the GPU may be rendering at 80fps internally, you only see 60fps (assuming 60Hz display).

Can someone do my sanity a favor and ban this idiot? Do you think he actually has a job? Hell, does anyone even think he has a frakking GED? I love how he thinks every review site is engaged in a mass conspiracy against AMD. Reply

Yeah that's why the Radeon HD4870 review was "HD4870 - The Card To Get".
Almost every preview article (this is a preview article) doesn't have any kind of flashy subtitle/comment, just a quick summary of what it's about.
When the reviews of the ATI DX11 cards come out, I am sure they will have some sort of zany subtitle/comment about how amazing they are (since compared to current gen they are sure to be amazing, and I doubt by that time based on rumour we will have anything from nvidia except pre-release benchmarks if they feel desperate). Reply

I didn't see this in the article. How does a graphics card with two outputs drive 6 displays? What hardware is needed to do this? Is there some kind of splitter or DP box they all plug into? I have two dell 3007wfp-hc's already and this is making me want to buy a 3rd, but I don't know if I need anything else to drive it. Reply

I think they use the "Trillian" AIB, or possibly known as the "Six". It features six mini-displayport outputs. Other articles on the net show ATi demonstrating a 24 monitor setup using 4(!!) Six cards..

I dont know which setup(gpu) the "six" cards uses, rumours were both Cypress(aka r870) and Hemlock(akar800). From the coolers shown, I think it uses "just" Cypress.(Single chip not duallie).

I also believe that the long socalled 5870 card shown in photos around the net is Hemlock(5870x2), not 5870.

And for you concerned about your power bill, rumours state that the 5870 uses 28W(!!!!!!) in idle and 2D.

This ATi generation rocks, I only hope nVidia will get their card out and survive. Anyway how you look their future is bleak. Their chipset business is coming to an end except for AMD cpu's, and theyre late with the gt300. Reply

from page 1:
"The OEMs asked AMD for six possible outputs for DisplayPort from their notebook GPUs: up to two internally for notebook panels, up to two externally for conncetors on the side of the notebook and up to two for use via a docking station. In order to fulfill these needs AMD had to build in 6 lanes of DisplayPort outputs into its GPUs, driven by a single display engine. A single display engine could drive any two outputs, similar to how graphics cards work today." Reply

I can't make much sense out of this however. Current cards have two independent display controllers. These two display controllers afaik don't really share anything, so referring to them as one display engine doesn't really make a lot of sense. So any rv870 really has 6 display controllers (though probably not that many internal tmds transmitters, so if you'd want to drive more than 2 dvi/hdmi displays I'd guess you're out of luck or you'd need a card with external tmds transmitters)? Reply

Am I the only one here that thinks the real story is not the multi-monitor support, but rather the ridiculous GPU power driving them all? I realize they haven't fully disclosed the specs publically, but 7000x3000 resolution over 60fps? The article barely seems impressed by these numbers. Was this the performance expected from single-GPU setups for this generation? I didn't see this coming at all and I'm completely floored!

Also, I would just like to add that I have always preferred being able to segregate displays so that I can easily maximize multiple applications within their own screen. Having everything as "one giant display" has been possible for years and is a less than desirable for everything BUT gaming... IMO Reply

LCD's require control electronics around all 4 sides, making the bezel a necessity. It could easily be 1/4 the width of current monitors. I messed around with stitching the images from 3 rear-mounted projectors together. The image was seamless, but the price would be astronomical. That, and you have to have a VERY good screen to project on to, or all your wonderful resolution gets muddied. Reply

Or the Nec X461UN, which looks very similar (btw you don't need the 460UTn, the 460UT would do as there's no use for the built-in PC in this scenario)
Those are really expensive (>5000 USD), are huge and low-res (1366x768). That's really for big video walls, not suitable for some monster gaming setup. But really, it shouldn't be much of a problem manufacturing 24 inch or so tfts with similar slim bezels. There just hasn't been a market for this up to now... Reply

You make a good point, but the other side of the coin is also true - Intel processors are very strong, and AMD processors suck by comparison.

It's a pity ATI stopped making chipsets for Intel motherboards. They'd make money, Intel would still sell processors, and the only real loser would be NVIDIA. It's surprising how many chipsets they sell. I don't know many people who would buy NVIDIA chipsets, like most people, but it seems they sell them well with HP and Dell, where no one asks or knows the difference. ATI should really make chipsets for the Atom too. That would be a great combination. Reply

I'm not sure why you think AMD's cpu's stink? All benches i've seen is they can run any game out there and push more than 30fps in all tested games not limited by the video card and even push the frames to where the video card ends up the bottleneck. No?

Even compared to i5, the PII 9650 held its own quite well in alot of area's.

For the past few years Intel has defintely had the trashiest iGPU's and probably will at least until the forseeable future. And I wouldn't count on Larrabee to change that all that much by the time it comes out. You can have the strongest cpu in the world but if you have gpu trash like Intels you can't game anyway at good resolutions and speeds to make good use of the fastest cpu in the world.

I think people sometimes instinctively balance things out, and forget they are even doing it.

Keep in mind that Phenom II processors are the same size as the Nehalems, and you're forced to compared the low end of a brain-damaged line with the highest end AMD CPU, and still generally come out the loser. That's not a good CPU design, if you think about it.

I don't agree with your remarks about great processors not mattering if you don't have the GPU to go with it. Stuff like that just makes you sound irrational and too pro-AMD. Not everyone needs powerful 3D graphics, but more to the point, you can get an ATI card for your Intel motherboard. So, sorry, they don't need to make a good video card for you to benefit from it. Reply

The Apple store on Michigan in Chicago is using something very similar to what's being shown here. If I recall correctly, they had 8 panels arranged 2 across by 4 down. They were running an animation showing all the iPhone apps that were available. I noticed the display from the other side of Michigan and was impressed enough to cross the street to see how they did it.

The device was probably purposely made by the LCD manufacturerer as the seams were about as wide as a single thin bezel on Samsung monitors. Reply

hmm, running that is pretty nice, especially with the framerate given, but god save us from electricity bills... screens with good colour reproduction of this size are about to take 80Watt at least each... giving half kilowatt on screens only.

Anyway I am eager to get detailed review soon... And hopefully nVidia counterpart as well... $1000+ graphic cards are just nice to see in benchmarks so fingers crossed for fast competition... Reply

I've yet to see someone use a monitor setup like that for gaming. It looks terrible with the lines between monitors. Video card manufacturers have been trying that for ages now, Matrox tried and failed at it already. Let it die. AMD.

Mission control stuff, large desktop for apps sure be nice. But to game on? yuk.

Case in point, look at the WoW screen shot..screen cut right in half were character is located. lol

Now if some monitor company decides to make a monitor that lets you remove bezel around edges to a smooth setup, then we can talk. Reply

Apparently you never played on a 3 monitor setup. (yeah a 6 monitor setup would suck)

I can tell from personal experiences that it awesome.
I have tree 22" wide screens Matrox TripleHead2Go setup, so have no problem whit bezels in the middle of my screen.

Yeah of course the bezels bug me, but just as mouths as the roof beams do when i am driving in my car,
And yeah a 6 monitor setup just doesn't work in most games(except games like C&C), would be the same as putting a band of duct tape on eye height around the car.

If you wane see for your self go over to gaming site from Matrox to see some screen dumps from for example WoW
Ore see how it looks on YouTube

NFS is also awesome on tree monitors

And whit the low prizes of monitors anyone can now have a 3x 22" setup Reply

What actually counts is that the graphics card is so powerful that it can support the high resolution!!! And this is just the first generation D11 card. I doubt that we will see anytime soon DX12 so I guess the third generation DX11 cards will rock the earth and break the walls! Reply

Case in point, look at the WoW screen shot..screen cut right in half were character is located. lol

I wonder why they didnt use pivot to circumvent that problem. Five displays in a single row instead of two and the center display will not cut the area with the character in half, while at the same time provide the desired hight at the cost of some width. That would work for me... Reply

Please read the entire article before posting. It's clearly stated that AMD already has one manufacturer lined up to make monitors with very thin bezels, and for FPSs, you would use an odd number of displays so your crosshairs are in the center of a monitor. It's all in the article. Reply

[quoting block]I've yet to see someone use a monitor setup like that for gaming. It looks terrible with the lines between monitors. Video card manufacturers have been trying that for ages now, Matrox tried and failed at it already. Let it die. AMD.[/quoting block]

I think that what makes this a brilliant thing isn't that you _can_ hook up 6 monitors to a single graphics card, but that it works as if it's just one monitor. I think THAT is the innovation that ATI's done - all other implementations I've seen basically treat each monitor as a separate entity, and have to handle them within the OS separately. This treats it all as a single 7500x3200 resolution display. THAT is the cool part, I think.

I think that one of the things that Matrox did back in the day was some clever tricks to try and get the OS to recognize the displays correctly. Looking back on it, there was this concept of "primary display", "secondary display" and "tertiary display". Not "this is my display that's huge".

Perhaps it's just a "name swap", but I can see that this sort of thing is what you need for abstracting arbitrarily large displays (made of one, giant, single display or multiple smaller displays). Reply

That is true but you're not seeing the big picture and yes that pun was intentional.

Current monitors strapped together like in the demo are worthless for things like L4D or Counter-Strike. But what about future monitors? I can see vendors building a single giant monitor built from 3 or 6 or more monitors and making them a more or less seamless and market those to the extremely hardcore gamer. Reply

Ok, but then why not just have it be one monitor to begin with?
One whole panel will always be better than pieces strapped together even if they are without bezels. One single video port and cable, even if it is "six-link displayport" or some other freaky name will be better than multiple cables. Reply

For games yes it's useless, but have you ever seen the gigapixel projects (and the displays for viewing very high resolution images) universities do? This will simplify it brutally, before they had a PC for every screen or two and rendered with software. Now they can just connect them to a single desktop if 6 displays is all they need, or when running with software rendering cut down the number of PCs by a factor of 3. When you need high resolution you can't just use one big display (or projector) because the resolution will be to low for that stuff.

On the other hand, this is just for displaying the multi-monitor capabilities and people use more then one monitor in the real world, maybe not as one giant screen (they do that too of course) but that's not the point that is just one of the mode you can use. And for those who use that mode it's now greatly simplified, before it was a hell getting acceleration and stuff working across both screens at the same time or stretch some apps across the two screens, this is now no problem as it's just one single surface for windows and the applications. But as said you simply can't build a single display with a resolution of say 11520 x 6480. Reply

[quote]One single video port and cable, even if it is "six-link displayport" or some other freaky name will be better than multiple cables. [/quote]
Is there a problem i don't see ?
Its the same as whit RGB cables you have 3 of them together and only the ends fan out.
Not seeing the problem here Reply

How many monitors have you seen that can do a 10240 x 3200 display? I think that was part of the points of this demo as well is the throughput of the card. Techology ain't there yet. This is where the flexible panels of the future will shine. Reply

Cost & manufacturing complexity. It is much much easier & cheaper for them to build smaller displays. So it is much cheaper, and easier for them the put 3 small displays in one package than make one huge panel with the same resolution. Reply

And having 6 panels strapped together makes 6 points of failure. Any one of the panels has bad pixels or an other bad component and you will have a mess. Replacing one will probably lead to temperature differences and right next to each other that type of issue will show. I agree that it would need panels without bezels to work right for gaming. It would be annoying as a desktop too. Can you imaging trying to use IE or Firefox with lines in the middle of your web pages? Reply

and having one giant panel makes 1 point of failure. if you have a problem, you replace the whole thing. this is MUCH more costly than replacing one panel out of six. indeed, it's probably more costly than replacing all six panels all together.

the technology for such a display has been around for at least a year, if not longer. i recall seeing a three panel display at a tech conference (can't remember which), where a high end computer was driving crysis on all three displays. Reply

I see it as a good thing. Panels have an advantage of "if one goes bad replace it" as opposed to a giant monitor that gets a bad pixel and is irreplaceable due to cost. $150 and I'm good as new. With a $12K monitor, I imagine I'd just keep using it with a few bad pixels in it. Reply

FullCon Solutions, LLC partnered with Duke University & Iowa State to promote the benefits of using 6-Sided, Fully Immersive CAVE's in the AEC & Marine Industries. FullScale Analysis of 3D models are dramatically improving communication between all stakeholders. Ultimately, this technology allows are clients to design better, clarify expectations sooner and profit from improved decision making.