Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

EconolineCrush writes "While AMD's Eyefinity multi-display gaming tech is undeniably impressive at first glance, digging deeper reveals key limitations. Some games work well, others not at all, and many are simply better suited to specific screen configurations. A three-way setup looks to be ideal from a compatibility perspective, and given current LCD prices, it's really not all that expensive. But would you take that over a single high-resolution display or a giant HDTV?"

But would you take that over a single high-resolution display or a giant HDTV?"

If I'm sitting at my desk play, an HDTV at 1080p is going to look absolutely horrible. So is even a ridiculously expensive large format display. Even three low end 20 inch monitors will give a much higher resolution, and much, much higher DPI than I could get for the same amount of money spent on a single large display.

That is, primarily, what they are using to describe whether, or not, a game "supports" the ATI multi-monitor technology. Some games keep the field of vision and just increase the resolution (or worse, stretch the image if the monitors are set up in a non-standard aspect ratio). This makes the technology worthless. Of course, people with wide-screen monitors have been dealing with this problem for a while with games like Bioshock.

Personally, I think much of the dissing of this stuff is coming from people

Not a lot that I'm aware of (I know you could change fov with the quake series, and I bet most flight sims will let you, but I'm about to do a 3 monitor eyefinity setup for iRacing [slashdot.org]. It lets you adjust fov, and enter bezel width for your monitors so it can be nice... example [youtube.com]

Supreme Commander makes pretty decent use of two monitors, but it's the only game I've found where it works.

On the other hand, I've been using multiple monitors for music production and video editing for a long time. Having the ability to multiple accelerated displays is a good thing. When doing music, for example, I have the DAW "track" view on one screen, the virtual "mixer" on another screen (controlled by one of several midi devices) and on the third screen, w

That is good since it's just a splitter for video signal; takes very widescreen video from one output of whatever powerful card (or cards - SLI, et al) you like (out of those which can output such resolution; simply mutliply typical horizontal by 3) and displays it on three monitors.This new AMD multi-display tech is essentially the same thing, built into the card (but Matrox solution is vendor-agnostic and on the market for a few years already)

With annoying gaps between the screens. Watch you not notice something because it's straddling a gap.

I've been doing a multi-monitor setup for a while. In practice this isn't a problem. Usually you have different items you are working on on different screens. Now and then you'll stretch across multiple monitors but really most of the time I prefer 2-3 monitors over one huge one. Normally I have my email/IM/calendar on one monitor, my active work on a second window and a browser or documentation on the third if I have it. (usually I have 2). Works really well.

Also, when the time comes, I guess it's only contacts for you. not glasses (how so many people can put up with them?)...not that you could even wear glasses after the amputation of your nose that you already performed, so it won't irritatingly obstruct the field of vision.

Also, when the time comes, I guess it's only contacts for you. not glasses

With glasses, you can make the world move relative to the rim by moving your head, and your brain uses this to help filter out the rim. Do PC window managers have an analogous feature to nudge all windows?

And the same is true with most common of apps that span several monitors - games; you head (camera) moves. Multimonitor with other apps usually works on the basis of "one window per monitor"/etc., so there's no issue as far as far as spanning & window managers are concerned.

With glasses, you can make the world move relative to the rim by moving your head, and your brain uses this to help filter out the rim. Do PC window managers have an analogous feature to nudge all windows?

More importantly, with glas frames and car frames in your vision, you can make a minute movement with your head to see what is behind them, and your subconscious does that for you. With a computer screen this reflex does not work.

The same way you can drive your real car with blind spots. With tripple head your resolution is not stretched it's increased. So you still see everything the same on the center screen but you get more now with the side monitors. There are tricks to reduce the bezel such as lining up the monitors to hide the bezel of the sides. Also once in the game the bezels tend to be ignored. I forget they are there.

I'm not sure I will have a place at my disposal with the amount of space required; not that I would want to toy with proper screen/etc. for back projection...when there's really nothing wrong with front one?

However fun this always looks...not that practical, I guess. Not many applications which would justify not only the trouble (and additional expense / space consumed), but also actually standing inside. Those uses which do seem nice should be, well, nice enough with 3 screens filling most of FOV while you're sitting; basically a pimped-up three monitor thing. Requiring much less space, with quick setup in small room, easy to do with three cheap front projectors.

they don't seem to realize there are LCDs made specifically without borders for this purpose, and they're not all break-the-bank priced.

Then why don't I see borderless LCD monitors in Best Buy or Office Depot?

macs were not only the first PCs

I'm sorry, but the IBM Personal Computer 5150 was out in 1981, while the Macintosh didn't come out until 1984.

but are still to this day PCs.

Every country has an internal revenue service, but only one has "the IRS". Likewise, "PC" when abbreviated specifically connnotes Lenovo-compatible personal computers. (IBM sold its PC business to Lenovo half a decade ago.) Thus a Mac is a personal computer, but it is not a "PC" until you install Boot Camp or a VM because it do

The things you say don't make sense. 3 20" monitors will not give you "much higher resolution". At best, they'll give equal resolution. Assuming you buy 3 1920x1080 20" monitors and not 3 1680x1050 20" monitors. Because a 1080p monitor's resolution is 1920x1080.

As for looking "absolutely horrible", I suggest you try it before you bash it. I've been using a 37" 1920x1080 LCD as my primary monitor for years and it's freakin' awesome. Hooks up via DVI (with HDCP support). I sit back about 3' from the di

As for looking "absolutely horrible", I suggest you try it before you bash it. I've been using a 37" 1920x1080 LCD as my primary monitor for years and it's freakin' awesome. Hooks up via DVI (with HDCP support). I sit back about 3' from the display and I love it. In fact, I can lean waaaaaay back in my chair and still read the text without the slightest bit of eye strain. And games look great.

Psst. I'll let you in on a little secret. These TVs you speak of...They're LCD monotors! A 37" 1920x1080 "TV" works exactly the same as a 20" 1920x1080 "monitor". You can hook up as many "TVs" as you have display ports on your computer. Your video card(s) can't tell the difference. But don't tell anyone! It's [apparently] a secret.

It makes perfect sense. Three 1920x1080 monitors (which are rather cheap nowadays) give you 6 megapixels. You can find single screens with around that amount of pixels...but they will be significantly more expensive, also "per pixel", than three cheap ones.

Apparently you didn't understand his claim. His claim was that a 20" monitor would provide greater resolution than a 1080p TV. The highest resolution 20" panel you'll find at Best Buy or NewEgg has a resolution of 1920x1080. Exactly the same as a 1080p TV. There also seems to be some confusion about the use of multiple TVs as monitors. They're exactly the same. If you can connect 3 20" 1920x1080 monitors to your computer, you can connect 3 37" (or 52" or 60") 1920x1080" monitors to your computer. Goi

I tried a Philips 37'' 1920x1080 LCD last year. It was pretty awful. It was hard to keep focus and it looked much worse than my (several years old) monitors.

I'm back to my dual 22'' monitors. Not that I didn't plan to keep a secondary monitor anyway - there is no comparison on the total resolution as well as just putting windows on the secondary monitor. It is awesome for developing software.

I've also bought one of the new AMD cards to try 3 monitors for gaming eventually.

> And Here I was thinking 1080 lines of vertical resolution should be enough for anybody.

Everyone's airing of grievances over the seemingly pervasive march towards fewer vertical pixels than even mid-priced LCD monitors had 3 years ago (ah, the happy days when you could actually buy a laptop with 1920x1440 display) was a different Slashdot topic a few months ago.;-)

Even three low end 20 inch monitors will give a much higher resolutionLow end 20 inch monitors seem to be 1600x900 so with three of them you would get just over twice as many pixels as on a 1920x1080 (full HD) screen. It also seems I can get a 2048 x 1152 display for less than the three low end 20 inch monitors.

obviously you've never owned a 30" lcd, 2560x1600 is a wonderful resolution. I've got one of those screens. It blows away anything you can buy in multi monitor.

A 3 monitor setup with 1920x1200 displays gives an effective resolution of 5760x1200. That's roughly 50% more pixels than your 30" 2560x1600 display. Nothing wrong with a huge monitor but it's not better for every purpose. Personally I find a multiple monitor setup more useful for the way I work. YMMV.

Actually, one of these plus a 20" portrait (1200x1600) screen to it's left and right is a fair bit more awesome than just the 30" display. Since, if you're doing that, you probably run two video cards anyways: throw in a 1080p projector, just in case of wanting to watch a movie in bed or something. Ten million pixels. Fun!

The problem with a big TV is only 1920x1080 max, and pixels the size of my fingernail. (There's a recent "obligatory" xkcd about this, I'll let some karmawhore who cares dig up the link...)

The problem with single high-resolution displays is that, while they keep the pixel density sane, they stop at only 30" (without going insanely expensive) and 2560x1600. Even these are way pricier than the same number of pixels in two or three smaller monitors.

Sadly this is the new norm, because HDTV sized displays are the new in thing, that seems to be about where companies have stopped packing pixels.

My 17" dell laptop is running at 1920x1200, and it's about perfect as far as DPI goes. The 21" monitor next to it only does 1680x1050. I've seen LCDs as large as 24" that only do 1920x1200. Come on, have all the manufacturers just given up at "1080P"?

What's worse, they take all those pixels and waste them on 20 pixel wide window borders and giant glossy buttons. I'm fine with accommodating the visually impaired, but I usually want more resolution so I can fit more *useful information* on a screen. I'm glad I switched to linux a long time ago, else I'd be a very sad person these days without at least some level of control on my GUI.

In multi-card/X Screen (:0.0) mode, you only have Xinerama - and minimal configurability.

In multi-card/multi-screen mode, you have RANDR per screen + Xinerama extensions for glass layout.

If the above sounds like some obscure language, it is. The X code is only really understanding 2 heads with lots of X developers trying to kill multiple-GPU configurations (thanks Intel). It's not pretty, but the skeleton is mostly there. Look for comment the comment below http: [slashdot.org]

You obviously have not played a racing simulator in tripple screen. There are tricks to reducing bezel size and multiple monitor vendors are reducing bezel size as they are cathing on to tripple head. The bezels tend to not be noticable with you are immersed in a game. Some guys who are hard core just remove the monitor casing to get them even thinner.

What's not mentioned in the summary is that, if the game properly supports it, the screens on the right and left of your setup get tilted inwards a little and your field of view is increased by 3X (assuming a 3 display setup). This means that you get all the view you would normally get on the central screen and a massive amount of the peripheral vision that we all enjoy in real life by never get in gaming. Is there a gap from the screen bezels? Sure, but you barely notice it because you don't focus on the left and right wings. You just focus on the central display and use the other two to detect motion you wouldn't have otherwise seen (such as the enemy approaching you from your left).

That also means the periphery monitors don't really have to be of the same "quality" as the main one; lowering costs even more (which only strenghtens that. in "price per pixel". it's cheaper to buy three average screens than one with a massive resolution)

Also, people don't seem to mind "screen bezels" much when wearing glasses or drivinf a car; heck, not many cut off their nose so it won't be obstructing their field of view...

That's operating on the assumption of a first- or third-person POV (shooters, MMOs), where typically you re-orient your field of view so items of immediate interest are always front and center. This isn't necessarily true of an RTS; having to scroll the playfield around defeats a lot of the advantage of extra display real estate, compared to seamlessly viewing and mousing to more of the playfield for more clicks-per-second.

I imagine that in the case of RTS, the extra screens could be better used with auxiliary screens (such as construction and stats) and data (as they already do with some WOW add-ons and Supreme Commander).

I'm not sure if it was Supreme Commander, but I remember one RTS where you could point to a place and make a mini-screen out of it, so you could keep an extra screen with your base, the enemy's base, etc.

If the one big screen was wider vs taller that would work. But most of them are too tall to keep docs and apps at a comfortable level. I would rather swivel around a little than look up and down very much.

two very small, fixed LCDs (OLEDs?) combined with eye tracking (camera?) and a system of small, light mirrors on voice coils. when you move your focus, the mirrors move so that you're still looking at the display.you could combine one high resolution display with multiple lower resolution displays for your peripheral vision, the mirrors could probably be engineered to eliminate the gaps. Are there any ~1" 1080p displays?

Some time ago, there was a university experiment combining a high-resolution (centered) display with a large-surface but low-res projection, for peripheral vision. Never heard of it again, so apparently it wasn't that successful. IIRC, the high-res part was fixed, though.Small 1080p displays should be available, there's plenty of LCD projectors. Possibly not in an inch, probably not too cheap.

According to ATI, support for Eyefinity on Linux will be enabled by a 'future Catalyst release'. Three releases of the Catalyst driver have come and gone since I got my Radeon in February, and they still have zero support for Eyefinity on Linux. Which is irritating as hell, because the famed YouTube demo of Eyefinity running a flight sim on 24 screens was a Linux box.

Have their Linux drivers improved yet since their announcements from a few years ago? I remember their drivers would be improved/better. From what I read and heard, NVIDIA is still better. Also, what is the status of their opensourced drivers?

It won't make you feel any better, but nVidia is just as lame. I have a GTS 240 and it was unsupported by several driver releases which came out after the card hit the streets. I had to use an old driver release under which IT worked, but VDPAU didn't.

I was excited for this (and still) for a digital signage setup, being that to drive 6 individual screens at native res from a PC source was a challenge without real expensive gear (like an NVidia QuadroPlex), so at $500 this would be a bargain for certain setups, but without DisplayPort the card can only drive 2 screens video DVI/HDMI, anything else you need active (not a dongle like for the MBP since the card only has 2 DACs) DisplayPort to DVI adapters, which run at $99 each and are in terribly short supply thanks to this card. So if you want to use 6 screens without DisplayPort tack on another $400.

Actually, you only need to shell out for one of the $100 adapters if you're planning on using DVI to connect all of your monitors. You can connect 2x DVI and 1xVGA using a $25 passive adapter. VGA is sub-optimal, I know, but that's what I did. I can keep the remaining $75 and put it toward another 5850 card, which will solve the DVI problem and give me CrossFireX performance.

I recently bought a Radeon HD 5670 for $100. It has Eyefinity tech "with support for up to three displays". "Yeah, so what?" i was thing.. not exactly a new thing. Well, tricking the OS into thinking multiple displays are actually one? NEW! Glad for this./ article.

PS, card made by XFX and purchased through TigerDirect. Great so far but had a TERRIBLE driver issue with crashing, had to roll back to an older driver.

Having 3 x 22" 1680x1050 Dell monitors side by side playing Hawx or WoW or any other game is absolutely stunning.
The Catalyst interface is a bit quirky (profiles do not remember relative screen position so you have to specify each time you change profiles) but once you have it setup and get into a game, choose your insane 5040 x 1050 resolution, you will be blown away.

Bezel gap is not as much of a problem as you might think. Your brain kinda just adj

1) Whatever amount you were willing to spend on a monitor, you must now spend 3 times that. It requires 3 monitors of the same resolution, and to look good they need to be the same 3 monitors. That means your budget has to triple. So the argument of "Well there are cheap monitors," doesn't hold weight. If you were happy with a cheap monitor, you probably didn't want to spend much anyhow and now you need to spend as much as a more expensive monitor. If you like higher

No not really. I have a 5870 and quite like it, but I'm not buying 3 monitors. Desk space is part of it but money is most of it. I like good monitors, and I'll pay for one. $1000 was not to much to spend on a monitor, same as I spent on my HDTV. I am not going to spend $3000 on monitors though, that is out of my budget. I am also not willing to step down to inferior monitors. Then of course there's the performance issue. I like my games to look good and run smooth. My 5870 can do that with my single display

I'd be happier with a high-resolution head-mounted display with head-tracking capability. With that I can look up or down and it'd be as if I had displays completely surrounding me. It'd also be a lot more immersive. So for anyone with the money to spare, when HMD's with their stereoscopic 3d capability get high-resolution, what's the point of this for gamers?

High-resolution head-mounted displays with large field of view(for peripheral vision) will set you back atleast $5k or more. As they have individual feeds to each eye you get 3D functionality included at that price.
If you are interested go look up Sensics.

Most of the activities I perform work better with multiple screens simply because I can have applications maximized on separate screens. Whether it be surfing the web, working with spreadsheets, or debugging applications.

As for gaming, a single, large screen would be fun. Add in left & right screens and it's even better.

I never understood that comic. You couldn't get that high of resolution content outside of owning a 35mm print until we had high definition distribution mechanisms like blu-ray. Why shouldn't that be exciting?

Large displays used to be many times the overall size and cost, why shouldn't that be impressive?

I own a Dell U2711 but we still watch movies around the house on my roommate's Epson 8500UB. Resolution isn't the only factor in what makes a good watching experience.

An actual high res monitor would be better than any of these supposedly "HD" screens kludged together using expensive GPU's.

I do have a 22.2" 3840x2400 IPS display (ViewSonic VP2290b), it's from 2003. It's driven by two DVI ports of a regular GeForce 8800GT in my Mac Pro. Additionally, I have two low-res (1920x1200) 24" screens connected to another GPU for video and games.

IBM sold their monitor factory to Sony around the same time they sold their ThinkPad business to Lenovo in 2005.

Since then, the meaning of "HD" has been just 1920x1080, just 22.5% of the resolution these 3840x2400 displays have.

Unfortunately, finding one of these magnificent monitors is damn hard and they still command rather high prices (although nowhere near the original ~$7500 price tag).

Dear monitor manufacturers, I just want a 200+dpi monitor, is that really so hard to understand? 100dpi is stone age technology compared to the massive leaps forward every single other piece of hardware has experienced.

Even the lowly computer mouse has gone from low-res two-button models hindered by the low update speed of the serial port to mo

We aren't stuck at 100dpi. The monitor quoted by otuz is about 200 dpi. If you are asking why they aren't more common, and why they are so expensive. That is becaus they are:- Difficult to manufacture- Unsupported by most software- Pointless for 99% of applications- Require high-end hardware to even make use of it

You can't compare the DPI of mice to DPI of screens. To increase the resolution of a mouse you don't have to increase the density of the sensors. Creating high resolution LCD screens is not tr

My point is that the average monitor is still stuck somewhere below 100dpi for no good reason.

Laptops are available with 145+dpi displays, some smartphones have displays in excess of 200dpi and yet the average desktop monitor has only moved from about ~75dpi to less than 100dpi in the last 20 years. Why can't I buy a desktop monitor with the same pixel density display as a 15.4" 1920x1200 Thinkpad?

- Difficult to manufacture- Unsupported by most software- Pointless for 99% of applications- Require high-end hardware to even make use of it

- Somehow the panel manufacturers make it work for laptops etc.- It's just higher resolution, nothing fancy abo

My point is that the average monitor is still stuck somewhere below 100dpi for no good reason.

My point is that 1) they aren't stuck, and 2) there is good reason for them to be behind.

Laptops are available with 145+dpi displays...Why can't I buy a desktop monitor with the same pixel density display as a 15.4" 1920x1200 Thinkpad?

You can. And they are expensive. Just like laptop displays are expensive.

- Somehow the panel manufacturers make it work for laptops etc.

Yes, just like they do for desktops. It's exactly the same - laptop displays are really expensive. This is why.

- It's just higher resolution, nothing fancy about it until you reach the limits of DVI etc.

There's a lot that is fancy about it. Memory usage, bandwidth, cost. Monitors work over more than just DVI connections. Manufacturing yields decrease by the square of the pixel density. So the higher the resolution, the harder it i

Laptops are available with 145+dpi displays...Why can't I buy a desktop monitor with the same pixel density display as a 15.4" 1920x1200 Thinkpad?

You can. And they are expensive. Just like laptop displays are expensive.

Where are these mythical 150+dpi displays sold? I have yet to see any for sale outside of the bizarro-pricing world of medical displays etc., which goes far beyond what I'm guessing you meant by "expensive".

But it seems you'd rather insult me than offer any geniune insight. It's funny how peo

Unless you're watching video, the drawbacks you mentioned aren't particularly serious. For working with large amounts of text, pictures, graphs etc. or photo editing, you'd probably never even notice. Besides, the T22x monitors were first introduced in 2001. 9 years of semiconductor development should be able to get us markedly better response time (the 41hz refresh rate is perfectly fine for anything but gaming or 60fps video).

The display response time isn't worse than other displays from the era. The refresh rate isn't actually too bad, a regular graphics card drives it at 33.8Hz nowadays and I'd guess the largest limiting factor at the time was the driving electronics (dual-link dvi wasn't invented yet etc).Sony probably also profits from the technology they bought from IBM by making 200dpi cell phone displays better than larger displays. After all, most people aren't demanding anything better than "HDTV", because most don't kn

Huge screen with huge resolution is probably the way I'd prefer and the technology is there, but for some reason ever since HDTV the resolutions themselves have been going backwards. And yes, of course there is an xkcd on the subject: http://xkcd.com/732/ [xkcd.com]

Minor editing nitpick: Does it have problems AND potential, problems OR potential, problems WITH potential or potential problems? I suspect it has problems BUT potential; this would be so more clear and less lazy than a comma;)

Oh dear I made a right mess of that last sentence without using Preview. Serves me right for acting like a grammar nazi that it should, as per usual, backfire!

So as not to stray off topic, to add to my previous points, there are some very impressive and incredibly immersive looking setups there. The problem I have with it is they put the "window" into Windows. All those frames look terribly distracting. What's to stop any of that being manufactured in a single screen without much more expense than buyin

One of my biggest bugbears with Eyefinity is the inability to switch easily between multiple desktops and the giant Eyefinity desktop...
I bought my 3x 2209WAs to use a landscape multiple desktop solution whilst studying my MS certification (one display for 2003/2008 Server, one for XP/Vista/W7, one for e-book/web/onenote), for honing my Java skills and having a whopping display area for playing HL2/Crysis/any other app I can get working
Unfortunately ATi's drivers aren't suitable for easy switching betwe

The biggest problem I see with these cards is that they do not support frame sync between displays. At work, we run two 4K projectors behind a 15' piece of glass. There are workstation variations on this card coming out that support frame sync, but they're not available yet. With the bezel on normal monitors it's not an issue but if you want to do anything exotic, you'll have to wait, unfortunately.

Here's the thing that all these multimonitor solutions (including matrox's triple head to go, etc.) Most games are written with one eyepoint. For dual head or triple head to work properly, you have to have multiple eyepoints. Each monitor is like a window into the virtual world. If you're wanting to get a straight ahead view and two side views, say at 45 degrees, you need 3 eyepoints, one looking straight ahead, one looking 45 deg to the left, one 45 deg to the right.
Games don't do that. They think they'r