Post Your Comment

153 Comments

Thank you for such great display reviews. I am an avid fan and have found no others as good.

I have 2 questions/requests:

1)Would you please do a review comparing good quality LCD and Plasma TVs to a "computer monitor" like the HP 3065.
I am a pro photographer and spend most of my days processing images. I find it easiest to see artifacts and other processing problems on displays with bigger pixel pitches (like my Dad's 1080p 58" Panasonic Plasma). I guess I'm saying that I want to see my images at their worst, but accurately.
I'm about to pull the trigger and move to a 35-45" LCD or Plasma TV as my prime editing monitor, but I'd like to see how they stack up in one of your top-notch reviews.

2)Where can I get Monaco XR software. I have the DTP 94 colorimeter, but I need the software. I have used ColorEyes form many years, but have never been happy with it. And now I can't even get it installed on Windows 7.

Thanks for any help you can give me. And keep those great display reviews coming! Reply

Looking for a big screen, how do the 30 inch monitors from Dell compare? I wish there was a nice big chart with several of the monitors in this range.
Would like to see how the u2410, 3007wfp, 3007wfp-hc, and 3008wfp compare with the reference HP 30" in processing lag. Reply

The lag when turning on and switching resolutions is really annoying on the 3008WFP. I switch between my Macbook Pro and my desktop PC and the lag annoys me. Otherwise I haven't noticed any real input lag problems with the display.

I was hoping that Dell had fixed that problem for the newer models but apparently they just crammed the same software and hardware in with a smaller panel. Reply

I find the turning on lag to be the worst personally. Resolution switching lag isn't nearly as bad compared to that! Other than that there is no real input lag you are definitely right there. It behaves really well at native resolution on DVI-D. Reply

And the main difference is the software mostly, a few extra features on the Pro software, like being able to set gamma and cool/neutral/warm color temps through the tray tool.

The results I've got with my Dell 3008WFP with the Huey Pro are quite good, while it doesn't detect reds all that perfectly, I don't mind the slight red tint, it makes it alot easier to sit 30cm away from the monitor and have it not hurt my eyes.

Thanks man. I also read that AVIA Guide to Home Theater and Digital Video Essetials are very good DVDs to calibrate for videos. I supose that the calibration is worse than PANTONE products. I will test and see. Shame I really dont have money right now. Reply

Very nice article. I enjoyed reading it as I am in the market for something like this. One thing that caught my eye was your comment regarding better results using Monaco Optix over Coloreyes Display Pro and remembered seeing Integrated Color recommending using the Spyder 3 calibration puck over the otherwise recommended DTP-94 when calibrating wide gamut displays. I believe the DTP-94 is the same colorimeter as the one you use. Perhaps the older puck isn't sensitive enough to the new wide gamut. (see link). Thought you might be interested in this. Thanks.
http://www.integrated-color.com/mm5/merchant.mvc?S...">http://www.integrated-color.com/mm5/mer...&Cat...Reply

I've also tested with the i1 Display2 and achieved essentially the same results (within about 10% on Delta E and within about 2% on gamut). I'm not sure if the Spyder3 is better than both of those, but the i1D2 is pretty well regarded and I've read some stuff in the past where DTP-94 was regarded as one of the better colorimeters. I may play around with the latest version of ColorEyes again just to make sure nothing is wrong. I really don't want to have to invest in another colorimeter, though, since I already have three. :| It would be particularly odd for Spyder3 to be better considering Spyder2 wasn't regarded as being all that great and Spyder3 costs around half as much as the i1D2 or DTP94. But hey, price isn't necessarily a good indicator of quality. Heh.Reply

Hi Jared, can you please answer me this question ? I had a 22"1680x1050 monitor and I was very annoyed by the "sieve effect". Basically i saw the blackness between the pixels. For that reason mostly i upgraded to a 24" 1920x1200 TN panel lcd and i still spot very well the black between pixels and it is annoying me. Now you will ask how far i am standing from the monitor. I am at 2 thirds of a meter away, or 2 feet, and i really can't go any farther than that because the monitor becomes too small in my field of view.

I know that the lower pixel pith will make the blackness between pixels less noticeable, but my question is how much less ? Also, is the proportional distance between pixels smaller compared to other models ? I'm just making this up, but let's say that on a 24" 1920x1200 monitor 90% of the screen space is pixels, and 10% is the black space between them. Does this Dell have less black space proportionally or it maintains the same ration, like 5% black space ?

I ask because i know that the "sieve effect" will be less noticeable on this Dell because of the lower pixel pitch... but how much less compared to a 24" 1920x1200 TN panel ? Thanks. Reply

In case you think that the monitor i have has larger pixel spacing or something, i do see this sieve effect just as much on my 1440x900 17" laptop (from a normal viewing distance), and on my friend's laptops, as well and it is bothering me as much. Another reason why i ask is because i read somewhere that H-ISP (which i think this panel uses) has less proportional distance between pixels(more space is used up by the actual pixels and less by the blackness between them). I'm wondering if this is really true. Reply

You have better eyes than me, certainly. I can say on the U2711 (and on 30" LCDs), I really can't spot the red green and blue elements that make up white unless I get out a magnifying glass, and I certainly can't see the black between those elements. Well, I suppose I sort of can see the squares that make up the LCD, but that's about it. The U2711 is certainly smaller dots than other desktop LCDs, though. Reply

An interesting article Jarrad. Clearly from the derth of discussion, screens are very important to us all. Although I don't feel that old, and enjoy gaming to the max, I clearly remember the days when there was no such thing as a visual display unit (as we called them when they were first used).
Since those heady days in the late 60's we have sure come a long way from those green screens to that which is the subject of your article.
Surprisingly, having read all the post article discussion, no mention has been made of the new 120Hz screens. You did mention 60Hz at one point in the discussion.
I just built a nice gaming PC for a mate, and bought a new 120Hz screen to go with that. All I can say, it beats all the other screens in the house from 30" to 22" to an old 21CRT.......mind you when the CRT was in it's hayday, maybe similar. It is very impressive, and was nowhere near the cost of the sceen in your article.
I do hope that you, or maybe your (hopefully) new writer can have a crack at the new 120Hz movement in screens. Oh!.......the ones available round me (downunder) are 16:10, lol

The big complaint with 120Hz displays is that they're all TN panels (as far as I'm aware). But, TN is often good enough for a lot of people, and it's a lot cheaper than IPS and *VA. I'm hoping to at least get *some* 120Hz LCD for testing in the not-too-distant future. Reply

I couldn't get any report of the manufacturer; I used to get decent results from Astra32, but now the manufacturers have gotten smart and Dell writes their own name in the firmware. So this panel tells me it's a "Dell U2711". LOL.

I don't think I called it an "S-IPS" anywhere; it's just *some* form of IPS, and whether that's H-IPS or S-IPS isn't hugely important. Given contrast is measured at 800:1 to 1000:1 (depending on brightness), H-IPS is entirely possible. But then, refinements to LG's S-IPS might accomplish the same thing. Reply

"While we know some of you would like us to compare performance to a CRT, few users have CRTs these days and all we're really interested in measuring is the relative lag."

That is an incredibly weak argument for not getting proper numbers. I don't get it - why don't you care?
Anandtech would never only show the FPS scores of the 5870 as a percentage of a benchmark GPU, so why do this? Reply

1) I don't have a CRT.
2) I have limited space.
3) I don't want to have a CRT - I ditched all of the ones I had about four years ago.
4) I would need a CRT that can support resolutions up to 2560x1600 -- none I'm aware of handle more than 2048x1536.
5) CRTs are terrible at getting correct geometry. Pincushion and trapezoidal distortion are all too common, even after lots of time spent fiddling to try to get it "just right"... and if you change resolution or refresh rate, you have to do it all over.
6) If CRTs are faster, add 20ms or 40ms or whatever to my numbers.

Lag between user and display comes from mouse, GPU, CPU, and LCD, really - up to around 200ms in some tests. You'll never eliminate all of it, and even CRTs have some "lag".

Besides, no one is making new CRTs that are worth buying, so why should we continue to compare to them? FWIW, my five year old CRT was getting dim when I got rid of it, so I couldn't even use it for comparison anyway. I'd need a new, high-quality CRT to make the comparison even remotely meaningful. Reply

Its not really an argument about the merits of owning a CRT. It's about having a proper zero to benchmark against. I don't really see what most of those arguments have to do with measuring a lag time and "I don't have a CRT" is a very surprising reason to see on what I consider one of the nets premier hardware sites. Anandtech always seems to have all the esoteric equipment in the world to work with, both hardware AND testing equipment...

Here's an email I just sent to someone else on this subject (with a few edits):

As far as input lag and the HP LP3065, if you trust another source you can find an LCD where we overlap, look at their result and look at my result, and then add the difference. But then, I'm not sure what other source I would trust, because I have seen sites report some of the LCDs I have as a "0" as anything from 0 to 20ms. None of them go into detail as to how they're testing (i.e. what stopwatch program they use, and how many pictures they take and average).

"I noticed no input lag at all with this thing. I took around 20 pics comparing input lag with a CRT -monitor and the worst lag I got was 24 ms, and the best was 0 ms."

So my choice of reference LCD has at most 24ms of lag compared to a CRT, and as little as 0ms, or perhaps an average of around ~12ms. But then, all CRTs aren't created equal, so what CRT do you use as a reference point?

Ultimately you're stuck trusting some source for your information about input lag. All I can tell you is that I've played a lot of games on the LP3065 and have never, ever noticed any form of lag. I've played games on a 2405FPW and barely noticed lag, and I've played games on the 2408WFP and definitely noticed lag. I could list a dozen more LCDs, and the fact is that not a single one has had less lag than the LP3065 in testing; they have only managed to tie it -- and many of the TN panels I've tested were recorded as "no lag versus CRT" at other sites.

If I ever find an LCD that has less lag than the HP LP3065, I'll make sure to mention it, but when I can confidently say that there's no current LCD that does better in that area, why beat a dead horse any more than we already have?

Every chart on the internet can be misleading if you don't know what you're looking at, which is why I explain in detail exactly what I'm doing. Plus, not all stopwatch programs are created equal; at best they are accurate to 17ms on an LCD, since it only updates the display 60 times per second. I tried one program and had results that were off by as much as 100ms, where I showed a difference of only 10 to 30ms with 3DMark03 as the time source.

So why not accept that the LP3065 is my zero point, and unless and until another LCD can beat it there's not much point in worrying about it. TN panels with "0ms lag" tie the LP3065, so it must therefore also have 0ms lag. If you trust my testing procedures, of course.

(And will I need to go over this AGAIN when I do another LCD review? A reference point is just that, and I can't find anyone that can give me a clearly better reference point. 10ms at most doesn't count....) Reply

I don't even see how this can even be listed as a complaint, no matter how minor. To me, this ranks right up there with the classic complaining about the speakers on monitors. Should makers of ultra premium displays cater to people with less than stellar eyesight? Isn't the point of it all to be able to resolve finer and finer detail? It just sounds funny: "Man! This monitor is too sharp!" Reply

you are crazy, this is a valid complaint that should be noted in the review
I'm glad that you have perfect vision- but many people, like me, don't. I would be angry spending my money on something only to find out later (since it wasn't noted) that I would have to change settings so becomes actually usable! Reply

Well, resolution/pixel pitch is something each consumer should educate them on first. As it is a major point of this monitor, it probably isn't so much a complaint ("This is bad") as a caution ("This is something you have to know how to deal with") Reply

Yup, just "This is something you deal with by increasing the dpi setting" would be sufficient.
Amazing that you get posters on AnandTech not understanding this. What will they say next? "Don't get that 1080p screen, your 720p movies will look small and your 480p dvd you can hardly see." Reply

I disagree. Just because you have a stellar eyesight, doesn't mean the notice isn't justified, because this IS interesting for a lot of people.

If it's so small that I have problems reading text that IS a problem for me and should be at least mentioned in the text. If it's not a problem for you, that's fine, you don't have to agree with the review in every point.

It just sounds funny: "I don't have a problem with it, so it's perfect for anybody!" Talk about empathy.. Reply

I disagree. Just because you have a stellar eyesight, doesn't mean the notice isn't justified, because this IS interesting for a lot of people.

If it's so small that I have problems reading text that IS a problem for me and should be at least mentioned in the text. If it's not a problem for you, that's fine, you don't have to agree with the review in every point.

It just sounds funny: "I don't have a problem with it, so it's perfect for anybody!" Talk about empathy.. Reply

a 27" lcd monitor review??? you guys should do your research and see what your users actually use, what portion of your demographics actually use a 27" $1000+ monitor? write articles that are practical for your users and that your users would be interested in, no this i have $1000 lying around and then ontop of that i have $700 for an sli or crossfire setup to handle the insanely high resolution for this monitor.

As someone looking to replace a 30" LCD I completely disagree. This article had perfect timing for me as I was planning on purchasing the Dell 2709W next week. This was an article I found well written and informative. Reply

I sort of agree. While its great to see reviews of high end monitors, there are also a lot more interesting monitors to review, such as the new e-IPS displays. I bought the new NEC 23" e-IPS display, and I also own a Soyo 24" MVA display. These were both around $300 (and the Soyo is better , btw, but you can't buy it anymore). Would be nice to see a review of the NEC or Viewsonic 23" eIPS displays. Reply

Maybe this is idiotic, but what if you mounted a midrange computer to the back of the panel? It shouldn't be too difficult to modify a shim to go between the panel and the stand that can hold a smallish desktop. Maybe I've been struck with the Apple bug, but there's something to be said for minimizing wire clutter. If Apple weren't hell-bent on proprietary video inputs, I'd be looking at the 27" iMac more carefully. So instead I'm trying to figure out how you could do something similar in the PC world. Reply

I wouldn't use apple if you gave me for free, but they do have better video connections than PC, PC makers have all these legacy connections including VGA while apple was quick to DVI and then displayport. They chose a strange displayport plug but that's all you can blame them for.
But various PC makers have all-in-ones. I think Sony makes at least one with a good screen. Reply

Good information about a good product I'll definitely consider. But I was disappointed by novice errors about pixel pitch.

Claim: Small pixel pitch makes text hard to read. The windows dpi setting is limited to 96 and 120.

This is a particularly bad error because so many people fall for it. It's important for a tech site to hammer home the right answer here so that people set up their displays right. Smaller dpi always improves text. It doesn't make text smaller because you just adjust the dpi setting to whatever you want. Smaller dpi just gives better-resolved text. XP had problems but since Vista this has worked well in Windows so you shouldn't perpetuate this mistake.
(Now with icons and web images, yes you may get minor artifacts of scaling, but this is becoming less and less of an issue, and the lower the pixel pitch the less of an issue it is.) Reply

The text isn't worse; it's simply smaller, and *that* is indeed harder to read for many people (especially those who are 40 and older).

Changing the DPI will help with the text problem, true, but there are all sorts of other artifacts. Take web sites that tend to use a set width (AnandTech and tons of others). You can get big text with a change in DPI, but the images stay the same. It's not an ideal way of working.

Also, if you're working at a 120 DPI, documents and spreadsheets and such don't look the same on other PCs running the default 96 DPI. It's one of those "issues" I glossed over. Icons are another. When I switched, there were all sorts of oddities that I wasn't anticipating; I've mostly come to grips with them, but it would have been nice if the 120 DPI setting had just scaled everything but images "perfectly".

I suppose that ideally I'd want my text DPI set at 192 for some things, but I'd prefer my icons and Windows UI elements to stay at 96, and other areas would be nice to have at 120, and.... You get the point. I think by using the magnification capabilities of Word, Excel, and most browsers I can get around changing the DPI altogether... almost.

Basically, higher DPI (finer dot pitch) is good for some people and in some usage scenarios, and potentially bad in others, depending almost entirely on user preference. In the case of the U2711, I'd almost rather run it at 1920x1080 with sharpness at 60-70% than deal with the DPI stuff... expect I really like 2560 width when you work with large images. Reply

You explain DPI completely wrong. DPI is dots per inch or pixels per inch. It is the space between pixels on the screen that basically has no relationship how graphics is realistically being shown on the display. Lower then 0.28 mm DPI for the monitor is better for computers while higher is OK for TV.

The DPI for your desire OS is different than the DPI for your monitor. Each DPI relates to something else, but the DPI you have to look out for is the value in the OS and not the display since you are dealing with formatting issues. The calculation of how fonts are sized includes operating system DPI, so you should not change the DPI at all if you are sharing your work to others. It is best to set to the standard what everybody is using which is 96 dpi for Windows. Then change the operating system font size.

If your eyes are not what they used to be, it is best to use lower resolutions instead changing the operating system DPI or referring to a monitor that has high DPI which can look like you are viewing through a sunscreen screen for your window. Of course lower resolution screens will lose your workspace, but you will not have to rely on high DPI. To gain the workspace back, use multiple displays. Of course lower the resolution and viewing it on a LCD monitor will look blocky, but you will have to live with a poor mans technology until there is something better. Reply

This is nonsense. Comparing non-native low resolutions to a change in dpi at the OS level, sizes of objects are the same, but with the OS change most things are sharp while with the non-native change everything is unsharp. Using non-native resolutions is just incorrect use of the monitor. Reply

It is not about what is sharp and what is not sharp. LCD are basically a poor mans technology and you have to give up quality if you can not see what is on the screen. LCD are designed for notebooks for their portability and not for their quality. If we go back to a better technology such as CRT, everything just works the way it is. Using a non-native resolution is not an incorrect use or a waste of the monitor's capabilities if you can not see anything what you are doing. Sure you can setup a panning which will make a virtual resolution be viewed at a native resolution, but with a sacrifice of panning.

I do not recommend changing the operating system DPI just to make sure you can read at ultra high resolutions because it will create problems in the future. Reply

Text isn't smaller when the dpi setting is correct. Unless you want it to be smaller (you might now find smaller text more readable).

Some apps do override the dpi setting, in my experience only browsers (and image software - fair enough). There you have to set a zoom level. That will zoom images and text (default on IE, firefox). Well I'm on Anandtech on FF now and it zooms correctly, as do all sites I have ever visited.

Now yes, images that were mapped pixel-to-pixel before are now not as sharp at a higher setting. That's mitigated by most popular software now storing icons at various dpi settings, but is a problem for web images and I usually return to 100% zoom for photographic images. However you get the same problem if you use a non-native resolution, except instead of just a few things being unsharp, everything is including text. So that is the worst possible option you can take.

If you like a 96dpi icon on a 96dpi screen (say), then you should like a 120dpi icon on a 120dpi screen even better. (Assuming the icon has been rendered at 120dpi, which most have.) So the icons comment doesn't make any sense.

You seem to be thinking about display of computer content at a very low level, each element separately. But everything has to have the right proportions and the key measurement is physical distance. Dpi settings allow you to have the right scale while preserving the right proportions. Reply

When I changed the DPI on my system, all the icons look fine, but the spacing is all messed up. They look the same size as before, but there's now a bunch of empty space surrounding each icon. (This is looking at my desktop.) They look like they use 100x100 pixels instead of 75x75 (give or take).

Really, my experience is that just about everything tends to be designed assuming users are running 96DPI... it's getting a bit better, but that's the short summary. There are a lot of screen elements that just run 1:1 mapping and ignore your DPI setting. Windows Vista seems to have done more than Windows XP at addressing this area, and Win7 may be even better than Vista, but no one has really nailed this IMO. Reply

I don't have great eyes, so I was a fan of larger pixel sized monitors. That is until I upgraded to windows 7. One of the big problems with Windows 7 is that you can't turn off cleartype, or at least its extremely difficult to get rid of it. So, large pixel monitors look like crap. As an example, I was using a HannsG 28" LCD @ 1920 x 1200. This was fine with Windows XP, but absolutely horrible with Windows 7. I had to get rid of this LCD, and get one with smaller pitch, just to comfortably use Windows 7. My eyes are bad enough, I don't need cleartype making things even fuzzier... Reply

If you're gonna compare this monitor, it HAS to be compared against the Dell 2707. I see the 2707 in many of the tests, but not all. WHY? The best candidates to test against are the U2410, 2408, 2707, and even 3008. Why the heck would you be so inconsistent and use only some monitors in some test and different batches in others? Sigh @ anandtech inconsistency. I know you want to use old data and stuff that's available, but when you make these reviews, please consider reviewing against things that matter.... and KEEPING those test candidates in 100% of the time. Reply

Probably depends what tests they were doing when they had that monitor. If it was a test unit sent by Dell they had to send it back, so unless someone bought one of those it is no longer available for all tests. Reply

As far as the monitors you mentioned, the 2707WFP and 2408WFP are in all of the graphs, with the exception of power -- I didn't test that on the 2707WFP way back when. Same goes for input lag testing -- I started that long after my tests of the 2707WFP and 3007WFP. As for the U2410 and the 3008WFP (along with the 3007WFP-HC), I never have tested any of those and thus can't include data for them.

The HP LP3065 will perform very similarly to the 3007WFP-HC and 3008WFP (but the 3008 will have more processing lag). For the charts where I limited the results to a few monitors, I specifically chose the monitors in order to show good, high-end offerings that would compete with the U2711. Obviously, I'm limited to what I've tested, so I used the 2408WFP, 2707WFP, HP LP3065, Samsung 245T, and LaCie 324.

Hope that helps... I'd love to provide results for the 2707WFP, but I can't. I could run tests on my 3007WFP (non-HC), but it's failing on me because it's three years old. Not *failing* really, but I've got some definite image persistence problems now (e.g. if I have content open where there's a start green/white vertical area, and then move that image a bit after 10 minutes, I can see an afterimage for at least 30-45 seconds.) Reply

I'd be interested in hearing more about deep color selling point of this display. Perhaps it'd be worthy to explore in a future article. Specifically the complete line of components to enable it, and whether there is any noticeable difference.

It sounds like all the parts are out now for the non-pro to use deep color. Windows 7, the GT240 (are there any better gaming cards with DC support?), and a couple LCDs. Reply

As a gamer, I can tell you that the 28" Hanns-G is terrible for gaming. At least if you've ever used something better. I had one for about 2 years, used mostly for work, but I did try to use it for gaming. It is horrible with dark scenes, the top of the screen is darker than the rest (viewing angle limitation) and the contrast was poor. About the only acceptable scenario were bright outdoor games like Farcry. After using my Soyo MVA panel, I could never go back to using the HannsG anymore for gaming. The HannsG was fine for work (I'm a programmer/engineer), but not games. Reply

"The final potential drawback with the U2711 that we want to discuss is lag. There are actually two types of lag we noticed during testing, and neither one is likely to be a deal breaker if what you're after is high quality image. Processing lag (a.k.a. "input lag") is definitely present, and it appears to be due in part to the digital scaler."

Blah. IPS is all about NOT having the input lag PVA panels have. Making an IPS panel that has noticeable input lag means it no longer has one of its biggest benefits.
Reply

Ugh...this is step backwards. 16:9 and an insane pixel pitch that is too small for most users.

I owned the 2707WFP and currently use the 2709W because of the 16:10 aspect and high pixel pitch (1920x1200, 0.303). Because my eyesight isn't perfect it is a great LCD monitor for me, plus unlike the 28" TN based displays you could get a good quality S-PVA display with either one.

Now Dell has killed that option with this introduction. They took a step forward with the IPS display, but switching to a low pixel pitch 16:9 display sucks. I've used the 3007WFP and that pixel pitch would drive me nuts for normal use, most things are just too small despite the nice increase in real estate. Plus, doesn't this LCD strongly overlap with the existing 3008WFP?

Also luckily for you this isn't meant as a direct replacement for the 27" monitors you have I don't think. It's a professional monitor with fancy color capabilities and electronics that aren't needed for general home and office use. Reply

Given the fact that it shares the model number scheme with the previous monitors it is hard to know right now if it is a replacement for the 2709W. I guess we will know when the 2709W is phased out. If it gets a similar replacement (1920x1200 16:10) my previous comment is null and void. Reply

"Note also that the HDMI connection uses the 1.3 standard, so it won't support resolutions above 2048x1152 (a 16:9 resolution)."
That kinda sucks. Note that HDMI 1.3 definitely DOES support 2560x1600.
However, seems to be a useless paper spec. Some devices implement other features of hdmi 1.3 (like 30 bit color) however as far as I can tell there are neither monitors nor graphic cards which could output higher resolutions over (single-link) hdmi thanks to the possible higher link bandwidth. Even HD5xxx series graphic cards seem to be limited to 1920x1200, and for the monitors it's often impossible to even figure out as they just list "hdmi 1.3" but they don't tell you they actually don't support the higher bandwidth modes... And that spec is getting old already... Reply

It could be my laptop that didn't support the appropriate resolution over HDMI, then.... let me test with the M6500 and see if that will do more than 2048x1152 on HDMI. I know that at least one laptop wouldn't allow anything higher. BIAB....

Oops... the Dell M6500 doesn't have an HDMI port; just VGA and DisplayPort. If I could find my DVI to HDMI adapter I could try it on a different GPU; as it stands, all I know is that on the test laptop, HDMI limited the maximum resolution. (FWIW, I have a Dell Studio XPS where the DisplayPort tried to output 2560x1440 but the GPU apparently wasn't designed to do that. That same Studio XPS didn't give an option to try 2560x1440 on HDMI.)Reply

I own the U2711 - and love it :-)I agree with mczak: The vendors are very bad at specifying what their HDMI does - even if they say 1.3 I'm not 100% sure they support the high bandwidth...I'm very interested in real life experience with 2560*1440 over HDMI. I'm not certain that the U2711 actually supports it - and I've NEVER seen any of the notebook manufacturers specify maximum resolution over HDMI. But there's nothing in the specs preventing a well designed HDMI 1.3 notebook and screen to run at 2560*1440! Actually this /should/ be a certain thing if they both specify HDMI 1.3 (and at least Dell does!)...However, the only safe for now bet seems to be Display Port - but they are mainly on the brand new notebooks and I HATE the glossy screens they usually come with. HP 8740W and Dell 6500 seems to be exceptions - but they are currently a bit outside the price range I was hoping for ;-)Reply

Because it was offered as a review unit. LOL. I've been doing a lot of mobile reviews and it's hard to get in displays and laptop coverage from one person. This is why we have that call for writers that went out; I'm hopeful that I can turn all display reviews over to someone else and focus on just one area (more or less). Reply

Why did Dell make this thing a 16:9? On what planet in this or any parallel universe is a 2560x1440 display more desirable then a 2560x1600 one? I can almost understand why they make 16:9 24" screens(Look it's 1080! Full HD! It must be the bestest resolution available!), but that doesn't apply here. Give us back the other 160 vertical pixels, please! Reply

Agreed. its a matter of marketing bullshit. 16:10 is going the way of the Dodo just because marketing monkeys like to yip-yap about "FULL HD" resolution despite inferiority in every way - its a sad, sad world.

But its probably not Dell at fault there, but the panel maker(s) instead. You cant sell what you cant buy as a company like Dell. Reply

You need to explain in your reviews of high gamut displays the implications of getting one of thoes displays. Especially to your gaming audience.

High gamut displays are not always a good thing. They are great for photo and video profesionals who use applictions that have color management (can covert to the high gamut color space). But for every other application and game where the assumption is that the display connected is a normal gamut one, the colors will be over exagerated or just wrong. The reds and greens will sear your eyes out. Now this my be ok with the other advantages of the monitor (small dot pitch, etc), but people should be given the negatives as well as the positives.

Hopefully all apps will become color managed at somepoint and this will no longer be an issue, but until then high gamut displays are bleeding edge. I expected better from anandtech. Reply

Agree, it should be mentioned. Technically - aside from OS and software problems - more is better. But unfortunately I don't know of any video players that are color managed even on Windows 7. Same for games I guess. This can be fixed at the OS level but I don't think Windows 7 does that with all applications.
Presumably there is a monitor setting for sRGB so if you are using non-color-managed software you can switch to it temporarily. Reply

A lot of Wide gamut displays have an sRGB emulation mode. But a lot of them are rubbish and don't realy do a good emulation. Seems to depend on whether hardware (lut ?) has been installed in the monitor or not. Maybe this should be part of the testing during the review to see if the sRGB emulation is any good or not. Since it is an expensive display reviewd above, then maybe the sRGB emulation is being done properly on this display.... Reply

Yup I wish more sites would go in to the implications of wide gamut but sadly it seems most just buy in to the MOAR=BETTER marketing hype. Given the depth of some articles at Anandtech it's kind of sad to see only the positives written, almost like marketing material, even if it is ostensibly in the context of professional use and applications. Reply

I have never noticed any problems whatsoever when viewing images on a high gamut display, but perhaps that's because I have calibration tools available. On the U2711, if you're running standard applications, just set the LCD to sRGB instead of Adobe RGB and your LCD will be running in the reduced gamut.

Again, I'm not sure how having a wider gamut is supposed to oversaturate colors. Just because a display has a potentially wider gamut doesn't mean you have to use it. Oversaturated reds and blues is a calibration problem, not something inherently wrong with having a higher gamut.

I've heard this complaint before, and I've just never experienced any problems with this. I've even searched for information on what you might be referring to, with no luck. If you use a standard sRGB color space on a wider gamut, you're going to only use 82% of the available gamut. There should be a mapping for reds to reds, blues to blues, etc. that occurs with the display regardless; red shouldn't suddenly map to bright red.

Anyway, if you have a link to somewhere that explains how you encounter this problem, post it and I'll go see if I can replicate this issue (with and without calibration). Reply

Sorry Jarred, I have to disagree with wide gamut not oversaturating colors in a practical sense, especially with a monitor that has multimedia aspirations like Dell's latest releases.

I also have calibration tools, and one cannot "calibrate out" wide gamut on my Dell 3008WFP and NEC LCD2690WUXI. Wide gamut works fine with color managed applications like Photoshop, or color managed OSes like Mac OS X, but even using the Color Mgmt control panel and loading in the ICM profile in Windows 7 or Vista (or XP) only fixes the oversaturation in Windows Media Player and Picture Viewer, which funny enough ARE color managed. Ty using Media Player Classic and you'll see that it's different from what WMP 11 or 12 show.

This is because color managed apps perform color gamut transformation to correctly "map" sRGB colors to a wider gamut co-ordinate, based on information in the ICM profile. Because a wide gamut screen can effectively show a higher color intensity, a color that is R,G,B 255,0,0 will show a deeper red on wide gamut than standard gamut, because if eight bits per pixel is used, you have the same number of bits defining a wider color space than sRGB. In easy terms, the color difference between red at 254 vs red at 255 is more obvious on wide gamut screens.

Load a photo as your desktop background and load it into Picture Viewer in Vista or Win7 and you will see the difference especially with greens and reds. Even an ocean shot with lots of cyans and turqouises will show up quite obviously. I'm saying to do this, because I have made the same assertion before that wide gamut is not that big a deal, because I was fooled by the built in color viewer of Vista showing me correct colors.

I DO agree, however that using the sRGB mode for PC users and multimedia (console, BluRay) is best. Because otherwise the wide gamut will show up with stronger, unrealistic colors, similar to an old style TV with a color knob.

I have read reviews from other sites that listed wide gamut as being a plus, but I would have to disagree for about 90% of the population out there using tagged and untagged sRGB images.
Reply

You have a perfect example imo. I use this quite often myself for testing.

"Load a photo as your desktop background and load it into Picture Viewer in Vista or Win7 and you will see the difference especially with greens and reds. Even an ocean shot with lots of cyans and turqouises will show up quite obviously. I'm saying to do this, because I have made the same assertion before that wide gamut is not that big a deal, because I was fooled by the built in color viewer of Vista showing me correct colors."

I'm not sure about video playback... I just tried to compare a movie in WMP11 and MPCHC on a regular gamut LCD, and there's clearly a difference between the two. MPCHC looks somewhat washed out, with lighter blacks and darker whites. I'm not sure if WMP11 is doing some extra post processing or what. Reply

Jarred, virtually all pictures and video were recorded in sRGB space. With 24 bit color, one pixel might be color 103x50x246. That maps to a different color than it should on a wide gamut monitor. Since wide gamut extends deeper into the red zone, the incorrect mapping will give more saturated reds. Reply

It only maps incorrectly if your monitor is running in a wide gamut color space (both the monitor and OS) and it doesn't handle an sRGB image as being sRGB. If the applications is color space aware, you don't have a problem. Having a wide gamut monitor doesn't inherently cause the problem; in fact, if you have a regular gamut monitor you get the same effect. It's the images/videos and color space they are created in that can create issues. Reply

Maybe my understanding is incorrect. The way I see it, regardless of what OS or application is being run, let's say you open a picture on a 72% gamut screen. The color is accurate. Now you crack open the monitor and replace the backlight with one that does 95% gamut. That same picture would appear different and more saturated, right? If so, that would be the case with all sRGB pictures out there, which is nearly every one.

Even if software could "up-convert" a low gamut sRGB picture to be high gamut (by lowering the integer color values), for viewing on a high gamut screen, the end result wouldn't be a perfect match since we're dealing with integers. Reply

This is not correct. A high gamut screen has the potential to display a wider range of color, but it doesn't inherently do so. The backlight puts out white, and if it's a better white you can get a wider color range (more or less). But if you run in a limited color space, it doesn't make the colors map incorrectly within that color space. The problem only manifests when you view an image that maps to a different color space, and your image viewer doesn't adjust colors appropriately.

There's a link above to a site that shows the problem. I can view the "wrong color space" stuff on a low gamut display as well as a high gamut display. So if you want an image to be viewable by your average Joe, you should use sRGB color space. Which, incidentally, the U2711 has a built-in profile calibrated to less than 5.0 delta E. Reply

The color gamut issues stems from issues with applications not using ICC profiles. It was a huge problem in windows XP because so few applications cared to use color profiles. It's gotten a lot better with windows vista/7 (or with OSX) but it still pops up from time to time, specifically with all those images on the web missing color information.

Okay, I read through that, tried it on Firefox 3.5.7, IE8, and Safari 4. I get the problem now... it's not having a wide color gamut LCD, but rather viewing images that have an ICC profile in applications that don't respect those profiles, right?

In terms of internet publishing, I'd have to agree that we need to standardize, and since sRGB is already the standard there's no point in going elsewhere. For an imaging professional where the content isn't going on the web, though, wouldn't you want the better color space? I always "save for web" if I'm going to post an image online, and that strips out any ICC profile information (AFAIK).

I need to go play with this on the U2711 with and without a monitor profile (and using sRGB and Adobe RGB) to see what effect - if any - it has on the experience. Stay tuned.... Reply

Save for web should give an sRGB label and does with Photoshop CS3. No point in stipping it out.
FF is color managed. As soon as IE becomes color-managed it will be fine to use any color spaces in web images. Microsoft has been fairly good about color management of late, so hopefully in IE9, fingers crossed. Reply

To me, this monitor is more interesting from a different viewpoint:
It's one of the rare few monitors that offer 2560 resolution at less than 30" size. And even this one only has 109 DPI resolution.
If this monitor was 22 - 24" in size, I'd buy it without second thought.
Of course you need to run Word at 150% magnification, but the fonts look just awesome with so many pixels available to render them. Not to mention games. A "fairly powerful" graphics card is needed though.
The only problem is that current display managers, be it Windows or Linux are all still raster ones and can't really adapt well to increased DPI displays.
Perhaps if more monitors like this would show up on the market, MS and Linux gurus would finally develop true vector user interfaces. Then we could have 150, 200 and even more DPI displays. I'm wondering how many more years before displays catch up to what printers had some 20 years ago already. Reply

High dpi is very useable, even with normal pixel pitches 120dpi can be a good option. But yes vector graphics would make the benefit of higher dpi displays even greater.
There is vector support in WPF since Vista so Windows is getting there at least.
Plus modern applications (including OS) tend to have raster graphics at various dpis, I think often including 150, so that is improving even without vector graphics.

But the fact is most people don't need very high dpi so we have to wait for the cost to come down. Also for displayport to become common once dvi reaches its limit and that will take a while. Reply

I have the HP and it's not bad as far as anti-glare coating although not the best I've seen - that would be a Samsung but sadly it was TN. It helps to sit a reasonable distance away, like a mere 2 feet or more. Every single Dell I've seen is far worse in terms of grainy anti-glare which is a shame because it means I can't consider Dell monitors at all, it just drives me too crazy to see whites all sandy-sparkly-grainy. It really makes me wonder how more Dell users don't even notice it at all.

I haven't noticed any negative effects from the anti-glare coating. I use a Dell 3007WFP normally and haven't ever noticed a problem there either. Mostly, I see the "grainy effect" on cheap TN panels, but maybe I'm just one of those people that isn't bothered by what you're referring to. Anyway, FWIW I feel the U2711 has the same appearance as the 3007WFP and I enjoy the picture. I don't notice "more graininess" or anything like that. Reply

There may be some confusion here over the word 'graininess.' It is not a screen-door effect, pixel pitch by-product or defect in the actual image. It is the screen coating itself - it's not 'grainy' per se in the way a film might be grainy, it's 'sparkly' is my best word for it. Another way to describe it is if you took some sand and lightly dusted the screen surface with it - the image itself looks fine but there's always this layer above it that has its own effect on the image. It makes things 'sparkly' -tiny random dots of coloration across the specturm. For some people it may blend together, I don't know, but I can't see how it's possible to miss entirely. Dell's anti-glare is known to be heavby and particularly prone to this effect which is why it's brought up. Check out forum threads about Dell monitors and you'll find plenty of people noting it.

Now someone who uses a Dell monitor all the time may get used to it especially if they never have another monitor next to it. But Jarred you surely have seen a variety of screens. Have you ever set them up side-by-side with your Dell? What about a laptop screen if you have one? Or if you have access to a glossy screen at least to compare, they by nature will not have any anti-glare coating problem. Reply

Honestly, I'm sitting in from of my 3007WFP (not the HC version) and moving my head around and I just can't see anything. LOL.

I'm looking at a white background and I can't see anything bothersome. I just did the same thing with a couple glossy laptops as well as the U2711 and I'm still not seeing anything sparkly. In fact, I just wandered around my lab and looked at six different LCDs (laptop and desktop, glossy and matte) and I'm still at a loss.

I might actually sort of see what you're referring to, but even then it just doesn't register as something distracting to my brain/eyes. My best guess is that either my eyes just aren't good enough to see it well (entirely possible), or my brain has adapted to where this effect doesn't affect me.

This is something I try to get across in my display reviews: you know what bothers you and I know what bothers me, and sometimes something that I totally don't see will irritate the hell out of others. If you can see/test a display in person, that's your best bet obviously. I figure if you're spending $1000 on a display, you'll want to do everything you can to make sure it's the right panel for you in advance of laying out the cash.

Not really. Most of the issues people are referring to and talking about are what I like to call "subjective preferences." People want things the way they like them to be will pay more attention to the features, defects, etc. For most users, they aren't even aware of most of this features/defects to even begin realizing and thus not seeing.

Selecting a TV is no different. However, once you do start to pay attention to what others are saying then you will begin to notice as it becomes more apparent.

If you can't see and anomalies on your 30" I wouldn't worry about it or bother to do research. Doing so might just ruin your enjoyment of it. For those that care way way too much, they usually end up here...

You don't need to mvoe your head around to see it. I can get the same sort of thing only less severe on my LP2475W but only if I'm less than one foot away which is not normal use. With Dells I see it very easily from normal distance (2-3 feet.)

If Dell still has a solid 100% back free shipping return policy on monitors there's little risk in trying them. All I know is when I tried a 2407WFP a few years back even without reference to another LCD I was like 'wtf is this?' and then found comments about the same thing from others. But obviously there are plenty of people who like Dell monitors so it's all personal preference. I was coming from a CRT and it was my first shot at an LCD if that matters but still it was immediately noticable to me. Maybe lighting plays a role too *shrug*

Given their return policy (if it's still the same) I don't see any problem recommending people try Dell monitors. The strong anti-glare is just something that for people who know they don't like it to the extent it's a deal breaker want to know ahead of time so they just won't bother. Reply

I've got a refurb 3007WFP-HC and it's definitely got the sparkliness going on. I only really noticed it when I first got the display and was scouring for dead pixels (got a couple, but they're hard to see, and the rest is immaculate, so I kept it). I guess I've gotten used to the sparkling, which is a much better term than grainy. It's never bothered me. I strongly prefer it to seeing a reflection of myself like a gloss finish. Reply

Just to clarify: large white areas are shown as large, white areas, not as "large dusty/grained kind-of-white-but-not-really-white" areas? ;-)

Anyway, I'm really looking forward to the release of the devise. I haven't found any 24-inch that has no crappy colours or that frickin' grain :(
The LP2475w has _incredible_ colours and a very deep black - but it's extremely grainy - even in comparison to a really, really old and really cheap 22-inch TN panel. Reply

If he says it looks the same as the 3007WFP then it likely has the typical Dell anti-glare coating. I just don't think he knows what it looks like or doesn't notice it any more. It just doesn't bother some people. Reply

The only reason I'd see a gamer wanting this is if a resolution that high is really worth sacrificing size since you could just get a much larger (42"?) quality 1080P HDTV for roughly the same price. Reply

If I'm doing work? Yes, I'd much, MUCH rather have more vertical pixels than a larger screen, no comparison at all. I'd pay significantly more for it too, than a "High Quality" TV with crummy resolution.

I love the 1920x1200 15" screen on my laptop, thank you very much. I love the fact that I can see 4 windows at the same time on it. Using the company provided 14" screen with an (IMO) appalling 1280x800 resolution makes me feel so ... cramped. Reply

hehe, sorry, I didn't mean to sound trollish. A review of those monitors would be nice, sure, if just to see how good 'c'PVA is but they aren't really competition for IPS in the professional market because fwir they still have the typical PVA horizontal angle-dependent contrast shift. Reply

Sorry, having owned and used all three I disagree about putting TN over *VA. And I definitely could see the TN vertical problems just sitting at a desk looking at a 17" 5:4 panel, just the few degree shift in angle from top to bottom caused issues. Reply

if you need anything bigger than a 24 inch for gaming youre better off with a nice led or lcd tv (40 inch or so is perfect) hook up your machine to it and youre in business. seems rather pointless to spend that much money for a monitor unless youre using it for something other than games/movies. Reply

Besides, those TVs were meant and designed to be watched from 6-10 ft away, not for close up viewing. Even if you got a 1080P panel, it would still have half the resolution of a 30" desktop LCD (at 2560x1600, it has 4 million pixels compared to 2 million on the 1080P or 1 million on the 720P)

Now, I have seen HDTVs used as computer monitors for wall displays of information in companies, but again those are meant to be viewed 6-10 ft away, not to be put on your desktop. Reply

I think that the GP was referring not to the actual size of the monitor, but the resolution of the monitor. The problem with the LCD or LED TV's is that they run at (at best) 1920x1080 resolution. This screen has substantially higher resolution to it. The dot pitch of those screens are pretty .. terrible in fact.

When will people understand the difference between dot pitch and physical screen size? Reply

Also, if you do any kind of image manipulation (Photoshop-style of CAD style) you want a high resolution screen more than a physically large screen. While it's generally true that larger screens have higher resolution, that isn't generally the case. In fact, it reverses the trend once you start going with TV's... Reply

I disagree; the 3008WFP is an older model with slower signal processing and narrower black levels.

I bought a U2410, it is amazing, period. You put it in 'game' mode and the input lag drops to ~15 ms, which is one of the best LCDs on the market today. (Check out the review on tftreview.co.uk - no affiliation here except I read their review before I bought my U2410.) Reply

I had a 30" monitor (an LG with an IPS panel) and found it was excellent except for the huge lack of signal processing abilities (no scaler, one input). I also found that playing games on it was inconvenient because I'd actually have to turn my head to see the entire picture at my desk.

Anyway, the U2410 is $750 MSRP and the U2710 is $1000 MSRP. Compare apples to apples. It's on sale for $500 off and on, and I expect the U2711 to be on sale in the same manner fairly soon. What I'm saying is the U2410 is excellent for certain things that the U2711 presumably is and the WFP3008/9/whatever isn't. It's not for everyone but I'm offering an alternative. Reply

I frankly did not understand your comment about smaller dot-pitch being bad. Wouldn't smaller dot pitch be good, as you get crisper text? Being able to see large pdf pages, or multiple ones, at full size on would seem to be helped by smaller dot pitch, not hindered. I just don't understand your comment there. You can always just reset software to magnify if text is too small, is that what you were complaining about?

The difference between .225 and .233 is minor, but that wasn't really my point.

The 30" panels are quite a bit larger than the 27" panel here, with a higher resolution, for the same money. The 30" Dell also uses a IPS panel, and while not quite as good as this new one, it is pretty close. Close enough for most people anyway.

Actually, if you want to talk about most people, the 28" HannsG LCD is perhaps the current bargin, I picked up one for my parents before Christmas for $288 from NewEgg. Not as good as the Dell panels, but plenty good for most people. Reply

Err yes it would. Incase you have not realised, most people do not have insanely perfect eyesight. The dot pitch on the Dell 3008 would still be kind of small for most users. Let alone this display.

Having said that i would like to the day when LCD monitors have dot pitches of 2 or 3 times as much as this Dell. So the display clarity is perrrfect and non-native resolutions would also look perfect. You also would not need AA in games anymore because of how tiny each screen pixel would be.
But also have Windows automatically detect this and increase the DPI to compensate so text is just as readable. Reply

I've always thought of reviews that say, "oh no, high dpi means fonts will be tiny!" as "reviewer fail" -- high DPI only means fonts are tiny if you don't have your OS set to correct DPI. In fact, Windows 7 now sets a correct (or at least, rounded down to the next 25% scaling) on high-dpi displays.

I have a laptop with a 1920x1200, 15.4" LCD (147dpi), and it's awesomely wonderful for my eyes -- I can run HL2DM at native resolution, and not need antialiasing. When reading text, it's "halfway to paper" (printers give at least 300 dpi). The only downside to high-dpi is that some apps do break under DPI scaling.

Unfortunately, not even ONE single desktop LCD vendor has a display with equivalent DPI rating -- in fact, many are even lower than 96 dpi! At 19 inches, 1440x900 and 1280x1024 are both around 89 DPI -- on such displays, I can easily see the individual subpixels, and such displays can give me headaches.

If I wanted a second monitor, I'd have to pay tons of money for another laptop lcd and and lcd controller board!

Also, can anyone vouch for how it compares to that HP DreamColor LCD? Reply

Even Windows 7 doesn't really fix the DPI problem, simply because so many programs are poorly coded and don't understand anything other than standard Windows DPI settings...

I used to use a pair of Dell 27" displays simply because I wanted 1920x1200 at that panel size under Windows XP to allow me to see anything. Not all of us are 23 years old with perfect vision you know. :)

With Windows 7, I moved to a pair of the Dell 30" displays because they handle the resolution better, even if it isn't perfect. That, combined with 200% scaling in IE8 and I'm mostly happy.

The reason for these displays was for work purposes (work paid for them, thankfully), I can comfortably display 2 side by side pages in MS Word and they are almost exactly real size, as compared to a physical piece of paper.

The resolution isn't there, but it is good enough for editing. The benefits in gaming are just a side bonus. :)

The downsize is that too many programs (Quickbooks is a good example) just were not designed for these displays and really don't take advantage of them, nor do they scale the interface up so you're looking at tiny icons...

What do I really want?

How about a pair of 40" OLED panels running at 10240x6400. That is about 300dpi and is 4x the resolution of these 30" panels, with better display technology.

Now how much would that cost me today? ;)

BTW, that resolution is over 65 million pixels, compared to the 4 million pixels on current 30" panels (and 2 million in a 1080P display). What kind of video card would be needed to drive that?!? Reply

It baffles me that Microsoft, even with Windows 7, has not yet properly addressed the display size issue. Their "font scaling" simply works very poorly or not at all on many apps. This is an OS issue. Very sad. Reply

Remember that the Apple version uses a completely different backlight, so even if the glass substrate is the same the two displays can't be directly compared. Yellow tinging on the bottom half is a backlight problem (and possibly a design issue). Reply

Hi,Just got this monitor on order now, for use with my new computer I am yet to buy (chose monitor first). I have not seen any comments regarding video cards to drive this one. I have read about this monitor and seen recommendations to run this with 2 DVI-D cables to get full use of the resolution. I am considering the ATI 5770 card, which should handle this resolution according to the specs. I would be interested in hearing from anyone who has any experience on this card/monitor combination or any other input to a suitable videocard to make justice to this monitor.Thanks!/DanReply

You need a Dual-Link DVI cable, not 2 DVI-D cables to run this monitor, basically it is a single DVI-D cable containing both channels. A Dual Link DVI cable is included with the monitor, but if you want more information I recommend you check the wikipedia article on DVI...

However, at work we have ordered/installed about 20 of these so far and I am finding that DisplayPort is a much more convenient connection mechanism than DVI since a lot of the computers we are finding actually contain only single-link DVI outputs or other issues such as underpowered cards which max our at 1600x1200 over DVI. It does seem though that any card we find with displayport can do 2560x1440 despite some of them being relatively underpowered compared to other cards that dont seem to reach 2560x1440.Reply

I have a problem using the Dell U27 as a second monitor for an iMac 27. The Dell U27 doesn't let me choose its max resolution 2560 x 1440. I am using a DVI display adapter from Apple.Thanks for any help.Reply

I have the Dell U2711 & have read a number of reviews where people keep mentioning the AdobeRGB & sRGB modes. Yet none of these articles state what monitor profile to set in the operating system when using these modes!

Is it so obvious that it doesn't need to be mentioned? What am I missing?

I can personally state that when I set the monitor to AdobeRGB mode, & then set the monitor profile in the OS to AdobeRGB (HERESY in the world of color management; that is, to assign a device independent profile to a, um, device), I get the same colors as when I set the monitor to sRGB while setting the monitor profile in the OS to sRGB. These leads some credence to the belieft that one should set the monitor profile to the emulation mode being chosen. What I'm trying to get at here is that maybe Dell created these modes not only to limit the gamut of the monitor, but also to make a mode where the monitor's color response can be reasonably described by the sRGB or AdobeRGB profiles... This would make it easy for people to have somewhat accurate colors without profiling their monitors (so would a good profile in 'standard' mode, or what have you).

I realize that the best option when using the emulation modes is still to profile the monitor & use that profile. But I'm wondering if one can get away with using the emulation mode as long as you select the proper corresponding profile in the OS.

I have the Dell U2711 & have read a number of reviews where people keep mentioning the AdobeRGB & sRGB modes. Yet none of these articles state what monitor profile to set in the operating system when using these modes!

Is it so obvious that it doesn't need to be mentioned? What am I missing?

I can personally state that when I set the monitor to AdobeRGB mode, & then set the monitor profile in the OS to AdobeRGB (HERESY in the world of color management; that is, to assign a device independent profile to a, um, device), I get the same colors as when I set the monitor to sRGB while setting the monitor profile in the OS to sRGB. These leads some credence to the belieft that one should set the monitor profile to the emulation mode being chosen. What I'm trying to get at here is that maybe Dell created these modes not only to limit the gamut of the monitor, but also to make a mode where the monitor's color response can be reasonably described by the sRGB or AdobeRGB profiles... This would make it easy for people to have somewhat accurate colors without profiling their monitors (so would a good profile in 'standard' mode, or what have you).

I realize that the best option when using the emulation modes is still to profile the monitor & use that profile. But I'm wondering if one can get away with using the emulation mode as long as you select the proper corresponding profile in the OS.

Outcome:Dell UltraSharp U2711 is a bit costly but looking at the performance, quality and the ports we will definitely recommend you this panel. Get this panel if you have enough money and if you don’t, then start saving! It is worth every bit of your hard earned money.

I'm not sure that I understand the color accuracy test results compared to the out-of-the-box result that Dell's done at the factory. If the default calibration at the Dell factory is average around 2.2 E, and the result is better than your custom calibration results, why would want to keep the custom calibration setup? Also, if one just wants to keep the default out-of-the-box calibration but want to lower the brightness level to 90cdm, how would I go about doing that if I have the i1Display2 device?

I bought the Dell Ultrasharp U2711 and I have a problem with the colors: They look cheep/not natural, especially the red tones.I switched through the color presets like "Cold", "Warm", "Adobe RGB", etc but this does not change my impression that the colors aren't homogenous and natural.Is there a special calibration I have to do?Or is this just my perception of quality, as I'm coming from an NEC MultiSync 2080UXi (costed about 2.000 USD about 2, 5 years ago).Reply

"we test with ColorEyes Display Pro and Monaco Optix XR Pro and 24 test colors, but our Monaco results confirm their claim. We're not sure why, but we continue to get better results using Optix XR Pro than with ColorEyes Display Pro."

So what exact software/hardware was used? It is stated that the MOXRPRO is better than CEDP, yet they're both bundled with the same DTP94 - what's the dif? I am confewooz. Any clarification will be appreciated. Thanks.Reply