Post Your Comment

70 Comments

I can barely notice the difference between 720P and 1080I on my 32" LCD. Will people notice the difference between 1080P and 4K on a 61" screen?

It seems we have crossed the point where improvements in HD video playback on Sandy Bridge and post-Sandy Bridge machines are discernible to normal people with normal screens.

I spoke to a high-end audiophile/videophile dealer, and he tells me that the state of video technology (Blu-Ray) is pretty stable. In fact, it is more stable than it has ever been in the past 40 years. I don't think "improvements" like 4K are going to be noticed by those other consumers in the top 1%. This seems like a first-world problem to me - how to cope with the arrival of 4K?Reply

... Anything being discussed on a Web site like Anandtech is going to be "a first-world problem"...

That being said, there's not much of a difference between 720 lines of non-interlaced picture and 1080 lines of interlaced picture... If anything a 720P picture tends to be a little better looking than 1080I.

The transition to 4K can't come soon enough. I'm less concerned with video playback and more concerned with desktop real estate - I'd love to have one monitor with more resolution than two 1080P monitors in tandem.Reply

Why does an iOS device's Retina Display work in the minds of the consumers? What prevents one from wishing for a Retina Display in the TV or computer monitor? The latter is what will drive 4K adoption.

The reason 4K will definitely get a warmer welcome compared to 3D is the fact that there are no ill-effects (eye strain / headaches) in 4K compared to 3D.Reply

We can certainly hope, though with 1080p having been the de-facto high-end standard for desktops for almost a decade I'm not holding my breath.

Until there's an affordable alternative for improving vertical resolution on the desktop I'll stick to my two 1280*1024 displays.

Don't get me wrong, I'd love to see the improvements in resolution made in mobile displays spill over into the desktop but I'd not be surprised if the most affordable way of getting a 2048*1536 display on the desktop ends up being a gutted Wi-Fi iPad blu-tacked to your current desktop display.Reply

Higher latency and ghosting that maybe one in fifty thousand users will notice, if that. This issue has been blown out of all proportion by the measurable stats at all costs brigade - MY SCREEN HAS 2MS SO IT MUST BE BETTER. The average human eye cannot detect any kind of ghosting/input lag in anything under a 10-14ms refresh window. Only the most seasoned pro gamers would notice, and only if you sat the monitors side by side.

A slight loss in meaningless statistics is worth it if you get better, more vibrant looking pictures and something where you CAN actually see the difference. Reply

If we were talking about PVA, I wouldn't be responding to an otherwise reasonable arguement, but we're not. The latency between IPS and TN is virtually identical, especially to the human eye and mind. High frame (1/1000) cameras are required to even measure the difference between IPS and TN.

Yes, TN is 'superior' with its 2ms latency, but IPS is superior with its <6ms latency, 97.4% Adobe RGB accuracy, 180 degree bi-plane viewing angles, and lower power consumption/heat output (either in LED or cold cathode configurations) due to less grid processing.

This arguement is closed. Anybody who says they can tell a difference between 2ms and sub 6ms displays is being a whiny bitch.Reply

Oh, I know I shouldn't--REALLY shouldn't--get involved in this. But you would have to be monochromatically colorblind in order to not see the difference between 65% and 95% color gamut.

I'm not saying that the 95% gamut is better for everyone; in fact, unless the 95% monitor has a decent sRGB setting, the 65% monitor is probably better for most people. But to suggest that you have to be a hyper-sensitive "whiny b---h" to tell the difference between the two is to take an indefensible position.Reply

I agree completely that, as you say, "it's about the things you personally appreciate." If you have color settings you like that work on a TN monitor that you can stand to deal with for long periods of time without eye strain, I would never tell you that you should not use them because they don't conform to some arbitrary standard. Everybody's eyes and brain wiring are different, and there are plenty of reasons why people use computers that don't involve color accuracy.

But as it happens, you picked a poor counterexample, because I defy you to put a Dell U2412M (~68% of aRGB) next to a U2410 set to aRGB mode (somewhere close to 100% of aRGB) and tell me you can't see a difference.

For that matter, I challenge you to find me someone who literally can't see the difference between the two in terms of color reproduction. That person will have something seriously wrong with their color vision.Reply

I'd like to see a "blind" test on this. Is there a percieved difference between 6 and 2ms? Blind as in the test subjects (nyahahaa) does not know what ms they look at.

Test with both a 60 and 120hz display. I would guess the moving object, an explorer window, for instance, would simply be easier to look at and look less blurred as it moves over the screen. People used to fast paced gaming on CRT monitors or "3d ready" 120Hz monitors would see more of a difference.Reply

I really don't see any need for improvement in video resolution just yet. I myself have nearly perfect eyesight and can be extremely annoyed by artifacts, blocky compression, etc, but I find 720p to be detailed enough even for action movies which rely solely on the special effects. In most movies 1080p appears too sharp to me, add to that the fact that most movies are already oversharpened and post-processed and the increased bitrate (and therefore filesize) of 1080p and I see more downside than upside to it.This all goes double for 4K video.

That being said, I do still want 4K badly for gaming, viewing pictures, reading text, there's tons of things it'll be useful for.But not for film, not for me.Reply

Another advantage of a 4K screen (one that has at least 2160 vertical resolution) is that you could have alternating-line passive 3D at full 1080p resolution for each eye. I'm not an expert on how this all works, but it seems to me that the circular polarization layer is a sort of afterthought for the LCD manufacturing process, which is why vertical viewing angles are narrow (there's a gap between the pixels and the 3D polarizing layer).

In my opinion, it would be pretty awesome if that layer were integrated into the panel in such a way that vertical viewing angles weren't an issue, and so that any monitor is basically a 3D monitor (especially high-quality IPS displays). But I don't really know how practical that is.Reply

4K is a very big deal for a couple reasons: pixel density and film transparency.

From the perspective of pixel density, I happily point to the ASUS Transformer 1080p, iPad 3, and any 2560x 27" or 30" monitor. Once you go dense, you never go... back... Anyway, as great as 1080p is, as great as Blu-ray is, it could be so much better! I project 1080p at about 120" in my dedicated home theater - it looks great - but I will upgrade to 4K without hesitation.

Which leads me to the concept of film transparency. While many modern movies are natively being shot in 4K using RED or similar digital cameras, the majority are still on good ol' 35mm film. 4K is considered by most professionals and enthusiasts to be the baseline for an excellent transfer of a 35mm source to the digital space - some argue 6K-8K is ideal. Factor in 65mm, 70mm, and IMAX and you want to scan your original negative in at least 8K to capture all the fine detail (as far as I know, no one is professionally scanning above 8K yet).

Of course recording on RED4K or scanning 35mm at 4K or 8K is a pointless venture if video filtering like noise reduction or edge enhancement are applied during the mastering or encoding process. Like smearing poop on a diamond.

You can't bring up "normal" people when discussing the bleeding edge. The argument is moot. Those folks don't jump on board for any new technology until it hits the Walmart Black Friday ad.Reply

While I agree with most everything there is something I would like to nit pick on, While making a digital copy of old film in what ever format you use, more often than not a lot of touching up needs to be done. Wizard of OZ and all the 007 films can be an example. (I am ignoring the remastering of Star Wars and Lucas deciding to add in 'features' vs giving us a cleaned up remaster sans bonuses.) Still when your spending millions in remaster I expect at least not muddy the entire thing up.

However I feel we need to bring in higher bitrates first. I will not apologize over this, yes encoders are great but a 4mbs 1080p stream still is not as good as nice as a 20mb-60mb vbr blu-ray film The feeling that a craptastic 4k or even 2k bitrate will ruin the expedience for the non informed. Also notice I am ignore an entire difference debate whether the current can candle true HD streaming to every household, at least in the US.Reply

And for the OP, 32", really?Its completely understandable you don't see the difference on a screen that size.Step up to a 60" screen and then go compare 720p to 1080p (who uses 1080i anymore, oh thats right, crappy 32" LCDs. Don't get me wrong, I own 2, but they go in the bedroom and my office, not my Family Room.)

I think 60" +/- 5" is pretty much the norm now a days for the average middle class family's main movie watching TV.Reply

1080i @ 60 fields per second when deinterlaced is the same as 1080p @ 30 fields per second. The picture quality is almost entirely dependent upon your display's ability to deinterlace. However, cable TV is generally of a lower bit rate than OTA or satellite. Reply

On a 32" you will certainly not see a difference between 720p and 1080p - it is barely visible on a 40". Once you go to 52"+ however the difference becomes visible.

On a 61" screen as you suggest the difference will be quite visible.

Having said that I am still very happy with the Quality of properly mastered DVD's which are only 576p on my 47" TV.

It's not that I can't tell the difference, its just that it doesn't matter to me that much, which is why I also don't bother with MadVR and all that, and just stick to Windows Media Center for my HTPC.

Have you ever seen a 4k display on a uncompressed signal? The clarity is just astounding.

I'm more concerned about the ability to deliver content with that kind of bandwidth requirements. We already get hdtv signals that are so compressed that they're barely better than a really really good SDTV signal. Reply

I've experimented with madVR a bit but in the end the problems with playing back DVDs and Blu-rays with menus has so far stopped me from using it seriously. However, I've seen reports claiming that Ivy Bridge includes higher quality upscaling within Windows Media Player (as part of the EVR I suppose). Any evidence of this?Reply

You can take a look at the PowerDVD chroma upscaling screenshots linked in the text. I was really surprised at the quality (until I zoomed to 300%, I couldn't actually decipher the difference between PowerDVD and madVR!). Similar behavior with MPC-HC using MPCVideoDec.

Also, when we are complaining about 23.976Hz versus something like 23.972 how can you be sure that your measurement is accurate? I would think that for most HTPC users the important thing is that the video clock and audio clock are derived from a common clock. Is there some way you can check for this? I'm also interested to know if automatic lip-sync over HDMI is working properly - it doesn't seem to work on my AMD E-450.Reply

Whether the clock is accurate or not, what matters it the number of frames dropped or repeated by the renderer because of this. madVR clearly indicates this in the Statistics.

Yes, you are right about video and audio clock derived from a common clock, but I am not sure on how to check for this.

Does lip sync not work for you on E-450, but does work on some other machine? I have played with the e-450 only briefly in the Zotac Zbox Nano XS, and I did watch one movie completely. I didn't have lip sync issues to warrant digging in further.. I do agree my sample set is extremely small.Reply

I agree that what matters is dropped frames. I'm not absolutely sure how madVR decides when to drop frames. As I see it there are four options

1) lock playback to the video clock and drop or repeat audio frames2) lock playback to the audio clock and drop or repeat video frames3) lock playback to the video clock and resample the audio4) lock playback to some other clock (maybe the processor clock) and drop or repeat both video and audio frames.

My guess its probably doing 2 which would make the reported dropped frames a good measurement. If it was doing 1 or 3 then it wouldn't drop frames. If its doing 4 then I'd argue that its a faulty renderer.

Regarding the lip sync its difficult to be very scientific about it because I don't have any suitable test material. My TV definitely introduces a significant delay and for some reason I haven't had much luck correcting it with manual adjustment on my AV receiver. Maybe it varies with frame rate or maybe the delay is outside the range I can set manually. When I enable automatic lip sync it does seem to correct things for the set top box and standalone DVD player but for my E-450 (an ASUS mini-ITX motherboard) it seems to be way off. Its quite possible its a bug in PowerDVD or that it depends on the format of the audio track or I don't know what else.

I do have machines that I could try but it would really help to have some test material in a range of frame rates and audio formats.Reply

This article is great commentary on the video aspects of an Intel HTPC setup however nowhere on either the processor discussions or the Z77 motherboard articles was any attempt made to actually review the audio portions of HTPC setups which is still a major part of any Home Theater.

IMO if you want a complete comprehensive look at HTPC capabilities of any platform addressing such things as audio decoding, audio pass through over HDMI and audio quality are a must until then it is not a complete review.Reply

Why are you testing with a HD4000? The 4000 only comes in the higher and more costly chips? Most lowwer/Mid Ivy chips will use HD2500 video.The price differance is enough to buy a cheaper chip and get a full sep. video card that has its own memory, or wait for Trinity. Reply

If the i7-3770T is actually ever available to buy then from a power consumption point of view it would also be a good choice (with plenty of CPU headroom for the times where GPU decoding doesn't work) . From a cost point of view it might be a bit on the high side I suppose.Reply

This review is really testing the HD4000 implementation. When the dual-cores are released with the HD4000, the GPU will be exactly the same, so almost everything will be directly applicable there too.Reply

With that P8H77-M config, if you use a double slot GPU in one PCIex16 slot (and so lose one PCIex1 slot) and use TV tuners in both on the remaining slots PCIex1 and PCIex16 does using the second PCIex16 slot result in the first PCIex16 running at x8 ?Reply

If the second PCIe is occupied, then it will cause the first x16 to run at x8. Both these slots are electrically connected, so when you need even one lane, it takes eight away from the first PCIe slot for it.Reply

I see you didn't test madVR in Full Screen Exclusive mode - can you elaborate on the reason for this please? I read over at missingremote that FSE improved the situation significantly for madVR with the HD4000?Reply

The FSE mode performed visibly worse for me compared to FSW in the few cases that I tried. I have got the rest of the settings that Andrew @ MR used. I may try it and see if it improves things. My aim was to get madVR to render without any dropped frames, and I was able to get that at DDR3-1600 (which is what Andrew used too) for almost all the clips I had (except 720p60, which I didn't try till yesterday).Reply

Video decoding and rendering benchmarksCan you provide the learning guide how you've got those scores? It will be very helpful for some of us... I know about hqv score.. but this one is new to me.. kindly help :)From where can I get these benchmarks if i have to compare my existing system with the IVB results?Reply

In the article there is a promise of a BIOS update to fix the 23.97Hz issue. Wasn't something similar also promised for sandy bridge in the same article over a year ago!! That never happened did it. I want to build a HTPC already!Reply

Well, something did happen with SNB.. they got it to 23.972 Hz :) If you think about it, video cards with AMD and NVIDIA GPUs also end up in the 23.974 - 23.978 range, and only very rarely do I actually see a GPU outputting exactly 23.976023976 Hz.

If Intel gets between 23.974 - 23.978 in a stable manner, I will consider the matter closed. Reply

Is there still the problem like with SB that the driver puts color space to limited range when connecting to the tv with HDMI and resets it with every refresh rate switch/reboot with the integrated graphics?Reply

Nice article! As always.About the note:"The good news is that Intel is claiming that this issue is fully resolved in the latest production BIOS on their motherboard. This means that BIOS updates to the current boards from other manufacturers should also get the fix. Hopefully, we should be able to independently test and confirm this soon."

What does it mean exactly? Does it mean that this BIOS update should get refresh rate closer to the 23.976 than it was in your test? And "on their motherboard" - does it mean that this BIOS update is for Intel MB only?

True that in AMD and nVidia the out of the box refresh rate for 23 is never precisely 23.976, but the custom timings on nVidia allows you to get closer to is. There is no custom timing settings on the HD4000, right?Reply

I've been looking at an Ivy Bridge setup with the H77/Z77 chipset but I can't find any information about the audio support? Can it bitstream TrueHD and DTS-HD tracks? The older chipsets do it so I would find it strange that the new ones don't, but I don't see it mentioned on any of the new boards or in the intel information. Reply