It's nice to see 1080p becoming more prevalent at this size laptop, but why can't we see some higher res displays at 20"+? I had a 19" 1600x1200 CRT eight years ago, and resolution hasn't gone up since then, and even dropped from 1920x1200 to 1080p in recent times. Laptops these days have some high DPI displays and I'd love to see some on the desktop.Reply

1920x1080 monitors are replacing 1680x1050 TN panels in the mid range monitor segment just as 1680x1050 replaced 1280x1024 monitors with the advantage of either 120hz TN or IPS screens. 1920x1200 monitors still exist and are just as expensive as always along with the 2560x1440 and 2560x1600 in the high and very high end segments.Reply

I think the main bottleneck for the resolution picked for the HD standard was the capacity of dtv broadcast/blueray/hddvd disks without any compression artifacts. Bumping the frame sizes up 77% would have needed a significantly higher compression level and would've resulted in the videophiles who're currently reviling netflix/hulu/etc's streaming offerings for low quality to have slammed the new standards; potentially rendering them stillborn at birth, and almost certainly slowing adoption down significantly.

The other hangup would be the size of the TV screen needed to get full use of the resolution in the living room. 1080p is generally not worthwhile on less than a 40" screen because the angular size of the pixels at 720p are too small to resolve at couch distance. The smaller pixels of a 1080p screen won't be visible as individual pixels until about 56". At the time the standards were being written 56" was an enormously large TV. It's still larger than most TVs sold today.

Until that changes (and bluerays, or the bandwidth needed to stream them at full quality, become commodity items) I don't expect anything to change on the consumer video market. When that happens I expect the new standard will be one of the 4k resolutions; probably either 3996×2160 (1.85:1) or 4096×1714 (2.39:1). We'd also need a higher density video cable standard. DP 1.2 will carry the 2d version of either signal, but would need doubled again to support 3d. Hopefully lightpeak will be mainstream by then and able to carry the data.Reply

It's the wide-wide screen mode at theaters today. IT would render all but the largest desktop computer displays too short to be useful for anything except consuming content. The video industry would see this as a feature.Reply

I wouldn't hold my breath. The theater's originally went widescreen (1.85 in the US, 1.6 in the EU) to differentiate themselves from the 1.33 aspect ratio of TV and offer something more than a giant screen to compensate tor the extremely expensive food and obnoxious idiots you had to share the theater with.

1.85 isn't much more than 1.77 and with 3D poised to invade the living room as well it won't serve well as a differentiator. Unless the studios decide to throw the theaters under a bus I expect something wider to go mainstream even if they stop short of 2.39.Reply

The end has begun, Vizio just launched a pair of 2.33:1 TVs at 2560x1080. I hope everyone is looking forward to their 2013 laptop running at 1400x600. It won't be deep enough to have a touchpad, so your lousy low contrast ultra-superglare LCD will be covered in fingrerprints from the touchscreen layer.

The 1080 resolution was a standard HD resolution in the 80s and 90s, long before flat-screen, fixed-pixel displays were even being sold.

While you may argue that 1080p is a step backward in resolution from the 1600x1200 CRTs of yesteryear - not even my beloved (and perfectly calibrated) Sony FW900 24" CRT can hold a candle to the clarity of my 1080p LCDs. Not to mention the LCDs are thinner, lighter, and much cheaper. Plus, having a true 1:1 pixel ratio for HD content is so much better. My wife is a professional video effects editor and can attest to the benefit of 1080p displays for her own reasons as well.

That's progress.

The only regress I can think of with modern displays is the loss of refresh rates over 60Hz. That's the only reason I keep the FW900 - for gaming w/VSYNC @85Hz and up. Analog FTW in that case. More and more 120Hz and 240Hz LCDs are coming out, but without proper mainstream connectivity, what's the point? Meh to that.Reply

I agree that in some respects current displays are better than what we had ten years ago, but some things took a step back, and even if everything else was equal, it's not such significant progress. If I want a monitor that's better than 1920x1200 I need to pay a lot more than I did for the 1600x1200 19" monitor I bought 8 years ago, and it'd be a lot larger.

One would have thought that by now it'd be possible to display high quality text and images on a PC monitor, but somehow we've degenerated into believing video is the only application that matters.

I agree that for standard users, who do just web and content surfing, current monitors are a step up from what they had in past years (1024x768, 1280x1024), but anyone more demanding could ten years ago get something that was a step up yet took about the same space and didn't cost 5 times more.Reply

Yup, what DanNeely said is right. Even with Blu-Ray, which represents the highest data rate currently available for consumer 1080p video (roughly twice what you get with terrestrial HD broadcasts, which in turn have higher data rates than cable, satellite, hulu, and netflix), the signal has to be compressed an amazing ~100:1 vs. a raw video feed! Only the cleverness of the compression algorithms, combined with the fact that large parts of a typical picture don't change much from frame to frame, allow this compression to still look good ---though it is still perceptually lossy on a high-end system (I understand Joe Kane did some studies to determine what data rate you would need to avoid all perceivable compression losses, but the results were for a private client and thus not published).

Plus don' t forget that the current bandwidth limitations force compromises not just in spatial resolution, but also chromatic and temporal resolution. Blu-Ray movies today have 8-bit color (allowing for only 2^8=256 gradations). The standard does allow for higher color depth (up to 16 bit), but that means more data and, with the current bandwidth limit, that in turn would necessitate more compression. Likewise, at 60 fps we'd get more temporal resolution than we do at 24 or 30 fps, which would result in less blurring during fast action scenes. But if you go to 60 fps, you've got to give something else up.

I.e., with the current bandwidth limitations, we're at about the limit of how much spatial resolution the system can offer, unless we want to increase compression artifacts or give up further on the already-compromised chromatic or temporal resolution.

Don't get me wrong -- I have a 100" screen (JVC RS1 projector), and would love to see a consumer 4K format. But I'd also like to see at least a 12-bit 4:4:4 color space, and fewer compression artifacts---which is not going to happen until they can offer a bandwidth about an order of magnitude higher than what Blu-Ray currently offers.

And unfortunately, a lot of video seems to be moving in the same direction as music -- less resolution for more convenience. So I think it may be a while before we see market pressure for a higher-resolution video format.Reply

We also appear to be reaching the limits in what compression can offer. Over the summer I read that the team working on the H.265 algorithm were concerned that they'd only be able to reduce bitrates to 70% of current levels while maintaining quality levels vs the 50% target that they'd set when beginning the design process.Reply

I do wipe off fingerprints, but those glossy bezels pick up every little touch and the flash photography tends to bring them out more than usual. You're not seriously going to complain about one photo (out of a couple dozen) where a few fingerprints are somewhat visible, are you?Reply

I dunno, I took time out of my busy day at work to read an article about a laptop I didn't know existed 10 minutes ago and probably will never buy anyway because the perfect laptop that I want doesn't exist/costs too much. It really bothers me that you didn't take more time to be professional and do it perfect. Now I'm going to be tormented for the rest of the day about that photo and my overall productivity is going to suffer. Thanks a lot. BTW, Merry Christmas and Happy New Year, jerks.

I disagree. Years of simply saying glossy sucks when it's where it'll get fingerprints on it hasn't hammered the point home to the PHBs who write the laptop design specs. Perhaps if reviewers all start showing pictures of how disgusting it ends up looking after a week or two of use the point will finally get through.Reply

That's actually not a bad idea, but very ballsy/risky. I could see the manufacturers getting pissed at the 1st site that did that, stop sending them review units, and then no other site would do it out of fear of getting the cold shoulder. Then again, they don't seem to care about reviewers ranting about these issues in text, so maybe I'm worried over nothing. More likely, though, mfg's don't actually bother to read reviews of their own products...Reply

People... full HD on a regular 22" makes for 100ppi, that's pretty comfortable, but on 15.6 it means 141ppi, that's a lot of pixels per inch. Don't tell me about the font scaling in Win7 cause FullHD@125% displays exactly like 1600x900@100%, no advantage if all screen elements are bigger, I don't get any extra screen real estate. Plus that the scaling doesn't work with all apps, there are plenty who don't scale at all.

I'm very used with 1400x1050@15", 116ppi, but I wouldn't stand 141ppi all day long. Am I having problems with my eyes, is everybody else comfortable with fullHD on 15.5 (usage of 12h/day)?Reply

I'm not. 1600x900 seems to be a lot rarer on 14/15" laptops than 1680x1050 was a few years ago. For that matter, has anyone reviewed the current crop of 1600x900's to see if they're good panels like most of the 1920x1080's or garbage like the 1366x768s?Reply

The two 1600x900 displays I've seen in the last year are both junk. I also think 1080p on 15.6" will be a stretch for the over-40 crowd, but I'm okay with it. Those who suggest we need 4K screens on laptops, though... I have problems with a 30" LCD at 2560x1600; what would it be like to have that resolution in 1/4 the area!?Reply

Enough DPI that AA won't be needed much. GPUs capable of pushing that many pixels are some years down the pipeline though. According to the Eyefinity lead at ATI 3x25 mega pixel monitors placed to completely fill your field of view would have a high enough DPI that you'd be unable to resolve individual pixels with your eyes. At typical laptop distances an 8MP screen would probably be approaching that level. Reply

I had problems with my eyes looking at an 14" XGA. I almost went blind before I got my 15.4" Inspiron 1920x1200 screen many years later. My eyes have fewer problems after looking at that high resolution display for many years now.Consider the difference between old dot matrix printers and laser printers. Is reading 1200dpi text uncomfortable? The real problem is Windows being optimized for low res screens. There are a few configuration changes that can help. I actually dual boot with Windows XP and Debian and prefer the Debian for being better equipped to manage the high resolution display. I spent a little extra time fine tuning the X Window System to do exactly what I wanted and I am very comfortable now. I am in no hurry to downgrade to the 1080p display until my old Pentium M gets really tired. The display is that much more important than anything else in my opinion. Thanks Jarred for recognizing the value of the premium displays.I would like an ebook reader with 1200dpi resolution to match my laser printer and expect that would be very comfortable to use also.Reply

Yeah, this is old news. A few anti-NVIDIA sites made a huge deal about the failures, but I never personally had any of those chips fail on me. Of course, I wasn't playing a lot of games on laptops, so maybe that's why. Anyway, anything in the post-8000M era should definitely be fine. Actually, I think it was mostly the old GeForce Go series that had problems.Reply

Somewhat on & off-topic question: So in all honesty, with such a horrendous screen, where does that leave value-minded users that want a laptop with a nice 1080p screen and a GeForce video card? The application I'm thinking of is CUDA-accelerating H.264/AVC 1080p videos.

The XPS 15 isn't listing the B+RG LED as an option, as mentioned in the article. Has anyone else heard from Dell about reasons why/if it will come back? The Clevo seems like an OK option but... well.. it now seems like the only option.

Thanks for the review. A friend of mine recently priced out a Sony Vaio F series laptop: 16.4" 1080p screen, Blu-Ray R/W drive, NVIDIA GeForce GT 425M GPU, and an Intel Quad Core i7-840QM Processor (1.86GHz, turbo up to 3.20GHz) ---he said it was about $1300. Perhaps that is worth a review.....Reply

correction: just checked it myself, and it's $1300 (on the Sony site) with a Core Quad i7-740QM processor (1.73GHz with turbo up to 2.93GHz).

The EC series cgeorgescu mentioned might be an even better buy. With a 1080p 17.3" screen (a bit more suitable for 1080p than the F's 16.4) , Blu-Ray R/W, ATI HD 5650 (don't know how that compares with the 425M on the F series), and Core i5-580M processor (2.66GHz, with turbo to 3.33GHz ) (Core i7 not offered on the EC series), it prices out to $1200.

And, as with the F series, if you downgrade from a Blu-Ray RW to a CV/DVD RW, you can subtract $150.Reply

Further, if we downgrade the EC series to make it comparable to the Asus reviewed here (Blu Ray read only + CD/DVD RW, Core i5 460, 1080p), the Sony site has it at $1020 --- nearly the same as the $1030 Asus but with what I understand is a much better screen (plus the extra drive bay that cgeorgescu mentioned, and the free Adobe Acrobat/Photoshop bundle).Reply

Don't forget that quad-core Clarksfield CPUs are horribly power inefficient, so you'd sacrifice quite a bit of battery life. Given that Sandy Bridge will address this, there's basically no point in looking at any more Core 2010 or Clarksfield laptops.Reply

Understood, thanks for your reply. But that leaves unanswered the obvious follow-up question, which is that of why, given that these Vaios have been out for a while, and given that they may represent the best value available in ~$1K laptops (say, the dual-core EC series), you folks didn't include them among your recent looks at mid-range laptops (e.g., the Vaios weren't mentioned in your 11/15/10 "Holiday Buyer's Guide: Notebooks"). Did you consider them and discount them for some reason, or was it something else? Since choosing what to review from amongst a large universe of products is a significant part of what a tech journalist must do, I was just wondering what goes into these sorts of decisions. Reply

The biggest issue is that Sony basically has no interest in seeding reviewers with hardware. While you could try to buy/review/eBay laptops, I don't have enough time/money to go that route, and we've been busy with other items. We did mention the VAIO Z in the guide, but most of the time I have difficulty justifying the Sony Tax. And not all Sony laptops have good displays either -- I've looked at more than a few at Best Buy, etc. Without hands-on time or input from someone I trust, I'm not willing to recommend a laptop as having a good LCD. :-\

I'll see if I can get Sony to be a little more forthcoming at CES, but I've gone down that road before to no avail.Reply

Thanks for the explanation! Why there had been no review of this particular (and seemingly high-value) part of the Vaio line was something I'd been curious about for a while, so it's nice to understand the manufacturer's role in this (a factor I had not considered).Reply

You guys and ur glossy bezel on the screen. Put ur thumb on the edge of the screen to open the laptop, there, problem solved. lol. wow.

Other than that nit-picky sillyness I was REALLY saddened to see those low scores on that Asus. I read it had to same display as the Dell used to and got all excited then those scores... I guess they had to save money somewhere to hit 1000 bucks AND have a blu ray drive. Honestly, I almost never use disks at all anymore and have never even touched a blu ray disc. Don't include any CD drive at all, put in a bigger battery and better screen and non-name-brand speakers that don't suck and I'd be good. If the marketing guys insist on a cd drive use the cheapest one you can find.Reply

Look, it's not like we *try* to put fingerprints all over the laptops. Just regular use will put them there, even if you're careful (which I am). If I walked around with white gloves on all the time, it wouldn't be a problem, but I'm not going to do that. Saying "just use your thumb" doesn't entirely fix the problem either, because you WILL overlap into the glossy area every time. A better solution, amazingly enough, is to stop using stupid piano black glossy plastic on laptops. There, problem solved, and it wouldn't cost anything extra.Reply

You can always turn it down if you need to, but if you're outside and can't read the display because it's not bright enough (I've had that happen with numerous laptops over the years), then brighter *is* better. Apple does this with MacBook Pro, where they get up to 350nits or something, but you can always set it to 50% or 100nits or whatever if that's what you need/like.Reply

I agree that displays have slipped lately. I am building a new desktop rig currently, and I hate the 16:9 displays enough that I am sticking to a old dell 4:3 17 inch. Pretty sad I would need a 24" or something to even match the height of it. My wife has a 17.3 hp dv7 laptop, and she downloaded the amazon kindle software. I double clicked it to check it out, and opened up a book...It looked hilarious to see the middle 1/4 of the screen being used and nothing on the sides, all of these apps and programs are going to have to start allowing for wrapping and double width viewing if this stupid trend continues. I literally wanted to turn the screen sideways it would have been much better.I tell you another problem too with the piano black glossy finishes. Our daughter is 7, and uses the laptop sometimes for schoolwork or to look at disney website etc. A very heavy laptop is a pain even for me to carry with such a slick surface. It is utter stupidity!!! I wonder how many people have dropped their expensive laptops and ruined them due to this. I always make sure my hands are 100 percent dry before carrying the thing, but it's really tough for my daughter which is why I have started to let her use my much lighter netbook more. Anyway a rougher, matte finish would provide tons more grip and look better on the fingerprint front as well. I can imagine the pain in the tail it must be for people doing these reviews to try to get the thing fingerprint free under camera flash. Reply

If you google the drive name listed in the spec table at the beginning of the review ( "Philips/Lite-On DS-4E1S") you can get the full tech specs. But from what I vaguely recall, this drive can burn CD and DVD, but it's read-only for Blu-Ray.Reply

Got another question though: The spec table shows only 1 HDD. Does this laptop support dual HDD, or SSD-HDD combo?IMO It'd be a shame as a 15-incher not to be capable of it.I have checked Asus International and it does not seems to support it, but could you please confirm?http://www.asus.com/product.aspx?P_ID=zzD4OFFWhspr...

There's no room for a second HDD. If you wanted to get creative, you could try removing the optical drive and installing a second drive there, but ASUS doesn't sell the necessary caddy so you're pretty much on your own. Actually, very few 15.6" or smaller laptops have room for two drives in my experience; that's usually a feature of 17" notebooks, or special laptops that skip out on other items in order to fit two 2.5" drives. Granted, there are exceptions, but I don't think we've reviewed any in the past year at least.Reply

Aww.... I see. I must have had false first impression when first getting in touch with Asus G51 specification, now that one feels real huge.

Still, when you mentioned in the review this N53J being "heavier, wider, thicker, deeper than that one which in turn slightly larger than yet another one" I had some hopes LOL. (Not blaming you for this)

I think the G51 is indeed heavier and larger than the N53; I was comparing the N53 to the Dell XPS 15. The G53JW in fact does support two hard drives, and it actually has some really interesting specs. When the Sandy Bridge refresh of that unit comes around, I'll be sure to hound ASUS about getting a review sample. We've looked at G73 twice, but no G53 yet.Reply

Hi Jarred, I suppose you are busying yourself with new top-notch toy named Sandy Bridge.However, I stumbled upon this yesterdayhttp://forum.notebookreview.com/asus-reviews-owner...Member "mzil" of the forum suspects the brightness level was not maxed during the test due to the reason he explained in there. Perhaps you could do a short check if this is so (and thus the display might be better than as recently reviewed)?

Let us know how things turn out, won't you? =)Thank you.(Gotta read the SNB review asap, thanks for this one as well)Reply

The only thing B&O is the amplifier for the speakers (B&O ICEpower technology). This means that the laptop has an efficient and powerful digital amplifier, but it tells you nothing about the quality of the speakers themselves or the audio codec delivering the sound to the amplifier.Reply