Because every conceivable medium has adopted 1080p as their native resolution, including the consoles that will be with us for the next decade...not the ones from last decade. More than 50% of American households have a 1080p display. While it's certainly overkill, even tablets and phones are headed that direction.

720p will go away because ~4.5''/1ft (or what Apple would have you believe is the Retina's Acuity) does not scale. 1080p however is close-enough, and an adopted standard.

Just like the 16:9 aspect ratio itself, 1080p is a bastardized compromise that will be with us for a long time

Click to expand...

well if u want an 4k display u can get it for 8000$ can't you wither ibm of hp sells those

Some people are just not as keen when it comes to framerate. 30-40 fps can pass as acceptable to them. There are some people who value accuracy in these kind of games especially when stuff goes insane on screen.. hence the demand for a constant 60+ fps

I sometimes think there's more to just the cards than the drivers and the hardware. I went about 40fps in BF3 (as a low but nothing below that, that was noticeable). Likewise in GW2, no dips below 60-70fps (and these were on single card, 2560x1440).

I use a G19 with my display monitoring fps and temps so I keep an eye on both.

The one game i had huge dips with was Skyrim. I'm sure it was worse on my GTX 580 (having to alt-tab out of game or was it restart? to fix). I think the coding of certain games is just dire and is affected by more than just 'shoddy' drivers. Sometimes a poorly coded game can't be addressed 100% by driver fixes.