It should also be noted that graphics are one of the few things people can universally agree on/measure. How good a plot or gameplay system is depends on who you ask.

If a game has 512x512 textures. It has 512x512 textures. If a game has low FPS readouts. It has low FPS readouts. These are things you can measure and/or come to a consensus to fairly quickly and easily.

Does the plotline of Mass Effect 2 suck and is it not engaging for repeat gameplay? Depends on who you ask.

This is a good point, I agree with you here.

I just think graphics shouldn't really progress from where they are currently at I guess, developers maybe need to focus on extending content. I remember I played morrowind for over 200hours, but I expect maybe to get 30-40 hours out of skyrim (although it will be amazing) - meaning that maybe the gameplay length will suffer from highres textures/audio dialogue size

It should also be noted that graphics are one of the few things people can universally agree on/measure. How good a plot or gameplay system is depends on who you ask.

If a game has 512x512 textures. It has 512x512 textures. If a game has low FPS readouts. It has low FPS readouts. These are things you can measure and/or come to a consensus to fairly quickly and easily.

Does the plotline of Mass Effect 2 suck and is it not engaging for repeat gameplay? Depends on who you ask.

This should be true, but it's not. My dumbass friend thinks that Black Ops looks great. The OP thinks that Oblivion looks great, FFS. Some people just don't have an eye for good graphics. There is an entire army of PC gamers who will tell you with a straight face that Crysis 2 looks as good or BETTER than Crysis. <-- I couldn't make that up

512x512 vs. 1080p is one thing, but being able to discern the finer points of graphical fidelity is not a universal trait among gamers.

Because how else can you justify spending copious amounts of money on a brand new custom PC, if not to show off to your friends you can play all games on extra high super max enthusiast ultra settings?

1) Graphics are easy to see (obviously) and can be measured to a degree (check skripka's frequently quoted post).

2) Gamers buy new video cards / new consoles purely for the sake of more graphical horsepower. Games that do not make efficient and extensive use of said hardware give people a sense that they are wasting the power of their hardware, and their money.

3) Game studios have HUGE budgets and tons of staff members. We expect them to be able to produce something "up to standard".

4) Until we actually play the game, the only consistent impression we have of it is its graphics. A good first impression is important. (yes, some games give a lot of details out during development and such, or it might be a game friends of yours always talk about, but this is the exception not the norm for most games. you can take a look at a screeny of ANY game)

5) Games with great graphics AND gameplay AND storyline do exist. Gamers know this. Therefore, if a game has poor graphics, it is already one step behind something that already exists. We want things to keep getting better not worse.

Minecraft is one of the most involving games I've ever played because of gameplay. The visuals aren't amazingly detailed without a texture mod but they're still immensely satisfying and minimalistic. I've installed the LB realism mod and things look utterly stunning though with it.

This should be true, but it's not. My dumbass friend thinks that Black Ops looks great. The OP thinks that Oblivion looks great, FFS. Some people just don't have an eye for good graphics. There is an entire army of PC gamers who will tell you with a straight face that Crysis 2 looks as good or BETTER than Crysis. <-- I couldn't make that up

512x512 vs. 1080p is one thing, but being able to discern the finer points of graphical fidelity is not a universal trait among gamers.

I was thinking more amongst reviewers since they are the ones who cook up FPS comparisons, and those end up being used to bench hardware.

It also depends on what eyes people are using.

Half-Life and HL2 looked great back in their day...as did TES. By post-Crysis1 standards they look like crap...OTOH they had or have gameplay/plot to make up for it. They were also fairly original things back in their day. Heck, SW: Knights of the Old Republic 1 has awful graphics, and I still find it engaging to pull out.

Nowadays plot lines for most games have gotten terribly formulaic and predictable. Controls have gotten dumbed down and no longer use a 104-key keyboard like they used to. Publishers focus on making something consoles can run rather than something to show off PC hardware. Publishers prefer sequels to original content. Publishers also don't want to spend the money in a 30+hour plotline AAA title and want to nickle-and-dime you for extra hats or weapon skins or armors...heck they've gone to just reusing the same meshes over and over with a different texture file (I'm looking at YOU BioWare).

I think there's a key difference here - realism vs. style. I still play Team Fortress 2 almost everyday and still think the graphics are pretty good/fun even though by most fps standards nowadays they're not very good at all. Team Fortress 2 has style and it's damn fun. Same thing with Minecraft it has a very distinctive 8-bit feel and will take a very very long time to 'feel' outdated graphics wise. Contrast these games with any shooter from even a couple years ago and post Crysis it looks like crap.
I wish more games were released based on style not realism.