The current console cycle (regrettable dictator of modern game graphics – hey, can we rebel using Twitter?) kicked off more than five years ago, and, well, it's seen better days. Our PCs, meanwhile, are like hulking pro athletes running around in the little leagues, easily rendering everything at ultra high spec while yawning and disinterestedly flipping through an old magazine. There's simply no sport in it. Fortunately, id Software's resident mega-brain John Carmack has our backs. At this point, though, even he may have hit a wall.

“I have done a fair amount of research work this year to help clarify our next generation directions, but so far they have mostly been negative results – I know we won’t be rendering with a triangle intersection ray tracer on the next gen, for instance,” he said on the Bethesda Blog.

“I have a couple more research projects to undertake in the coming year, but the technical work I am most excited about doesn’t have anything to do with graphics, but instead with the data management and work flow through the development process.”

He did, however, add that today's games look “incredible.” Even so, we think we'll manage to contain ourselves over the data management and work flow solutions. It'll be tough, but we'll find a way.

All told, we're of two minds about this graphical standstill. On one hand, it's going to become much tougher for publishers to solely sell games based on “OMG skin textures so detailed that you can see the pores from space!” More creativity is never a bad thing. On the other hand, we like shiny things. It seems, then, that we have reached an impasse.

Comments

I find myself agreeing with Carmack - videogame graphics are approaching a plateau (but not quite yet, IMO). However, as others have said, there is a lot more to an immersive gaming experience than just graphics. I think it's a good thing that the industry may have to raise the bar on other aspects such as gameplay.

Hmm, at this rate hardware manufacturers are going to have to start making sure their components will last more then a few years! (I've not had a problem, but truth is, these things aren't built to last).

Games need more realism not in regards to visual beauty but to real world physics. If I hide behind a sheet metal barracade and it gets hit with a few rockets, my sheet metal barricade should disappear. If I plow a vehicle into a tree, that tree should bend over, or break, or snap, etc, not stand there like a Mighty 3" Sappling That Stops Tanks.

That is what games need. More realism in the environment and how it reacts to me.

Will this cause havoc with level design? Yep. But that is the next level that will help force smarter AIs (@RUSENSITIVESWEETNESS) which will help make level design easier. When I can just blow a hole in the side of a house rather than run around the side to a door and the bad guy (or good guy, hehe) will have to adjust his tactics to stop me, then we'll be blown away by our imersion in the game.

I remember when I first played GTA4, and I was pretty damn impressed with how real the whole thing seemed. It wasn't the graphics so much as the effort that went into the game world. Still, it wasn't perfect. Sometimes the game was even hurt by it's realism. In normal games I probably wouldn't even notice that all the NPCs have the same basic look, or that the textures on the garbage dumpsters were all the same. But in such an (otherwise) realistic world it was kind of jarring.

Yeah, it's true. Game design is lacking (Doom 3 comes to mind). You have to give the man credit for one thing though: he builds rock-solid game engines. Some of the most popular, best selling FPS shooters use engines built by John Carmack. Call of Duty for example, uses a heavily modified idtech 4 engine, otherwise known as the Doom 3 engine.

Personally, I'm interested to see how far they can push DX9 with "Rage".

I think he's trying to reinvent the wheel with a new special effect. Personally, I think he should just use the same great HDR that what we are seeing in some games today but greatly increase the amount of polygons in objects and the detail of textures. We have the GPU memory for larger textures and DX11 to help speed up polygon drawing already. Plus I would imagine to scale the game back to consoles you'd just compress the textures more and have lower polygon models.

@Marsel: I totally agree that this console cycle is holding developers back. Sure good gameplay is the number 1 priority, but the market feels stagnant without graphical updates.

I never understood why console manufacturers don't make their units more like PCs; in a word -- upgradeable. Imagine an XBox 360 that had a removable (and replaceable) video or CPU card. You wouldn't have this perpetual cycle of having the console be outdated 12-18 months after it's been released yet 6-8 years away from the next console cycle.

As long as the swappable parts are fully compatible with each other (as most PC parts from the same manufacturer are) I don't see how it would be such a big deal. Games would scale their graphics based upon the available hardware (just like a PC). Parts would be standardized and proprietary to minimize Q&A. And you'd sell a lot more hardware. I really can't think of a single downside.

The thing about consoles and their lack of upgradeability is the fact the devs will know EXACTLY what their entire user-base is capable of at a bare minimum. Launch Day or 5 years later, there's no worry about, "How many users will be able to play our game?" or, "What should be set as the minimum system requirements?"

Console manufacturers already toyed with "upgrades" in the form of add-on hardware. Sega and the Genesis is an example: core system, the 32X, and the SegaCD. All those did was fracture the user-based and disappoint the early adopters. (I believe the 32X only had a handful of games released for it.)

There's also the fact consoles tend to target the less tech-savvy userbase. Keeping things as stupidly simple as possible is a major factor. Take the PS3: you can upgrade the HDD in it if you want, but i think many people don't bother (for various reasons).

Granted, these examples aren't "core" components like the CPU and GPU as you mentioned...But that just fractures the userbase if it the upgrades aren't universal. (Sega did have a the RAM cart for the Saturn. The games that REQUIRED it probably didn't sell as much as they could otherwise. On the other hand, some my not have been technically possible without it.)

I'm a console and PC player and both platforms have their merit. If I wanted to deal with upgrading a cosole, I'd just stick to PC gaming instead. Ther's a lot to be said for being able to just pick up a game, take it home, pop it in the console and play it instead of thinking, "Does my system meet the requirements and how well will it play if it does?"

Oh, lastly: TCO (Total Cost of Ownership)

Part of draw for consoles is the relatively cheap on-time cost for the hardware. Pay a few hundred bucks now and the system will be supported for 5-10 years with the only real additional costs being for the games you buy. PC gaming by comparision can be pretty expensive for the same period of time. (Especially for those in the high-end market or wish to always be able to play the latest games as soon as they are available.) Most consumers I believe just want to buy once and not have to think about again nor continually spend money to stay current. You'll always have the early adopters and such, but that's when market fragmentation starts to creep in because they don't make up the entire market.

I understand what you're saying, but you missed my point entirely. The key would be to make upgrades that were completely compatible and would NOT fragment the playerbase. And developers would know exactly what the minimum specs would be because it would be whatever the lowest combination of available components are.

No one would be forced to ever upgrade if they chose not to, because ideally all games would play no matter what. Game developers would then be able to add new visual and processing effects that would be enabled for those using better components, but which would be disabled for those using base parts. So base users still get an experience identical to other games they play, while those with better parts get a better experience.

Think of it like this: I can play WoW on a single core P4 with a Geforce 6800GT. I can also play WoW on a Core i9 with a Geforce GTX 580. The game code is exactly the same. The difference is that on the P4 you have to turn off all the fancy visuals. But it's still the same game.

One point you have missed. You would need different drivers for different upgrades within a single console unit. Do you have any idea how many problems this would cause the PS3 or XBOX for example? Look at what it has done to the PC gaming market? Do a google search on driver problems for the GTX460 alone. LOL

Simply put, you can not make a console be upgradable like a PC. The day you do that is the day consoles lose the only thing that have going for it--ease of use...

Nearly the entire PC platform is compatible as you describe, yet it fragmented. Though granted, there isn't a single manufacturer of the "platform" to ensure baseline stats.

But what about future revsions then? Current business models completely refresh the baseline en masse and there's no guarantee older software will run on the newer consoles (though as of late, the console manufacturer have been expanding on backward compatibility as well.) You'll have all this extra hardware that most consumer don't want to contend with. (There's also the question of what to do with the old parts you replaced.) I think for a console company, the logistics would be a nightmare since they, as the Console designer, would have to keep track of the market and the hardware distribution.

Also, whta's to honestly stop a developer from basically crippling a game if you have low level HW? Sure, you can say the console manufacturer would have control on what gets published on their platform so everything can run at the lowest specs...But then you'd have to ask if it's workworthwhile for the dev to spend the exratime coding the game to run at different HW spec profiles optimally.

BTW, for some games, turning down the visual quality in some cases can in fact shift your gameply strategy, especially in MP situations where framerates and distance view level matter quite a bit.

Still, It's mainly a cost thing in the long run, but for the console company (support) and the consumer (investment). Your upgradeable platfrom already exists: it's the PC.

I think I'd prefer better gameplay first over better graphics. I still prefer some older (90's) games simply because they play better. A crappy game is still a crappy game no matter how pretty it looks.

I don't think it is although if a game has one and not the other I don't think that automatically rules it out. I personally really like Minecraft BECAUSE the graphics ability required is low but the gameplay is fun. Why is this you may ask? Well, when I don't have access to my desktop or just want to mess around for a little while it's easy to pop on via my graphically irrelevant Dell Latitiude (school provided). By the same token, I play Crysis occaisionally on my desktop to show off the graphics, not for the gameplay. So, I guess ideally you would have both but one or the other can be okay to me.