Posted
by
Soulskillon Sunday March 28, 2010 @10:25AM
from the pc-game-makers-are-holding-back-pc-gaming dept.

An anonymous reader writes "Despite all the excitement over Nvidia's upcoming Fermi GPU, there is still a distinct lack of DirectX 11 games on the market. This article points out that while the PC has returned to favor as a gaming platform, consoles are still the target for most developers, and still provide the major limitations on the technological sophistication of game graphics. Inside the Xbox 360 sits an ATI Xenos GPU, a DirectX 9c-based chip that bears similarity to the Radeon X1900 series of graphics cards (cards whose age means that they aren't even officially supported in Windows 7). Therein lies the rub. With the majority of PC games now starting life as console titles, games are still targeted at five-year-old DirectX 9 hardware."

All it would take to "revive PC gaming" is to have a few truly great games get released. Even the best of last year were little more than machinima.

Not only does console gaming make computer games worse, but as the horrible control mechanisms of Mass Effect 2 and others showed, by using console gaming as a starting point can kill an entire franchise. I was a big fan of Mass Effect, but found ME2 such a mess that I won't spend any money if and when ME3 comes out.

PC gaming IS revived.... It's just not what traditional gamers want.... the big games are things like plants vs zombies, Dinner Dash, Farm Town.... things specifically excluded from Console (and apple) platforms. I'd venture the number of hours on Facebook or Myspace games dwarfs Xbox Live and the demographic spread is much wider.

Yes, this is the issue. Lack of cross platform pc gaming is holding back a ton. However, it's almost equal to the amount if games were available on both wii/ps3/xbox360 at the same time.

The difference, and why the PC gaming will win in the long run? It's easier to just program a game in openGL that runs on all platforms than it is to program for wii/ps3/xbox360 where you have 3 entirely separate hardware and development requirements.

"The difference, and why the PC gaming will win in the long run? It's easier to just program a game in openGL that runs on all platforms than it is to program for wii/ps3/xbox360 where you have 3 entirely separate hardware and development requirements."

No it's not, because you have to cater to the different levels of processing power, memory, disk storage, you still have to optimise for the different iterations of difference graphics cards for each different vendor, you have to implement optimisations. You

No one has mentioned linux until you showed up. Why is it that the anti-linux crowd shows up the minute anyone mentions openGL or cross platform? I suggest you go back and re-read the GPs post which is directed at wii/ps3/xbox360.

Even so, your post makes no sense. You list everyone being on Windows and OSX yet fail to realise that OSX uses openGL. What on earth was your point with this off-topic rant?

I got some news for you, pal. The GPU in the 360 is 100% capable of OpenGL. Should you choose to write a game engine that doesn't use MS's crap, you're free to do so. Sure you'll probably have to jump through some hoops to get it to work but all the functions and calls are still in the hardware.

For example, Darkest of Days uses 8MonkeyLabs own proprietary Marmoset Engine, which in turn uses OpenGL, Physx, SpeedTree and OpenAL.

I swear, just because it has MicroSoft on the package doesn't mean you're stuck with DX.

Bullshit, there is no OpenGL API on XBox360 and there is really no such thing as OpenGL capable hardware - it's all about software (drivers and libraries) and it can only be implemented by Microsoft - not you.

The closest you could get is to write OpenGL wrapper for Direct3D, but it's much easier to simply have multiple rendering backends for your engine - which is most likely what Darkest of Days does.

Right, because PC game developers have to write a separate version of their engine for each combination of CPU and RAM quantity that they want to support.

Wow, that's an incredibly stupid comment. There's a big difference between differences in the amount of RAM and completely different architectures. The consoles have ridiculously little common ground amongst themselves or with the PC. Of course you rewrite big chunks of your engine for each of them. You can tell when someone hasn't rewritten their engine for the target platform because it invariably becomes yet another shitty port!

For example:

The PC has a bunch of homogeneous cores. You set up some threads. You allocate some memory. You run your algorithms, being mindful of synchronization and cache misses. The OS schedules things relatively sanely (you hope).

The XBOX 360 has six hardware threads split across three cores. Each thread has identical capabilities, including decent math performance. The paired threads can stomp each other's caches very easily if you don't have compatible tasks running on them. Cache misses are incredibly expensive. The pipeline also penalizes branches very heavily, so you'll need to do things that might be slower on PC to avoid if statements.

The PS3 has one core with not that much for math horsepower on it. You also have seven coprosessors that run a specialized instruction set and can perform ridiculous number-crunching feats, but they can only work out of their own dedicated little bit of memory. The main core's job is to DMA math-heavy tasks into them, wait for them to finish, and DMA the results back while running high level logic. Write your own synchronization code. Also, you get to restructure all of your physics (and maybe AI) data so that you can very efficiently batch it into little chunks of math-heavy work.

The Wii has ridiculously fast RAM. It's just silly fast. Cache misses are not a concern. Cramming everything into the limited amount of RAM you have, however is. This affects the core structure of almost every compute-heavy subsystem.

Graphics: just as varied. IO: varied, again. Controls: also very different...

If OS X is a unix operating system, does that mean Mac games will work on Linux too?

No. There isn't usually not compatibility among Unices. Mac is also only using Unix underpinnings, it has plenty of other APIs that a game may be using - e.g. Cocoa and OpenCL. That said, porting to e.g. Linux after porting to Mac is a lot easier as you'll already had to port away from DirectX.

Except... Warcraft is 100% open GL. And alot of game makers (as the article and others read) are not interested in DirectX anymore because Mac is gaining in popularity, consoles are gaining in popularity, iPod/iPhone/iPad are gaining in popularirty, etc. They use something so they don't have to code twice and can save money on coding costs.

That's why they originally went with DirectX because there was no other platform. Now there are... and game makers want something to cut their costs and CODE ONCE, USE EVERWHERE!

'WoW for example is nowhere near as fast on Windows with OpenGL as with DirectX."

What, are you still running a single-core processor? D3D is halfway tied through the OS first so it's ALWAYS been slower if you've had a crappy CPU, regardless of how well your GPU ran, because OpenGL ran direct calls to hardware without the OS interfering, while D3D kept some stuff on the CPU.

It's been like that since the first Unreal, and hasn't changed in ANY benchmark where I can test OGL vs D3D.

Saying OpenGL allows direct development to multiple platforms including mobiles doesn't make much sense because in pretty much every case you would need to do the rendering engine again, and in most cases also change the gameplay completely. Mobile phones don't scale up to same performance as consoles or PC.

If PC gamers understand the technical difference, then they know DirectX is technically superior. But those who understand and care about the philosophical difference are probably along the same numbers

I'm a game developer. The only good point you make is that using OpenGL makes Mac ports a bit cheaper. The rest of your rant is bullshit, and if you're actually a gamedev (which I doubt) you should know better than to make such silly claims. There's a hell of a lot more to porting to a new platform than porting the graphics subsystem (and porting between DX and GL is trivial compared to what you have to do to squeeze stuff like physics onto console architectures).

We all want to use OpenGL because it's a nicer API than Direct3D

Hah! Bullshit. OpenGL might become a nicer API if Khronos ever gets their heads out of their asses and stops pandering to the CAD crowd. Until then it's an annoying mass of gotchas. Seriously, the backwards compatibility provisions in OpenGL make every Windows release look like a clean break from the prior version.

we can develop for it on our Macs, and our games will support just about every modern gaming platform imaginable (because we aren't tied to Microsoft's platforms).

Mac I'll grant you. What are these other modern gaming platforms? Seriously, what are they? Linux? Unless you mean all the mobile devices using OpenGL ES, but you need to rewrite significant portions of your engine and redo almost all of your art to get a reasonable experience on those, and a DX -> GL ES port is trivial when you're already doing all of that.

DirectX 11 doesn't support Macs, it doesn't support the PS2 or the PS3, it doesn't support the Wii, and it doesn't support most mobile devices.

Again, I'll grant you the Mac. What the fuck are you smoking as far as the rest goes? OpenGL doesn't magically give you free (or even meaningfully cheaper) ports to any of those platforms either:

PS2: No OpenGL here. Just a DMA controller and some hardware registers. The entire create/bind/release metaphor that both GL and DX are based around does not exist. The shading unit can't even express all of the common blend modes, and you have to do ridiculous gymnastics to fit textures into the tiny amount of video RAM you get. You should know this if you've ever worked with a PS2.

PS3: You're an idiot if you're using the GL library directly on the PS3. There's a reason Sony gives direct access to the hardware - if you care about performance you won't be using the wrapper libraries. But again, you'll be rewriting a bunch of your engine to get AI, physics, and other stuff running on the SPUs anyway and a graphics port from either DX or GL is fucking trivial next to that.

XBOX and XBOX 360: DirectX-ish API, so OpenGL gets you nothing here. Even if you start with a DX game you're still porting a bunch of code if you did anything worth mention since there are still fairly significant architectural differences between it and PC. About all you get out of the similarity is a good idea of what entry points will likely be named.

GameCube/Wii: Calling what those platforms expose "OpenGL" is just silly. The structural similarities between the libraries you get and OpenGL are trivial when compared with the mountains of restrictions, special cases, and other odd differences you'll be dealing with. And again, you're going to be rewriting a bunch of your engine to the execution environment so a 5% more direct graphics port saves you fuck all once you tack on the art changes and another QA cycle.

Mobile devices: we already covered the mobile devices. Have you actually worked on one? You should know better than to imply that you get magic free porting to them if you just use OpenGL. There's a hell of a lot more to a usable mobile port than flipping some defines and recompiling with GCC.

Seriously, the starting graphics API is fucking irrelevant to any serious porting effort. GL and DX have near identical capabilities, identical object lifetime management, trivially mappable entry points and trivially mappable state bits, and near identical performance and synchronization behaviors. Porting between the two is trivial compared all the other work a proper port requires.

DirectX 11 in this context does not mean 'version 11 of the Microsoft API for game programming' it means 'graphics cards with geometry and compute shaders and hardware tessellation support'. Whether these are programmed using DirectX 11 or OpenGL 4 (with OpenCL) is largely irrelevant. If you use these features effectively, you need quite a different design to the older model where you just had vertex and pixel shaders. If you target the older functionality, supported by consoles, then your game will work fine on newer hardware. If you target the newer hardware then a console port will involve a significant amount of rewriting.

The Voodoo card did texturing in hardware and the GeForce did transform and lighting as well, but these were just accelerating parts of the fixed-function pipeline. You used the same programming model with and without this acceleration, it just made your code run faster. With pixel and vertex shaders, you had two separate code paths, one for the fixed-function pipeline and one with shaders. This was a bit more effort, but you were mainly using the shaders to do the same thing as the fixed-function pipeline, just with a few more special effects. With geometry and compute shaders, you can generate a lot of your data on the GPU. Writing fall-back code basically means either writing the engine twice or not using the new hardware to anything close to its full potential.

Not quite. The big change was between DirectX 9 and 10. If you use OpenGL, you can access the new rendering model on Windows XP which, according to Steam, gives you 80% of the PC gamer market (or, at least, the subset of the market that is willing to put up with DRM'd crap). DirectX 11 is a relatively small change from 10. You can do some extra stuff with it, but it's much easier to write code that uses 11 and falls back to 10 than it is to write code that uses 11 and falls back to 9.

If "no one really wants to use it", why are AMD (ATI) and nVidia implementing it then in their hardware? In the past those companies have been succesful, because they have made products people want.Yes it's true, we have to wait for the games that can use it, but so what? It's the same as with DX9 and DX10 before.

Besides you need a lot more than just OpenGL, because you need input and sound right?

This used to be much more true when DirectX wasn't artificially limited to the OS version. You just downloaded your new DirectX version and went to town when you bought a new video card.

DirectX 10 became a "Vista exclusive", despite the fact that unofficial ports made it work on Windows XP without much muss or fuss. It was an artificial limitation. So, in order to upgrade from DirectX 9 to DirectX 10, you had to buy a new video card and a new OS. Even some Microsoft games artificially limited detail to make the game seem better on 10 than on 9. Of course, a few clever hackers exposed this as well. DirectX 11 is and update to DirectX 10 and similarly incompatible with Windows XP.

This bs has left a bad taste in a lot of people's mouths. Couple that with the absolutely absurd Digital Restrictions Management in some PC games and the taste is downright sour. (Related note: Honestly, if you knowingly buy an Ubisoft game at this point, you're an idiot... their games are basically useless because of DRM now.)

While you may say "please step away from your Slashdot reality distortion field" in relation to DRM, Ubisoft's piss-poor DRM implementation has made a lot of people swear off their games on PC. Assassin's Creed 2 much? All the major game sites covered when Ubisoft's DRM server went down and no one could play it [joystiq.com]. So that shiny Ubisoft game you bought for your PC will only work when your internet connection is up and Ubisoft's DRM servers are reachable... even though you're not playing the game online. An

You'll have trouble measuring a real performance difference between OpenGL and Direct3D (which isn't surprising since both APIs are simply ways to queue up commands in buffers for the graphics card to execute)

Since Direct3D 9.0, both OpenGL and Direct3D are very equivalent in terms of features and ease of use. Neither is "more suited" to either games or serious use.

For long term projects OpenGL has been much more suited to "industrial" apps simply because it's a lot more stable. If you'd started a project ten years ago using Direct3D you'd have had to rewrite the graphics code three or four times by now. With OpenGL the ten-year-old code would still compile/run, no problem. This long-term stability has a downside in that OpenGL has a lot of accumulated cruft - functions which serve no real purpose these days or have better alternatives.

OpenGL ES is a cleaned-up, modern OpenGL which would be perfect for games but for some reason it's never really been pushed on desktop machines (which is a pity IMHO).

Direct3D is a teeny bit lower level when it comes to things like memory management (e.g. for fine control over where geometry/texture data goes) whereas OpenGL just says "leave it to the driver". This gives Direct3D a slight advantage for games.

The main reason Direct3D is used for games though is because Microsoft spends lots of money wining and dining the CEOs of games companies and making pretty presentations to the developers.

"Directx 8 - 11 have been very solid and offers features that OpenGL could not accomplish."

What are you talking about? The beauty of OpenGL is you don't HAVE to wait for new features, you simply programmed them in. You could do hardware tessellation RIGHT NOW WITHOUT DX11 by simply implementing an OpenGL call.

That's still where the majority of PC gamers can handle things well, too. (Their hardware may be newer than the consoles, but DX9 is still the majority support, and they have higher resolutions to cover.) The real questions is if the developer is even INTERESTED in targetting higher-performance hardware with unique features, or if they mainly want to use it to be "slightly shinier" and hit better framerates.

Why are modern games being judged based on their technological prowess? How is this holding back PC games? Games produced for five year old tech still run on modern machines. So what if games are targeted towards years-old technology? Are they fun? Are people buying them? There's more to a game that shading effects and the hundreds of hours that dedicated teams put into making realistic water ripples.

Games are sold based upon gameplay and fun. In this current market, those are more easily found in the console market. I don't see that changing.//PC Gamer since 1986///Now happily a 100% console gamer////Though I love to play Cave Story

You're somewhat wrong, because games are also sold by their graphics and sounds and such. You're probably thinking that great graphics and sounds make a bad game, but you can have the both. I enjoy some of the old games, but seriously I rather play with awesome graphics and sound environment too.

Also, you are missing one important thing. If you free more resources from the graphic rendering by using newer technology, you can have more resources on AI and other gameplay elements.

"Graphics" != "latest hardware". Graphics are important, but to a limited extent. The graphics created on five-year old tech pleases the vast majority of the market. The common gamer does not see a need to move to DX11 when games produced on DX9 are "good enough". I never said that graphics were unimportant, just suggested that continually pushing the graphical envelope is a fruitless journey.

//PC Gamer since 1986///Now happily a 100% console gamer////Though I love to play Cave Story

Your example of Cave Story just illustrated another point: PCs tend to be better for games from smaller studios. Indie games on PCs are commonplace; indie games on Sony and Nintendo consoles need a jailbreak unless some major label notices the developer. See Bob's Game [wikipedia.org] for an example of what Nintendo can put developers through. And the modding tools for PC games tend to be far more complete than for console games. For example, the stage editor in Super Smash Bros. Brawl is limited to just a few predetermined pieces on a grid; there's no way to add custom pieces, custom characters, or a custom soundtrack.

Exactly. About a decade ago I gave up PC gaming because I was sick of buying games only to find out my $299 computer wouldn't play them. I can pop any brand new game in my console and it just works. I find it hard to believe I'm having less fun in MW2 because it's only DX9.

There is no shortage of MMOGs. The category is growing, even, at an insane rate, despite (or because of?) WoW's dominance. There are only 24 hours in a day, and peeps who play MMOGs can never "beat" their game -- they are continuously rewarded for playing, constantly and forever, and pay monthly for the privilege in many cases.

Many no longer have the time or inclination to start a new, one-off PC game. I recall an interview with supposed "Diablo-Killer" Titan's Quest creators who attributed the poor sales of their well-reviewed game to the fact that their prospective player-base could not break away from their MMOGs.

1. They bring money from monthly fees if P2P or microtransactions if F2P2. As you need to connect to a server with a valid account, cannot be pirated

Seeing the low quality of the latest games (the cannot possibly keep a large number of players after a couple of months),it seem to me that the second reason is the real one for the constant production of new MMORPGS (companies are more interestedin the "fast money" from boxes sold)

...prefer game consoles. For starters, you're dealing with a uniform hardware platform. The core specs and capabilities don't change too often, only about once every 5 years or so. So if you are developing for the Xbox360, you only have to get it to work on one 360 and it should work on all. On a PC, you're encountering a vast array of hardware configurations. X CPU with Y Motherboard using Z GPU. So while you can optimize for a number of these, you can't do it for all and that leads to a certain percentage of your customer base complaining.

That and pirating console games is a bit tougher for the average consumer. Usually requires a hardware mod chip and most people don't feel they have the technical skill to install one. On the PC, piracy is pretty much fire up bittorrent, go to the piratebay, and download.

I think you're exagerating the amount of configurations developers deal with. Motherboard version is largely irrelevant. X CPU, well, if the software is designed to scale on the amount of cores, then the only concern is if the CPU is fast enough to execute what you need it to. Well, its the same with single threaded games but I would hazard a guess that most AAA games are capable of scaling. As for videocards, its largely about what feature set it has, and many of them share the same feature set, within sim

Why should devs adopt DX11? Because the last iteration of DX lasted about a year and a half before being ditched and extended/redone? Because the majority of the market doesn't have DX11 cards? Because there's no clear advantage in developing to DX11 rather than DX9c?

Why should developers shift from something they know to something that they don't know as well unless there was significant profit motive to do so? There simply isn't in this case.

Also, Windows XP does not support DX10 or 11.Microsoft did this thinking that it will cause the gamers to upgrade when new games need the new version (in the past games used the newest DX version that was available). However, since DX10 was on Vista, people did not upgrade, so the game companies continued to make games that work on XP since this way they can have a larger customer base (DX9 game runs on XP, Vista and 7), since intentionally restricting your customers is stupid.

Is the rush for performance and graphics killing the fun in video games?

I think so.

This argument has been made since 1992 and before. I should know; I actually made that claim in a USENET post in response to someone at Id software on the original comp.sys.ibm.pc.games board. Of course, that same company then came out with DOOM a few years later. I was wrong then, and you are wrong now. The rush for performance didn't kill the fun in video games.

What has happened, however, is diminishing returns. This used to be manifest in the size of the team and expense of the game. Now it's comin

Does anyone really think this cycle is any different? We're pretty much at the mid-point of the console cycle: PCs are flexing their muscle (again) and developers are reluctant to design just for PCs. But, as always, more will jump back on the PC bandwagon as it becomes obvious that the PC is the place to be for graphic quality (and the market loves eye candy). Eventually the console makers will decide to release new hardware to try to coax them back, and we'll repeat this cycle again.

Eventually the console makers will decide to release new hardware to try to coax them back, and we'll repeat this cycle again.

Except it appears the next generation of actual console hardware is far off. The new gimmick won't be better graphics but instead "Mii-too" motion control. Sony has the PlayStation Eye and the new Move controller, and Microsoft has Natal. And among the big three, the only console maker that has taken any effort to coax the smallest developers away from PCs is Microsoft with its XNA Creators Club; the others require a dedicated office and prior commercial titles [warioworld.com].

The thing that has changes is that the PC exclusive game has almost died out. There is of course still WoW and a handful of other tiles, but most of the big titles these days are basically console games first and the PC might get a port later on. This is even true for series that originated on the PC.

Another changed is that the hardcore gaming PC game has died out. The last one that really pushed the envelope was Crysis, but that is already over two years old. All other titles take a much more moderate appr

... instead of focusing all your energies on creating fancy graphics for your latest title, why don't you try something different like making the game actually compelling and fun to play?

I'm not an huge gamer, but my preference is to sit in front of my TV on my XBox 360 or Wii when playing games. In truth I couldn't give a rat's derrière about the graphics of the games I play so long as I find them compelling and fun. Then again when your business model is based solely on churning out the same game time after time and you only differentiate the games by the graphics I suppose this argument becomes reasonable.

Hey game makers, here's a clue: In the last few weeks I have played video games quite a bit due to a knee injury that's meant I can't do much else. If I think seriously about the amount of time I've spent playing video games recently, the one game that really sticks in my mind and has me itching to play it more is Bit Trip Beat on the Wii. Realistically I probably could've run that game on my 25 year old Amiga if I still had it... but damn that game's fun!

I've always been a PC fan all the way back to the original SimCity on my 286. Throughout the years I've also owned Consoles (Nintendo, Gameboy, SNES, N64, Gamecube, GBA, XBox, XBox360, Wii, etc, etc, etc). I've probably owned/built just as many gaming rigs as well.

Obviously I take gaming a little more as a hobby than just a time waster.

The one thing I have loved all this time is Multiplayer. It wasn't really possible back on the 286 unless you shared a keyboard as gaming on PC's was in its infancy. At this point in time it was easier to play multiplayer on one console with a friend.

A few years passed and the internet became a big thing. Quake for example was one of my favorites! Especially CTF online with clans. I even ran my own unsuccessful one but even so, it was a blast! Consoles couldn't touch this kind of fun! 5 on 5, 10 on 10. Just awesome!

Consoles at this time, really couldn't do this at all. XBox + Live just wasn't around yet.

Later on when XBox arrived and I got into the Live! Beta I started to see what multiplayer on consoles is like. Pretty fun! Problem for me here was that FPS games just weren't fun with a controller. I really did (and still do to a certain extent) need a keyboard/mouse combo to be a threat.

So for quite a while, I still preferred to play FPS's on a PC. However, this has changed as of late. Games that I want to play are either coming out without server support and/or mod support (Modern Warfare 2) or are simply outpacing my hardware. Combine those two and frankly, I simply don't want to upgrade my graphics card every year just to play the latest and greatest games. Especially considering that Modern Warfare 2 works just fine on my 360 and I get to play nice multiplayer battles. When it came out, my hardware was just as good as everyone elses. Sure, I have to get use to a controller, but it seems a small price to pay versus making sure my rig can handle the game (plus I run Ubuntu now).

In the end, I'm realizing that gaming on a console is just a _ton_ easier than it is on a PC. They both have the same options and generally roughly the same graphics. The only difference is the controllers.

In my mind, consoles just have the upper hand. Less cost, less hassle (juggling OS's), and the same multiplayer options. It has just become a lot more convenient over the years to play on a console.

That would be more insightful if PC games didn't typically hold back to support the older cards out there. Depending on where you are in the cycle, PC games are more like 2 years, if that, ahead of a console.

I like a lot of the open source games on Ubuntu, but to be honest you could play most of those on a NetBook so a "gaming pc" just isn't required. But I have also enjoyed some of the indie games on the 360 marketplace.

I like a lot of the open source games on Ubuntu, but to be honest you could play most of those on a NetBook

True, open-source games tend to work better on last-gen hardware. But netbooks? A lot of the games in Ubuntu's repository require a screen at least 768px tall, which netbooks tend to lack. For example, the Dell Mini 10 with Ubuntu still has a 1024x600 pixel screen. At least console game developers can rely on standard resolutions like 640x480 (SDTV and EDTV) or 1280x720 (HDTV).

Right now, consoles are behind PC gaming and derided by some as antiquated and holding back progress.

And then, in a year or two, the next generation of consoles will slightly leapfrog the average gaming PC, the death of PC gaming will be predicted, and the new commoditized hardware will sell like crazy.

The sales surge will fund ATI and nVidia's development of the next generation of GPUs, PC gamers will provide an eager market to test the next generation hardware, and the cycle will repeat itself.

The simple answer is that 95% of the PC gaming market** can use DX9 while only 56% can use DX10.

* That 39% for DX9 includes 22% people with DX10 hardware using DX9 Win XP.
** Assuming Steam account holders who allow the HW survey are indicative of the relevant PC gaming market. Personally I'm inclined to assume it's not far off, at least not so far that it matters.

If not being able to use the latest shiny things is holding things back, then I say good. Why should I have to spend 2 grand on the latest and greatest hardware every 6 months just to play the latest fad game, when the computer I bought 2 or 3 years ago still serves perfectly well for everything else? Computers are expensive, and last I checked most of the world is dragging it's feet out of financial crisis. Additionally, we reached the 'good enough' mark a long time ago. Pushing the technical envelope for the sake of pushing has been an exercise of diminishing returns for a while now.

The Nintendo Wii in particular has proven a very important point. Hardware spec wise, it's a pile of crap. Yet it's also a wildly popular platform. Why? Affordability is a significant factor. Also it's because instead of focusing on massive polygon counts and 1600x antialiasing and whatnot other geewhizbang features, they make games that are enjoyable to play.

The Nintendo Wii in particular has proven a very important point. Hardware spec wise, it's a pile of crap. Yet it's also a wildly popular platform. Why? Affordability is a significant factor. Also it's because instead of focusing on massive polygon counts and 1600x antialiasing and whatnot other geewhizbang features, they make games that are enjoyable to play.

Popularity has nothing to do with quality. Wii is popular because it's a very social platform. people like to have friends over and play things together. they never really care for game quality, as most Wii games are a pile of crap (just like the hardware), but they are fun when you're with friends.

Games as a business probably make more sense on the console. but just because most people are happy with crappy games and equally shitty hardware is that a reason to stop pursuing new frontiers?

Most computers being sold today contain crappy integrated graphics (Intel GMA etc). Only the high end expensive machines tend to come with graphics good enough to play modern 3D games on.

If you want a machine with 3D graphics capabilities, you need to either build one yourself or buy a high-end expensive machine. If you just buy your typical "house brand" PC from stores like Best Buy, Staples, Office Depot etc, you will get crappy graphics.

Whereas, for the price of a typical "gaming" PC, you could likely buy an XBOX 360 or PS3 AND 1/2 dozen games (if you buy the cheaper titles instead of the latest and greatest that is)

I have seen many el-cheapo machines from various places where adding a decent PCIe graphics card would be difficult (either due to lack of slots, lack of space or lack of power/airflow).Dell for example, I have seen a number of Dell models that would be unable to accept ANY PCIe graphics card at all. All of these models contained Intel integrated graphics.

No. Piracy is holding back PC gaming. PC sales are ridiculous low for most single-player, non-casual, PC games. Game publishers are doing the natural thing; focusing on consoles where the problem of piracy is much, much smaller.

IMHO the industry should be commended that it, unlike some other industries, fight piracy by changing it's way of doing business instead of choosing the path of litigation and legislation.

There are much bigger issues than graphics in this "Console/PC" debate. The really big issues are things like user interface and game controls. Take Oblivion for example- that game's interface was significantly altered to accommodate console play, which made it a sub-optimal for the PC: an overly simplistic UI and relatively poor use of screen real estate.

PC gamers expect a lot more from their games- private servers, LAN play, mods, etc.; and as the Modern Warfare 2 debacle showed us, game companies are showing less & less love for the PC. There's tons more money (and less hassle) to be make on the consoles. That's a MUCH bigger hurdle than "Console graphics are the holding PCs back!"

What's really interesting to me is how MMOGs haven't really made it to the console. I think that's because of the console's revenue model, which really only supports "throwaway" games with a very short life span. You'd think a subscription-style game would have amazing appeal for console game-makers, but where are the games?

Yes, games are being held back by consoles. PC games used to push the edge of the envelope, not they simply follow the consoles. It's getting particularly bad, with many games designed for consoles and then poorly ported to PC. It wouldn't be so bad if the studios would at least make an effort to port them properly. I've come across all of these problems in many games over the past few years:

- Poorly designed menu systems that do not support mice (keyboard/gamepad only)- Poorly designed keyboard maps that don't follow established PC standards, which leads to the next item- Inability to remap or customize keyboard controls- Games which do not support standard PC peripherals (e.g. some PC games only support console gamepads. I don't own an Xbox so don't force me to buy a damn Xbox gamepad to play your game). Same for driving wheels/pedals.- Games with severely limited graphics options. These are a must to tailor the game experience to the hardware and performance expectations.- Games with crippled graphics effects (limited draw distances, low-res textures, artificially limited environments, etc)- Games with poor savegame support, or only support checkpoints- Games being launched on consoles, with PC ports following very late afterward (sometimes 6-12 months later or never)

I could go on and on. Literally, there are very few games I've purchased in the last 5 years which do not have at least one or two of the above problems, with a few managing to tick nearly all of the above. I blame the cross-platform game development environments which basically force the game design onto consoles with PC's being treated as second class citizens. It's not likely to change either, as consoles are very popular and many game studios see them as a more profitable market.

I don't hate consoles, they are fine for what they do and I happen to own 2 (Wii and PS2), but the games I play on consoles are vastly different than the games I play on PC. I want my PC games to push the envelop of technology, sadly this seems to be against the trend.

Desktops are obsolete and laptops aren't powerful enough to run the games. That keeps me from PC gaming. There's no way I can be bothered or justify the expense of setting up a desktop just for gaming, and I already have a laptop for everything else.

The only way it could work is if my HTPC became a gaming PC too. However that would interfere with its HTPC duties, it would require a more powerful box and hence no 10W idling (and possibly even be noisy, ouch!) and I'd be playing on the TV which negates most of the advantages of PC gaming in the first place.

It's not consoles that're holding games back. It's Windows 7. All the hard-core gamers I know are still running XP on their gaming rigs because of the hit they get to frame rates running Windows 7. These are the people who care about a 5fps difference even when they're getting over 60fps. The game companies know these people are their core audience, and if they put out a game these people can't run on their rigs that game won't sell well. Those rigs run XP, XP won't do higher than DirectX 9c, so the game companies target DX9c. It'll run on the hard-core gamers' rigs, it'll run on the average consumer's Windows 7 machine, so there's no sense in supporting DX10 or DX11. The only games I've seen that require DX10 or DX11 come from Microsoft itself.

That's strange. My comment was posted anonymously... must've misclicked.
Oh well. One more point is that you can see Microsoft and Sony's desire to extend the lifespan of their current consoles with their willingness to release game-changing hardware peripherals like the Move and Natal. Last generation you definitely wouldn't have seen them focus so much energy on new control methods in the middle of the console generation. They would have instead waited for the next line of consoles.

That all sounds completely backwards. Console game developers don't have ballooning budgets and team requirements because they're on a console. Those are attributed to the blockbuster games, on PC and console alike. Additionally, developers shouldn't be learning whole new systems on a continual basis. This is what makes bad games and delays advancement. Once a developer has the code for a system perfected, they can turn their attention to focusing on the gameplay itself. Console games allow developers to o

Console game developers don't have ballooning budgets and team requirements because they're on a console.

Sony and Nintendo appear to require a minimum business size for console game developers. A micro-ISV [wikipedia.org] that tries to meet these will in fact experience these "ballooning budgets and team requirements". Perhaps ironically, the company that Slashdot users associate with closed source is also the most open console maker, with the XNA Creators Club.

Market size: With a few exceptions (WoW etc) console gaming earns a lot more money. Not just because console games usually cost 50% more than a PC game.

For console multiplayer against visiting friends, you usually need one console, one large monitor and one copy of the game. But for PC multiplayer against visiting friends, you usually need a whole LAN of PCs because most major-label PC games don't have a mode for gamepads and split screen. So you have a $60 console game vs. two to four copies of a $40 PC game.

Yes, but for a lot of games, split-screen sucks. Not only do you have only a portion of the screen, but your friends are probably cheating by looking at your screen to see where you are. There are games which are reasonable to play with local multiplayer, but for most, I'd just as soon not play at all... so I wouldn't really call that an "advantage" for consoles.

your friends are probably cheating by looking at your screen to see where you are.

It's not cheating to see where your teammates are in almost any game. (See Gauntlet or Secret of Mana.) Nor is it cheating to see where your opponents are if all players are in an arena. (See Bomberman, any WWE game, or Smash Bros.)

There are games which are reasonable to play with local multiplayer

So with the rise of TVs with PC-compatible VGA and HDMI inputs, why aren't these games ported to PC?

You only described games that don't require tactical elements where the opposite player shouldn't know everything. Civilization, Age of Empires, all those strategy games. Most of shooters like Call of Duty are a lot better too when the enemy doesn't know where you are. Those kind of games don't work good on split-screen, and on minimum lose all tactical elements.

Simple solution to the screen watching cheat - just accept it. It adds an interesting new element to the game. I remember back in the Halo 1 days the advantage would go to the players that could instantly glance and interpret what they saw.

Sure, but you can't really compare those two. Both console gaming with friends and LAN parties are completely different. I don't know why people always have to compare the two - you can have both.

LAN parties also offer one strategic element more - other people don't see where you are / what you are doing / what you are planning and you can have your whole full screen just for yourself. Our Call of Duty LAN parties would had been quite less fun if you knew where everyone was. No hiding, no surprise attacks,

Both console gaming with friends and LAN parties are completely different.

I agree. So why don't PC games support the console-style experience for players who have a PC tucked under the HDTV?

I don't know why people always have to compare the two - you can have both.

But why can't I have both on one machine?

Our Call of Duty LAN parties would had been quite less fun if you knew where everyone was.

If modes like Goldeneye 007 on N64 aren't acceptable, have you tried a team game? If you have two people on your split screen, can you do two on a team vs. two bots? Split-screen first-person shooters all the way back to FaceBall 2000 for Super NES have supported team matches.

So why don't PC games support the console-style experience for players who have a PC tucked under the HDTV?

It's been a while since I looked at the state of PC gaming, but most of the games that looked like they'd make sense as shared-screen games that I owned did support it. Some really old examples include things like Wacky Wheels, which supported both serial-line and split-screen modes. Future Cop LAPD is slightly more modern and it did too. Atomic Bomberman supported up to 8 players (I think), and you could have them on any mixture of computers. We played it at a LAN party where we ended up with more peop

I agree. So why don't PC games support the console-style experience for players who have a PC tucked under the HDTV?

People that have done such are so minority that it probably doesn't make much sense to developers. Most people now a day have console for that purpose. PC games used to have multiplayer split-screen support in a lot of games and we used to play so in the 90's (there was some fun games too, especially some freeware ones). But when Internet got around and LAN parties started to become more common, there wasn't really need for such anymore.

If modes like Goldeneye 007 on N64 aren't acceptable, have you tried a team game? If you have two people on your split screen, can you do two on a team vs. two bots? Split-screen first-person shooters all the way back to FaceBall 2000 for Super NES have supported team matches.

Co-op campaigns like Left4Dead, Borderlands and so on sure can work tha

But when Internet got around and LAN parties started to become more common, there wasn't really need for such anymore.

One of the use cases that I'm talking about is a fortnightly play date, a birthday party, or an annual family reunion. These cases tend to involve a lot of gamers in one place, and not all of them are willing to dismantle the home PC and bring it.

But since I like these strategy games and games where enemies not knowing where you are is important thing

Have three players and one machine? Play 3 on 3 against bots. Your teammates can and should know where you are.

But when we sit down for a beer or quick game, my consoles work just well for that.

Consoles tend to lack mods and indie games. If I want to develop games designed for people who "sit down for a beer or quick game", and I don't have the s

Sega & Sonic All-Star Racing has local multiplayer on both consoles and PC and it uses split-screen

Thank you; that's very much what I was looking for. I'm glad that at least Sega has recognized gaming HTPCs. Now why don't others? Where's a sequel to Atomic Bomberman after thirteen years? Where's a platform fighter like Power Stone or Smash Bros.?

>>>The Wii is a fisher price funbox designed for non-gamers and drunk idiots

Sure if you pretend that Nintendo doesn't have a 30 history of creating excellent games. I don't own a Wii but the games I've played (Zelda Twilight Princess, Metroid Prime 3) are just as good as those games I found on my Gamecube, N64, Super Nintendo, and NES. And just as good as on my Xbox, PS2, or PS1. I can't believe your comment was marked "insightful" since it's mostly just fanboyism..

>>>Most console gamers have short attention spans and prefer flashy lights and 5 mins of intense adrenaline to a game with a story.

How ironic you post this on an article about how PC games are not shiny enough. If Pc gamers care more about story than flashy lights, then why worry if the graphics are "only DirectX 10 instead of 11?) Probably cause you're wrong. I've met lots of PC gamers who refuse to play a classic like Wing Commander or Baldurs Gate 1 just because it's pixelated.

As for story, if console games don't like story, why are RPGs so popular on consoles? Once again I question why your fanboyish anti-console rant was labeled "insightful". Trollish is more like it.

Programming resources are finite and (since the gamer gets more bang-for-his-buck) consoles enjoy greater market penetration. If you were coding where would you aim your efforts?

Probably PCs, because Sony and Nintendo don't want to deal with micro-ISVs. I get more bang for my buck from actually developing the software than from trying to satisfy business overhead requirements such as "Home offices are not considered secure locations." [warioworld.com] And then I get further bang from my buck by porting to Mac OS X for two reasons: the game market there isn't as crowded, and more affluent Mac owners tend to buy more proprietary software.

Bang- for-buck only applies if you only buy a relatively few games over the life of a console. Console games are more expensive on release than PC equivalents. Remember, the cost of the console hardware is subsidized by shifting some of the cost to the games. Result for consoles: More expensive games.

You should do some research on OpenGL. If anything, it is more capable, advanced and supported on many more platforms when compared to DirectX. The problem started when MS decided they wanted to get rid of OpenGL as a game platform on the PC and replace it with DirectX. It gives them another lock-in for the Windows platform, which they have then used to sell DirectX to potential Xbox developers.

If you look at "professional" apps, there is hardly any DirectX; it's all OpenGL. And it's that way for a reason.

Why even mention the 360's use of DirectX 9 and ignore that the other systems.

Because they are not talking about DirectX, they are using DirectX 9 as a placeholder to mean 'graphics hardware with a fixed function pipeline plus vertex and pixel shaders' and DirectX 11 to mean 'graphics hardware with a fully programmable pipeline supporting general-purpose computing'. Whether you use Direct3D 11, OpenGL 4 and OpenCL, or some proprietary API, to program them is completely irrelevant to the point at hand.

This discussion is nothing to do with Direct3D versus OpenGL, it's about the di

The pad is a great control mechanism. I'd hate to remind you, but there are more games out there than just Third/First Shooters, MMOs and RTS games out there. I wouldn't want to play Tiger Woods '11 with a mouse/keyboard. I definitely wouldn't want to play Super SF4 with a keyboard(or a pad either; arcade controller plz). I wouldn't want to play Katamari Damacy with a keyboard or a mouse. The list goes on and on.

In this sense though, PC gaming has been holding back console gaming. When are PC