If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Lithander's "How Today's Videogames Miss Their Potential"

Originally Posted by lithander

If anyone want's to post this wall-of-text on the forum to get some discussion started I'd be eternaly greatful! :)

In interviews on RPS Bethesda's Todd Howard says: "I think people discount graphics. [...] I do feel that graphics and your ability to present something that feels new, real, and believable puts people in that environment where they can really enjoy what they’re playing."

Obsidian's CEO Feargus Urquhart says: "A lot of the other systems in role-playing games, they all work awesome and people love them. They still need to evolve and move forward a little bit, but what should combat be in that next big role-playing game? That’s one of the things we’re trying to zero in on."

We just need better graphics and a little work on some of a genre's sub-systems, and, voila, here's the pinnacle of what video games can aspire too? Really?

In my eyes their stance lacks all ambition! I can only hope that what the game's industry is "zeroing in" on is at best a local maxima.

I'm not a CEO of anything but here's what I have to say regarding the current state and future of video games.

I wanted to create games for the better part of my life. I enjoyed playing video games but what fascinated me was their yet unfulfilled potential. Hardware was growing more powerful at a staggering pace and games where the only kind of consumer oriented software that would motivate people to upgrade their systems every two years. It was this synergy between hardware and video games that allowed both industries to prosper. We were witnessing the birth of a new medium and I wanted to have a part in shaping it.

Moore's law, the prediction made in 1965 that the complexity of integrated circuits would double every two years, remained surprisingly accurate for over half a century. The improved capabilities of electronic devices changed our lives profoundly. When surfing the net with my smartphone, reading a book on my kindle, depending on amazon, google, wikipedia or the navigation assistant to solve most of my day-to-day problems I realize how technology has advanced way beyond what I could have imagined 15 years ago.

Video Games on the other hand turned into a disappointment. Not because my career plans failed - I have studied multimedia engineering and make my living in the games industry. But the product is not what 15-year-old me would have expected videogames to be like in 2013. Not remotely close. That's a thought provoking revelation for someone that devotes most part of his waking hours to games.

I'm not denying that video games have come a long way. But what improved the most are surface values. And along with the audio-visual quality our expectations rose, too. Just try to play a "timeless" classic again and you'll have a hard time to immerse like you used to. We got addicted to a fidelity level that comes with a high price tag attached. Losing the ability to enjoy our favorite games like we used to is the least of it.

Early video games were defined by their hardware platform. By lack of better options the state-of-the-art is always good enough. But it's easy to imagine how a game would be like with more colors, higher resolution, better sound samples, smaller polygons and better textures! Improvements like that are easy to imagine and easy to sell. And last but not least they are optional, too. Only a few developers dare to alienate the majority of potential buyers by requiring the top-end rig to play their game. The sensible approach is to target some baseline spec and add optional eye-candy to keep the system busy. Hardware vendors adapt, designing gaming oriented hardware aimed to maximize audio-visual payoff. Rendering capabilities become the benchmark for gaming hardware and consequently their owners prefer to buy the games that make best use of it; both games and hardware evolve with a strong development and marketing focus on visual quality.

Iterative improvement of what's come before is the modus operandi of any industry. Hit close enough to what player's know and like, just make yours a little better. Iterating on gameplay mechanics is risky but who'd argue with higher fidelity? Unique selling points beyond presentation are entirely optional.

It's a perfectly reasonable approach for the developer to take. For the industry as a whole it leads into a dead end.

The addiction to high fidelity becomes a limiting factor of it's own. All that high quality content is very expensive to make. Development of a AAA title keeps large teams of specialists busy for years. Millions of investments are at stake in a hit-driven business. Smart money looks for franchise potential. An environment like that doesn't promote taking chances. Worse, for the sake of quality you rely on content that is very inflexible. Static level geometry, baked lighting, hours of canned animations, hand-animated or performed by human actors, thousands of lines of text has to be voice-acted, too. Don't forget the lip syncing! In a fierce competition you can't afford the player to miss out on millions worth of content just 'cause you want to provide some room for meaningful decisions. So the real challenge is to fake the player into believing he's in control, when in truth every turn of events has been carefully planned and scripted to maximise asset-use. Interaction is predetermined or insignificant.

Of course, there's always a balance to maintain between player and authorial control. If movie-like aesthetics is your goal the current approach makes sense but let's not forget that photo realism isn't required to create immersion. Our mind is capable of forming a mental concept of things not actually present. Media is engaging the recipient on a creative and emotional level by guiding his imagination. On the surface books offer only language encoded in little symbols but the story that unfolds is not constrained by that. In comic books the action takes place between the frames. Our mind provides closure to missing elements. We build an internal model of the fictional world based on the input we receive regardless of how abstract it is as long as we can interpret and integrate it effortlessly. We make predictions and the quality of our experience depends on the accuracy of these predictions as the story unfolds. All parts of a piece of fiction have to fall into place or immersion is broken. Of course there's room for surprises but they have to make sense in hindsight. What happens has to accord to the laws of the imaginary, second world. The moment disbelief arises, the spell is broken, and you're back in the primary world.

In modern video games, the contrast of pseudo-realism and the emptyness behind it's surface makes suspension of disbelief hard to maintain. Welcome to Uncanny Valley!

At the time I made my career choice I expected games to a evolve towards emergent gameplay. To allow that games need to provide systems for players to engage with that are inherently interesting to interact with. They shouldn't have to rely on extrinsic reward mechanisms (like achievements or, yes, the storyline) to hook the player.

Given that games are unique in how they allow us to interact with them instead of presenting their content in a fixed sequence we can only witness unfolding, the potential of video games still seems amazing. But what matters beyond surface values is that the World we're visiting is forming a coherent, predictable system that always plays by the rules. Tetris meats those requirements but humans like to relate to and interact with equals. I hoped that we would find a way for autonomous agents to be more then shooting targets. Given a simulation that can integrate our actions accordingly there would be nothing to stop us from employing our own creativity to solve challenges or develop real feelings towards virtual characters. This is the key to genuine player agency - everything we deal with today is fake and make-believe, a waste of resources on facades.

Video games were already pretty fascinating and immersive in 1990. I just assumed that they would become increasingly more fascinating and immersive at the same rate as hardware grew more powerful. This is where I erred. It's the same fallacy the led AI researchers in the 60s to boldly claim that AI would surpass human intelligence within a generation: The thought that all the constraints are technology-based.

We still have no good understanding of what intelligence and consciousness are let alone how to simulate it. But in 1997 the chess computer Deep Blue won a match against the world champion Garry Kasparov. Deep Blue wasn't intelligent. He won thanks to sheer number crunching power, a huge databases of opening moves, static analysis of thousands of recorded games and a lot of fine tuning. Eventually the hardware power at their disposal sufficed to compensate for the lack of an truly intelligent approach.

Parallels with the game industry are glaringly obvious: There's "Game Theory" but it doesn't help us to understand the human heart and mind. It does explains a lot about rational decision making so it has it's merritt if you want to understand the current state of global economics which is in large part driven by algorithms and financial models. But humans - they are a lot less rational than economics used to believe. So, as far as video games are concerned we rely on limitted interactivity, canned content, storytelling tricks borrowed from other media, of-the-shelf engines and established gameplay-mechanics to produce endless variations of the same proven themes and genres.

The big players in the industry leave it to the Indies to think out of the box. They've mostly given up on finding new solutions to unsolved challenges. But when the big budgets are spent on producing generic clones, can you expect the Indies with their limited resources to compensate for that? They prosper in niches but pushing the limits on topics like AI is well beyond their scope.

It seems like the game industry has lost it's ambition and ability to evolve the medium.

NalanoH. Wildmoon
Director of the Friends of Nalano PAC
Attorney at Lawl
"His lack of education is more than compensated for by his keenly developed moral bankruptcy." - Woody Allen

It was totally flawed but it attempted something unique, an emergent world you had to discover and solve things yourself. It was made years ago. It is weird no one attempts even close to that sort of thing anymore. Maybe the players dont want it nor can the developers deliver it?

We still have huge flaws in AI and physics, worlds that have no solidity to them. And its not through lack of processing power or programming ability. You could easily do these things, just the focus is on tesselation and pixelshaders(we still havent got decent dynamic shadows). As if they are just the things that make the world real. AI still doesnt interact with the world and is unable to navigate even at the toddler level.

Okay, so to distill all of that down to a few salient points, I presume that this is what you were getting at:

Originally Posted by Lithander

- Games have stagnated- Significant technical innovation which has now stagnated in some areas, like AI
- Makes old games hard to play because... old graphics suck?
- Fidelity++ leads to a dead end, leads to linearity because assets only fit one way (really?)
- Photo realism isn't required to create immersion
- Comics need imagination to read (not a good point, has no relevance to games)
- You wanted emergent gameplay and want this to become a "new thing", less manshoots
- The 90s were the Golden Age of PC gaming because... optimism?
- AI is still bad because they're not human
- Indies will save us with innovation... except they can't, because they're too small

Okay, so to respond to that:
I'd argue against the 90s really being the "golden age" that so many people like to make it out to be. Were there more "innovative" games back then? Yes, but gaming was new. Even then some of the industry heavyweights were Doom, Wolfenstein 3D, Quake, and Duke Nukem 3D, all of which are FPS games. Half Life is a tightly scripted FPS and is considered one of the greatest of the 90s. A lot of the other "classics" didn't sell very well or were largely under-appreciated at the time. People are still making excellent games today that aren't manshoots but they're overshadowed by manshoots because manshoots sell well, and always have. Yes, technical improvements have made old games hard to play in some cases, but that's not just because the graphics were primitive in terms of aesthetics, but in terms of gameplay too. XCOM's interface for example is a mess because it has such a low resolution. Real-time rendering is hard, and compromises were made that directly impacts on gameplay, like draw distance. That's different to something like an old movie where you can accept it as part of a limitation of the time and it can be effectively managed. But in games it's frustrating.

I don't agree on the fidelity arguments leading to linearity... or I've missed the point. Yes, there's an expectation of voice acting etc which can lead to obscene amounts of content encouraging devs to cut corners and reduce the scope of the project. But there's nothing wrong with linearity if the game is good. HL2 is entirely linear, it's one of the most scripted games around, but nobody seems to care. With procedural generation the burden might be taken off artists to keep churning out high quality models for all characters but we're still a way off that yet. Voice acting? There's no substitute for a good, human voice actor, and probably won't be for a very long time. Same with AI - it's hard to make something human-like as I'm sure you're aware, the lack of AI development has nothing to do with a push for graphics over everything else.

Emergent gameplay seems to be a catch-cry around here but the problem is that it's hard to actually design for it. Some might argue that if it's really emergent you can't actually design for it. Games like that can come off as unfocused or directionless, and risk doing nothing well at all. We need sandbox games or games that allow freedom, but you seem to imply that they're inherently superior to linear games and we should drop the linear game designs to focus on "emergent" designs. I disagree, there's a place for both of them.

As for "indies will save us"... they won't, because apparently they like to copy each other, target the casual market, or just keep rehashing the 90s and usually in largely ineffectual ways. Remaking the 90s is not innovative and in some cases it's not even iterative. The number of indies actually breaking new ground isn't particularly significant. Not that there's anything wrong with these approaches (most of the AAA devs don't innovate either, Valve's last innovation was the Portal gun after all, everything else is iterative or nothing new) but the indie utopia of originality extends pretty much only to art. The gameplay? We've seen it all before. Taking a look at Kickstarter even with access to crowdfunding indies still want to stick to safe mechanics and gameplay (because it works, it'll probably sell).

There are emergent games available, and good ones. They are all rather limited in one sense or another. This is because emergent games are really flipping hard to make. New tech doesn't help as much as all that. Designers and programmers need bigger brains.

There are emergent games available, and good ones. They are all rather limited in one sense or another. This is because emergent games are really flipping hard to make. New tech doesn't help as much as all that. Designers and programmers need bigger brains.

You are mistaken. They need someone to give them more money. Unless you want to go the DF route of ascii graphics and 10 year one man dev cycles, which is fine with me but most people have problems with it.

Money mostly only buys people. I don't think you can solve complexity issues by throwing more people at them. More people just gets you more assets, but your design team needs to be small enough to maintain a coherent vision and to keep all the systems and mechanics straight. Too big a design team and you'll have a diluted mess.

I think a lot of the reason DF works is because there is that one chap with a very strong vision, and those 10 years to iterate endlessly. Do you think it would be a better game with 5 people implementing things and trying to get them to behave together? 10 people? 20?

I agree with Todd Howard. A couple of days ago I spent a couple of minutes throwing various objects in Skyrim's water.
I get the impression that when a developer says "graphics" people only think about the look of a game, not the interactivity better graphics can lead to. Visual fidelity is not all graphics are about. I like games where the environment reacts realistically to my input. There are a million abstract games out there. I'm getting tired of these empty shells, and I'm also getting tired of the "imagination is enough" rhetoric.

Would you rather imagine a tank shell destroying a wall in Battlefield? Would you imagine water ripples and caustic lighting in Crysis 3, or would you rather take advantage of them to track cloaked enemies? You think FEAR would be so highly regarded if it had Quake 2 graphics instead of the utter chaos in each battle? Physics don't matter in Company of Heroes? Would Prototype be the same without its on-the-fly transformation, hundreds of NPCs and destructible objects and free-flow parkour? You can turn the new XCOM into a boardgame if you have enough time to waste, but you can't recreate Silent Storm's physics based combat. Look at the progress from Total Annihilation, a great concept limited by the technology of its time, and Supreme Commander. Hell, I think TA actually had a couple of scale upgrades over the years as PCs improved. Minecraft looks shitty but its success is a direct consequence of what its graphics engine allows players to do.

By the way, mentioning the uncanny valley instantly destroys any argument.

Going off on a slight tangent when you compare videogames to other mediums, video games don't come off that bad at first: the biggest seller of books in recent years is a bondage fan fiction, most major movies are possibly even more stupid explosion fests and outside HBO searching for good tv is a ballache. And just like videogaming other mediums have out of the box ideas outside of the mainstream.

On the otherhand, is there a videogaming equivalent of "1984"? I don't just mean copying the themes of it, but a videogame making an impact that 1984 has on society. Or even just having a videogame with half decent social commentary.

I think sandboxes are the future in a sense, as no other medium other than videogames can do what sandbox games do. But we also need writing in games that bring new ideas to the table, not just ideas borrowed from other mediums.

Money mostly only buys people. I don't think you can solve complexity issues by throwing more people at them. More people just gets you more assets, but your design team needs to be small enough to maintain a coherent vision and to keep all the systems and mechanics straight. Too big a design team and you'll have a diluted mess.

The "great man" theory was thoroughly debunked a long time ago.

The reason that development studios don't make deep games is because the return on investment is fantastically low for them. If 80% of players only see 20% of your content, then all the time and money you spent making the remaining 80% of that content was wasted.

NalanoH. Wildmoon
Director of the Friends of Nalano PAC
Attorney at Lawl
"His lack of education is more than compensated for by his keenly developed moral bankruptcy." - Woody Allen

You are mistaken. They need someone to give them more money. Unless you want to go the DF route of ascii graphics and 10 year one man dev cycles, which is fine with me but most people have problems with it. Its not the designers and programmers whose vision is limited.

Except apparently it is, because looking back over the Kickstarter games most of them rely on "Remember this from the 90s? Well, here it is again!" as well as indies churning out platformers with their own artistic bend attached because Braid did really well, and so did Limbo, so let's jump on the bandwagon!

Truth is, the kinds of projects being talked about have always been once in a blue moon things. That's partly because people like to stay with the safe, popular alternatives (with manshoots being the current flavour of the century), and also because quite a few of them failed miserably for a multitude of reasons. Money isn't the issue in the age of Kickstarters and where we've become accustomed to lo-fi indie games (which use it as a stylistic choice these days).

Dwarf Fortress is a terrible example, because the issues with DF is a complete lack of useful documentation and in-game help combined with a complete unwillingness to make the interface better to implement proper tileset support. The community picked up the slack. DF's problems could be easily solved with more community involvement, and it'd be free.

Originally Posted by arathain

Money mostly only buys people. I don't think you can solve complexity issues by throwing more people at them.

As Nalano said the "great man" thing doesn't hold water but you're right in another sense. To use an analogy I'll take hospital funding. A fun catch phrase for politicians over here is "We're spending $x million to improve healthcare, enabling another 100 public health beds to cut waiting lists!" Which sounds great, but it doesn't fix anything unless you have the staff to man those extra 100 beds, as well as all the support staff, equipment, time, and so on, and at the end of the day you could get 100 long-stay acute patients and the waiting list barely budges.

The same thing applies here as you sort of point out - you can throw money at a problem as much as you like but all it does is offer the potential to do something. Unless you can actually make use of that money to achieve your goals, it's being pissed into the wind. Things like AI or emergent gameplay aren't held by in actuality by lack of money, because there's more of it around than there has been in the past. It's that they're hard to make or to get right. Case in point: Trespasser. Cost a fortune, had a lot of technical innovations (few of which ever actually worked right), ended up being a shit game. If they'd had more time and perhaps more money would some of those issues been ironed out? Maybe, but probably not - it was too far ahead of its time and the capacity to do what they wanted just didn't exist. Some issues in gaming can't be solved with a big wad of cash.

Urquhart quote made me think about the fact that compared to table-top RPGs cRPGs are stuck in 90s. In 00s there were a big push towards narrative and theme-based systems, but cRPGs are mostly about simulating all aspects of a game world with a big set of stats. Which is fine for more open world games, but theme-focused games can benefit form more specialized stats. Actually, we do have such stats, but they are mostly limited to morality and relationship (and survival) meters with a few exceptions like Mask of the Betrayer's hunger mechanic and ME3's Total Military Strength (which doesn't work so well).
But what if, for example, Mass Effect had a meters showing Shepard's stance on non-human races or Council loyalty? It would add more weight to the game's thematic conflict while avoiding "I want everybody to love me" pitfall of relationship mechanics. Or a stat for Normandy's crew resolve in the face of Reapers's invasion or human crew members' xenophobia.
Or it may be a system based on character's psychological state or beliefs and so on. A lot of possibilities to explore.
Another TT RPGs' trend is giving players some narrative control beyond their characters, but at the first glance I don't think that it would work for cRPGs.

Although I am a huge proponent of the emergent gameplay and player agency ideal I find it very hard to agree with the summation that games (or the industry) are missing their potential. I'd actually argue that it's right where it needs to be - it all makes sense. This is an industry after all and people make what people buy.

As for the "missing the potential" bit I don't really see where this is coming from. Apart from comfort food games like Call of Duty the biggest game in recent years has been Minecraft - emergent gameplay at it's purest. Skyrim as well. And big publishers are pushing with more games of this kind, with Deus Ex HR, Dishonored, and Far Cry 3 most recently. In my mind these are only the 1st baby steps in to a world of game design that has not been explored as much as linear games (which is a game design philosophy that has IMHO peaked with Portal in 2007).

I don't want to single out a decade and call it the golde ange. But Half Life brought something new to FPS: Tight scripting and focus on narration. That didn't exist before. It also came with a flexible (moddable) engine and groundbreaking Squad AI.

A lot of games we remember from the 90 are unmatched in their ambition.

The thing with fidelity is that it narrows down designers choice. When every conversation in your game is expected to voice acting this goads you towards linearity. Mass Effect tried to offer choices but they had to make sure that the branching would be as cheap as possible despite maintaining AAA quality. It was better then no branching but it showed. Multiple dialog options playing out exactly the same. Stuff like that.

I don't think that designer and programmers vision is necessariyl limitted but that they can't really put all that ideas into a game that has to look&play like people have come to expect.

Games that use only text (Fallout for example) can add a lot more variations and branches because it's much cheaper to add. But you can't sell those anymore.

This is why I mentioned books and comics. They convey the experience requiring a lot less resources. I argue that the quality of a game is in the player experience and that cheaper means don't necessarily diminish that. This is what some Indie games prove nicely.

But to make real progress you need money. Pay a team of specialists to engage in R&D ans see what they can come up with. Only the big publishers could afford to find it. This isn't happening, though. They rather spend the bunk of the workforce on producing assets because they know that's enough to sell their games.

Urquhart quote made me think about the fact that compared to table-top RPGs cRPGs are stuck in 90s. In 00s there were a big push towards narrative and theme-based systems

Narrative and theme based systems have been around since the seventies, they've hit the mainstream several times (Pendragon, Amber, White Wolf et al).
This is kinda the problem when talking about games fulfilling their potential. Like any medium, games are subject to fashion trends, and those trends tend to move in cycles. The FPS is popular now, as it was in 1995 before being overtaken by the RTS. Focusing on hardware is easy, because this is predictable - it's only going to get better. However when it comes to implementing that improved hardware, we're at the mercy of the market. You can advance AI in leaps and bounds, but if multiplayer is what's in at the moment few developers are ever going to use it, nor is anyone likely to pay attention to it if you did.

Originally Posted by Nalano

We just need better graphics and a little work on some of a genre's sub-systems, and, voila, here's the pinnacle of what video games can aspire too? Really?

Actually yes. The problem is you're taking their quotes out of context. The Elder Scrolls as a series is pretty much focused on presenting a virtual world to the player. So yes, if you want to improve that game then increasing the fidelity of the world is the way to go. Similarly, the games Urqhart is looking to riff off were primarily about story, so improving the non-story aspects is again a good way of improving the game as a whole. Neither were talking about games as a whole, and attempting to treat games as a whole makes zero sense, unless you consider the goal of Space Invaders to be identical to the goal of The Walking Dead. Trying to assess whether something has improved or not is impossible without considering what it set out to do in the first place.

You also have to be careful regarding innovation or lack thereof. Like the seven basic plot theory of literature, if we leave the scope broad enough we will never see any innovation, because our definition of what a game is encompasses all games which exist or could ever exist. If we were to look for a better chess playing AI it's highly likely we can find one, if on the other hand we're looking for something other than a game where you play with or against another player then good luck with that ;)

It was this synergy between hardware and video games that allowed both industries to prosper.

There's also a problem with equating better games to better technology. Technology only plays a small part in what makes a game good; if you ask someone why they liked Baldur's Gate 2 they're unlikely to put it down to the higher resolution, improved particle effects or similar. No amount of technological progress is going to improve the quality of writing in the storyline for example. If we accept that what made Baldur's Gate II good is actually the non-technological things like the story, writing and similar then what Urqhart is saying makes sense - you can paraphrase it as "we have this style of game which was popular in 1995, which parts of it can we improve on with moden technology and which do we have to retain".

I did a double-take for a moment before realizing those were Lithander's words you were quoting.

Originally Posted by soldant

As Nalano said the "great man" thing doesn't hold water but you're right in another sense. To use an analogy I'll take hospital funding. A fun catch phrase for politicians over here is "We're spending $x million to improve healthcare, enabling another 100 public health beds to cut waiting lists!" Which sounds great, but it doesn't fix anything unless you have the staff to man those extra 100 beds, as well as all the support staff, equipment, time, and so on, and at the end of the day you could get 100 long-stay acute patients and the waiting list barely budges.

The same thing applies here as you sort of point out - you can throw money at a problem as much as you like but all it does is offer the potential to do something. Unless you can actually make use of that money to achieve your goals, it's being pissed into the wind. Things like AI or emergent gameplay aren't held by in actuality by lack of money, because there's more of it around than there has been in the past. It's that they're hard to make or to get right. Case in point: Trespasser. Cost a fortune, had a lot of technical innovations (few of which ever actually worked right), ended up being a shit game. If they'd had more time and perhaps more money would some of those issues been ironed out? Maybe, but probably not - it was too far ahead of its time and the capacity to do what they wanted just didn't exist. Some issues in gaming can't be solved with a big wad of cash.

The educational system has always been criticized as having intractable problems where "throwing money at it" never solves them. Those arguments have almost always been offered by Republicans who had no intention of funding public schools in the first place.

In the sense that you can grossly mismanage a lot of money down the drain, they're absolutely correct. But that assumes that the money will always be mismanaged. The irony of this is, thanks to their insistence on pinching pennies everywhere, they get exactly the management they pay for, and as such the whole practice is a lesson in self-fulfilling prophecies.

The school system in specific and the public sector in general has been hamstrung by a brain drain ever since the Reagan administration started defunding public agencies. We got unfunded mandates, so agencies which were supposed to keep infrastructure maintained or to oversee a sector of the economy gradually became incompetent - primarily because they couldn't afford to keep educated, trained staff at market-rate wages, effectively turning a public-sector job into a monkish vow of poverty - and this in turn fed into the Republican narrative that these agencies can't do anything right and we'd best not throw good money after bad.

Teaching has become by far the lowest monetary return for a Master's degree one could get - to the point where getting a Master's in education has to be heavily subsidized else the prospective teacher will be awash in debt for much of their professional lives - leading any self-respecting post-graduate student to abhor teaching, and lo and behold educational standards have been slipping since the 1980s.

So yeah, you can throw money at the problem. Money is, indeed, the defining issue of the problem. If you want the best and the brightest to come and solve these "intractable" problems, you pay them as if they were the best and the brightest, not the bottom of the bargain bin.

That said, both your national health service and my public school system are public services. Gaming is a private enterprise, and as such I believe the solutions are somewhat different: Specifically, I believe the problem is profit motive. Sure, you get eccentrics who live in exile in Montana for a decade and come up with a complex system after ten years, but they're clearly not doing it for money. Most everybody else doesn't put in effort because the motive is pre-ordained: Make a profit first, and make a good game second.

As I said earlier, making content for a game that most people will not ever even see is wasted time and money and a cut in profits. Further, I'd say, the extra three to five years or so that you'd have to tack on to development for your magnum opus RPG so as to present meaningful choices for all types of characters in a world where you have relatively free agency (ie: a sandbox with the depth of a narrative RPG) would, even if you could afford it, eventually just hurt you anyway as you lose the initiative to any company that's doing anything remotely similar and has a shorter development schedule. Their game may not be as deep, but your game will be seen as old hat by the time it does come out, effectively punishing you for your efforts. It's economic suicide.

What does this mean for gaming? To me, if you really want to see games with the depth suggested, you petition the National Endowment for the Arts to do with game designers as they do with prospective writers and painters: Give 'em a grant to hole up in some liberal arts commune somewhere for five years and make something worthy of their talents. But to even suggest such is to argue that gaming is a national treasure and should be fostered. Do you think yourself capable of making that argument forcefully?

Last edited by Nalano; 17-02-2013 at 01:55 PM.

NalanoH. Wildmoon
Director of the Friends of Nalano PAC
Attorney at Lawl
"His lack of education is more than compensated for by his keenly developed moral bankruptcy." - Woody Allen

The Elder Scrolls as a series is pretty much focused on presenting a virtual world to the player. So yes, if you want to improve that game then increasing the fidelity of the world is the way to go.

The characters that inhabit the world still seam fake, and that is not going to be fixed by higher texture resolution. They are puppets playing out the bits and pieces of the overall narration but they don't feel more alive then they've been in Morrowind.