Oct 28, 2004

I Said Lunch, Not Launch!

(I will buy a cookie at State of Play II for the first person who can decode the obscure reference point of my title...)

Does the condition of a massively-multiplayer persistent-world game at launch make any difference to its long-term economic success? Completely leaving aside aesthetic considerations, what is the wisest business strategy:

1) to push MMOG development on a tight schedule, capturing the core customer base, and patching as you go, knowing that designers might never actually judge a game to be complete or ready if left to their own devices, and that a MMOG is never truly "finished";

2) to polish a MMOG until it is very stable and substantially feature-complete, regardless of how long it takes to do so?

Excluded middles are welcome to weigh in--but how do you know when you're at the right balance between the two?

This is obviously not just an academic or idle question with the release of Everquest 2 and World of Warcraft imminent, the former within two weeks, and with the Jump to Lightspeed expansion of Star Wars: Galaxies having come out yesterday.

Most gamers tend to strongly (stridently) argue for the latter option, that it is always better to wait and polish and perfect (even though they may also eagerly anticipate the release of a particular game). Aesthetically, I would agree, and I suspect so would most developers. But in business terms, it's not so clear. Essentially, it boils down to this: do you lose a substantial enough potential base of customers by launching prematurely to justify the enormous continuing expense of development costs before revenue comes in through box sales and continuing subscriptions?

If you were guaranteed 100k customers at launch regardless of the condition of the game, and would lose only another 25k "borderline" subscribers due to bad conditions at launch, is it worth another 2 months of development costs to rope in most or all of the borderline subscribers? Especially if there's a similar product coming out from a competitor that might steal those subscribers away from you in the interim? Especially if you're not certain that two more months--or two more years--can actually "fix" a MMOG sufficiently that it is reasonably bug-free, stable, and rich in feature sets and content?

The problem for me in judging this is that most of the test cases you could use to establish a reasonable rule of thumb are profoundly debatable.

For example, would Star Wars: Galaxies have double or triple the number of subscribers it has today if it had waited another three months to launch? I once would have said yes, but I would now say no--because I now do not think the bugs, instability and design problems in SWG were dependent upon a rushed launch. I think instead they're predicated on an overly complex, fussy, baroque design and on live team management problems, including numbers of staff on the development team, problems that another two months or ten months would not have fixed. Plus it's clear that some of the people who tried SWG and didn't like it would never have liked it because of some fundamental design decisions it made about how to instantiate Star Wars in MMOG form.

Would Asheron's Call 2 have double the subscribers if it had another two or three months of development before launch? Again, it may be that the flaws of the game were far deeper than early bugs, and not correctable through extended development.

Is the success of City of Heroes due to a nearly flawless launch? Maybe, but you could just as easily argue that the game's success is due to genre, to satisfying a particular market niche (simplified combat-centric MMOG), or due to the luck of good timing (launching in an open window when there was little else available to MMOG players looking for the next new thing). Would CoH have failed or underperformed if it had been horribly buggy or unstable?

Judging from a wave of onlinediscussions of Jump to Lightspeed, Everquest 2 and World of Warcraft, both Jump to Lightspeed and Everquest 2 are launching "prematurely", e.g., with a significant number of bugs and stability issues, not to mention unfinished features sets, hasty design decisions, and relatively untested game mechanics. At this point, I think it would be fair to say that this appears to be a company policy for Sony Online Entertainment, for which it receives much abuse. But it may be a sound business approach if it turns out to have little impact on the bottom line, if customers will pay regardless of the condition of the game, or if waiting longer to launch does not bring in compensatory revenue that justifies the delay.

Is there any way to know counterfactually what difference a launch date really makes, to answer the question "what might have been"? If you just had to think about the corporate bottom line, would SOE's approach make good sense?

Comments

A while back Jack Emmert (in a seminar) attributed City of Heroes' success to the game design and not the successful launch. Sidekicking, ease of use and the shallow learning curve were what he claimed as the hooks for CoH.

That said, I can't believe the pristine launch didn't have something to do with it. The hard-core group was maxed out on the game within a month of launch and had already moved on, but the good word of mouth they and the other early adopters had about the game because of the successful launch drew in a crowd of second-hand gamers. I know several folks who are now devoted CoH players who have never played a MMOG before in their lives.

IIRC, the saying was from an old(the 70s) sunday morning humans with humans in costume show. More then that I don't want to remember.

The only to guess if more people who of joined would be to count thoses who initially joined and within 3-4 months dropped the game. However this is also filled with errors just because people get home sick for thier old MMORPG.

With SWG those in beta who kept saying the game was a message in design and implementation were constatly told they were just talking the game down in order to play for free. So with it was not really effected by the bad reviews of players. A delay would not of helped since it is still in really bad shape, with the expansion pack pushing back much needed combat and other fixes.

However with CoH not many people were originally startly to play it, but word of month got around and people started to pick it up. So a release where alot of the game was in place did help.

For AC2 it would of really helpped. They had that beta client included in a game magazine which helpped get it out to alot of people, but once they saw it was incomplete it was dropped. If it was released in the current shape it is they would of done really well.

However the granddaddy of the all would have to be AO. Rotten release however if it was released in the shape it turned out, and at the same time, it would of easy beaten DaOC. Just because a fair amount of people were looking for a new MMORPG right around that time.

It seems to me somewhere in the discussion you associated option a) with not only an 'unfinished' game (by definition nigh every MMO) but a buggy, prematurely launched game, often with fundamental design flaws.

I present: a+++) To focus on the core gameplay experience and launch the game with the minimum set of features required to make the game very fun to play. Be sure to iterate and refine the design, feature-set and user inteface during a long testing period, during which you can introduce new features if you like, but remain focussed on the core experience. Ship it, then expand in the directions that the players enjoy.

This approach has many advantages, primarily avoiding the numerous problems and costs associated with overly complex designs and bloated feature sets. It has some problems, too; notably that players have been conditioned to expect 'kitchen-sink' games with a free pony in the retail box. I think for the online games business to grow into a healthy industry they will be re-educated. Downloadable distribution and free trials are a related trend.

I see City of Heroes as an example of the above, btw. Puzzle Pirates also followed this pattern (though I think we added too much during testing) and I believe that it's the preferred modus operandi of some Korean developers such as NC Soft (original Lineage was a tiny, tiny game that was fun to play).

I don't LIKE it, but I don't see many games that open small and grow to be big. Either you open big, or you don't get big--that's where the market seems to be. (CoH opened big, btw--it was on the top ten seller list for weeks, starting right at launch). So while I agree with Daniel in principle, I just don't see the market operating that way.

Raph wrote:
I don't LIKE it, but I don't see many games that open small and grow to be big. Either you open big, or you don't get big--that's where the market seems to be. (CoH opened big, btw--it was on the top ten seller list for weeks, starting right at launch). So while I agree with Daniel in principle, I just don't see the market operating that way.

Runescape and Habbo Hotel come to mind.

--matt

Interestingly, the market for text MMOs seems to work almost opposite.

Whoops. Regarding the market for text MMOs: All the biggest ones are years old. The closest thing to a 'splash' release that anyone in text has done since probably 1996 was our release of Lusternia 3 weeks ago, which immediately became one of the larger text MMOs out there.

I suspect, though, that this isn't an inherent feature of the text market. It's more of a side effect resulting from the fact that few text MUD developers think about this kind of thing to begin with, being largely hobbyists.

I was the one who submitted that story to Slashdot. I actually submitted it to Terra Nova first and then Slashdot (which is why the same topic appeared on both websites I imagine). I have loved reading all the comments.

Timothy> But it may be a sound business approach if it turns out to have little impact on the bottom line, if customers will pay regardless of the condition of the game, or if waiting longer to launch does not bring in compensatory revenue that justifies the delay.<

Do you really think there is a good business case for selling a buggy product? Imagine a production manager at Sony Manufacturing proposing that they rip the QA system off the production line to save money. “After all, we’ve tested it in the lab. And if there are defects, customers can send a complaint postcard.”. I don’t think they would keep their job long. Yet, EQ, and probably SWG, had no discernable QA system on the production servers. Debugging complex code is hard. But most of the bugs that annoyed me in EQ were simple database entry errors that I believe a decent statistical system would pick up.

I think one aspect of that is game programmers usually come from a small shop background. They don’t want to be subject to industrial scale QA. (As a small shop programmer, I know I don’t). Yet the large MMOGs have reached industrial scale production, without taking on big time manufacturing organization. Maybe SOE could second some production engineers from one of Sony’s manufacturing arms? I have no problem with the “toss it over the wall” approach in ATITD, which is only two guys who are totally on top of their code. I’d look for a more structured approach in bigger teams.

That being said, I think Microsoft has trained users to accept buggy code. Most users see simple DB typos as “buggy code” and don’t get as frustrated by it as I do. So if you’re in the right spot to make a ton of money, the bit you lose to buggy code isn’t a big problem. As several people remarked, other factors offset the effect of bugs in the product, like good gameplay, attractive art, good timing, marketing channels etc. I do think that building a QA system into the world from the beginning would save money, and a lot of aggravation for the players. It would probably make working on the game less fun though, so I am not holding my breath.

If you look at the growth chart of most MMOs, the shape of the curve is determined very quickly by the launch. For example, both UO and EQ hit about the halfway mark of their eventual peak very, very quickly and grew slower from there. This alone should tell you the impact of launch.

One of the things that hasn't been mentioned here is that Launch is your opportunity to get marketing and the press. Our entire PR portion of the industry is bent towards box products with one launch date. Worlds of Warcraft and EverQuest 2 will never get as much word of mouth as they are getting right now. Their marketing plans are all geared towards right now. And your product's reputation is then frozen in time: Ultima Online still has a reputation for being a buggy ganker's game, even though the design team has gone to extraordinary lengths to combat these problems.

In principle, I agree with Daniel James re: ship with a small, tight, very fun core gameplay. That being said, it's worth noting that while both City of Heroes and Puzzle Pirates are two of my favorite MMO releases, I left both relatively quickly. Perhaps they each needed more depth than they shipped with.

Raph: FWIW, SOE has no such policy of intentionally releasing early. :P

Then how do you explain the complete character class and skill overhaul less than 2 weeks before release? Making such big changes strongly suggests that even the developers think that EQ2 isn't ready yet, but it is released anyway.

Robold>Then how do you explain the complete character class and skill overhaul less than 2 weeks before release?

Launching a large, commercial virtual world isn't a one-day thing. It takes time, like stopping a supertanker: you have to plan it some time in advance. Because of the long lead-in times for print magazines, marketers have to get the publicity lined up 3 months before launch day at the very latest. Likewise, making all those CDs and shipping them to Electonics Boutique isn't something you can just do overnight, it has a lead time of several weeks too.

So Raph can honestly say "SOE has no such policy of intentionally releasing early", because the launch process began some time ago. They're not intentionally launching early, because by this time their hands were tied.

"Intentionally" kind of implies that they decided from the outset to launch early. I wouldn't say that this applied even to SW:G, which launched 18 months before it really should have. I don't know the details, but I suspect that SOE's higher management decided some late in the process that the SW:G project needed its own cash flow to support the burgeoning costs of its development. If they could have held on until the game was fully completed to specifications, they would have; however, they felt that they couldn't. Thus, they may have knowingl launched early, but they didn't intentionally launch early.

I'm not suggesting that SOE says, "Hey, is one of our MMOGs unfinished? Good! Let's launch it!" Nobody would have that kind of "intentional" early launch.

I do think there's a pattern with SOE that says, "When a game is close enough to playability, launch it, even if there are bugs, stability issues, or major design changes yet to come, because there will always be bugs, stability issues, and major design changes yet to come". Moreover, my intuition is that this may actually make good business sense for exactly the reasons that Richard outlines, that the costs of holding off from launch until a game is fully "ready" may not be justified by a superior yield of subscribers.

From a pure profit-and-loss standpoint, it may make sense to go live the first moment you plausibly can *if* you think you have an inbuilt customer base, as SWG did and EQ2 may have. This is not the case for games with less of a core audience. As a couple of people have mentioned, Anarchy Online is probably the best example of that. They had an audience that was ready for a new game, but one that had no particularly fixed loyalty to the AO franchise. If they'd launched in good shape, they would have kept their audience. They didn't, so they lost much of it.

Though even AO shows how complicated judging this all is. To suggest that a premature launch is the source of AO's troubles (or of the troubles of any flawed MMOG) is to suggest that additional time could fix those troubles. AO's later development history seems to bear that out, given that it's now a pretty decent MMOG--but maybe that later development history could never have happened without the blast furnace learning experience of the disastrous launch. The Fortune article on City of Heroes that we discussed here recently has one little insight into CoH's development process that I think is really key, that well before launch, there was a major managerial intervention into development to simplify the design and rein in some people on the team who were running off in their own little directions and making undocumented changes to the code. If that intervention had never happened, then another six months of development time would make no difference.

There's something subtle here about the authoring processes involved in making code for a huge game or software project. It has a temporality very different from authorship in other cultural contexts. Imagine if as you wrote successive drafts of a book, at some point you became fundamentally unable to make small alterations to parts of the book without destabilizing the entire project, even when some of those smaller segments or parts of the text contradicted each other or were aesthetically incompatible. Code is cumulative; at some point, it becomes impossible to change certain features or decisions made early on even when their actual manifestations in gameplay are deemed undesirable. Features that are eliminated live on like zombies in the code, and the bigger and more ambitious the MMOG, the more haunted it is by flesh-eating code from beyond the grave.

This is something that delaying a launch can't fix or alter unless you're willing to tear it all up and start again. I think some MMOG designs end up in a position where their only hope is to build the baroque Gormenghast mansion of their code ever upward in spirals of hasty improvisations, to give the players the sensation of change and novelty so that they don't see the perpetually crumbling foundations. At some point, if you were the manager of such a MMOG, I think the only sensible business decision you could make is, "Get this puppy out the door NOW, because there is no cure for what ails it".

It may be that AO is actually the only MMOG ever where the launch date was the dependent variable separating relative success from relative failure.

"Then how do you explain the complete character class and skill overhaul less than 2 weeks before release?"

The goal of any company is to release the best possible product that they can. In the case of an MMO they are driven by marketing, development, and community feedback.

The purpose of a beta is not only to find bugs, but to examine how the game is played, how the players respond, and then to make changes that result in the best possible product at release.

Given this, it is hardly surprising that there may be changes in a product so close to release. However that is not to say that those changes have not, to some degree, also been tested in house before going to the larger pool of the beta test population. It is also not possible to say that these changes haven't been planned for quite some time.

In six months there will have been even more changes.

One thing that SOE has proven to my satisfaction is that they are responsive to the playerbase. If you read the historical postings on the beta boards you may come to the conclusion that the skill/class changes recently introduced were just that, a response to the playerbase. Or you may not.

Which would you rather have, a development team with a design document that is set in stone, or a team which responds to the needs of the game?

Timothy> I think some MMOG designs end up in a position where their only hope is to build the baroque Gormenghast mansion of their code ever upward in spirals of hasty improvisations, to give the players the sensation of change and novelty so that they don't see the perpetually crumbling foundations. <

Wonderful. LoL. I think that image is going to be permanently burned on my mind when I play MMOGs now.

FWIW, the US military's acquisition plan now favors "incremental" development and distribution, which allows for a useful, working, if not fully functional system to get in the field sooner, as well as allowing for a lot of feedback from the actual user. The downside, which isn't news to any MMO player, is that funding for "improvements", which actually means delivering the promised system, may disappear in favor of other, shinier projects.

I'm not sure if this means early releases are a good idea, or if devs should run far far away.

Actually, I think I'd enjoy trying out a MMOG based on the first two books in Mervyn Peake's wonderful _Gormenghast_ trilogy. The physical and social architecture would be fun to explore.

(Until someone creates such a game, the closest we may come is the cook named "Swelter" in the Templar castle from the original Deus Ex. Heh.)

Some random thoughts on previous comments:

As a long-time observer of SWG, I don't think I'm quite ready to abandon my impression of it as having been "not ready for prime time" when launched. I still suspect that this decision, and the player frustrations it engendered, probably caused a faster drop-off in resubscriptions (and thus less long-term revenue) than would otherwise have been the case. This is just my perception; I'm open to evidence to the contrary.

...

As Raph said, it does appear that the big players tend to start out that way, rather than starting small and growing. What's interesting about this finding is that it seems to run afoul of one of the "system design laws" in John Gall's wonderful _Systemantics_: "A complex system that works is almost always found to have evolved from a simple system that worked." This suggests that the happy medium is to launch with a well-tested set of core features. You'll get people who are grumpy that they can't do some thing in the game, but it puts the developer in a better situation: when he makes a change, he's adding something the player wants, rather that taking away something the player doesn't want (bugs). Isn't that the more desirable approach from a customer management perspective?

...

I come to this discussion from a professional background outside the game industry. Currently I manage a group of about 10 programmers who maintain and enhance a fairly complex production application used by around 100 people. Our development process includes developer/customer review of requirements, problem evaluations, solution designs, and individual code changes, and all code files and documents are under source control. I also developed a suite of applications that allow me to monitor and control the whole process. In this way everyone -- me, the developers, and our customers -- know what's expected and what's being delivered. The result is fewer "unhappy surprises" and a demonstrably low rate of errors that show up in Production. I'm not big on external standards boards and suchlike, but FWIW we were certified as CMMI SEI Level 3 several years ago... and our process has improved considerably since then. It's both streamlined (because I despise bureaucratic inefficiency) and thorough (because people checking each other's work reduces errors).

I mention all this stuff in order to ask: How many of the big players in the MMOG industry use this kind of approach? Bearing in mind the similarities (critical and demanding customers, high-visibility "has to be right" application) and the differences (in-house app vs. marketed app, monolithic app vs. client/server, 100s of concurrent users vs. tens of thousands) between my app and a large MMOG, would the very controlled approach to the development process that I described be effective for MMOGs? Or do the big guys feel like they're already doing these things?

Is this level of peer review and process control already standard for the major players in the MMOG industry? If so, then why do so many bugs wind up in production code?

If it's not standard, why isn't it? Do MMOGs have unique purposes/audiences/technologies that make a well-defined and -controlled development process (such as the one I described) inappropriate for MMOG developers?

In short, shouldn't a commercial product follow commercial development process standards? If there are business reasons why it's OK to trade bugs for time, what are those reasons?

Flatfingers, I come from a similiar background, and in answer to your question:

"In short, shouldn't a commercial product follow commercial development process standards? If there are business reasons why it's OK to trade bugs for time, what are those reasons?"

There aren't any. :) There's a great book that was written some 20 years ago that should have been required reading for the dev and management teams of some of the latest MMOs.

"The Mythical Man Month" - Frederick Brooks

Go read it if you haven't. You'll smile (or maybe cry) when you read Brook's description of the tar pit and realize how many projects you've worked on that fit the examples in his essay. Some things never change.

A classic... albeit a classic that more senior managers could stand to read.

It was a giggle back in the early '80s when I read a reference to IBM's "human wave" philosophy of speeding up late software projects. Then came the day (some years back) when a manager actually suggested something similar. "Just add more people!" he said with a straight face.

Sometimes it can help. Usually it doesn't. The person who can build and maintain a team at the balance point between too few and too many has a good chance of being successful (all other things being equal).

Side note: I actually once ran into the opposite of the situation Brooks describes. Instead of one manager trying to pile additional programmers onto a job, I was the one programmer (a contractor) onto whom was piled multiple managers, none of whom agreed with the others on what I should be doing. At one point it was me and seven managers stuffed into an office.

A week later that company went bankrupt. I'm not suggesting there was a connection, of course. ;-)

I wonder: is having a non-game industry background useful to game industry leads/managers? Or can they learn everything they need to know just growing up in the industry?

As it happens, the server lead on SWG came from National Instruments. Yes, people with that sort of background are all through the MMOG industry.

It's not uncommon for practices like pair programming, lead verification of all checkins (as in, a source control checkin is not integrated unless a lead has walked through the code first), and even some degree of automated regression testing to be used in MMOs, at least at SOE.

I often hear folks from outside say stuff like "there was no discernable QA process," and it's frankly unfair; a better question to ask is what in the products and typical development processes of games makes teams with clearly demonstrable expertise in software engineering still have the defect incidence they do? Because I don't think it's solely the process to blame; it has a lot to do with the nature of the software being built.

Thank you, Raph. I always wanted to believe that the major players had mature development processes in place -- I appreciate your verifying this.

But as you say, if that's the case it does raise the question I asked earlier: if a good process is in place, how is it that so many anomalies make it into Live code?

One possibility is that this is just a perception, that the incidence rate of real errors is actually low compared to the number and rapidity of changes made. If so, this is a tough battle to win as it's a fight against human nature, which tends to see only the problems and ignores the things that unobtrusively just work.

Another possibility is that game programmers are just incurably sloppy, something no process (no matter how good) can fix. I don't believe this is the case. A lot of game programmers are lousy spellers, but given the highly results-oriented nature of the game industry, I think it's safe to conclude that most of them are motivated and competent coders.

A more likely source of production errors recognizes the fact that games are among the most technically challenging applications where audio and video are concerned. It wouldn't be surprising if games had more than their share of sound/graphics bugs... and yet, in my experience most games actually get these things right, even when they're pushing the envelope. It's usually not a graphics glitch or audio stuttering or suchlike that players complain about, but rather unexpected and/or undesirable in-game object, action, and GUI behaviors.

But that brings us back to where we started: If there's a good process in place to catch problems in object/action/GUI behavior before they go to production servers (because the behaviors of in-game interfaces are defined in requirements, documented in implementation, and checked in peer reviews), but these problems are still showing up, then where's the real breakdown?

Is it really the case that modern complex MMOGs are just too complicated for any process to manage? If so, is there any hope for future MMOGs? Or are we doomed to ever-increasing error rates as games get even more complex?

1) The incidence isn't actually as high as it seems. JTL, for example, shipped with, I believe, less than 200 defects, all of which were priority 3 or lower on a system with (I think) 8 tiers. The vast majority of them were priority 8. I don't think that is actually a bad record considering the magnitude of the project. Yet defects are seized upon and magnified because bad news travels faster than good.

2) Many gameplay bugs are subjective rather than objective, and yet they drive a large part of the perception. A large amount of the problems that, say, Tim Burke cites in Galaxies aren't actually defects in a typical QA sense. They are balance issues, and so on. These can be tracked in a QA database, but it is hard to assess what a proper fix is, and often it takes weeks of large-scale testing to determine whether or not the issue was resolved. This is also why it's easier to knock out presentation-related problems--they are easily quantified.

3) The nature of the applications relies on large-scale user-to-user interaction for many behaviors to manifest. If there's a defect in the valuation of a single sword somewhere, it isn't going to manifest until we see the large-scale usage of that sword reach unreasonable levels. If there's a subtle flaw in the economy, it's not going to show up until critical mass has been reached. If one class is slightly overpowered, it can take weeks for players to notice the 0.02% advantage they have, and all gravitate to it.

4) MMOs, like many games, invent a lot of stuff. This naturally leads to a high incidence of defects. Known technology solutions can help, but even "known gameplay solutions" aren't always directly applicable. Most of the MMOs today use roughly the same combat solution, and yet some are fun and some are not.

5) The rate of changes is high. An MMO serves a larger diversity of customers than a typical business app does. If you develop something like a inventory-management system, you can be pretty sure that 99% of your customers will be using it to track inventory, whether they come from textiles, finance, or automotive repair. An MMO, however, caters to a wider array of activities. This means that the rate of changes across each activity has to be higher, that more diverse bugs are reported, and so on. In Live service, you have to move extremely rapidly. A typical monthly publish for an MMO will include more defect fixes than a business app puts out in a year--and must since the MMO customers are far more fickle.

6) A mismatch between QA and dev. It's not unusual for the bottleneck to be in testing, not in development. The magnitude of MMOs makes it very difficult to regression test them; in most cases, it's impossible to automate the regression. Many of the features involve significant amount of time spent in gameplay; accelerating the gameplay invalidates the test in many ways, but if you don't do it, high-end content will never get tested. This doesn't mean your QA team is bad--on the contrary, they are extremely adaptable, dedicated, and capable. It's the nature of what they are testing that is the problem.

At GDC '03, after my Small Worlds talk, Mike Steele posed the question, "as these games get more complex from the network theory point of view, can any one designer actually keep them in their head, or are we doomed?" I think it's a valie question.

Each of your points makes sense, but there are a couple of items in particular that seem to be of the "getting ever more difficult" variety.

1. Tiny advantages can have large impacts when multiplied by thousands of users over hundreds of hours.

This is more of a game balance issue than a "bug" issue, but it's related to the overall point that increasing MMOG complexity is making it harder to manage ongoing development of a live game. If expanding the feature list means increasing your chances of creating these little advantages that get mutated into Them!-sized problems, is there a point at which a game becomes unmanageable? (That begs the question of whether a game should be "managed," but I'm thinking about this from a commercial product POV right now.)

Or is constantly chasing after these game-unbalancing giant ants with balance tweaks an unavoidable cost of increasing complexity?

2. The more features you add, the more potentially surprising interactions you get between features. So past a break-even point (based on QA staffing and tools), QA will *never* be able to test all the side effects of even a single change.

Assuming constant dev team and QA staffing levels over the next few years, the bottom-line question would seem to be: is it possible to create tools to help testers stay ahead of the combinatorial explosion wavefront?

Regression testing tools are helpful, but they seem limited. I'm wondering if the next step in managing development of complex games won't be a kind of combined attack between a new development language and a new programming style made possible by this language that allows each individual feature to "know" what it is and is not permitted to do. We're groping toward something like this context-sensitivity in our movement toward data encapsulation through object-oriented languages. But I wonder if there isn't another level of abstraction necessary to have a language whose very structure helps define how every feature should behave, both individually and in relation to all other connected features.

I don't know whether such a language exists now, or whether one is being bashed around in the universities, or if it hasn't been invented yet. (I do know that it's not Ada, and I'm pretty sure it isn't Java or C#.) Scheme, Smalltalk, and Actor are possibles, but I don't see LISP-based languages as likely inheritors to C.

Maybe such a context-aware language isn't possible. Or maybe it could be devised, but will take so long to be adopted by enough developers that the industry will suffer from being unable to grow much beyond current complexity levels for years to come.

But if not a radical rethinking of how we develop complex code, what other hope do developers have of staying ahead of the squall line of combinatorial chaos? If the development process is mature enough not to be the problem, then what solution can there be other than building some capability for module self-knowledge into our development tools?

Raph> I often hear folks from outside say stuff like "there was no discernable QA process," and it's frankly unfair; a better question to ask is what in the products and typical development processes of games makes teams with clearly demonstrable expertise in software engineering still have the defect incidence they do? <

I should emphasize that my comments about MMORPG QA relate to production servers, not development. I am quite ready to believe coding teams are following good software engineering practices. As a coder myself, I am impressed by the complex client and server code they write, and to a great extent get working. But I have got the impression that once the code gets tossed over the wall from development and testing to production, the industrial scale MMOGs are not following modern industrial scale production QA practices.

In particular, I've seen beta tests ask players things like "let us know if any of our quests can't be completed". To me that indicates inadequate monitoring of the production code. In a modern physical production environment you would be sampling product actually in production to see that is working to specification. I don't see why this doesn't apply to virtual production. That is, sampling production drop rates, combat outcomes and quest completions to ensure they are meeting specification.

My comments might be a bit out of date though. In the most recent beta test I was in, WoW, much of this production sampling was being done. Though by a player built resource, www.thottbot.com. The main drawback of this method is that it relies on client data. That introduces both self selection error, and the possibility of deliberate spoofing, into the data. Still, it does illustrate that production server monitoring of drop rates, quest completions etc is possible. And that post production check does give me more confidence in the WoW’s quest completions and drop rates.

Raph makes a good case that SOE are doing a lot of QA before the code hits the production servers. But what is in place to ensure simple database typos don’t compromise the intended effects? In EQ at least, these produced some of the biggest player annoyances. Mistyped quest text sent you to the wrong NPC. Mistyped drop rates made certain parts of the game too hard or too easy. Mistyped spell attributes made a supposedly better weapon worse than the weapon it was replacing. Not big things in themselves, but I do think they have a larger impact on customer perception of QA. Complex coding and balance issues are one thing. “Jeez, these guys can’t even type straight” is another.

I’ve seen a lot of comment on board that “EQ II will be a buggy mess”. Fairly or unfairly for SOE, people continue to project their dismal early record in EQ onto the current products. Some explicit recognition that the original production server monitoring was weak, and the problem has been addressed, would help. I understand there is a cost issue with live server monitoring. But, with the dropping cost of hardware, is that still prohibitive?

We do production server monitoring, more and more over time goes on (e.g., EQ2 does it better than Galaxies did, and so on). The difficulty isn't so much storing the data as knowing what to log, how to log it, and how to interpret it. EQ2 has had great success with measurements of quest completion, for example, and SWG has had great success with measurements of client performance. But these just scratch the surface, and at some point, the amount of data you get back out can become unmanageably complex in itself.