The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.

Making a video game can lead to a fair number of misconceptions and misunderstandings for those who aren’t working on the team, especially when it gets close to an update’s launch and it’s the Testing department's time to shine. So I wanted to have a go at letting everybody know exactly what our department does and what we're about!

First things first, let’s take a look at what the QA team is:

The QA team – Quality is none of our middle names, it would be a terrible middle name.

Comprised of around a dozen seasoned players that know their way around KSP like the back of their hand, the QA team is arguably the most passionate of KSP’s community. Many of our team have worked professionally in software development, so they know the in’s and out’s of general development cycles and those that haven’t have picked it up very quickly! The QA team was created more than a year ago after 0.18 to accommodate the increased rate at which content was being developed and the scale of the game’s growth. Since then, it’s undergone minor personnel changes, but for the most part has had a core handful of members that have remained.

Now, let’s go through what the QA team does:

QA – Assuring Quality through Quality Assurance

Those familiar with game development, or perhaps games in general, will know the essence of QA is bug hunting, documenting found bugs and assisting Developers in fixing them. It’s the most bare bone part of the entire process and is performed by a handful of us QA testers pretty much constantly, thanks to the Branch System .

Given such a comparatively small, volunteer team, QA is very much about efficiency and focus, with mostly test cases being used instead of a general playtesting method. What this means is that instead of going through the motions of the game as a normal player would, we tend to identify areas of the new content that would usually be prone to issue and hunt for bugs there. This method cuts down the time taken to find issues by a significant margin and correspondingly means that the content is tested more evenly – playtesting can sometimes skip completely past some aspects of a feature.

Furthermore, this method allows us to work closely with the developers and compare exactly what they intended to occur for specific cases, to what actually occurs – this is where QA becomes more about feedback.

When we talk about feedback, it’s actually quite a loose term to use. It can range from the type of feedback players, such as you reading this now, provide us with, such as comments on the forums about KSP to the developed suggestions we receive either via e-mail or on the many other KSP Communities out there. So, when it comes to QA, we use the term feedback in a very narrow scope. It means providing specific recommendations and improvements to certain areas of a feature. As we rarely do general feedback in the form of a forum thread for an update in QA, we use the Redmine Issue Tracking system to log all our feedback. In QA, there is little use in us telling the feature’s developer it could be a bit more ‘flashy’ or ‘eye-catching’, what we both require is an established dynamic of improvement via recommending a certain component be changed in size or color, and suggesting the size or color to change it to. With this explained, one can see QA is a lot more than just finding bugs. It’s about having the knowledge of the game (especially how it works under-the-hood), the comprehension of the ideas behind the features in the game, the understanding of what a Developer wants the feature to turn out like and how you can assist them in making it happen. Furthermore, it’s about condensing all of that into concise and objectively written issue reports.

Though that doesn’t mean we don’t come across our fair share of game-breaking and eye-widening bugs. These buggers can be quite the kicker when you’re striding through an otherwise uneventful set of features. However, we’ve all encountered more than enough of these types of bugs to be able to jump right into the method of establishing the necessary information for an ideal bug report.
Such a method involves doing many of the following:

Making any relevant notes before doing anything else. Noting down what exactly has occurred prior to the issue becoming present is invaluable; it’s the sort of detail that is easy to forget when it comes to later steps.

Quickly making a copy of the relevant logs (usually KSP.log and output_log/player.log).

If relevant to persistence, or the likelihood of being in the same situation is rare in the near future, or the current situation is difficult to get into; making a copy of the persistence file, quicksave file and/or the craft file.

Inspecting the relevant logs for any errors or warnings that were written to it.

Narrowing down a potential cause using the above information.

Determining reproduction steps.

Finally, minimizing the mass of information collected about the issue into a neat and brief issue.

While this seems like a lot to do, it quickly becomes a routine when encountering an issue and is very much second-nature. Of course, not all of the above is carried out for each issue encountered. It depends greatly on the severity and nature of the issue encountered.
All of this keeps us working quickly through features, branches and issues, with builds often flying out of the build server many times a day. And at the end of QA, we have a build that can be played without encountering major issues, is feature-complete and ready to be given a thorough inspection and play test by the Experimental Team.

So, all in all, that’s what the QA Team does during an average QA phase. Onward to the Experimental Team!

Experimental Team – More than just a Team that Experiments.

The Experimental Team is comprised of, on average, 50-60 active testers. All of these testers are volunteers who contribute their spare time to playtest KSP. Prior to version 0.15, the Experimental Team didn’t exist. However, with the increasing size of the KSP player base and the game itself, open testing was no longer a viable option for efficient and spot-on bug reporting and issue tracking. Thus the Experimental Team was created.

The Experimental Testers are normal KSP players, sourced from the various KSP communities via a simple application process. Often and understandably, they don’t have as much spare time to devote to testing as the QA Testers and thus we have significantly more Experimental Testers ‘signed up’ than we need at any one time. This works in everyone’s favour as it keeps the activity level throughout an Experimental Phase and doesn’t put pressure on the testers while they also deal with their personal and professional lives.

What is the Experimental Team responsible for, then?

Experimental Testing – Are we Experimentally Testing or Testing Experimentals?

At its core, Experimental Testing is much like using a focus group to ‘test the water’ on an update. Of course, there is far more depth to Experimental Testing than just using the Team as a focus group, but we’ll touch upon that later. For now, it’s best to discuss how significant and useful the Experimental Team is in this part of the testing process.

After we have an update go through QA, as detailed above, it is hopefully free from major issues and each feature has had any needed major improvements and refinements carried out; we’re at a feature-complete state. However, many components of a feature may still be unpolished, such as part balancing, or the performance of newer UI on different platforms (eg high-resolution screens). Here is where Experimental Testing comes in and assists us in cleaning up the remaining minor feedback issues. In game development, it’s a mammoth task to try to account for all edge cases and different types of hardware, but with the Experimental Team as diverse as it is, it makes it a whole lot easier to approach this task and dissect it into manageable parts.

Moving on, while the Experimental Team is incredibly useful in providing minor and focused feedback, it really shines when we arrive at general feedback for an update. Essentially, each Experimental Tester provides their opinion in a feedback thread and they go into as much detail as they feel necessary. Here, the size of the Experimental Team becomes its strength, we can get a feel for how the Community may react to certain components of an update. We can also gauge what minor changes are needed to improve players’ perception of a feature in some areas. Furthermore, this area of feedback really illustrates whether some difficulty in use of features are more widespread than others and we see what misconceptions about gameplay mechanics exist.

As I mentioned earlier, feature-lock is pretty much in place for Experimental Testing, so this feedback often works with that in mind, with a lot of the changes that come out of it being minor, with as big of an effect as is possible.

An Experimental Testing phase typically lasts around a week, though it is highly dependent on the number of issues that arise and how much further development is required to reach a release state. At the end of the Experimental phase, there are still a fair amount of issues on the tracker that are still open, but it’s important to note that these issues are always minor ones, ones that aren’t in the scope of the update or simply issues that would take too much time and resources to resolve – the latter is more often than not for feedback issues, as opposed to bugs. The overall aim of the Experimental Testing period is to produce an update that is balanced, playable, can stand the test of time and performs well on a wide of a range of systems as possible.

That about concludes this blog, though I’ll be writing up more in the future that cover the other various aspects and components of testing. Feel free to ask any questions you have in the comments, I’ll be sure to read and reply to them.

Related Jobs

WB Games —
San Francisco,
California,
United States
[07.31.15]
Game Director