The Use of Genetic AI in the Balancing of Competitive Multiplayer Games

In the increasingly profitable competitive multiplayer games industry, ensuring products are fun, challenging and fair can be key to a games success. Traditionally, measuring and improving the balance of games has been costly in terms of both time and money, requiring many man-hours and large teams. Automated approaches to the process could improve game quality whilst offering ongoing time and money savings — a huge advantage to an ever-growing pool of game developers.

This project aims to highlight the potential for improving gameplay balance within multiplayer competitive games by automating the highlighting of potential balance issues. More specifically, the project explores how imbalances can be highlighted with data gathered by playing Artificial Intelligence agents against one another iteratively.

Why Balance?

Balancing is a key consideration in any asymmetric game play experience, particularly in competitive multiplayer games. Typically the outcome of a match should be decided by player skill rather than not the tools at their disposal. Imbalances lead to homogenous gameplay styles and frustration at the experience, ultimately leading to fewer people playing a given game.

Automated Assistance

The problem is gathering meaningful volumes of data without rolling a game out to millions of real users. Automation of gameplay, through Artificial Intelligence (AI), offers a way of gathering the necessary data. The high economic and time costs of traditional balancing mean the potential advantages of automation are clear, and the final results may be significantly more balanced, given the scalability of automated processes.

Automation in Action

In order to test the potential for automating elements of balance testing, a game typical of the genre was developed. An AI was developed to play the game and pools of the AI were matched against one another, all the while gathering data. This data was used to infer imbalances in the game.

For the purposes of this project, three consecutive balancing rounds were undertaken. Each AI player was equipped with the same deck of cards, though these decks changed between rounds based of card class changes. All card changes were made based on gathered data.

Results

Over 3 testing rounds, subtle imbalances were highlighted in the card pool. Examples of the subtle adjustments highlighted can be seen below:

Conclusion

The success of the balancing process suggests that the proposed approach to game balancing is viable, and should be further explored for commercial applications. The clarity with which test data highlighted issues, combined with the speed in which this data was gathered, make the technique particularly effective.