DeepMind AI Moves on from Board Games to StarCraft II

This site may earn affiliate commissions from the links on this page. Terms of use.

There is a long and storied history of machines beating humans at games we once dominated. There was Deep Blue beating humans at chess, then Watson winning Jeopardy!, and most recently, Google’s DeepMind AI branch created a computer that can win the deviously complex game Go. Now, DeepMind is moving on to its biggest challenge yet: StarCraft II. I wonder if it’ll go for the Zerg rush.

DeepMind isn’t going it alone here — Blizzard is on board with the project and is making thousands of StarCraft II matches available for training the learning machine. This has the potential to be far different from the AI that already exists in games like StarCraft II. Those bots are predictable, and trivially easy for experienced players to defeat because so much of their behavior is pre-programmed. DeepMind has the opportunity to build an artificial player that behaves more like a real person.

Blizzard has started DeepMind off with a cache of more than 65,000 games of StarCraft II. The amount of data fed into the bot will increase dramatically over time, too. Blizzard will make about 500,000 more games available each month.

DeepMind is interested in conquering StarCraft because it’s a vastly more complex problem than any other game it has worked on. Go was a huge challenge because of the enormous array of moves and board configurations — a one followed by 170 zeros. DeepMind estimates that StarCraft’s complexity would add at least 100 more zeros to the possible configurations. That’s because you have many more options at all times. You can build another unit, add more defensive structures, form a group, move your units to a certain place, attack the enemy, and so on.

DeepMind’s AlphaGo AI amazing the world by winning a high-profile Go match up.

Another issue is that DeepMind has to play by the same rules as a human player. That means a large part of the map will be covered by the shroud, limiting the amount of data the AI has. So, the machine will need to anticipate the actions of human players in a way it doesn’t when playing board games. It can’t just plan out the game in advance because it can only see a small part of the map.

A StarCraft II bot that can beat the world’s best players would be cool to see, but this is about more than pwning noobs. DeepMind knows an AI capable of mastering StarCraft II will be able to tackle big problems in the real world. But first, they must construct additional pylons.