Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs or learn how to Submit Your Own Blog Post
A reflection upon the AIIDE 2010 StarCraft AI Competition and discussion of what future competitions could provide the game industry.
I have been involved in research in RTS games for a few years now. One of the things that really caught my attention was that most researchers from an AI background were unable to apply their work to the complexity of current generation RTS games. Instead, they would use an abstract RTS game and evaulate their work in their restricted framework. But I had a different agenda. My goal was to learn from expert players, from expert replays, and therefore I required dealing with the complexity of a complete game.
To settle the score of which AI technique worked best for RTS games, I proposed a competition, in which bots compete in a heads-up tournament for the winning position. I wanted to simulate an environment in which bots were placed in a situation as close to professional gaming as possible. The result was the AIIDE 2010 StarCraft AI Competition.
AIIDE 2010 StarCraft AI Competition
This was an international event, involving a dozen countries. The task given to participants was to build the best performing bot for AI vs AI matches in a double elimination tournament bracket. The competition included four different tournaments which involved varying levels of complexity, but the most popular mode by far was the complete gameplay tournament. This mode simulated a professional StarCraft tournament, such as BlizzCon, but for bots. In all, 28 participants submitted bots for the tournament.
The competition was open to everybody, but most of the participants were affiliated with a university. Non-affiliated participants included both hobbyist programmers and industry veterans. Of the university affiliated participants, it was quite remarkable to see how many non-games related participants used the event to justify the study of game AI. In particular, UC Berkely participants were able to convince their advisor that StarCraft was an intestesting domain even though their focus was NLP (Natural Language Processing).
UC Berkeley ended up winning the competition with an interesting strategy that focused on air units. But it is not a dominant strategy and there is still a lot of potential for AI techniques to be employed in this domain. During play testing with a former pro-gamer, it was shown that the bot is not yet at expert level, even though it can defeat most casual gamers.
UC Berkeley Winners (Left) and Organizer (src: Vadim Bulitko)
The competition was unique in that it required AI systems to work at the same level as gamers. That is, the AI has to operate with imperect information and scout the opponent due to the fog of war. The results of the first competition were a huge success, and players are looking forward to playing a more sofistacated AI system.
The competition was made possible by the BWAPI project, which provides hooks into StarCraft (Blizzard provided a content-use license for the day of the competition). While the game is over 10 years old, it is still an active game and players are enjoying the new gameplay experience provided by AI systems.
Should more games expose their API for research? The StarCraft competition will continue in 2011 and gamers are looking forward to the challenges provided by research in this domain.
Read more about:
Featured BlogsYou May Also Like