Sponsored By

Devs without programming knowledge soon could create dynamic AI

The GDC X OLL podcast crew spoke with Mitu Khandaker-Kokoris about the work she's doing to make artificial intelligence more accessible to game creators without programming expertise.

Alissa McAloon, Publisher

March 7, 2017

2 Min Read
Game Developer logo in a gray background | Game Developer

During GDC last week, the crew behind the GDC X One Life Left podcast took the time to speak with a handful of people working on interesting projects and technology in the video game development sphere.

On such interviewee, Mitu Khandaker-Kokoris, spoke in detail about the work she’s doing with SpiritAI to make artificial intelligence more accessible to game creators without programming expertise.

The tool, an in-development character engine, would give developers the ability to create AI characters for a wide variety of games and situations.

Ideally, this character engine would enable narrative designers, writers, and other developers to create responsive AI characters without programming knowledge. Khandaker-Kokoris compared the tech to the fictional AI robots in HBO’s Westworld series, and that comparison carries through to how the technology functions behind the scenes as well. 

“There’s text or speech generated on what the NPC thinks is most appropriate. It basically improvises on a script that you’ve written and also on the things it knows in the world,” she explained. “So it's about coming up with new things that this character says based on what its own agenda is and what you’ve said to it.”

SpiritAI's recently unveiled project was on the GDC show floor last week. In that demo, players could communicate via text to an in-game character accused of a murder to explore its motivations, agenda, and knowledge.

The demo was similar to some of the potential applications Khandaker-Kokoris mentioned in the podcast recording, such as one that described how NPCs in massively multiplayer online games could share knowledge and respond differently to people based on a player’s actions or words.

“You can sort of write characters who are very responsive and dynamic and you can talk to them, either through natural language or [through] typing at them and they’ll know what you're saying,” explained Khandaker-Kokoris. The character engine has applications in virtual reality as well, and she says it can also enable AI characters to respond to where a character is physically located in an environment.

For Khandaker-Kokoris’ full explanation of the technology and more examples of its application, take a look at the final segment of the video above, right around the 1:12:30 mark. The full episode, featuring conversations with a total of 12 developers, is well worth a listen as well.  

And while you’re at it, be sure to subscribe to the Gamasutra Twitch Channel and the One Life Left podcast for more great game analysis to enrich your development life. 

Gamasutra and GDC are sibling organizations under parent UBM Americas

Read more about:

event-gdc

About the Author

Alissa McAloon

Publisher, GameDeveloper.com

As the Publisher of Game Developer, Alissa McAloon brings a decade of experience in the video game industry and media. When not working in the world of B2B game journalism, Alissa enjoys spending her time in the worlds of immersive sandbox games or dabbling in the occasional TTRPG.

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like