Sponsored By

Video game music systems at GDC 2017: what are composers using?

In the GDC 2017 audio track, several divergent systems for interactive music implementation were presented. Video game composer Winifred Phillips explores some of these systems, while also sharing interactive music case studies from her own projects.

Winifred Phillips, Blogger

June 26, 2017

14 Min Read
Game Developer logo in a gray background | Game Developer

By video game music composer Winifred Phillips | Contact | Follow

The 2017 Game Developers Conference could be described as a densely-packed deep-dive exploration of the state-of-the-art tools and methodologies used in modern game development. This description held especially true for the game audio track, wherein top experts in the field offered a plethora of viewpoints and advice on the awesome technical and artistic challenges of creating great sound for games. I've given GDC talks for the past three years now (see photo), and every year I'm amazed at the breadth and diversity of the problem-solving approaches discussed by my fellow GDC presenters. Often I'll emerge from the conference with the impression that we game audio folks are all "doing it our own way," using widely divergent strategies and tools.

This year, I thought I'd write three articles to collect and explore the ideas that were discussed in five different GDC audio talks. During their presentations, these five speakers all shared their thoughts on best practices and methods for instilling interactivity in modern game music. By absorbing these ideas side-by-side, I thought we might gain a sense of the "bigger picture" when it comes to the current leading-edge thinking for music interactivity in games. In the first article, we'll look at the basic nature of these interactive systems. We'll devote the second article to the pros and cons of each system, and in the third article we'll look at tools and tips shared by these music interactivity experts. Along the way, I'll also be sharing my thoughts on the subject, and we'll take a look at musical examples from some of my own projects that demonstrate a few ideas explored in these GDC talks:

So, let's begin with the most obvious question. What kind of interactive music systems are game audio folks using lately?

The big three music systems

In recent years, the three most popular methods for instilling interactivity in game music have been either via the use of rendered music consisting of pre-recorded audio files (such as WAV, OGG, etc), the triggering of pre-composed music data files (most commonly MIDI), or by virtue of a procedural music generation system that constructs unique musical content on-the-fly.  Photo of A Composer's Guide to Game Music, by video game composer Winifred Phillips.I wrote extensively about all three of these systems in my book, A Composer's Guide to Game Music. At GDC 2017, the discussion in the field of interactive music continued to center around these three distinct approaches, with a number of new tweaks and adjustments designed to enhance the utility and effectiveness of the music design. In this first article, let's take a generalized look at the music systems that each of these speakers were using in their projects. By looking at them side-by-side, we'll get a broader perspective on the current interactive music strategies being employed by game audio folks. In subsequent articles we'll discuss more details and challenges associated with these approaches to music interactivity, but first let's try to wrap our heads around the basic nature of these systems.

 

Illustration of a GDC 2017 presentation, from the article by Winifred Phillips (video game composer).For sound designer Steve Green, the music system for the underwater exploration game ABZU was most effectively executed using rendered/pre-recorded audio with some straightforward interactivity designed to address a common implementation problem. "I know of a few games that basically have a music or a track that will go right to the next one," Green comments, pointing out a very old-school technique wherein disparate music tracks are butted up against each other.

From the article by game composer Winifred Phillips - an illustration of the game ABZU.While such a simple system may work, the effect will inevitably feel clunky and inelegant, so Green solved the problem in ABZU by focusing on musical transitions that overlap the preceding piece and thereby facilitate the switch-over to the next. Taking into account that each piece of music is triggered when the player enters a new in-game stage (i.e. location), the music system concentrates on overlapping transitions in order to present a more cohesive musical texture. This allows the musical score "to find this transitionary moment that gets you from stage one to stage two," Green explains.

Steve Green's description of interactive transitions is very interesting, and personally, I've found this to be a very useful interactive music mechanic. I worked with a similar music system for the dancing minigame sections of the Spore Hero video game, a part of the famous Spore franchise developed by Electronic Arts. I thought I'd share some details about this system in order to further broaden our perspective on using interactive transitions. In the Spore Hero dancing minigame, your character must imitate the movements of an opponent in what is essentially a dance-off. This dancing is accompanied by a bass line, plucked strings and some drums and rhythm instruments. If your character imitates those movements successfully enough, the system triggers a "special move" sequence that's more complicated. The special move music features lively flutes that join the mix for a cheerful phrase with an Irish flavor. In order for this musical transition to proceed as smoothly as possible, the interactive music system uses an overlapping transition (similar to the overlapping transitions described by Steve Green in his talk). The overlapping transition in the Spore Hero dance music is a vigorous percussive rhythm and roll performed by high-pitched pellet drums. Here's an audio example from the Spore Hero dancing gameplay. Notice the pellet drums at 0:13 and 0:25:

 

 

Illustration of a GDC 2017 presentation, from the article by Winifred Phillips (video game composer).Let's now move on to another of the GDC presentations we'll be examining in these articles. Staying with another rendered/pre-recorded music system, let's take a look at how educator Leonard J. Paul describes his work in his talk, "Different Approaches to Game Music," in which he describes the process of implementing the interactive music of a scrolling platformer game called Vessel. While the music of ABZU featured a streamlined music design involving layered transitions between linear tracks, the music of Vessel went further with the idea of more intricate music layering.  From the article by video game composer Winifred Phillips - an illustration of the game Vessel."We did custom adaptive music design with this one," Leonard J. Paul explains during his wide-ranging GDC talk on music interactivity. Describing the process of working with an electronica musician to requisition music compositions for the project, Paul dives into the details of the interactive implementation. "We got all of his stems, which was great," Paul says (and stems, as we know, usually consist of individual instruments in a music mix which are recorded into discrete audio files so they can be delivered separately.)

"So there was twenty to thirty tracks," Paul adds, referring to each of the separate instrument stems as a 'track' in this context. "I distilled those down to an ambient layer, a harmony layer, a bass layer and a drum layer," Paul says, referring to his process of grouping these stems into the four aforementioned subgroups. With this structure in place, Paul could then mix the music interactively, working triggers into the game that would alter the music mix as the game progressed.

 

 

Illustration of a GDC 2017 presentation, from the article by Winifred Phillips (video game composer).Working with someone else's compositions in order to introduce musical interactivity can be even more complex when the interactive system is equipped with more diverse features. Sho Iwamoto, an audio programmer at Square Enix, describes the interactive music system he designed for the game Final Fantasy XV. "I developed the MAGI system; the interactive music system for an audio engine," Iwamoto says, "(MAGI) stands for Music API for Gaming Interaction."

From the article by game composer Winifred Phillips - an illustration of the video game Final Fantasy XV.Like the interactive music systems for Vessel and ABZU, the music system for Final Fantasy XV is also structured around rendered/pre-recorded audio files. However, Iwamoto added extra functionality to the more traditional interactive approaches. Describing the features of the MAGI system, Iwamoto says, "First, it enables vertical and horizontal transitions. Because, you know, that's basic. Second, it accepts any tempo or time signature," Iwamoto adds, "and also any changes in the music. I think that, despite the fact that not many interactive music systems support this, I believe this is really important." MAGI accomplishes this goal by virtue of many sync points (programmed moments in a music file wherein an interactive music system can instantly transition to another music file). The MAGI system recognizes and responds to these sync points in sophisticated ways (stay tuned for more on this in articles two and three of this series).

 

 

Illustration of a GDC 2017 presentation, from the article by Winifred Phillips (video game composer).For the Plants vs. Zombies: Heroes mobile game, audio director Becky Allen adopted a completely different music system. "The score is a hybrid score," Allen says. "It’s a mix between MIDI and wav." To clarify, let's remember that MIDI (Musical Instrument Digital Interface) is a music data format that allows note events to be saved as data. The data then triggers a library of sounds to play according to the note events saved in the MIDI file.

From the article by Winifred Phillips (video game composer) - an illustration of the game Plants vs. Zombies: HeroesBecky Allen's music system for Plants vs. Zombies: Heroes was a hybrid system that incorporated both MIDI data files and pre-recorded music files that could both be triggered by the system. While some of the combat and cinematic music benefited from live recording sessions on an orchestral sound stage, much of the music relied on MIDI data and a custom sound bank. "I’d always thought it would be a hybrid score, and it did indeed develop into being a hybrid score," Allen observes, "and there are some challenges that come up along the way about that."

 

Becky Allen's thoughts on MIDI highlight the inherent usefulness of the system. MIDI has a long and storied history in the field of video game music, and most video game composers have engaged in at least one MIDI project. To expand on this, I thought I'd share some personal observations about my first MIDI project: the Shrek the Third video game for the Nintendo DS, published by Activision. Because a MIDI score exists as data files that are called up by the game's audio engine, it offers lots of options for audio teams looking to implement music interactively. For Shrek the Third, each musical composition included three alternative melody lines assigned to the three main characters in the game. The music system triggered the appropriate melody to correspond with whatever game character was currently being controlled by the player. Here's a video showing how that worked in the game. I've indicated on-screen when each new melody is triggered:

 

 

Illustration of a GDC 2017 presentation, from the article by Winifred Phillips (video game composer).Finally, we come to the most complex and difficult music system of all. "What is procedural audio?" asks audio director Paul Weir (rhetorically) during his GDC talk. "What does that mean?... No one seems clear – what the hell is it?" If anyone can answer that question, it would be Paul Weir himself, who devised and deployed a procedural system for the audio and music of No Man's Sky; a space-exploration game featuring over 18 quintillion planets for players to discover. During his presentation, Weir points out how exceedingly difficult procedural audio can be to explain, and how difficult it can be to fully comprehend. As we've discussed previously in these articles, procedural music generation (a.k.a. generative music) requires a composer to provide a set of general musical rules and a core batch of music content. The system then extrapolates new content from this source material by virtue of mathematical calculations based on algorithms.

From the article by game composer Winifred Phillips - an illustration of the video game No Man's Sky.In his talk, Weir defines procedural generation as "any kind of algorithmically created content. So, any computer generated content based on some logic system." Going further, Weir points out that procedural generation is useful in many disciplines apart from music. According to Weir, the technique applies very well to the visual fields, and it's "very common to use it for things like textures, for things like landscape generation." This made the system ideal for a game like No Man's Sky, with its 18 quintillion planets. Weir's procedural music system, which he dubbed Pulse, operates by pulling musical fragments from a large library of assorted snippets, assembling them to create appropriate atmospheres to correspond with what the player is doing and where the player is. Like Leonard Paul in his work on Vessel, Paul Weir also collaborated with electronic musicians who created an album of music for him to interactively implement and dissect for the procedural system. "We built the system after they did the album," Weir shares, "but we had in mind how we were going to work."

 

 

So now we've looked at some interactive music systems that these five audio professionals used in their projects. In our next article we'll contemplate some simple but important questions: why were these systems used? What was attractive about each interactive music strategy, and what were the challenges inherent in using these systems? That's our topic of discussion for the next article! In the meantime, please feel free to leave any questions or comments below!

 

Photo of video game composer Winifred Phillips in her music production studio.Winifred Phillips is an award-winning video game music composer whose most recent projects include the triple-A first person shooter Homefront: The Revolution and the virtual reality game Dragon Front for Oculus Rift. Her credits include games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER'S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games.

Follow her on Twitter @winphillips.

Read more about:

2017Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like