Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs or learn how to Submit Your Own Blog Post
The 3rd of a 4-part series. Video game composer Winifred Phillips shares ideas from her GDC 2018 talk, Music in Virtual Reality. Part 3: Diegetic versus Non-Diegetic, with a discussion of composition & recording methods to make music fit into a VR world.
By Winifred Phillips | Contact | Follow
So happy you've joined us! I'm videogame composer Winifred Phillips. Welcome back to our four part discussion of the role that music plays in Virtual Reality video games! These articles are based on the presentation I gave at this year's gathering of the famous Game Developer's Conference in San Francisco. My talk was entitled Music in Virtual Reality (I've included the official description of my talk at this end of this article). If you haven't read the previous two articles, you'll find them here:
During my GDC presentation, I focused on three important questions for VR video game composers:
Do we compose our music in 3D or 2D?
Do we structure our music to be Diegetic or Non-Diegetic?
Do we focus our music on enhancing player Comfort or Performance?
While attempting to answer these questions during my GDC talk, I discussed my work on four of my own VR game projects – the Bebylon: Battle Royale arena combat game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike shooter/RPG from Labrodex Inc.
In these articles, I've been sharing the discussions and conclusions that formed the basis of my GDC talk, including numerous examples from these four VR game projects. So now let's look at the second of our three questions:
Before we launch into this discussion, let's revisit one of the examples from the previous article. You'll remember that we took a look at the Main Theme music I composed for the popular Dragon Front VR strategy game, in order to examine how music can best transition from a traditionally 2D stereo delivery to a 3D positional implementation. So in this case, the big victorious anthem that I composed for Dragon Front makes its first appearance as a bombastic stereo mix directly piped into the player's headphones, and then transitions smoothly to a spatially positioned environmental sound issuing from a small in-game radio. Just as a reminder, let's take another look at that:
In this example, we see how the Dragon Front theme music starts as traditional underscore (that is, a non-diegetic score), but then moves into the VR space and becomes a diegetic score – one that is understood to be present in the game world. And that brings us to the second of the three core debates at the heart of music in VR: should music in VR be diegetic or non-diegetic?
It’s a thorny issue. As we know, musical underscore is absolutely vital in gaming – it creates momentum, motivates players and adds emotional texture to the story and the characters. However, in VR, the idea of presence becomes paramount. We want players to feel like they are inside the fiction of an awesome VR world. So, when the non-diegetic music starts playing, we worry that players might stop and wonder, ‘where’s this music coming from? Why am I hearing it?'
The obvious solution is to make all of the music in the game diegetic – somehow, in this VR world, all music comes from in-game sources that players can see in the environment around them. Here’s an example from one of my VR projects – Bebylon: Battle Royale, from developers Kite & Lightning.
Bebylon is a great example of a completely diegetic score in VR. The whole premise hinges on immortal babies battling it out in over-the-top arena fights in a futuristic setting. Music during gameplay is represented by a group of in-game baby musicians, so the music originates from that source, and we’re able to see this happening in the VR world. So, let's take a look at that:
Bebylon: Battle Royale proves that its possible to get away with a completely diegetic score, but we’d need really specific circumstances to justify it. Most games won’t be able to make this approach work. So, what then? I’ve found that there are three strategies to ease non-diegetic music into VR:
Keep it subtle and gradual,
Keep it dry and warm, and
Keep it both inside and outside the VR world.
So let's start with the first strategy – subtle and gradual.
We've already discussed this technique in the first article in this series, when we took a look at the ambient music for Scraper, a first-person VR shooter set inside colossal skyscrapers in a futuristic city. Exploring the massive buildings in the Scraper fictional universe requires a musical soundtrack to set the tone, but introducing it so that it feels natural in VR is a challenge.
In order to address this problem, I composed the ambient music in Scraper so that it would come and go in subtle, gradual ways. As a technique for music implementation in VR, this can be an effective approach. Let's take another look at what that was like in Scraper:
While this technique works well for the ambient music, it wasn’t an option for combat. Battles in Scraper are pretty intense – the music begins with a bang and keeps on whaling away until the room is cleared of enemies. At the beginning of the project, we’d decided on a stereo music mix rather than spatialization – considering how important audio cues are to expert first-person-shooter players, we didn’t want a spatialized score to introduce any confusion. My job at that point was to figure out a way to delineate the stereo music mix from the VR world so that the player wouldn’t wonder where the music was coming from.
From here, I started thinking about proximity effect – it’s a term relating to microphone recording. You’ll notice proximity effect when someone speaks into a mike while leaning very close to it. The voice starts sounding really bassy and warm in tone, and the mike picks up a lot of the dry source signal, with less of the room acoustics coming through. When you listen with headphones to a recording with lots of proximity effect, it tends to feel like it’s inside your head. I thought – great! If the music is in our heads, we’re not going to be looking around, wondering where it’s coming from.
I recorded the music for Scraper with fairly dry acoustics, and when I mixed the music, I focused on keeping the tone warm and bassy, with a solid low end and some rich mids in the EQ spectrum. Here’s an example of how that worked in combat sequences of the Scraper VR game:
I also recorded the music of Fail Factory with dry acoustics and a warm, bassy mix – this effect is especially prevalent during the Fail Factory tutorial.
In the Fail Factory Tutorial, the instructor zips around on a hover craft while offering tips and guidelines. In those circumstances, having the music in a dry, warm mix allows it to feel closer to the player, and more separated from the spatialized sounds from the instructor. Let’s check that out:
So now let’s look at another approach, which I’ve called ‘Inside and Outside.’ If music is 3D – if it’s spatialized – we’re more likely to think it actually exists inside the fictional world. If music is 2D – if it’s a direct stereo mix – we’ll be more likely to accept it as non-diegetic, as outside the experience.
Remember the example I showed earlier from Dragon Front – when the main theme music of the game transitioned into a spatialized music source coming from inside the VR space? This is an example of music making the jump from non-diegetic to diegetic, and that can help the player accept the presence of music as a part of the VR game. Watch how players can look around in the Dragon Front hub area, locate the source of the music, and actually turn it off if they want to:
So we've now discussed the second of the three important questions for video game composers creating music for VR games:
Do we compose our music in 3D or 2D?
Do we structure our music to be Diegetic or Non-Diegetic?
Do we focus our music on enhancing player Comfort or Performance?
We've contemplated what role our music should play in the VR experience - whether it should be considered a part of the fictional world or an outside commentary that shapes the player's emotional experience. Both roles are valid, but the choice between them is especially meaningful within the context of VR. The next article will focus on the third of the three questions: whether music in VR should enhance player comfort or player performance. Thanks for reading, and please feel free to leave your comments in the space below!
This lecture presented ideas for creating a musical score that complements an immersive VR experience. Composer Winifred Phillips shared tips from several of her VR projects. Beginning with a historical overview of positional audio technologies, Phillips addressed several important problems facing composers in VR. Topics included 3D versus 2D music implementation, and the role of spatialized audio in a musical score for VR. The use of diegetic and non-diegetic music were explored, including methods that blur the distinction between the two categories. The discussion also included an examination of the VIMS phenomenon (Visually Induced Motion Sickness), and the role of music in alleviating its symptoms. Phillips' talk offered techniques for composers and audio directors looking to utilize music in the most advantageous way within a VR project. Takeaway Through examples from several VR games, Phillips provided an analysis of music composition strategies that help music integrate successfully in a VR environment. The talk included concrete examples and practical advice that audience members can apply to their own games. Intended Audience This session provided composers and audio directors with strategies for designing music for VR. It included an overview of the history of positional sound and the VIMS problem (useful knowledge for designers.) The talk was intended to be approachable for all levels (advanced composers may better appreciate the specific composition techniques discussed). |
Winifred Phillips is an award-winning video game music composer whose most recent projects are the triple-A first person shooter Homefront: The Revolution and the Dragon Front VR game for Oculus Rift. Her credits include games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER'S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games.
Follow her on Twitter @winphillips.
Read more about:
Featured BlogsYou May Also Like