Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs or learn how to Submit Your Own Blog Post
Strategies for implementing music in VR projects were offered during the Virtual Reality Developers Conference this year. Video game composer Winifred Phillips discusses tools and tips from these talks, while also sharing insights from her own projects.
By video game composer Winifred Phillips | Contact | Follow
The Game Developers Conference is always an awesome opportunity for game audio experts to learn and share experiences. I've given presentations at GDC for a few years now, and I'm always excited to hear about what's new and notable in game audio. This year, the hot topic was virtual reality. In fact, the subject received its own dedicated sub-conference that took place concurrently with the main GDC show. The VRDC (Virtual Reality Developers Conference) didn't focus particularly on the audio and music side of VR, but there were a couple of notable talks on that subject. In this article, let's take a look at some of the more intriguing VR game music takeaways from those two talks. Along the way, I'll also share some of my related experience as the composer of the music of the Dragon Front VR game for the Oculus Rift (pictured above).
The talks we'll be discussing in this article are entitled "Audio Adventures in VR Worlds" and "The Sound Design of Star Wars: Battlefront VR." Here's a common issue that popped up in both talks:
Where should video game music be in a VR game? Should it feel like it exists inside the VR world, weaving itself into the immersive 3D atmosphere surrounding the player? Or should it feel like it's somehow outside of the VR environment and is instead coasting on top of the experience, being conveyed directly to the player? The former approach suggests a spacious and expansive musical soundscape, and the latter would feel much closer and more personal. Is one of these approaches more effective in VR than the other? Which choice is best?
These two concepts share a lot in common with the traditional categories of diegetic and non-diegetic music in entertainment media. Diegetic music exists inside the fictional world, perceived by the characters within it, whereas non-diegetic music is inaudible to the characters and only exists for the benefit of the audience. VR presents an interesting twist to this usually straightforward dichotomy. When the entertainment experience is doing everything in its power to make us forget that we're an audience, to the point where we achieve a sense of complete presence within the fictional world... what role does non-diegetic music play then? If we can now consider ourselves as characters in the story, how do we hear music that story characters aren't supposed to hear?
"VR goes beyond picture sync. It’s about sync of the world," says music producer Joe Thwaites of Sony Interactive Entertainment Europe. In his talk about the music and sound of the game PlayStation VR Worlds, Thwaites explores the relationship between music and the VR environment. "The congruency between audio and visuals is key in maintaining that idea of believability," Thwaites asserts, "which in turn makes immersiveness, and in turn makes presence." In virtual reality development, the term 'presence' denotes the sensation of actually existing inside the virtual environment. According to Thwaites, a strong believable relationship between the aural and visual worlds can contribute to a more satisfying VR experience.
As an example, Thwaites describes an interactive music implementation that he integrated into the 'Ocean Descent' section of PlayStation VR Worlds. In this portion of the game, Thwaites pulled the otherwise non-diegetic musical score more fully into the immersive world by creating an illusion that the in-game objects were reacting to the musical notes. "There’s a part called The Jellyfish Cave, where you descend into this sea of jellyfish," Thwaites describes. "You get this 2D music," he adds, "which bypasses the 3D audio plugin, so it goes straight to your ears." In other words, the music is recorded in a traditionally stereo mix and the output is fed directly to the player's headphones without bothering with any spatial positioning in the virtual world. "Then, as you look around, these jellyfish light up as you look directly at them," Thwaites goes on, "and they emit a tone in 3D in space so the music tone stays where it is in the world." So, these tones have been attached to specific jellyfish in the virtual world, spatially positioned to emanate from those locations, as if special portions of the non-diegetic score have suddenly leapt into the VR world and taken up residence there. "And that has this really nice effect of creating this really immersive and magical moment which is really unique to VR," Thwaite remarks.
So this method served to help non-diegetic music feel more natural within the VR environment. But what happens when pure non-diegetic music is an absolutely necessity?
In the game Star Wars Battlefront Rogue One X-Wing VR Mission, the audio team at Criterion Games were tasked with creating an authentic audio experience in a virtual reality environment dedicated to the eternally famous and popular Star Wars franchise. In this case, according to audio lead Jay Steen, pure non-diegetic music was a must. "Non-diegetic means not from a source in the scene. This is how most movies and flatscreen games handle the music. So the music plays through the direct out straight to the player’s ears and we were worried from what we’d heard about non-diegetic music that it would distract from immersion," Steen confesses. "But we actually found the opposite. Maybe that’s because you can’t have a Star Wars story without the music. You don’t feel like you’re in Star Wars until the music kicks in." According to Steen, the non-diegetic music worked in this circumstance because the audio team was careful to avoid repetition in the musical score. "We didn’t reuse or loop cues that much, and due to the linear structure of the mission we could kind of get away with this," Steen points out. "We think that helps to not break immersion."
Sometimes non-diegetic music can be introduced into a VR game, and then quickly transformed into diegetic music within the immersive environment in order to enhance player presence. In my musical score for the Dragon Front game for Oculus Rift, I composed a dramatic choral track for the opening main theme of the game. During the game's initial logo sequence, the music is channeled directly to the player's ears without any spatial positioning. However, this changes as soon as the player fully enters the initial environment (wherein the player navigates menus and prepares to enter matches). Once the logo sequence has completed, the music makes a quick transition, from a full-bodied direct stereo mix to the player's headphones, to a spatially localized narrow mix located to the player's lower right. Upon turning, players see that the music is now coming from a battered radio, which the player is free to turn on and off. The music is now fully diegetic, existing inside the game's fictional world. Here's a video showing this sequence in action: |
While non-diegetic music can be tricky in VR, sometimes its an important part of the overall aesthetic. Plus, there can be ways to integrate non-diegetic music into the spatial environment. Joe Thwaites of Sony Europe describes an interesting combination of diegetic and non-diegetic music that was integrated into the 'VR Luge' section of the PlayStation VR Worlds game. In this gameplay sequence, players ride feet-first on a luge that's racing downhill amidst heavy vehicle traffic. The experience was designed to be a heart-stopping thrill ride. "So one of the experiments we did around the synchronization of the world was using a combination of diegetic and non-diegetic music to build tension as you zoomed down the hill," Thwaites describes. "We used 3D car radios to introduce elements of percussion into the 2D soundtrack that was playing." In the musical score for this sequence, the non-diegetic music presented a purely percussive rhythm, but as the player passed by other cars, the music would change. "So as you passed a car with a radio playing, an element of that 3D music would transition from the car into the 2D soundtrack." In this way, the in-game radio music would briefly become a part of the game's non-diegetic score, while still conveying spatial positioning inside the 3D world.
So in these examples from PlayStation VR Worlds and Star Wars Battlefront Rogue One X-Wing VR Mission, we see that audio teams grapple constantly with the contrasting natures of diegetic and non-diegetic music. While it seems as though non-diegetic music has been relegated to a very traditional, non-spatially localized delivery, this may not always be the case. Jay Steen of Criterion Games spent some time considering the possibility of delivering the non-diegetic music of his Star Wars game with a more enveloping spatial texture. "We did do a quick experiment on it, and we found that it’s like having an orchestra sitting around you," Steen says. "We didn’t want to evoke you sitting in the middle of an orchestral recording. We just wanted it to sound like the movie." That being said, Steen doesn't rule out the possibility of a more spatially-interesting mix for music in the future, including the use of ambisonic recordings for non-diegetic musical scores. "Ambisonic recordings of orchestras for example," Steen speculates, "I think there’s something fun there. We haven’t experimented with it anymore than that, but yeah, definitely, we’d want to try."
So this concludes our look at two presentations from GDC 2017 that focused on issues that complicate music creation and implementation in virtual reality. I hope you've found this interesting, and please feel free to leave a comment in the space below!
Winifred Phillips is an award-winning video game music composer whose most recent projects are the triple-A first person shooter Homefront: The Revolution and the Dragon Front VR game for Oculus Rift. Her credits include games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER'S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games.
Follow her on Twitter @winphillips.
Read more about:
Featured BlogsYou May Also Like