Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs or learn how to Submit Your Own Blog Post
Video game composer Winifred Phillips presents extra research & scholarship gathered for her GDC 2018 talk: Music in Virtual Reality. This info (not included in the talk) is useful for composers and audio pros interested in music strategies for VR games.
By Winifred Phillips | Contact | Follow
Once again, the Game Developers Conference is almost upon us! GDC 2018 promises to be an awesome event, chock full of great opportunities for us to learn and grow as video game music composers. I always look forward to the comprehensive sessions on offer in the popular GDC audio track, and for the past few years I've been honored to be selected as a GDC speaker. Last year I presented a talk that explored how I built suspense and tension through music I composed for such games as God of War and Homefront: The Revolution. This year, I'm tremendously excited that I'll be presenting the talk, "Music in Virtual Reality." The subject matter is very close to my heart! Throughout 2016 and 2017, I've composed music for many virtual reality projects, some of which have hit retail over the past year, and some of which will be released very soon. I've learned a lot about the process of composing music for a VR experience, and I've given a lot of thought to what makes music for VR unique. During my GDC talk in March, I'll be taking my audience through my experiences composing music for four very different VR games –the Bebylon: Battle Royale arena combat game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike Shooter/RPG from Labrodex Inc. I'll talk about some of the top problems that came up, the solutions that were tried, and the lessons that were learned. Virtual Reality is a brave new world for game music composers, and there will be a lot of ground for me to cover in my presentation!
In preparing my talk for GDC, I kept my focus squarely on composition techniques for VR music creation, while making sure to supply an overview of the technologies that would help place these techniques in context. With these considerations in mind, I had to prioritize the information I intended to offer, and some interesting topics simply wouldn't fit within the time constraints of my GDC presentation. With that in mind, I thought it would be worthwhile to include some of these extra materials in a couple of articles that would precede my talk in March. In this article, I'll explore some theoretical ideas from experts in the field of VR, and I'll include some of my own musings about creative directions we might pursue with VR music composition. In the next article, I'll talk about some practical considerations relating to the technology of VR music.
So, let's get started!
No discussion of virtual reality is complete without some time spent on the perils of Visually Induced Motion Sickness (a.k.a. VIMS). My upcoming GDC talk will include research on this topic pointing to a specific music approach that can play an important role in alleviating VIMS symptoms.
However, there is more to consider about the general role that music plays in relation to the famous VIMS phenomenon, apart from the technique that I'll be describing in my GDC presentation. So let's take a look at the general relationship between music and VIMS, starting with the most basic question:
Let's picture ourselves sitting in a movie theater. We're watching a silent film that shows a first-person perspective of a high-speed bicycle ride full of wild twists and turns. It looks stressful, but as we sit and watch the visuals, we're not really all that stressed. Okay, so now let's imagine that the film isn't silent anymore. We can hear the bumps and jogs in the road, the air whooshing by, the aurally chaotic soundscape. It's a bit more exciting to watch, but we're still feeling comfortable in our movie-theater seats. Now, let's imagine that we aren't looking at a flat 2D screen anymore. Now it's a 3D stereoscopic image of that wild bicycle ride. Oncoming traffic leaps off the screen at us. Obstacles seem to whip by our heads as the road before us corkscrews madly. Are we still comfortable? Or could all that dizzying 3D motion be finally getting to us?
In their study to better understand the causes of motion sickness, professors Behrang Keshavarz and Heiko Hecht gathered 69 experimental test subjects and exposed them to the visual presentation I described above. There were two variables: viewing mode (2D or 3D stereoscopic) and sound (on or 0ff). The 2D film didn't cause a problem. Likewise, the presence (or absence) of sound wasn't an issue. But when 3D visuals were introduced, motion sickness became a big problem. The findings of the study support the conclusion that immersive visual stimuli has the potential to negatively impact our sense of balance and equilibrium. However, there's also a secondary conclusion that's equally interesting to us as game audio folks: sound doesn't seem to have anything to do with it. Yes, the 3D bicycle ride with sound was pretty nausea-inducing, but according to the study, a silent 3D bike ride has just as much potential to cause motion sickness.
So what does that mean for audio and music in virtual reality? Does it mean that the aural spectrum simply doesn't matter? Or does it present us with some interesting creative opportunities? Let's explore that idea a bit further.
First, let's dispense with the notion that music and sound design doesn't matter when it comes to VIMS. Specifically, the presence of music actually has a powerful influence on the VIMS state, but that influence is therapeutic rather than harmful.
GDC 2018 Presentation Preview In my upcoming GDC talk, I’ll be exploring the specific type of music that exerts the most beneficial effects when it comes to Visually Induced Motion Sickness. By virtue of both my own experiences with multiple VR projects and the results of relevant scientific studies, I'll be showing how video game composers can best alleviate the effects of VIMS through their musical compositions, and under which circumstances those compositions should be deployed. |
While there's a certain musical strategy that has the most beneficial effect (which I'll define in my GDC talk), the mere presence of music is a proven therapeutic agent that has been shown to diminish nausea symptoms. In a study conducted by the Arthur G. James Cancer Hospital and Research Center at Ohio State University, researchers found that the use of music during high-dose chemotherapy sessions led to a significant reduction in symptoms of nausea. Music acts both as a diversion and a targeted therapeutic agent, shifting the listener's attention away from physical discomfort while at the same time acting to reduce the symptoms.
In my talk I'll be exploring how we can best employ an effective music strategy within the constructs of virtual reality in order to cushion VR players and make them more comfortable in the immersive environment. There is, however, an additional dimension to the relationship between music, audio and VR exploration, which I didn't have time to include in my upcoming GDC talk. I'd like to share my thoughts on that here.
What makes virtual reality so real? It can't be just the encompassing imagery, because then we wouldn't need VR, we could just go to a 3D movie. No, in order for VR to engage us, it has to make us feel as though we are personally present in the virtual world. This phenomenon can be alternately called telepresence or virtual presence, but the end result is the same. We feel as though we're physically occupying the same world as the imaginary visuals we're encountering. How does the game make us feel this sense of presence?
According to MIT Professor Thomas B. Sheridan, the sensation of presence depends on the operation of three important factors: a "sufficiently high-fidelity display, a mental attitude of willing acceptance, and a modicum of motor "participation". In other words, we need to find the visuals to be sufficiently convincing, we have to be willing to be convinced that they're real, and we must be able to move about freely and interact with the environment. Unfortunately, it's that third factor that causes the VIMS problem. Moving around in VR opens us up to motion sickness. How is this problem typically addressed?
According to Steve Bowler, cofounder of the VR game company CloudGate Studio, the community of VR game developers have “zero tolerance for user motion sickness." In an interview with ScienceNews.org, Bowler describes the way in which developers typically solve the problem. By virtue of a system of in-game navigation that relies on a type of teleportation, developers allow us to wander their VR worlds. We point our controllers where we want to be, we hit the teleport button and zip! We're there in a flash. It's highly effective in avoiding the perils of VIMS. However, it also sharply curtails our sensation of being able to "move about freely and interact with the environment."
So, if developers are forced to limit the personal agency of players in wandering around the environment, is it possible for game audio folks to compensate by making it seem as though the environment is wandering around us? This is a thought I've been considering lately, as I contemplate the movement limitations we experience in VR environments. After all, before we had visual virtual reality, we had a kind of audio VR in the form of audio-only games like Papa Sangre. In games like Papa Sangre, the environment presents a busy soundscape that invisibly drifts around us. If we close our eyes, we're suddenly fully enveloped in the world that the game developers have created. Merely turning around becomes a radically dramatic act of personal agency as the sonic universe reacts to our movement. I've included a non-interactive video clip below that demonstrates some gameplay from Papa Sangre. In this clip, you can watch a gamer interacting with the game's interface. Notice the somewhat exaggerated nature of the sound design as the player is instructed how to play the game:
The effect of this audio-only universe can be very immersive, and its power depends on the sensation of an active soundscape that surrounds, enfolds and interacts with the player in ways that exaggerate and heighten reality. Could these techniques make VR players feel more of a sense of presence in virtual reality, even if their physical mobility is limited? Should we be thinking about opportunities to present a soundscape with moving components and an exaggerated sonic palette?
Conversely, if we decide to exaggerate our audio world, would this disrupt the impression of realism that virtual reality attempts to convey? Here's where we come to another interesting concept that I wasn't able to include in my GDC talk, but that has some bearing on the train of thought we're currently pursuing.
Back in 2015, I wrote an article for Gamasutra about the "Uncanny Valley" - a concept that's been a long-time fixture in the visual arts but has just started to be discussed in connection with audio. When applied to the visual world, the "Uncanny Valley" pertains to representations of living things (most often humans, pictured left) that are impressively close to the real thing but that subtly miss the mark. This imperfection leads to a deeply unsettling impression of wrongness. In my Gamasutra article, I discussed how audio in the world of VR may be in danger of dipping into the "Uncanny Aural Valley," in which sound gets impressively close to perfect realism but misses the mark. Within the context of virtual reality, this subtle imperfection has the potential to impact players in a far more pronounced way than it would in a traditional video game. That barely-perceptible sonic wrongness can (theoretically) pull virtual reality players out of the immersive experience.
So, is VR audio in danger of dipping into the "Uncanny Aural Valley" anytime soon? Not according to Sean Earley, the Executive Editor of AR/VR Magazine. "The visual uncanny valley will still be around for a while," observes Earley. "Unless you are a super audiophile, however, digital audio has progressed to the point where a good engineer can make a recording that is very hard to distinguish from reality. Spatial audio, when mixed with simple VR can add a totally new level of realism to an experience."
That takes us back to our previous train of thought: if perfect sonic realism is achievable in VR, is it desirable? Or would we rather aim for a form of hyperrealism that emphasizes aural motion and more fully envelops the player? Do we want to dip into that Uncanny Aural Valley?
"Pure, super smooth and natural spatialized sound may be not immersive enough to get the sort of user experience/effect needed for VR," writes Gabor Szanto, the creator of the Superpowered audio software development kit for mobile. "You don’t want the most natural chirping bird sound, you actually want the cleanest and most 3D-like bird sound. You want to amaze the listener."
So, in a virtual reality environment in which we're forced to limit the physical mobility of players, is it possible that by making the aural environment hyperreal and supremely encompassing, we can compensate for any loss of presence that players might feel when they can't move around exactly as they please? I think it's an interesting idea to ponder, and one to which we should give some consideration as VR audio moves forward and becomes more ambitious. Also, as video game composers, we might want to consider how our music mixes can more fully surround players. For more surreal, synthetic or ambient-driven musical scores, we might even introduce spatial motion into our musical mixes, letting sounds float around players to convey an even greater level of sonic immersion.
In my next article, I'll be discussing some more down-to-earth technical issues that pertain to music and audio in VR. While my GDC presentation on Music in Virtual Reality will include several important technical issues and topics, there simply wasn't enough time to include everything that might be of interest. With that in mind, I'll be happy to explore some of these technical concerns in my next article! I’ve included the official GDC description of my upcoming talk below. Please feel free to share your thoughts and insights in the comments section!
This lecture will present ideas for creating a musical score that complements an immersive VR experience. Composer Winifred Phillips will share tips from several of her VR projects. Beginning with a historical overview of positional audio technologies, Phillips will address several important problems facing composers in VR. Topics will include 3D versus 2D music implementation, and the role of spatialized audio in a musical score for VR. The use of diegetic and non-diegetic music will be explored, including methods that blur the distinction between the two categories. The discussion will also include an examination of the VIMS phenomenon (Visually Induced Motion Sickness), and the role of music in alleviating its symptoms. Phillips' talk will offer techniques for composers and audio directors looking to utilize music in the most advantageous way within a VR project. Takeaway Through examples from several VR games, Phillips will provide an analysis of music composition strategies that help music integrate successfully in a VR environment. The talk will include concrete examples and practical advice that audience members can apply to their own games. Intended Audience This session will provide composers and audio directors with strategies for designing music for VR. It will include an overview of the history of positional sound and the VIMS problem (useful knowledge for designers.) The talk will be approachable for all levels (advanced composers may better appreciate the specific composition techniques discussed). |
Winifred Phillips is an award-winning video game music composer whose recent projects include the triple-A first person shooter Homefront: The Revolution and the Dragon Front VR game for Oculus Rift. Her credits include games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER'S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games.
Follow her on Twitter @winphillips.
Read more about:
Featured BlogsYou May Also Like