Sponsored By

VR for the Game Music Composer – Artistry and Workflow

In this article we'll be looking at some resources that discuss issues relating to artistry and workflow in audio for VR, including an overview of 3DCeption, the Oculus Spatializer Plugin, and insights from a postmortem of the VR game Land's End.

Winifred Phillips, Blogger

January 7, 2016

7 Min Read
Game Developer logo in a gray background | Game Developer

Since the game audio community is abuzz with popular excitement about the impending arrival of virtual reality systems, I’ve been periodically writing blogs that gather together top news about developments in the field of audio and music for VR.  In this article we’ll be looking at some resources that discuss issues relating to artistry and workflow in audio for VR:

  • We’ll explore an interesting post-mortem article about music for the VR game Land’s End.  

  • We’ll be taking a closer look at the 3DCeption Spatial Workstation.

  • We’ll be checking out the Oculus Spatializer Plugin for DAWs.

Designing Sound for Virtual Reality

In these early days of VR, postmortem articles about the highs and lows of development on virtual reality projects are especially welcome.  Freelance audio producer and composer Todd Baker has written an especially interesting article about the audio development for the Land’s End video game, designed for the Samsung Gear VR system (pictured to the right - I'm trying out the Samsung Gear VR system on the show floor of the 2015 Audio Engineering Society convention).

Todd Baker is best known for his audio design work on the whimsical Tearaway games, and his work as a member of the music composition team for the awesome LittleBigPlanet series. His work on Land’s End for Ustwo Games affords him an insightful perspective on audio for virtual reality. “In VR, people are more attuned to what sounds and feels right in the environment, and therefore can be equally distracted by what doesn’t,” writes Baker.  In the effort to avoid distraction, Baker opted for subtlety in regards to the game’s musical score. Each cue began with a gentle fade-in, attracting little notice at first so as to blend with the game’s overall soundscape in a natural way.

Going a step further, Baker enhanced this music/sound design blend by actively blurring the distinction between the two aural elements.  Sound effects were designed with a sense of musicality inherent in them. The score for the entire game was constructed with heavy use of key signatures sharing lots of common tones.  This allowed the “musical” sound effects to blend with the atmospheric score in a pleasing way.  According to Baker, this approach “blurs the line between what the player would recognise as music or sound, and helps them to instead accept that this is how the world sounds.”

Baker’s entire score for Land’s End is available for free download on SoundCloud.  Here’s the trailer video for the Land’s End game:

The 3DCeption Spatial Workstation

3DCeption Spatial Workstation is a new kind of DAW designed for VR audio. The philosophy of this software is driven by the simplification of workflow – eliminating the need to render audio and import it back and forth from one application to another.  Using 3DCeption, the composer or audio designer can continue sound asset creation using existing DAWS such as Reaper, Nuendo, and the famous and ubiquitous Pro Tools application.  The 3DCeption application provides a host of plugins that are compatible with the user’s current DAW.  Taking into account the geometry of the environment, the plugins are able to process the audio to include reflections, occlusions, and realistic spatial positioning that incorporates head tracking. In order for head tracking to work, the software package includes the Intelligent 360 Video Player that integrates directly with the virtual reality headset, allowing the user to preview the audio mix in the VR environment.

(Pictured above: how the 3DCeption Plugin appears in the Pro Tools Daw)

One of the most interesting and unique aspects of this software is its ability to accommodate the ambisonic method of recording three dimensional sound. Initially created in the 1970s, the ambisonic recording method uses multiple channels: one captures the sound pressure, while the other three capture the spatial coordinates of the sound (X, Y and Z).

In the picture to the right, we can see a visual representation of this audio encoding format, which is known as B-format. One of its biggest advantages is that it doesn’t dictate a specific speaker configuration in the way that multichannel surround sound does. Instead, the B-format encoding process allows the audio to be subsequently decoded by the end user’s speaker system. This allows the audio in B-format to accommodate many different speaker arrays and configurations. Even more interesting for its application in VR, the B-format can be reconfigured into almost any playback format.

For instance, composers may find this useful when trying to record an orchestra or other live ensemble for use in 3D sound. Using an ambisonic soundfield microphone (for example, the TetraMic from Core Sound), the composer can capture a three-dimensional recording that can then be reconfigured into many different playback formats (including the binaural/HRTF format currently favored by VR developers).  This music recording, captured with the ambisonic method, can then be dropped into 3DCeption using what they call an Ambi Array, which enables the music to function as a binaural soundfield that reacts faithfully to the orientation and head tracking of the player.  This has the potential to give the music a much more natural integration with the rest of the aural environment.

I haven’t yet seen any demonstrations of three dimensional positioning for music using 3DCeption.  However, we can get a sense of the simple possibilities of the 3DCeption software in these two sound design demonstrations (use headphones to experience the three dimensional audio):

 

The Oculus Spatializer Plugin for DAWs

As we know, creating audio for the three dimensional space of a VR experience can involve a clumsy workflow as composers and sound designers jump from one complex audio application to another. Fortunately, software developers have been busily addressing this problem (as we learned in our discussion of the 3DCeption Spatial Workstation). Now Oculus (gearing up to release the Rift in the first quarter of 2016) wants to make things simpler for audio creators.

With the Oculus Spatializer Plugin for DAWs, users of Digital Audio Workstations (such as Pro Tools, Nuendo, Cubase, Reaper, etc) can preview their music and sounds in the 3D spatialized environment of VR, using positioning automation that faithfully replicates the immersive aural environment of the Oculus Rift.  Pictured below is how the Oculus Spatializer Plugin window looks within a DAW.

Let’s take a look at a video that demonstrates the Oculus Spatializer Plugin from within the Ableton Live application:

 

 

Winifred Phillips is an award-winning game music composer whose most recent project is the triple-A first person shooter Homefront: The Revolution.  Her credits include five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the Massachusetts Institute of Technology Press.  As a VR game music expert, she writes frequently on the future of music in virtual reality video games.

Follow her on Twitter @winphillips.

Read more about:

Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like