Gamers Chart The Path In VR Audio

Category: 
Gamers Chart The Path In VR Audio
October 3, 2016
Above picture: Visitors play a video game "Star Wars Battlefront" published by Pandemic Studios at the Paris Game Week, a trade fair for video games last year in Paris, France.

"If the audio isn’t perfect, it takes you completely out of the virtual experience," one GameSound Con-goer tells Billboard.
 
As a media that’s paved the way in spatialized sound, video games are a bellwether as the entertainment industry makes its big push into virtual reality -- and while the implications are huge for pro audio, they don’t stop there.
 
“In VR, there are no picture cuts,” Scott Gershin, creator of the Sound Lab at Technicolor, told attendees of GameSound Con this week. “Sound, music and dialogue are what create the phrasing, the pacing,” and will be assuming a frontline role in guiding end-users through the virtual and augmented reality  experiences -- topics  that took center stage at the annual industry conference, Sept. 27-28 at the Millennium Biltmore in downtown Los Angeles.
 
The timing syncs with HTC throwing open its Vive app store on Friday (Sept. 30) to support the VR hardware it began shipping in the spring, when Facebook’s Oculus Rift and Samsung’s Gear also came to market.  October sees Playstation VR hitting shelves, while Microsoft’s HoloLens AR technology is expected to debut in January. The hardware tsunami has triggered a content-quake, the reverberations of which are being felt throughout the sound sector.
 
“Linear storytelling is a lot different than living in a simulation,” said Gershin, a sound designer and editor who has worked on films including Star Trek, Gladiatorand Braveheart. As mixes evolve to fill 360-degree space, “Everybody wants to twirl, but that’s not what it’s about. VR needs focal points, and audio is going to give you that. Is somebody fighting in the distance? Are you in a cave with drips? Music and sound will provide the equivalent of establishing shots.”
 
For the game set, full frontal mixes have already faded into the past. To what extent this will influence the wider world of recording is an interesting question. Certainly its influence will be felt, as algorithms and ambisonics allow engineers to spatialize sound with a high degree of specificity, whether the output is intended for Dolby Atmos speakers or Oculus Rift headphones. “With VR, the mix for the score can happen around you,” said Michael Bross, a composer with more than 20 years of game audio experience, including the soundtrack-driven “Oddworld” series, and more recently the “Edge of Nowhere” VR game for Rift. “I might have 60 tracks, and I can spatialize each track,” Bross explained.
 
To experiment in this level of elasticity, sound designers and engineers ditch the traditional DAWS in favor of a game engine, most often Unity, coupled with an integrated audio middleware solution -- Wwise and FMOD being the most popular.
 
Wwise is licensed on a per-project basis, and Simon Ashby, co-founder of Wwise parent company Audiokinetic, said projects ballooned from 50 in July to more than 100 as of September. “What we’re hearing from customers is if the audio isn’t perfect, it takes you completely out of the virtual experience,” Ashby said, recounting outreach “from producers who say, ‘We have this great project, but the audio just isn’t working. Please help us fix it.' We’ve had a few calls like that.”
The scheduled December opening in Culver City, Calif., of the Technicolor Experience Center, devoted to VR and AR, is one example of the industry’s commitment. Formosa Group recording in West Hollywood is another major player that’s made a significant investment in VR sound.
 
“At Formosa Interactive we’ve been doing 3-D audio for so long, in the triple-A game space, that we are actually quite adept at spatializing sound, making sense of full 360-degree realism,” VP creative services Paul Lipson said, rattling off a list of credits that includes the Call of Duty franchise, Halo Wars 2, Star Wars: Battlefrontand League of Legends. “It comes down to how do we optimize that experience? It’s all CPU and processing-dependent, so in addition to the goals of the story, or user experience, you have to understand the hardware limitations.”
 
Jaclyn Shumate, audio director for Pop Cap Games, was intrigued by the GameSoundCon presentation by Paul Hembree  on virtual instrumentation and procedurally generated music. “It got me to thinking how, in VR, you could have the music react to where you move in the room, or emitting different mood sounds triggered by where you go or what you do.”
 
Her colleague, Pop Cap audio director and composer Becky Allen, saw mobile potential in a similar demonstration by electronic music composer Isaac Schankler on “generative game music.” Schankler, who teaches at Cal Poly Pomona, described “a way to automatically trigger complex patterns of music from simple rules. So you can dynamically generate a lot of music, and it doesn’t need to loop or be stored in audio files.” For mobile games, that’s a big plus, because smaller files mean a more nimble app. Another useful aspect is that his automata respond to gameplay in an interactive, “in-the-moment way. You don’t need to queue up a new track. You just need to change a parameter in the automaton, and it’s going to respond to game events in minute and subtle ways and dramatic ways.”
 
This year marked the first time VR had a dedicated track at GameSoundCon. “Last year, we had four rooms going. This year we had five,” said GameSoundCon producer Brian Schmidt, who said the conference sold out, with more than 350 attendees -- a 20 percent increase over last year.
 
Rivaling VR in buzz was composer Gordy Haab, whose presentation on “The Music of Star Wars: Battlefront” was a standing-room-only event. Haab recounted in detail how when it came to the influence of legendary Star Wars composer John Williams, “I did everything I could to channel him without imitating the music.”
 
As with Williams’ theatrical scores for the Star Wars films, Haab’s DICE production for Electronic Arts was recorded at Abbey Road Studios with the London Symphony Orchestra, luxuries that made the audience of composers -- most of whom relied on MIDI “orchestras” and were lucky to have a budget that accommodated the hiring of a few soloists -- swoon.
 
That Haab got to write original themes for characters Chewbacca and Greedo awed the crowd, which thrilled to the fact that one of their own bested “other composers that were more established and further along in their careers.” Hans ZimmerMarkMothersbaugh and Harry Gregson-Williams have all composed for games, and it was presumably this caliber of talent over which Haab’s gamesmanship triumphed. A scrupulously crafted nine-minute spec demo, professionally mixed by longtime collaborator John Rodd, won the gig -- and the day. Well played.

Related articles

VRrOOm Wechat