Everyone often talks about the image as the critical component in both traditional filmmaking and in VR. But I think that sound is equally important. When telling a story in VR, traditional tools like picture editing, composition, zooms, don’t really apply because the whole point of VR is to be immersed in a 360-degree spheric world where we (the viewer) can look where we want. But sound is different. The exciting thing about virtual reality sound design is that all the traditional post-production sound editing/mixing concepts we’ve all grown up with – still apply. And in addition, there are many new exciting sound editing storytelling approaches to use for stories in VR.
In real life, we use our binaural hearing to understand where we are in physical space – it’s not just our eyes which tell us this – but our ears too. In a concept called sound localization our brain uses the sounds we hear, both direct, and the reflections of those direct sounds off walls, glass windows, etc., to help us understand where we are in physical space. And for that matter where the things in physical space are in relation to us. It’s a very dynamic and largely unconscious process that happens continuously as we go though out lives.
For years, sound in cinema has tried to emulate this, first with stereophonic sound in movies, and then with the various surround sound formats like the common 5.1 or Dolby Atmos. But the problem has always been that the cinema screen is a fixed unmoving rectangular window that we look out into the world with. So the sound of the Star Wars TIE fighters flying from the back of the theatre to the front – and then suddenly appearing on screen, doesn’t really work completely, as there is a disconnect between the very 360-degree sound environment and the fixed limited-view of the movie screen.
How Virtual Reality Sound Design changes that.
But in VR, those TIE fighters actually do come from behind us. We can turn our heads and watch them fly toward us. Finally, the visual has caught up to the place where sound has been leading us. Thus, understanding and using the concept of sound localization to create the virtual reality sound design for our stories allows us to not only place the viewer in a believable virtual space, but provide story cues in a way never before possible.
In traditional cinema sound design, if there was a barking dog in the rear surrounds – no one in the theatre would turn their heads to look to see the dog. But in the VR world, if there is an incessantly barking dog “behind us”, we may well turn our heads to see what that dog is barking about.
In this way, directors of VR story can use well designed sound and sound editing to help shape and direct their viewer’s experience – as indicators of where a viewer might place their primary focus – in a world where the viewer can be looking anywhere. Because, remember, the audience is listening.