I promised additional discussion regarding my experience at the AES AVAR (Audio for Virtual and Augmented Reality) conference from a couple of weeks ago. I’m sorry for the delay in getting back to the topic — I was in Colorado and distracted with other things. But the world of VR is a booming marketplace and it I wanted to make sure and get back to it before getting on to other things.
Despite the fact that Virtual Reality has been around for many years, the next three to five years will see the market explode. The estimates that I’ve read talk about VR being a $120 billion dollar market within the next 3-5 years. There are over 4000 production companies working on VR productions and technology companies are scrambling to develop tools and procedures to develop compelling content. The early experiments that I’ve seen and heard have been pretty limited. Would I put my iPhone in a Google cardboard VR viewer or strap a video screen to my head on a regular basis? No. My interest is whether the development of audio/music for VR or Augmented Reality has merit. I want to know how to capture of create audio for VR, how to post it, and how to distribute it. And I want to see if the music industry will have a piece of this market or will it be limited to games and other non-music experiences.
The AVAR conference was full of technologists and academics showing off their wares. They discussed the challenges of convincing people that a virtual sound experience can be as real as real life. This means personalizing the delivery of audio channels to my ears using measured HRTFs (head related transfer functions) for my own ears. And it means having the ability to track the movement and location of my head. These things are non trivial and require much more than powerful processors and clever algorithms. How many people have their own HRTF stored on their computer or smartphone? How would you even go about getting your head and ears measured?
What about the aesthetics of music production via VR? Does it make sense to place a soundfield microphone or quad binaural microphone (yes, they exist) in the midst of a performing ensemble and then deliver it via headphones with motion tracking to VR listeners? The people at Sony Music showed off a VR music video at a recent event and limited the audio to the standard released stereo master. They disconnected the visual experience from the soundtrack — which diminishes the whole thing. My Blu-ray productions don’t try to be VR experiences but they do have compelling video (some in 3D) and 5.1 surround sound. The most common complaint or comment I get from customers is the disconnect between the video and the audio. They see a drum kit on the left hand side of the stage and hear the drums on the right side of the 5.1 surround mix. They ask if I can’t fix it or make the music mixes track the visuals. No, I can’t. In my productions, the music comes first. The video is a bonus. If you feel the video and audio are at odds with each other, switch the video off!
The presenters at the AES AVAR event didn’t address the problems of getting VR video to “sync” with music. In fact, one guy actually panned the lead singer of a live music performance around the 3D space to “lock” the vocal track to the video. Never mind that they whole mix was destroyed. We are in the early days of VR and music. It reminds me of the dawn of 5.1 surround mixing all those years ago. There are no templates to follow.
My fear is that the excitement of the new technology will relegate the power of a well-recorded music track to a supporting role behind a 360 video. Music should maintain its place as a premiere experience. The VR guys have got to figure this out. And what I saw and heard at the AVAR, they are a long way from doing so.