Our first case study begins with avid Works user Petr Soupa, the sound designer at Soundsquare and C-flat. He has been in post-production for more than 20 years and thinks VR is an exciting opportunity. He believes VR can bring many possibilities to sound.
G’s: How did the Hans Krása Quartet (HKQ) project begin?
Petr: When VR first broke out, I was very excited to test the possibilities with the new technology. So, I started my first job working in VR on a project called “The Committee,” which is a short, humorous drama available on Kaleidoscope VR. During this project, I learned the concept of spatial audio. Soon after I discovered Works, which is the plugin I used to create HKQ. HKQ’s inception started this past June, once I became comfortable working in 360 video. As a music-enthusiast and audio engineering professional, I knew I wanted to create VR content that highlighted music.
Please note: The audio quality and localization accuracy is reduced by FOA format and YouTube renderer in this video.
I specifically chose classical music because I was very curious to hear it through VR. As a music producer and composer myself, I have a lot of respect for classical music. I knew if I could portray the feeling of someone sitting in the middle of a quartet playing harmonious music, it would be a very powerful and unique experience.
G’s: Did you find it difficult to create HKQ?
Petr: The whole process was actually not that difficult. During the project, we recorded each of the two compositions only twice, as we worked with a talented and professional group of musicians. For me, HKQ was even easier than the 360 degree drama on which I previously worked. In the drama, there were multiple components we had to figure out, including hiding microphones from the shots and characters moving about the scenes, unlike the music piece where microphones are a natural aspect and players were seated throughout the performance. In terms of post-production, there was very little automation needed in HKQ. What made the HKQ project easier was that I was able to leverage my experience from my first 360 video as I knew more about the workflow and post-production processes. I felt like everything was more controlled in HKQ, which just goes to show how much you can learn by doing something.
One might think spatialization is an extremely difficult job and could take copious amounts of time, but it’s actually not that time consuming. Spatialization or creating automation is just one additional step, where you place the content in 360 degrees. Compared to a regular project, the more difficult parts to consider in 360 are the multiple elements involved. Aside from the automation, these aspects include recording clear dialogue without showing microphones, as well as the staff and other props that are usually “behind the scene” in traditional video.
G’s: What were your aiming to accomplish with this project?
Petr: What I really aimed to achieve was bringing music close to the listener so they could hear detailed sound clearly. What sparked my interest for the video stemmed from whether or not this would be possible. I wanted listeners to feel like they were actually sitting amongst these amazing musicians, which is why we recorded in a space with great acoustics, but aimed to record dry audio signals. In order to deliver detailed sound, I’ve used a combination of an Ambisonic microphone and spot microphones. The Ambisonic microphone is useful, but not solely reliable to capture individual, detailed sounds. Just like how ADR is used in traditional media to record clear, individual audio signals, Ambisonic signal is not enough to deliver detailed sound.
G’s: Can you elaborate more about the setup?
Petr: I strategically chose Bohuslav Martinu’s Hall, which is part of the Prague Academy of Music and Dance (HAMU), based on several factors. The architecture and design of the hall reflected the sentiments of classical music. We chose to record early in the morning, which allowed the sunlight to cascade through the windows, further adding to the ambiance we wanted to depict. Lastly, it offered superior acoustics, as well as a recording studio with an SSL console.
On recording day, we had one Soundfield Ambisonics microphone under the Nokia OZO camera. Each instrument had its own Neumann KM185 microphone. The most challenging part during the setup was finding the proper position for the camera. As each musician had sheet music on a stand, we had to place each stand strategically in order to avoid obstructing the view of the camera. At the end of the day, I was very pleased with the captured audio signal quality.
G’s: Why are the players placed at nearly 180 degrees, but not 360 degrees?
Petr: I thought about that too, but then I remembered my primary goal, to bring the best music possible to the audience. I wanted to transport listeners directly into the scene where they could clearly hear the music and better understand how it is played in a quartet. I believe that the best possible degree to listen to music comfortably was not in 360 degrees because then you would be constantly turning around. Think of the color TV. You don’t see every color scheme possible all the time. What about the modern color grading film? It’s the same in VR. I don’t want to use VR as an element of surprise for audiences. My job is to make sure that their listening experience is leveraged with VR.
In my next project, I’m even thinking about narrowing the musicians placement to approximately 150 degrees. The cameras will be slightly farther from the musicians, so that you can see more with less movement. Like I said, I don’t want to shock people with the 360-degree format. I only want to bring proper music for audiences to enjoy.
G’s: What does the HKQ project mean to you?
Petr: It was a test for me to answer a few simple questions: Does VR help the music experience or not? How does it impact the audience? This experience gave me solid answers. It definitely does help, and in many ways.
I was unsure before the first shooting what the outcome would be. You really have to concentrate to hear the music. In 360, or when there’s interactivity required, you may become distracted, and I wasn’t sure if VR would help or perhaps hinder the music. But, I discovered that it really enhances the entire experience and places the listener in the middle of the musicians. You’re able to understand the communication between the performers; who’s playing what and why. Even the musicians were amused with the outcome and people’s reactions.
I have to commend G’Audio’s solutions for helping me achieve this. On top of the many advantages of using Works, such as its easy-to-use features, the most important aspect is the outcome of sound quality. The GA5 format quality is so great that it brings a new level of experience to VR. Sound professionals can decipher the difference between GA5 and FOA. That’s why I’m really looking forward to seeing the Sol renderer become more accessible to the public [Sol renderer can play GA5]. I’ve heard many people inquire about how to deliver quality sound in VR, and I’m sure once Sol is widely available, it will answer this question.
G’s: What else do you want to try in VR?
Petr: I’m not a producer. The traditional film language is lost in VR. Until producers figure out how to cultivate new storytelling, my next project will be music again, but with a different purpose. HKQ was more of a confirmation that it works. With the new project, I want to test that the distribution will work at the consumer consumption level. One of my ultimate goals would be to shoot from a conductor’s point of view with a large orchestra. It’s a bit far off at the moment, but would be a dream come true one day.
This is the first in the series of G’Audio community users stories. Please submit your stories at firstname.lastname@example.org.