WRITTEN BY Isabel Roney

Studio apprentice, BOOM BOX POST

Starting your first 360 audio project can be a bit overwhelming., so I decided to focus my Lunch and Learn on some essential concepts to help you get started. This post is specifically about designing for 360 video or VR experiences that are intended to be heard through headphones.

Jeff wrote a great blog post about a 360 video project Boom Box did for The Loud House a few years ago, so make sure to check that one out, too!

What does a 360 plugin actually do? 

A 360 plugin attenuates sound that falls outside of the viewer’s field of view. Our field of view is generally considered to be about 180 degrees, but our “focus area” is about half of that range. Any sound object placed outside of that 45-degree angle or placed far away will have EQ, reverb, and volume automation to simulate its location.

Every 360 plugin is set up a bit differently, but almost all of them include three essential parts: the control, the spatializer, and the loudness meter.

The control routes audio to a binaural playback, so that you can monitor your work. It also manages any global parameters in the session.

The spatializer is the interactive panning component. Each sound source has its own point on a spatializer graph that you can use to position it, just like in the ProTools 5.1 panner. Anything routed through the spatializer is affected by the viewer’s head movement.

The loudness meter looks like the meters you would encounter in a stereo or 5.1 mix, but it’s important to note that you must use a loudness meter specifically meant for reading ambisonic mixes. These meters are designed to measure the peak loudness of your mix as if the viewer is facing the loudest point of sound at any given moment.


What are Ambisonics?

Since we want to hear sound coming from specific locations depending on where the viewer is looking, we can’t print our completed timeline down to a stereo, 5.1, or 7.1 mix. We need an audio format that allows for head-tracking, which means we need ambisonics. Ambisonic recordings take in information from all directions. They have several channels, with higher “orders” of ambisonics having more channels. The first order has four, second order has nine, third order has 16, and so on. For most projects, like VR and 360 videos, you won’t really need anything higher than the second order.


Setting Up and Understanding Your Session

The video format: Although the final video format is in 360, it needs to be flattened out while editing. This is called an equirectangular video, which is made using a monoscopic camera. This allows us to see every inch of the frame at once.

Binaural Playback: Most 3D audio plug-ins don’t use head-tracking during editing. The 360 plugin converts your work to a binaural playback. This is heard through headphones and still incorporates attenuation based on location.

Headlocked audio: If you want something to exist “outside” of the world of the video, such as non-diegetic score or voiceover, you need to create headlocked audio. “Headlocked” means that there is no difference in sound no matter the viewer’s orientation. Incorporating headlocked audio differs from plug-in to plug-in, but a lot of the time you can print a master stereo stem along with your ambisonic stem and then combine them in a 360 video encoder. You can also create the illusion of headlocked audio by placing stereo files very close to and on all sides of the viewer in the spatializer. In AudioEase’s 360 suite, you can create headlocked audio by turning the position blur dial up to 100%, which turns that sound source from having a specific point in space to surrounding the viewer.

Session Organization

I learned a different way to organize the session than how it’s explained in Jeff’s post, so I’ll provide a possible alternative. At the end of the day, whatever method makes the most sense to you is the way to go!

Instead of splitting the video into quadrants, I like to break my session down object by object. Each object has its own track (or group of tracks) and a spatializer. The elements I design for that object stay on those tracks throughout the entire timeline, regardless of where the object moves. This helps keep your track count low. It also makes it easy to address universal changes for a design once you’ve gotten notes from clients.

Object-based organization makes for a speedy panning process. Once you’re happy with your designs, you can start writing your spatializer automation. Since your audio tracks are specific to one object, you can write all of the panning for that object in one go. If an object only appears for a short duration of the video, I suggest recycling those same tracks for other objects that will appear later on.

Let’s start designing!

Here are some tips on how to approach the creative aspects of spatial audio design.

  1. Sound design first, pan later. Don’t stress about panning until you’re happy with what your work sounds like.

  2. Be selective. Sound directs focus. When working in 360, the sound editor is responsible not only for guiding the viewer’s eyes, but also for guiding the viewer’s entire head. This can be difficult when you’re first starting out in 360. An overabundance of sounds, especially if it’s not clear where they’re coming from, can result in the viewer losing the narrative or leaving the video entirely. As a rule of thumb, try to limit the number of distinct sound sources to 4 or 5 at any given time.

  3. Consider the length of time a sound source is on screen. Short sounds can be difficult to locate. By the time the viewer registers a short sound, the visual component could have disappeared. You can’t predict where the viewer will be looking, so sounds that last for several seconds or that are repetitive are much more comprehensible. Of course, this depends on the situation.

  4. Use the full 360. A viewer goes into a 360 experience expecting to move their head a lot. This is a great opportunity to get creative and think outside of the box.

  5. Always test your panning. Adjusting to the equirectangular video takes time. Test your panning after you’ve done an initial pass. Low frequency sounds and changes in height are difficult to perceive in 360 videos.

  6. Keep rewatching in mind. Many 360 videos are marketed as novel experiences and don’t have long run times. Your 360 video clients want the video to be watched again and again. Every watch will be unique!

  7. Find your balance of realism. 360 videos immerse the viewer in the story. Some audio editing choices can bump the viewer out of that experience, such as: headlocked sounds that should only come from one direction, sounds at volumes and frequency ranges that don’t match their apparent distance, and “empty” backgrounds and ambiances. That being said, never be afraid to push the boundaries!

  8. Freeze tracks you’re not working on. The 360 panner takes up a LOT of CPU power, so I recommend writing automation one object at a time and freezing your tracks when you’re ready to move on.

I hope this overview of 360 audio design gives you some clarity on what to expect when starting your first project. 360 audio is an exciting way to challenge both your technical and creative skills. Plus, it can be a lot of fun!


If you enjoyed this blog, check out these:

LUNCH AND LEARN: USING A SAMPLER TO CREATE MOVING STEADIES

LUNCH AND LEARN: ADVANCED PRO TOOLS SHORTCUTS, TIPS, AND TRICKS

A BEGINNER'S GUIDE TO STARTING YOUR SOUND EFFECTS LIBRARY




Have you ever worked in 360 audio before? What was your experience like and what challenges did you face?

Comment