Sound Designing and Mixing Immersive Audio for MEGAN

Published on in Exclusive Interviews

By Peter Bawiec, M.S.P.E.

As recently as a couple of years ago, it was generally thought that immersive formats such as Dolby Atmos can only be used on big studio productions. Now, this is not necessarily true. Here’s a story of how a short passion project grew into a full-blown Hollywood production and how using Avid Pro Tools native support for Dolby Atmos allowed us to push the boundaries on a proof of concept. 

We’ve entered the age of immersive sound becoming a standard deliverable. Dolby Atmos is supported by the majority of consumer devices such as home theater systems, televisions, sound bars, and even tablets, and smartphones. Incorporating the immersive formats into post sound workflows early on guarantees that we future proof our films and embrace what’s ahead of us on the technology front. 

This summer, MEGAN, a proof of concept and an homage to the Cloverfield universe, was released on YouTube and went viral within two days attracting significant press coverage worldwide. Directed by the VFX artist Greg Strasz (Independence Day: Resurgence, 2012, It Follows) the film tapped into the Cloververse fan base electrifying it with speculations and theories.

At the time of this publication, MEGAN has gained over 1 million views on YouTube. You can watch the 7-minute film here:

The film was made with the support of industry leading companies such as Red Digital Cinema as well as Dolby that provided Dolby Vision color finish and their Atmos mix stage. As a sound designer and re-recording mixer, I know sound plays an important role in any film, but more so in an action movie. For me, Pro Tools and Atmos would not only become a mix tool but ultimately a sound design tool.

The entire editorial session from the start was running as the standard set of 7.1.2 DX, FX, MX beds and an array of objects for both FX, DZN and MX elements. I also allowed myself an additional food group of just Chopper/Helicopter elements to keep things organized. The beauty of the integration lies in the fact that regardless of you having a regular 5.1 edit room or a 7.1.4 sound design suite/mix room, you’ll still be able to do the proper editorial and prep everything correctly for the mix: at the end of your signal path is the Dolby Atmos Render Engine which collapses your sounds into the outputs that you’re actually working on.

Over the course of several months as the VFX were being created, I worked with the director on the sound design, creating the sound story of MEGAN. During this time, we worked in several different rooms, each one of them having a varying number of speakers, from 5.1, 7.1 to 7.1.4, and ultimately the Umlang Theatre at Dolby Burbank for our final mix, which was facilitated by Tom Graham and the entire team at Dolby Laboratories.

“I would walk into Peter’s studio and together we experimented with different ideas. We made the helicopters travel through the audience, we threw alien sounds on the walls, in the room, on the ceiling, experimented with placements of reverbs, etc.” says the director Greg Strasz. “The fact that Peter was able to have complete control of Atmos just within his computer, a couple faders and Pro Tools, without the need of having to go to a big facility or running massive computer rigs allowed us to be creative. The integrated Dolby Atmos workflow within Pro Tools helped us experience the immersive sound design making it a storytelling tool.”

Watch the movie side by side with the Atmos Monitor: 

 

Even though this was a pretty complex session, ultimately it was the simplicity of the integration of Dolby Atmos within Avid Pro Tools that allowed me to just plug in my hard drive and get to work. The flexibility of being able to move with a single session between multiple locations is the real power. Once the technology becomes a controlled variable, the creative aspects of our craft can start coming out to the forefront, and this is when you can take complete control over Pro Tools and begin experimenting with telling the narrative with sound. It was the experiments in the editorial stage within Pro Tools in Atmos that led us to ideas that we simply wouldn’t have come up with otherwise and doing it on the mix stage would not be possible given always existing time constraints.

With Avid Pro Tools you have direct control over anything and everything you’d need to design and mix in Atmos natively without the need of constantly referring to additional plugins or software that would slow down the creative process. Pro Tools simplifies this by keeping everything under the hood and communicating in the background with the Dolby Atmos Renderer.

Traditionally, there was a very fine separation between sound editors and re-recording mixers, but as the industry transitioned to in-the-box mixing, similarly the lines blurred more and more between the two professions. Bringing total control of the immersive platform to the sound editor fuses those professions even more, but more importantly streamlines and simplifies the process. What we have now, is a very organic workflow that anyone can start using immediately, with a minimal learning curve. 

In its core, immersive sound is all about bringing the audience deeper into the story. Just like going from Mono to Stereo, and later to 5.1. These were all leaps forward comparing to previous chapters. Today we’re finally at a point where the immersive sound is within reach of any sound editor and mixer. We did MEGAN as an Atmos mix to make a point and prove the feasibility of immersive sound on a project of any level, even a short proof of concept achieved on a more than limited budget.

ProductionHUB ProductionHUB Logo

About the Author

Peter Bawiec
Peter Bawiec M.P.S.E., is a Sound Designer & Re-recording mixer working in Film & TV. He has been nominated for an MPSE Golden Reel Award for his work on Score: A Film Music Documentary.

Related Blog Posts
5 Questions with ARRI’s Alan Lennox
5 Questions with ARRI’s Alan Lennox
Over his 30 years in production, Alan Lennox has worked in many capacities leading up to camera operator and then specializing in Steadicam. He is currently an active member of the Society of Camera Operators, IATSE 667Toronto, IATSE 669 Vancouver and other industry associations. Alan was one of the first operators to adopt the ARRI Trinity -- the first hybrid camera stabilizer that combines classic mechanical stabilization with advanced active electronic stabilization. He then joined ARRI as the Camera Stabilization Systems (CSS) Specialist for the Americas. Currently, Alan shares his time between supporting CSS sales, demonstrations and tradeshows and also leads ARRI’s CSS workshops for the ARRI Academy for Maxima, artemis, Trinity and Stabilized Remote Heads.
Published on Wednesday, March 20, 2019
theC47 Turns 10: A Decade of Production, Technology, Community & Education
theC47 Turns 10: A Decade of Production, Technology, Community & Education
Jem Schofield has been growing theC47 in sync with some of the most immense changes in terms of technology in this industry. The last decade has been incredible with the introduction and advancement of digital, advancements with lighting technology, camera formats, drones, streaming and just about everything else. We sat down with him to talk about his journey throughout the years.
Published on Thursday, March 14, 2019
Diving Into the ARRI Academy Certified Camera and Cinematography Workshop
Diving Into the ARRI Academy Certified Camera and Cinematography Workshop
Matt Siegel, Director of Photography, talks about being an instructor and more about the ARRI Academy Certified Camera and Cinematography Workshop.
Published on Wednesday, March 13, 2019

Comments

There are no comments on this blog post.

You must be logged in to leave a comment.