Empowering the Next Generation of Women in Audio

Join Us

Whose Job is It? When Plug-in Effects are Sound Design vs. Mix Choices.

We’ve reached out to our blog readership several times to ask for blog post suggestions.  And surprisingly, this blog suggestion has come up every single time. It seems that there’s a lot of confusion about who should be processing what.  So, I’m going to attempt to break it down for you.  Keep in mind that these are my thoughts on the subject as someone with 12 years of experience as a sound effects editor and supervising sound editor.  In writing this, I’m hoping to clarify the general thought process behind making the distinction between who should process what.  However, if you ever have a specific question on this topic, I would highly encourage you to reach out to your mixer.

Before we get into the specifics of who should process what, I think the first step to understanding this issue is understanding the role of mixer versus sound designer.

UNDERSTANDING THE ROLES

THE MIXER

If we overly simplify the role of the re-recording mixer, I would say that they have three main objectives when it comes to mixing sound effects.  First, they must balance all of the elements together so that everything is clear and the narrative is dynamic.  Second, they must place everything into the stereo or surround space by panning the elements appropriately.  Third, they must place everything into the acoustic space shown on screen by adding reverb, delay, and EQ.

Obviously, there are many other things accomplished in a mix, but these are the absolute bullet points and the most important for you to understand in this particular scenario.

THE SOUND DESIGNER

The sound designer’s job is to create, edit, and sync sound effects to the picture.


BREAKING IT DOWN

EQ

It is the mixer’s job to EQ effects if they are coming from behind a door, are on a television screen, etc.  Basically, anything where all elements should be futzed for any reason.  If this is the case, do your mixer a favor and ask ahead of time if he/she would like you to split those FX out onto “Futz FX” tracks. You’ll totally win brownie points just for asking.  It is important not to do the actual processing in the SFX editorial, as the mixer may want to alter the amount of “futz” that is applied to achieve maximum clarity, depending on what is happening in the rest of the mix.

It is the sound designer’s job to EQ SFX if any particular elements have too much/too little of any frequency to be appropriate for what’s happening on screen.  Do not ever assume that your mixer is going to listen to every single element you cut in a build, and then individually EQ them to make them sound better.  That’s your job!  Or, better yet, don’t choose crappy SFX in the first place!

REVERB/DELAY

It is the mixer’s job to add reverb or delay to all sound effects when appropriate in order to help them to sit within the physical space shown on screen.  For example, he or she may add a bit of reverb to all sound effects which occur while the characters on screen are walking through an underground cave.  Or, he or she may add a bit of reverb and delay to all sound effects when we’re in a narrow but tall canyon.  The mixer would probably choose not to add reverb or delay to any sound effects that occur while a scene plays out in a small closet.

As a sound designer, you should be extremely wary of adding reverb to almost any sound effect.  If you are doing so to help sell that it is occurring in the physical space, check with your mixer first.  Chances are, he or she would rather have full control by adding the reverb themselves.

Sound designers should also use delay fairly sparingly.  This is only a good choice if it is truly a design choice, not a spatial one.  For example, if you are designing a futuristic laser gun blast, you may want to add a very short delay to the sound you’re designing purely for design purposes.

When deciding whether or not to add reverb or delay, always ask yourself whether it is a design choice or a spatial choice.  As long as the reverb/delay has absolutely nothing to do with where the sound effect is occurring, you’re probably in the clear.  But, you may still want to supply a muted version without the effect in the track below, just in case, your mixer finds that the affected one does not play well in the mix.

COMPRESSORS/LIMITERS

Adding compressors or limiters should be the mixer’s job 99% of the time.

The only instance in which I have ever used dynamics processing in my editorial was when a client asked to trigger a pulsing sound effect whenever a particular character spoke (there was a visual pulsing to match).  I used a side chain and gate to do this, but first I had an extensive conversation with my mixer about if he would rather I did this and gave him the tracks, or if he would prefer to set it up himself.  If you are gating any sound effects purely to clean them up, then my recommendation would be to just find a better sound.

PITCH SHIFTING

A mixer does not often pitch shift sound effects unless a client specifically asks that he or she do so.

Thus, pitch shifting almost always falls on the shoulders of the sound designer.  This is because when it comes to sound effects, changing the pitch is almost always a design choice rather than a balance/spatial choice.

MODULATION

A mixer will use modulation effects when processing dialogue sometimes, but it is very uncommon for them to dig into sound effects to use this type of processing.

Most often this type of processing is done purely for design purposes, and thus lands in the wheelhouse of the sound designer.  You should never design something with unprocessed elements, assuming that your mixer will go in and process everything so that it sounds cooler.  It’s the designer’s job to make all of the elements as appropriate as possible to what is on the screen.  So, go ahead and modulate away!

NOISE REDUCTION

Mixers will often employ noise reduction plugins to clean up noisy sounds.  But, this should never be the case with sound effects, since you should be cutting pristine SFX in the first place.

In short, neither of you should be using noise reduction plugins.  If you find yourself reaching for RX while editing sound effects, you should instead reach for a better sound! If you’re dead set on using something that, say, you recorded yourself and is just too perfect to pass up but incredibly noisy, then by all means process it with noise reduction software.  Never assume that your mixer will do this for you.  There’s a much better chance that the offending sound effect will simply be muted in the mix.


ADDITIONAL NOTES

INSERTS VS AUDIOSUITE

I have one final note about inserts versus AudioSuite plug-in use.  Summed up, it’s this: don’t use inserts as an FX editor/sound designer.  Always assume that your mixer is going to grab all of the regions from your tracks and drag them into his or her own tracks within the mix template.  There’s a great chance that your mixer will never even notice that you added an insert.  If you want an effect to play in the mix, then make sure that it’s been printed to your sound files.

AUTOMATION AS EFFECTS

In the same vein, it’s a risky business to create audio effects with automation, such as zany panning or square-wave volume automation.  These may sound really cool, but always give your mixer a heads up ahead of time if you plan to do something like this.  Some mixers automatically delete all of your automation so that they can start fresh.  If there’s any automation that you believe is crucial to the design of a sound, then make sure to mention it before your work gets dragged into the mix template.

What you’ve always wanted to know about being a Theatre Sound Designer

 … but didn’t know one to ask.

Recently, I was invited to teach a seminar on sound design to the Stage Management and Technical Theatre students at the Academy of Live and Recorded Arts (ALRA) in London. During the Q&A, I realised the questions they were asking are ones I’ve been answering for most of my career, and not only from students. Directors, producers, other designers, and colleagues in other sound disciplines all have one question in common: what does a Theatre Sound Designer do? (more…)

Video Game Sound Designer

Sound Designers make everything you hear in a game except for dialog and music. They will use a DAW to create the sound effects.  They can pull sounds from sound effects libraries or make the sounds from scratch.  They will record sounds in the field or in a foley studio.  Then they can take those raw sounds and mix them together and use effects to make the game world come alive.  Designers will be asked to make all kinds of sounds depending on the game.  They can make anything from huge dragons to car engine or machine guns and tanks. Part of the job is also making every single sound in the game.  This includes every footstep and cloth rustle you hear as the game character moves around the world.

Technical Sound Designers are responsible for integrating all the sounds in the game.  They work with the game engine and audio middleware such as Wwise or Fmod  to put all the sounds in.  This is a very important position since they control where sounds are placed, what volume the sounds are played at and how far away you can hear the sound in the game world. This is where the games mix takes place.

Audio Directors  are responsible for for making sure all of the audio in a game fits together.  They will interact with all of the other teams making the game to figure out what sounds are needed.  They will review all of the sound designers work and oversee the sound implementation. They lead the audio team.

Composers write all the music you hear in the game.  This is different than other composers because a lot of game audio is interactive.  This means the music is created so that it can switch from loud, thunderous battle music to quiet exploration music in the same song at any time.  Game scores can be all live musicians or all samplers and synths.  It depends on how big the game is and what feel the game needs.

Dialog Producers direct the talent in the voice over sessions.  They will help format the scripts so they are easy to read in sessions.

There are different ways this can all be set up.  All of these jobs could be done by one person on a smaller team.  Sometimes there are multiple sound designers and technical sound designers if it’s a bigger title.  Sometimes you can have one person who works at a game company who hires outside contractors to all of this work.  Outside contractors can either work for themselves or they are part of a bigger company that does sounds for a bunch of different games.

How to get started
Try to intern at a game company or work on an indie project.  This will give you experience so you can get your foot in the door at a game company.  Learning the tools will help you as well.  Fmod, Wwise, Unreal and Unity all have free downloads of their software so you can learn how everything works together,  Play a lot of games and listen how everything works together.  A lot of games will let you adjust the volume on music and sound effects.  Turn each one off and listen how just the music works, then listen to how the sound effects all work.

 

Courtesy of: Tom Smurdon

Copyright SoundGirls.Org 2015

X