This post may contain affiliate links. As an Amazon Associate I earn from qualifying purchases.
I was taught in school to set my “input levels” as high as possible (without clipping, of course). However, I learned later on that when it comes to recording levels: digital recording isn’t the same as analog recording.
Contrary to popular belief, setting your levels as high as possible will NOT give you the best results. Back in the days of analog recording, sound engineers did so as a tactic to combat the “noise floor”. The signal-to-noise-ratio was much more of an issue than it is with today’s digital hardware. As you read through this article, you’ll realize that there are NO advantages to setting your levels as high as possible. I will be sharing what I believe to be the optimal recording level based on some extensive research. There were actually TWO reasons that convinced me, so I hope they’ll justify what I have to say. If you’ve ever been debating the issue yourself, this text will definitely provide some relief. Let’s get started!
- Setting recording levels to optimize the signal-to-noise ratio
- dBVU meters vs dBFS meters
- Optimizing your signal using VU meters
- Setting recording levels to -18 dBFS: 2 reasons
- Setting recording levels for digital recording
Setting recording levels to optimize the signal-to-noise ratio
Inherent to electronic circuitry is the presence of what we refer to as the noise floor. By measuring our signal’s amplitude against the noise floor, we obtain what’s known as the signal-to-noise-ratio.
Back in the days of analog recording, achieving the best signal-to-noise ratio was challenging.
However, the noise floor has become virtually inaudible since the advent of digital recording. In other words, we no longer need to set our levels “as high as possible” to compensate, BUT…
This would be important if you were using analog amplification.
You’d need to adjust the gain for each point of amplification to minimize noise and distortion (clipping). This is what we refer to as gain staging OR gain structuring.
However, unless you were using an analog mixer and/or a tape machine, you won’t need to be as tedious with this process.
In the digital world, the only thing we need to avoid clipping is our analog to digital converters (ADC).
However, once your signal has reached your digital audio workstation (DAW), it may still have accumulated noise from…
- Instrument pickups
- Unbalanced cable runs
- Effects processors
As you can see, each ingredient we add to the mix has the potential of introducing noise. This is because everything that happens BEFORE your audio interface is technically analog.
For now though, we’re only focusing on setting our interface’s levels appropriately.
dBVU meters vs dBFS meters
Analog equipment can actually benefit from setting levels as high as possible. If you’ve ever used valve/tube amplifiers, you know that pushing the gain can distort it in a pleasant way.
In other words, clipping ever so slightly in the analog world produces what we refer to as saturation.
However, clipping in the digital world will produce anything but favourable results. That’s why setting your levels as high as possible has NO advantage.
If anything, you’ll actually encounter TWO major problems by doing so (more on this later).
The first thing we need to understand is that dBFS metering wasn’t always the standard for measuring amplitude. Back in the day, sound engineers were measuring amplitude using dBVU meters.
You know… the one with the bouncing needle.
Here’s the shocking truth though… 0 dB VU ≠ 0 dB FS
Actually, most analog equipment was calibrated to average -18 dB FS.
What does this mean?
Well, it appears that sound engineers had been recording at much lower levels than we expected. Does this mean that we should be setting our input levels to average -18 dB?
Optimizing your signal using VU meters
To answer the previous question… YES, we should be setting our input levels to average -18 dBFS. However, this does not mean that your track cannot peak any higher than -18 dBFS.
Keep in mind that dBVU meters use RMS (root mean square) to calculate the AVERAGE amplitude.
In other words, you can have your tracks peaking at -6 dBFS, but the average amplitude will remain -18 dBFS. So, how can you start metering your tracks using this method?
Although there are many dB VU meter plug-ins out there, the mvMeter2 from TBProAudio is my favourite. Did I mention that it’s FREE?
Have you got your dBVU meter plug-in yet? Great, let’s get started!
Step 1 | Create an audio track
Step 2 | Set your track’s fader to unity gain (aka 0 dBFS)
Step 4 | Insert a dBVU meter in effects slot 1
Step 5 | Calibrate your plugin to -18 dBFS (this should be done by default)
Step 6 | Set your preamp’s levels to average 0 dBVU (it’s okay if it goes over a little)
It’s as simple as that! Repeat this process for ALL your tracks… ALWAYS.
But what if I’m working with a track that has been recorded over this threshold? If you’re mixing other peoples’ work, this will most likely be the case (unless they’re professionals).
You’ll instinctively want to lower your mixer’s faders, but this will only affect the output.
There’s an easy fix to this common problem…
Step 1 | Set your track’s fader to unity gain (aka 0 dBFS)
Step 2 | Insert a dBVU meter in effects slot 1
Step 3 | Calibrate your plug-in to -18 dB FS (this should be done by default)
Step 4 | Set your track’s clip gain to average 0 dBVU (it’s okay if it goes over a little)
You can also use a trim plugin or a dBVU meter that includes this feature. This will be affecting your signal’s input, which is what we need.
But now we need to answer the BIG question… Why is all of this so important?!
Setting your recording levels to -18 dBFS: 2 reasons
Now, I’m sure you’re begging to know why -18 dB is so important. I mean, why can’t we record at -6 dBFS like we’ve been taught in school? You can let me know in the comments if you agree or disagree!
Plug-ins that emulate analog equipment have an optimal “nominal operating level”. In English, this simply means that if you exceed a certain threshold, they won’t sound as good.
If you don’t believe me, read page 17 of the instruction manual for Slate Digital’s Virtual Tape Machines.
Remember, analog equipment uses 0 dBVU which is usually the same as -18 dBFS on average. In other words, tracks that exceed this level will distort (clip) your plug-ins.
You may not notice a difference if you’re using standard digital plugins, but there’s another problem…
By setting your levels as high as you can, you’ll be clipping your master bus one you’ve got multiple tracks. As I mentioned before, you’ll instinctively want to bring down your mixer’s faders, BUT…
The further away you move from “unity gain”, the less resolution you’ll get from your fader.
You may have noticed that your meter isn’t linear, it’s logarithmic. This one starts in increments of 3, then moves up to 5 and finally to 10.
This’ll become a problem during the automation stage… Instead of increments of say, 1 dBFS, you’ll be working in increments of 10 dBFS. In other words, you won’t have as much accuracy as you would have by staying in the top portion of your fader.
Are you convinced yet?
Setting recording levels for digital recording
I remember hearing about this shocking truth a couple of years ago. I wanted to believe it, but I needed to have some concrete evidence.
These were the TWO reasons that convinced me to make the transition to -18 dBFS.
However, you could technically record at any amplitude as long as you don’t clip your converters. You could always reduce the volume in your recording software, right?
I used to work this way, but I can’t imagine myself ever looking back. I mean, we now know that there are NO advantages to recording “hotter” in the digital world.
So, hopefully this has given you a new perspective on sound recording! What do you think about recording at -18 dB FS, does it resonate with you? Let us know in the comments and feel free to share your personal approach to setting levels.
So I’m confused if my interface has +24 dbu of headroom , wouldn’t +4db unity gain equal to -20 dbvu ? And wouldn’t your interfaces headroom determine the average level going into the adc and out of the dac ? Thanks for sharing .
In your case, your audio interface clips at +24 dBu which is the equivalent of 0 dBFS (unity gain) in the digital.
Now, I’m not really sure what you’re asking… If you +4 dB above unity again, you’d be at approximately +28 dBu (you’d be clipping).
If you mean +4 dBu, you’d be at about -20 dBFS.
You can check THIS article and THIS ONE for reference.
Thanks, I hope than answers your question!
Then there’s a third reason (and possibly most important):
If your preamp levels is pushing a lot more than -18db into your interface’s input (say -12 to -6 db), then you’re pushing your preamps a lot harder than when they were designed for. Studios that tracked to tape to get around 0 db of input had their preamp gain control far below what a lot of novice studios push to get the -6 or -12 db. This means clipping at the preamp level is happening in order to get these high levels into your DAW. Recording at -18 or so ensures you’re not pushing your preamp/compressor’s input/output levels too hard.
You’re absolutely right! However, that’s not always the case nowadays with digital equipment.
For example, some audio interfaces now feature “32-bit float” recording which essentially makes it impossible to clip. In theory, the dynamic range becomes infinite.
It’s only important to remember that not every plugin/device in your signal-chain can achieve this.
It’s true that most vintage preamps/amplifiers were regulated to -18dB (which is what 0dBVU represents on the meter).
Either way, if we’re going to pick a standard, why not use the one that’s always been in place? -18dB/0dBVU.
Thanks for stopping by, it was great to hear from you!
I think you might be missing my point. I’m talking about pushing a dedicated mic pre too hard, not the interface. Example: If I dial in the proper levels for tape in my external mic pre (Neve, A-Designs, Avalon etc), it should be about the same as if I’m dialing in my mic pre to give me about -18db on my DAW. That means the input signal on my mic pre is just about right. But when I crank the input up on a mic pre and get it around, say -6 or -12 db on my DAW (assuming the digital interface is at unity gain on the inputs), that means I’m running my external mic pre really hot. With mics like a U-87 that are inherently loud and headroom challenged, that extra 6 to 12 db of signal is too much gain at the preamp level, and usually means distortion or “FET fizz” between the mic and the preamp before it even hits the DAW. Therefore, by “dialing down” my mic preamps so that my DAW is reading about -18db, I’m also preventing overload at the preamp. To put it simply, recording DAW levels at -12 to -3 often means running an external preamp too hot to get to those levels, and is driving the external mic pre too hard. Thus, analog distortion is occurring before the signal even hit the interface. That’s the greatest difference in signal integrity I noticed when I learned to run my levels around -18db by turning down my mic pres.
I thought you were talking about the preamps on the audio interface so yes, I think I misunderstood you.
You’re right, the external microphone preamps you mentioned are designed to operate at -18dB. Even analog modelled plugins (based on these preamps) are designed to operate at the same -18dB, but most people don’t realize that it makes a difference (even inside the DAW).
I didn’t talk much about the hardware in this article because I figured that engineers working with analog were already familiar with the -18dB/0dBVU target.
Technically speaking, even the analog preamp of an audio interface should have an optimal nominal operating range (although this might not necessarily be -18dB).
We’re definitely on the same page though.
Moral of the story… Set your levels at -18dB for recording and you can’t go wrong.
Thanks for your input, I’m always happy to discuss!
Wow…thank you. You just solved my problem with persistent distortion of vocals and bass on my Facebook live streams even though the faders stayed out of the red zone and the final signal was very low. I forgot how conditioned I had become with analogue tape recording that I didn’t connect the dots when it came to digital live production.
That’s interesting, so your audio was clipping on Facebook even though it was below 0dB?
Maybe the platform itself has some sort of “threshold” in regards to amplitude. I’m glad the article helped you find a solution to a problem I wasn’t even aware of!
In general, setting your levels lower will always be beneficial because you can always increase them later. However, it’s impossible to undo clipping.
Thanks for stopping by, let me know if you need anything else. Take care!
“Set your track’s fader to unity gain (aka 0 dBFS)”
This is not correct terminology. Unity gain is “0 dB”, not “0 dBFS”.
dB is a relative change in level.
dBFS is an absolute measurement of level.
If your signal is at -18 dBFS, and you put it through +6 dB gain, it will be at -12 dBFS. “0 dBFS level” and “0 dB gain” are both correct terminology, but there is no such thing as “0 dBFS gain”.
Hello Guest (that’s an interesting name),
I sincerely appreciate the correction/lesson. I haven’t updated this post in a while, so I’ll keep this comment as a reminder!
Admitedly, I focus much more on the music making rather than the terminology. It’s nice of you to go through my entire post just to find a typo, but we’re much more interested in discussing the actual topic of the article here. I’m curious, what did you think about this approach to setting levels?
I’ll most likely be updating this post in the coming weeks, so I always appreciate the constructive criticism.
Don’t be shy next time though, you’re more than just a “guest” at Decibel Peak. Thanks for stopping by!