Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save gocha/093c2e07916c1d42f8578f392a426593 to your computer and use it in GitHub Desktop.
Save gocha/093c2e07916c1d42f8578f392a426593 to your computer and use it in GitHub Desktop.
[Transcription] Connecting EQ to Musical Sound | iZotope Pro Audio Essentials

A music usually think in terms of the musical scale. When we mapped the frequency of musical notes, we noticed something interesting. As we go further and further up the scale the distance between each note is greater in terms of frequency. If we compare the distance between octaves, we have a very simple relationship: the frequency doubles with each ascending active. So we mention that the note A, commonly used as a tuning reference by classical musicians, vibrates at 440 Hz. The next octave higher is 880 Hz. The next is 1760 Hz or 1.7 kHz. So you see the distance in terms of numbers of frequencies increases as the notes get higher. Between 20 Hz and 20,000 Hz, we get about 10 octaves. You see this range in modern EQs, because it relates to the range of human hearing.

Fundamental and Harmonic Frequencies (1:02)

The note or the pitch being played by an instrument, is determined by the fundamental frequency.

Along with a fundamental frequency, we get a series of related sounds called harmonics. When you play A at 440 Hz on a piano, you also have a little bit of energy at 880 Hz, a little bit less at 1760, and also at 1320, 2200 Hz, and other frequencies. This combination of harmonics, along with the dynamics of a sound, is what helps us distinguish one instrument from another.

As audio engineers, it's important for us to be aware of harmonics and their relationship to frequency.

This chart is called the Carnegie chart. It's a map which shows the relationships between instruments-frequency ranges and characteristics of sound.

We provided one here for you to download and it can be really helpful.

For any given instrument, you'll notice that there is a range of frequencies that can be connected with familiar aspect of the sound, and instrument has complex characteristics, even within a defined range, so I just think EQ can bring out different qualities.

Take the example of an acoustic guitar. We see in the Carnegie chart that is capable of a fundamental frequency, from about 90 Hz, all the way up to 1.3 kHz, with harmonics occurring above that until about 3.5 kHz. Adjusting different frequency points will bring out a very different qualities of the instrument. We both some frequencies and cut others to achieve the effect that we want. This combination of boots and cuts achieved an overall sound. After reviewing the guitar I might end up with something that looks like this.

Connecting EQ with Visual Terms (2:53)

Some people use visual terms to describe these kinds of gestures, because they're analogies that can communicate an idea, without getting too technical.

So a musician might ask to make the guitar sound brighter, warmer, thinner or darker. As a producer or engineer, you have to figure out how to use the tools available, to help the musician get the results that they want.

For example, if they want their guitar to sound warmer, you might have to try cutting frequencies above 5,000 Hz, and then turning the guitar up. Or adding a gentle boost around 200 Hz, and then turning the guitar down to compensate for the boost and level. Or maybe the warmth they're looking for doesn't call for equalization at all, you might actually need a little bit of distortion or compression to help them get the sound they have in their head.

I can apply the same principle to EQ a recording which contains multiple instruments during the mastering process, by adjusting certain EQ points, I can bring forward or 10 different qualities in the sound of a whole recording. Some of these adjustments can subtly raise or push back certain instrument groups, so the process of EQ during mastering shouldn't be confused with EQ during mixing.

When I'm mixing I can actually EQ individual instruments, and during mastering, I'm affecting the overall track, maybe instrument groups, but I'm not actually adjusting individual instruments.

Take the free Pro Audio Essentials challenge!
http://pae.izotope.com

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment