Q: Can I use a low-pass filter to remove noise from my recording?
Avid and Apple conspire to heist 9 decibels of level
How to become a better singer
A brief introduction to soundproofing
The Making of a CD - FREE DOWNLOAD
Audio demonstrations of distortion produced by compressor plug-ins
What should you fix before you mix?
The new battlefield in the loudness war?
A simple mixing tip that will improve (nearly) all of your mixes
Can an electric guitar virtual instrument ever sound like a real electric guitar?
Subscribe to access our latest, up-to-the-minute articles with hints, tips and adventures in audio in the weekly Audio Masterclass Newsletter.
Phase is a big topic, so I couldn't cover it all in one article. But try an experiment...
Load a mono music file into your DAW. Copy it onto another track so that the two tracks are identical. Pan both to the center. Using either the mute or solo buttons, compare the sound of one track playing by itself with the two playing together. You will notice that when both tracks are playing, the sound is louder, by 6 decibels in fact. Otherwise the sound is the same.
Now use your DAW's 'invert' function to flip the waveform of one track upside down. If you zoom in close to inspect the waveform, you will find that the high points on one track line up with the low points of the other. Now play...
Can't hear anything? That is correct. The two signals have completely canceled each other out.
Now undo the invert so that both tracks are the same again. This time zoom right in so you can make precise adjustments. Slide the audio on one track so that it is delayed half a millisecond with respect to the other. Now play...
It doesn't sound quite right, does it? Either track by itself sounds OK, but when played together the combination sounds odd.
OK, mute both tracks and create two more empty tracks. On both of these new tracks, use your signal generator plug-in to create around 30 seconds of 1 kHz tone on each. Don't have a signal generator plug-in? Don't worry - the explanation is easy to understand. As before, shift the audio on one track by half a millisecond. And play...
You can't hear anything can you?
Zoom right in and you will see that the half millisecond delay has caused the audio on one track to appear to be inverted with respect to the other. Or 'out of phase', as we call it.
Going back to the music tracks, what you are hearing when the two tracks are mixed together are frequencies around 1 kHz cancelled or reduced in level due to what we call 'phase cancellation'. This occurs at multiples of 1 kHz too, producing an effect that we know as 'comb filtering'. This occurs whenever a signal mixes with a delayed version of itself. Imagine you are recording an instrument or vocal close to a hard reflecting surface. The signal traveling directly into the microphone will mix with the delayed reflecting signal (which has further to travel).
As you will appreciate from your experiment, this will not sound good. So the moral is always to be on the lookout for a signal getting mixed with a delayed version of itself. This can happen acoustically, electronically or digitally. If you remain vigilant for this, you should never suffer from phase problems, at least not from this cause.
By the way, the picture shows an example of comb filtering. This is how it was made...