Read our post on linear phase filters and frequency response with the Fabfilter Pro Q 2
Filters - Minimum phase or linear phase? One of these can seriously mess with your frequency response. Today you're going to learn which, and why.In the analogue world, all we have, and have ever had, are minimum phase filters. As the level changes according to frequency, so does the phase change. That's the way things are due to the eternal laws of the universe, which are very difficult to argue with.
To create a linear phase filter in the analogue domain, you'd have to go back in time. In the immortal words of Dick Dastardly, "Drat and double-drat".
But in digital audio, we can in effect go back in time. Simply delay the signal and use that delay to process. Everything comes out a little later than it should, but latency compensation, or you can call it delay compensation, in the digital audio workstation can fix that.
With a linear phase filter, the level changes according to frequency, but the phase stays exactly as it was.
This seems better, and it can be. There's a price to pay in terms of pre-ringing, but that's another topic for another video.
So what's the problem with minimum phase?
Usually nothing of any consequence, as the history of the last 70 years of analogue audio has shown us.
But there are situations where there can be an issue. Typically this would be where you're splitting a signal in two then mixing the two parts back together again, which would probably be to process them differently.
An example would be multi band compression. You could for instance filter out the low frequencies, compress them, then mix them back in again. That's something you might do in mastering.
The potential issue is that the filters might cause frequency response problems when you mix the signals back together.
Let's move on to a demonstration.
What I have here is a frequency sweep from 100 Hz to 1600 Hz and in the following demonstrations I'm going to use a centre frequency of 400 Hz.
Here's the sweep with no filtering...
I'm going to make two copies of the sweep and apply a low-pass filter to one, and a high-pass filter to the other.
I'm going to set the cutoff frequencies so that the minus 6 dB point of both is 400 Hz. This is so when I mix them back together, because they are correlated they should add together at the original level.
Here's the low-pass filtered version...
And here's the high-pass version...
So when I mix the two filtered versions together, we would hope that they sound the same as the original, and there are no frequency response problems.
So let's try this with linear phase. I'm using the FabFilter Pro Q2 set to linear phase.
You can hear that the level hardly changes at all, and you can see that on the meter. Let's play it again...
That's all well and good. But is it so much better than minimum phase?
To find out, I'm going to set the filters to minimum phase. FabFilter calls it 'natural phase'.
Before I play the sweep, take a moment to imagine what's going to happen. Is it going to be pretty much the same? Or will you notice a difference?
Here we go...
Well I bet that was more of a difference than you thought. It was what I expected, but a lot more than I expected. Let's play it again...
So going back to my mastering example, if you split the low frequencies from a signal to compress them using minimum phase filters, then mix them back in again, you are going to have a frequency response issue.
It isn't going to be a disaster, but you should consider correcting the problem with EQ. Just make it so it sounds good.
So this is what happens with linear phase and minimum phase filters.
The further question is why does it happen?
The clue is there already. Phase.
I'm going to demonstrate this with a vectorscope. I'll pan the low-pass filter left and the high-pass filter right.
In the vectorscope, signal on the left channel causes the trace to tilt to the left, signal on the right channel tilts to the right. With mono the trace goes vertically up and down.
If the channels are in phase, the trace will be a straight line at whatever angle.
Let's play the linear phase sweep...
That is exactly as we should expect. The signal pans from left to right, and the trace is always a straight line, showing that the low-pass and high-pass filters are exactly in phase all the way through.
I'm going to play the minimum phase version now. Like before, take a moment to imagine what you're going to see. It is going to be different, but in what way?
Here we go...
That was an experience. What it shows is that there is a huge variation in phase as the frequency rises and a huge difference between the low-pass and high-pass filters. When identical signals that are exactly in phase are mixed together, they'll add up by 6 decibels. If they are not in phase, the level they add up to will be lower.
So this explains the frequency response issue I demonstrated earlier.
In summary, analogue audio has been minimum phase for decades, and it works. But there are occasions where linear phase is better, and you should probably use it.
Note on the Bessel filter: This is a filter with a highly linear phase response that can be constructed in the analogue domain. It is commonly used in loudspeaker crossovers. Its frequency response however is not as useful in practice for other purposes and filter designs such as the Butterworth are more often used despite their comparatively poor phase response.