Originally Posted by soundroid
Can you explain phase distortion a little further please. this anomaly has always intrigued and confused me somewhat. try and use small words as I'm a simple man.
First of all, realize that this is not an anomaly. This is just the normal way nature filters the frequency response of whatever source, no matter if these are reflections from a wall or a blanket covering a loudspeaker.
Mix two sine waves at the same frequency, both peaking at -6dB. If possible, re-create it in your favourite audio editor.
In the normal case, both sines will perfectly stack up to 0dB.
You're probably aware that inverting the phase of either sine will totally cancel out both signals, silence.
Now, observe what happens as soon you shift one of both channels in time: Depending on the delay, the original sine will be recreated at an arbitrary level ranging from -infinity dB to +6 dB. That is, the simple time shift directly controls the output level of the original sine. Much like an amplifier would do.
Do you follow me? The delayed copy of whatever single frequency (a sine-wave) gives direct access to the amplitude of that specific frequency.
This is essentially how typical audio filters work. They delay the signal at certain frequencies and mix that signal back to the original. Usually with feedback.
Say, a filter has to reduce 10Hz by 6dB. The wavelength of 10Hz is 100ms. To achieve -6dB at that frequency, the filter will have to "copy" the original 10Hz sine and delay it by 25ms in order to half the amplitude (see experiment above).