View Single Post
Old 30th March 2019
Lives for gear
FabienTDR's Avatar

Verified Member
Originally Posted by elegentdrum View Post
This is true for analog. Not for software. In software a measurement has to be taking, oh it's volume 5, so X=5. Then that goes into the software, get's your math done to it. then that reports back to the signal and adjusts the signals level. It takes at least one sample, and very often many samples for the signal to have it's level changed. The first sample of "Too loud" does not get compressed unless some "look ahead" is built into the software. This is typically the case. That introduces lag.

So sure it can work that way. But in the digital world, at the cost of lag. In the analog word, no lag required. Even though in practice, they way compressors are designed, it's lag rate is commonly what's desirable about how it operates.
No! It's nonsense. There is no lag, ever.

Take a clipper, and its results. Explaining why the first sample does not overshoot will likely be the only lag in this operation

The reason a dynamics processor lags is when it decides to trade distortion against time, i.e. the operator increases attack/release time. This is a logical result of the smoothing ("history dependency", so it needs some of it to fulfill its function), the logical result of any lowpass. But it's a completely misleading digital vs analogue argument!

Last edited by FabienTDR; 30th March 2019 at 09:10 PM..