Line reference level explanation
I am musician and have been playing for already 20 years now. Even though I have always been into home recording, I have just started to get seriously into it.
I have always thought I had the basics quite clear but I just found out that there is one think that wasn't that clear to me...
My question is all about signal reference level. According to its formula, the dB is calculated as the relation between a measured signal level and a reference level:
dB = 20 X log (Measured level / Reference level)
So far, my take on this was that when my console's VU meter was reading 0VU, then the output voltage would be 1.23 Volt or 0.316 Volt respectively when +4dBu or-10 dBV where selected:
Reference level : +4dBu : 1.23 Volts
20Xlog (1.23/1.23) = 0 dB
Reference level : -10dBV : 0.316 Volts
20Xlog (0.316/0.316) = 0 dB
However, when I gave it a thought, I realised that this cannot be possible as if I come to use the above formulas as I did, I'd actually be changing the dB scales. They will not be dBu nor dBV anymore, as these two scales respectively have 0.775 volt and 1 volt as reference values:
Reference level : dBu : 0.775 Volts
Reference level : dBV : 1 Volts
This makes me doubt, as then 0VU reading on my console would be indicating 0.775 or 1 volt output levels instead.......
Would anybody please give me some light on this???? what is the role those values are playing into the whole thing?
Thanks a lot in advance for your help!!!!