However, when I gave it a thought, I realised that this cannot be possible as if I come to use the above formulas as I did, I'd actually be changing the dB scales. They will not be dBu nor dBV anymore, as these two scales respectively have 0.775 volt and 1 volt as reference values:
Reference level : dBu : 0.775 Volts
Reference level : dBV : 1 Volts
This makes me doubt, as then 0VU reading on my console would be indicating 0.775 or 1 volt output levels instead.......
Would anybody please give me some light on this???? what is the role those values are playing into the whole thing?
You can measure exactly what your console is putting out with a Vac voltmeter and sine wave tone, across the balanced + and - pins.
Professional VU meters are normally 0 VU = +4 dBu = 1.228V, but there's no reason to expect it to be cal'd this way in your system--especially with consumer analog gear that does not have a high max level, it might be cal'd to 0 dBu to give more headroom in the system.
It also obviously depends on where the -10 dBV selector is in the signal path. It might not be before the VU meter, and instead is right on the output. Therefore, to get 0 VU you still need to hit +4 dBu, but that's not what you measure out of the desk.