View Single Post
Old 10th May 2020
  #2
I think I may have used RMS erroneously? What I mean to say, how LOUD are you making your mixes (or perceived loudness) and what are you using to measure this/as a reference point?

In my research I found Howard Massey saying this about LUFS levels:

"Worse yet, pretty much every delivery service normalizes the audio files they stream – a process that can easily degrade sonic quality. What’s more, they all normalize to different levels. Spotify, Tidal, and YouTube, for example, set a ceiling of -14 LUFS. Apple’s Soundcheck, an option in iTunes that goes through your library of music and analyzes the average volume of all the songs, can actually tell the player to turn down by as much as -16 LUFS. And most commercial television broadcasts drop them down all the way to a whopping -23 or -24 LUFS, depending upon the country.

Mastering to one of these target levels is up to you, and arguably not necessary, but bear in mind that the music you’re working with will be adjusted to one of these levels (or others) at some point after you release it, whether you like it or not. A good compromise – and pretty much a consensus these days – is to use -16 LUFS (or -6 dBTP [True Peak] if your loudness meter offers that unit of measurement, as the WLM Plus meter does) as the target for integrated loudness."

so..are you guys 'self mastering' mastering with streaming services normalisation algorithm in mind?