View Single Post
Old 17th June 2006
Lives for gear
Thanks Nathan!

After that explanation, I finally understand.

The only catch is that if I want the Lavry clock to improve the sound of especially my 888-24s D/A, then the A/D of the Lavry on the front end of the Masterlink will have to be at 44.1khz sample rate.

So thus, I would not have the option of my final 2 track digital mixes being at 96khz, and that trade-off might be more significant than the improved D/A resolution of my other units feeding the console.

Ahh, decisions, decisions!

Considering that it's still Lavry A/D conversion at 44.1 khz, would it really be all that much better at 96khz ? Coming off the Trident console, would the lower sample rate make all that big of a deal in the final product ? Any seasoned pros out there with an opinion on this ?

I hear that some top producers are still using 44.1khz as the sample rate for final mixes anyway, because they feel this is still the biggest consumer format and they want to make it sound the best at that sample rate. So the lower resolution actually makes them work the analog console harder to "get that sound", which they feel is a better approach than relying on the mastering engineer to do a sample rate conversion in post. Some (and I'm talking about TOP level people) feel that ANY additional computer based sample rate conversion degrades the audio quality, and that you should get it right at 44.1khz with some great converters that sound great even at that resolution.

I know this is almost a topic for a new thread, but some feedback on this issue would be appreciated. I may end up getting both the Lavry and the Big Ben, if I decide that I should not lose the ability to make final mixes at 96khz sample rate. After all, the final 2 track mixes are BY FAR the most important phase of any CD making effort -- it's what the public gets to hear and decide if they want to buy!