21st December 2004

21st December 2004

#

**1**Guest

Posts: n/a

**This is why 96 is better than 192**

From the mouth of Dan Lavry himself. Before I read this, I always thought that 192 was an overkill.

"In my paper I mentioned 3 arguments why 192KHz is worse:

1. The file size increases thus the space requirement compared to say 96KHz is doubled, and data transfer is slower by 2.

2. The computational requirement gets to grow and often by more than a factor of 2. That is why people that bought into 192KHz often ended buying very expansive accelerator cards, and still came short.

3. That is the big one: There is a tradoff between speed and accuracy. Clearly, the accuracy of a 10Hz system is great, but it is too slow for audio. The accuracy of 1GHz is much poorer, and it is too fast for audio. The question is - what is the optimum rate?

It is not true that faster is better. It is not true that more is always better. A 6 foot person weighing 100lb is too thin, but the same person weighing 500lb is too heavy. There is such a thing as OPTIMAL RATE. In the case of audio, it is all about what people can hear. That is what dictates why most mic's and speakers are optimized to about 20-20KHz, not 20-96KHz. The same factors should apply to converters.

The speed accuracy tradoff is one of the general engineering concepts, and it manifests itself many ways. Most of them are practical, such as "you can charge the cap more accurately if you have more time", or "the amplifier will settle to a more accurate value if you give it more time". But with modern converters, mostly based on sigma delta, the tradoff starts on paper, before we get to "real world" circuits. The basic given set of design parameters for a sigma delta converter are 1. oversampling ratio 2. filter order 3. number of quantizer bits.

Say you have a given set of parameters.

You can design for the best 0-24KHz audio bandwidth

You can have less precision but more bandwidth 0-48KHz

You can have even less precision but more bandwidth 0-96KHz

This was regarding the paper design stage of sigma delta. Than you get into the real world circuitry and face the same tradoffs again...

There is no escape from speed vs accuracy tradoffs, sigma delta or not...

Regards

Dan Lavry"

" I know that some of the concepts regarding sampling are NOT intuitive. It is difficult to explain that more samples are not better in a world where more pixels are better, but the fact remains, samples are not pixels and there are issues that are not easy to convey to people that did not chose to take an EE or math career. I wrote my paper to try to simplify things, but I guess it is still too difficult for many to follow.

So let’s just say that Nyquist was right, and we have 100 years of hand on experience, including test equipment, the communication industry, digital video, digital audio and much more.

And even without that experience, it is proven solidly to be mathematically correct that more samples than needed (as indicated by Nyquist) are going to add ZERO content, and are totally redundant.

Regarding that speed – accuracy tradeoff, that is easier to understand. Analogies can be misleading, but say you take on a task to color a picture with crayons and “stay within the lines”. The picture is intricate. I bet doing it in 10 seconds will be a lot less accurate than if you took 10 minutes. The same statement applies towards so many things. Devices and circuits also have speed limitations (and speed is in fact bandwidth). A given size capacitor takes time to charge, a logic gate takes time to change states and so on. Doing things fast goes against doing things accurately. Devices and circuits can be optimized for maximum speed, power, accuracy and more. They are most often optimized to provide a combination of acceptable tradeoff. When you relax on one requirement, you end up with more “breathing room” for other requirements.

Regarding the sigma delta design, yes, in theory you give up accuracy for speed. The noise shaping concept is about moving noise from a frequency range you wish to use for the signal, to other frequencies. Think of it as digging a hole. You can either dig a deep hole of small diameter, or very shallow hole of a large diameter. It is the same amount of dirt, but a different result. The depth of the hole is analogues to the accuracy, the diameter represent the bandwidth. Do you want great 20KHz or not so great 100KHz?

That answers your question about paper design. But I am an engineer and therefore equally interested in the real parts and circuits. Speed vs accuracy is a solid concept. Speed vs power is another one and there are others. Those concepts are no different than the first law of thermodynamics – never proven but no one so far came up with a single example to contradict it.

Regards

Dan Lavry

"In my paper I mentioned 3 arguments why 192KHz is worse:

1. The file size increases thus the space requirement compared to say 96KHz is doubled, and data transfer is slower by 2.

2. The computational requirement gets to grow and often by more than a factor of 2. That is why people that bought into 192KHz often ended buying very expansive accelerator cards, and still came short.

3. That is the big one: There is a tradoff between speed and accuracy. Clearly, the accuracy of a 10Hz system is great, but it is too slow for audio. The accuracy of 1GHz is much poorer, and it is too fast for audio. The question is - what is the optimum rate?

It is not true that faster is better. It is not true that more is always better. A 6 foot person weighing 100lb is too thin, but the same person weighing 500lb is too heavy. There is such a thing as OPTIMAL RATE. In the case of audio, it is all about what people can hear. That is what dictates why most mic's and speakers are optimized to about 20-20KHz, not 20-96KHz. The same factors should apply to converters.

The speed accuracy tradoff is one of the general engineering concepts, and it manifests itself many ways. Most of them are practical, such as "you can charge the cap more accurately if you have more time", or "the amplifier will settle to a more accurate value if you give it more time". But with modern converters, mostly based on sigma delta, the tradoff starts on paper, before we get to "real world" circuits. The basic given set of design parameters for a sigma delta converter are 1. oversampling ratio 2. filter order 3. number of quantizer bits.

Say you have a given set of parameters.

You can design for the best 0-24KHz audio bandwidth

You can have less precision but more bandwidth 0-48KHz

You can have even less precision but more bandwidth 0-96KHz

This was regarding the paper design stage of sigma delta. Than you get into the real world circuitry and face the same tradoffs again...

There is no escape from speed vs accuracy tradoffs, sigma delta or not...

Regards

Dan Lavry"

" I know that some of the concepts regarding sampling are NOT intuitive. It is difficult to explain that more samples are not better in a world where more pixels are better, but the fact remains, samples are not pixels and there are issues that are not easy to convey to people that did not chose to take an EE or math career. I wrote my paper to try to simplify things, but I guess it is still too difficult for many to follow.

So let’s just say that Nyquist was right, and we have 100 years of hand on experience, including test equipment, the communication industry, digital video, digital audio and much more.

And even without that experience, it is proven solidly to be mathematically correct that more samples than needed (as indicated by Nyquist) are going to add ZERO content, and are totally redundant.

Regarding that speed – accuracy tradeoff, that is easier to understand. Analogies can be misleading, but say you take on a task to color a picture with crayons and “stay within the lines”. The picture is intricate. I bet doing it in 10 seconds will be a lot less accurate than if you took 10 minutes. The same statement applies towards so many things. Devices and circuits also have speed limitations (and speed is in fact bandwidth). A given size capacitor takes time to charge, a logic gate takes time to change states and so on. Doing things fast goes against doing things accurately. Devices and circuits can be optimized for maximum speed, power, accuracy and more. They are most often optimized to provide a combination of acceptable tradeoff. When you relax on one requirement, you end up with more “breathing room” for other requirements.

Regarding the sigma delta design, yes, in theory you give up accuracy for speed. The noise shaping concept is about moving noise from a frequency range you wish to use for the signal, to other frequencies. Think of it as digging a hole. You can either dig a deep hole of small diameter, or very shallow hole of a large diameter. It is the same amount of dirt, but a different result. The depth of the hole is analogues to the accuracy, the diameter represent the bandwidth. Do you want great 20KHz or not so great 100KHz?

That answers your question about paper design. But I am an engineer and therefore equally interested in the real parts and circuits. Speed vs accuracy is a solid concept. Speed vs power is another one and there are others. Those concepts are no different than the first law of thermodynamics – never proven but no one so far came up with a single example to contradict it.

Regards

Dan Lavry