The No.1 Website for Pro Audio
 All  This Thread  Reviews  Gear Database  Gear for sale     Latest  Trending
Do you use different AD Converters tracking same song?
Old 6th January 2013
  #61
Gear Nut
 
FestivalStudios's Avatar
 

OK lets do the math ---correct me if im wrong
4 samples at 48 k is .0000833 secs or .083 ms or 83.3 microsec
time period of 1k waveform is 1 ms --- time period of 10k waveform is .1 ms so 4 samples of a 1k waveform would be about 12% out of phase or about 30 degrees --and at 10 k almost 300 degrees definately a problem for stereo pairs but also consider 1 ms is about the distance sound travels in 1 foot so 83.3 microsecs is 1" or the diffence of a mic by 1 " so the answer is its not going to matter for different instruments --provided the converters are all low like maybe below 10 samples --I know RME is and also Burl but I cant find much info about others ---A lot like Lynx claim 0 latency which we know is not true but that probably means 5 samples or so ---which is like RME --and considered zero latency ---It would be nice to see if we could get together a list of AD converter latency at their respective digital outputs AES/Optical Spdif---(Not System latency--AD latency only)
Old 6th January 2013
  #62
Gear Addict
 

Quote:
Originally Posted by psycho_monkey View Post
Not limited experience - just a firm belief of something you've never tested.



1 sample CAN cause phase issues in parallel processing. 4 samples definitely can. That's easy to check with timeadjuster.
1 sample at 48k would be less than .05 ms you would have zoom all the way in a DAW to even see that , you would not be able to hear it. Would you? I couldn't
Old 6th January 2013
  #63
Gear Addict
 

Quote:
Originally Posted by FestivalStudios View Post
OK lets do the math ---correct me if im wrong
4 samples at 48 k is .0000833 secs or .083 ms or 83.3 microsec
time period of 1k waveform is 1 ms --- time period of 10k waveform is .1 ms so 4 samples of a 1k waveform would be about 12% out of phase or about 30 degrees --and at 10 k almost 300 degrees definately a problem for stereo pairs but also consider 1 ms is about the distance sound travels in 1 foot so 83.3 microsecs is 1" or the diffence of a mic by 1 " so the answer is its not going to matter for different instruments --provided the converters are all low like maybe below 10 samples --I know RME is and also Burl but I cant find much info about others ---A lot like Lynx claim 0 latency which we know is not true but that probably means 5 samples or so ---which is like RME --and considered zero latency ---It would be nice to see if we could get together a list of AD converter latency at their respective digital outputs AES/Optical Spdif---(Not System latency--AD latency only)
you are right it is not negligible.

With modern ADCs the latency is caused by what is called "0 cycle" latency. Which is essentially the delay between the moment the analog signal hits the ADC input and is converted into the binary digit output. This interval however is still dictated partially by the length of the clock cycle. But still it is measured in fractions of GSPS. It's so fast with modern semiconductors it's hard to believe any modern converter would be problematic even 1ms, let alone 5ms. 5ms seems unacceptable. Maybe it has something to do with noise and interference? A noisy power supply in a cheap converter can cause poor performance with respect to delay due to error checking possibly. I have no idea to be honest. But in a Lavry gold? 5ms is surprising if that's the case
Old 6th January 2013
  #64
Gear Addict
 

Quote:
Originally Posted by Rick Sutton View Post
I agree with your post but I'm referring strictly to the latency/clock relationship topic. I've sent an email to Burl and asked them to join the discussion. Fred Forssell would also be an excellent contributor. Why don't you drop him an email? I realize these guys are busy but maybe we can get someone involved that could explain some of this from a design perspective.
yeah these guys are the authority on the super modern audio converters .

It would be cool to have them tell us the truth. However I doubt they could be bothered? they are too busy making the next toys I'm looking foward to buying
Old 6th January 2013
  #65
Quote:
Originally Posted by emitsweet View Post
1 sample at 48k would be less than .05 ms you would have zoom all the way in a DAW to even see that , you would not be able to hear it. Would you? I couldn't
In parallel with an undelayed source? 1 sample...maybe not. 10 samples, or even 4? yep - and it's easy to check.

Quote:
Originally Posted by emitsweet View Post
you are right it is not negligible.

With modern ADCs the latency is caused by what is called "0 cycle" latency. Which is essentially the delay between the moment the analog signal hits the ADC input and is converted into the binary digit output. This interval however is still dictated partially by the length of the clock cycle. But still it is measured in fractions of GSPS. It's so fast with modern semiconductors it's hard to believe any modern converter would be problematic even 1ms, let alone 5ms. 5ms seems unacceptable. Maybe it has something to do with noise and interference? A noisy power supply in a cheap converter can cause poor performance with respect to delay due to error checking possibly. I have no idea to be honest. But in a Lavry gold? 5ms is surprising if that's the case
As I said - it's due to the accuracy of the Lavry sampling - more processing cycles = more delay.
Old 6th January 2013
  #66
Gear Addict
 
Mo Facta's Avatar
If you want to scrutinize the minutia, splitting multi-mic recordings across different converters probably isn't a good idea, from a phase point of view. However, isn't that why we have a master clock and why we sync all digital devices to it? I would be interested to find out if there is indeed sample latency if all [differing] converters are synced to a master clock and if there is, how severe it can be between devices.

In other words, does the master clock sync not imply sample accuracy between converters?

Cheers
Old 6th January 2013
  #67
Gear Maniac
 

I don't speak for Dan Lavry, but let me pull a quote of his from the announcement thread for his LK-1 "Latency Killer" (Original post):

Quote:
Originally Posted by Dan Lavry View Post
I have been making converters for dozens of years, and I can “talk for a month” about the reasons why reducing the time delay compromises performance. But in order to save time, and not lose people on technical details, let me just point out that the makers of high end DA integrated circuits (look at flag ship DA’s from AKM, Cirrus and more) have a built in switchable feature – low latency at reduced performance or higher latency for better audio performance. If they could get the same quality at low latency, they would not have such a feature.

I refused to reduce audio quality of converters for the sake of latency. My best AD (Lavry gold AD122 MKIII) has a lot of latency. Mastering guys love it. Recording guys love it. But it is not used for overdubs. I don’t want to reduce audio quality.
This would seem to signify that yes, converter latency exists, as well as giving a reason for the high latency of the Gold A/D.
Old 6th January 2013
  #68
Quote:
Originally Posted by Mo Facta View Post
If you want to scrutinize the minutia, splitting multi-mic recordings across different converters probably isn't a good idea, from a phase point of view. However, isn't that why we have a master clock and why we sync all digital devices to it? I would be interested to find out if there is indeed sample latency if all [differing] converters are synced to a master clock and if there is, how severe it can be between devices.

In other words, does the master clock sync not imply sample accuracy between converters?

Cheers
It makes converters sample accurate yes. It means the samples are being taken at the same time. But the throughput can take a different amount of time. The time from the sample being taken to it arriving at the output of the converter might well vary.

Quote:
Originally Posted by cakeshoppe View Post
I don't speak for Dan Lavry, but let me pull a quote of his from the announcement thread for his LK-1 "Latency Killer" (Original post):

This would seem to signify that yes, converter latency exists, as well as giving a reason for the high latency of the Gold A/D.
It does more than signify it - it proves it!
Old 6th January 2013
  #69
Lives for gear
 
scottwilson's Avatar
This thread is almost enough to make me cry.

Can you hear one sample of delay? Two samples? On it's own no, sure, but try an experiment where you take 2 bus mix, run it through a bus with an intentional delay of 1 or 2 samples (Pro Tools has a plugin for this) - mix it with the original signal and see how it sounds. Go from 1 to 2 to 3 or 4 samples and see how that sounds. This would mirror what happens if you are using parallel processing on an analog insert on your two bus and you don't have your latency correctly compensated. One sample is significant in this case.

I'm pretty sure that clocking devices together doesn't ensure the samples are taken at the exact same instant, but that they are taken 1:1 - for every sample device 1 takes, device 2 will take as well. They should be relatively close, less than 1 sample discrepancy, but they will not be at the exact instant. This isn't jitter... jitter is frequency modulation of the clock signal. This is just a consistent delay between samples - again, due to design differences in PLL circuits, oversampling rates in one device versus another, and so forth.

Unless I missed it, I haven't seen an explanation of WHY different converters have different latencies. You can have almost zero latency if you're sampling at the base Fs and you have no filters... so you're just sending down the pipe a value of the sampled signal at that moment. That will probably sound like ass.

Most converters sample at a higher multiple of the session sample rate (oversampling) and decimate to the desired sample frequency. People like Apogee and Digidesign who typically are building low-latency converters must make compromises between 'quality' of the filter and latency. Those building mastering converters don't have to worry about latency, so design their filters with better phase response at the cost of higher delay.

Apologies for teaching your grandmother to suck eggs, but the blind assertions and absolute determination of some things said in this thread would require a bit of magic that just doesn't exist.

s
Old 7th January 2013
  #70
Quote:
Originally Posted by scottwilson View Post
This thread is almost enough to make me cry.

Can you hear one sample of delay? Two samples? On it's own no, sure, but try an experiment where you take 2 bus mix, run it through a bus with an intentional delay of 1 or 2 samples (Pro Tools has a plugin for this) - mix it with the original signal and see how it sounds. Go from 1 to 2 to 3 or 4 samples and see how that sounds. This would mirror what happens if you are using parallel processing on an analog insert on your two bus and you don't have your latency correctly compensated. One sample is significant in this case.

Exactly my point. It's something to be aware of, even if it doesn't affect you in day to day operation.

Quote:
Originally Posted by scottwilson View Post
I'm pretty sure that clocking devices together doesn't ensure the samples are taken at the exact same instant, but that they are taken 1:1 - for every sample device 1 takes, device 2 will take as well. They should be relatively close, less than 1 sample discrepancy, but they will not be at the exact instant. This isn't jitter... jitter is frequency modulation of the clock signal. This is just a consistent delay between samples - again, due to design differences in PLL circuits, oversampling rates in one device versus another, and so forth.
my understanding is that the samples are taken together, and they exit the converter together in sync - they HAVE to leave the converters in sync, otherwise you get clocking errors - that's the point of having things clocked together of course. The thing is, different converters take different amounts of time to convert the sample - so whilst converter A might sample and spit out digital audio 4 samples later, converter B might take 8 or 9 samples to do the same thing. So you're still in sync, but you've not got the same throughput.

Rereading what you're saying, I don't think it matters if either of us is right on the initial sampling thing, providing they're a) relatively close, and b) the output is as described - which of course it is.

Quote:
Originally Posted by scottwilson View Post
Unless I missed it, I haven't seen an explanation of WHY different converters have different latencies. You can have almost zero latency if you're sampling at the base Fs and you have no filters... so you're just sending down the pipe a value of the sampled signal at that moment. That will probably sound like ass.

Most converters sample at a higher multiple of the session sample rate (oversampling) and decimate to the desired sample frequency. People like Apogee and Digidesign who typically are building low-latency converters must make compromises between 'quality' of the filter and latency. Those building mastering converters don't have to worry about latency, so design their filters with better phase response at the cost of higher delay.
I think the Dan Lavry quotes say as much, but it doesn't hurt to requote it for those convinced there's no latency in digital conversion! Christ, you just have to run a signal through a so called "zero latency" monitoring system like PT HD or RME totalmix and lay it back against the analogue input to see that.

Quote:
Originally Posted by scottwilson View Post
Apologies for teaching your grandmother to suck eggs, but the blind assertions and absolute determination of some things said in this thread would require a bit of magic that just doesn't exist.

s
Amen.
Old 7th January 2013
  #71
Gear Addict
 

Quote:
Originally Posted by Mo Facta View Post

In other words, does the master clock sync not imply sample accuracy between converters?

Cheers
Until I read some replies on this thread I always believed part of master clocking functionality was to ensure sample accuracy between multiple digital converters/devices. Where the slave surrenders control to the master.

Although despite a few differing opinions on whether latency and clocking are related, they are. But there could be other functional attributes that contribute to the latencies in an A/D converter and yet even more related to the audio conversion process. Every digital device on earth comprised of sequential circuity requires a clock. This clock triggers finite state machines circuitry when to switch states.

With any analog to digital conversion when the analog voltage is translated into a digit there is a latency associated with what is called the zero-cycle. This is the void between the analog input to the digital output of the ADC semiconductor. This void in time is at the mercy of a clock and can be delayed causing additional latency. Though it is so minute I don't see how it can impact phase relationships of two devices in tandem. Regardless the fact of the matter is latency and clocking are very much related.
Old 7th January 2013
  #72
Lives for gear
 
scottwilson's Avatar
Quote:
Originally Posted by psycho_monkey View Post
Rereading what you're saying, I don't think it matters if either of us is right on the initial sampling thing, providing they're a) relatively close, and b) the output is as described - which of course it is.
Agreed. I'm being pedantic. Under standard information theory, for linear systems and signals below 1/2 Fs, it doesn't matter where that initial sample is taken + or - 1/2 the sample period. (edit: as long as that difference is consistent, of course)
Old 7th January 2013
  #73
Gear Addict
 

Quote:
Originally Posted by psycho_monkey View Post
Exactly my point. It's something to be aware of, even if it doesn't affect you in day to day operation.



my understanding is that the samples are taken together, and they exit the converter together in sync - they HAVE to leave the converters in sync, otherwise you get clocking errors - that's the point of having things clocked together of course. The thing is, different converters take different amounts of time to convert the sample - so whilst converter A might sample and spit out digital audio 4 samples later, converter B might take 8 or 9 samples to do the same thing. So you're still in sync, but you've not got the same throughput.

Rereading what you're saying, I don't think it matters if either of us is right on the initial sampling thing, providing they're a) relatively close, and b) the output is as described - which of course it is.



I think the Dan Lavry quotes say as much, but it doesn't hurt to requote it for those convinced there's no latency in digital conversion! Christ, you just have to run a signal through a so called "zero latency" monitoring system like PT HD or RME totalmix and lay it back against the analogue input to see that.



Amen.
everything he stated was contrary to you prior assertions
Old 7th January 2013
  #74
Lives for gear
 
scottwilson's Avatar
Quote:
Originally Posted by emitsweet View Post
everything he stated was contrary to you prior assertions
Actually, I was agreeing with him, and trying to explain WHY different ADC converters will have different latencies... because the latencies are built not into the physical connections, but into the software filters which require a trade-off of phase response near the Nyquist frequency and delay of the signal you're filtering.

I don't think this has clicked yet, so I'll just leave it at that and see how many others jump in here that you want to argue with.

-s
Old 7th January 2013
  #75
Gear Addict
 

Quote:
Originally Posted by scottwilson View Post
jitter is frequency modulation of the clock signal.
Jitter is the delay variation of latency with respect to data
Old 7th January 2013
  #76
Gear Addict
 

Quote:
Originally Posted by scottwilson View Post
Actually, I was agreeing with him, and trying to explain WHY different ADC converters will have different latencies... because the latencies are built not into the physical connections, but into the software filters which require a trade-off of phase response near the Nyquist frequency and delay of the signal you're filtering.
ooops
but he stated that latency and clocking are not at all related, which is incorrect

Quote:
Originally Posted by scottwilson View Post
I don't think this has clicked yet, so I'll just leave it at that and see how many others jump in here that you want to argue with.
another condescending chap on GS, it must go with the territory? does it not?

Quote:
Originally Posted by scottwilson View Post
ADC converters
What exactly is an Analog-to-Digital Converter chip converters?
Old 7th January 2013
  #77
Lives for gear
 
scottwilson's Avatar
Quote:
Originally Posted by emitsweet View Post
ooops
but he stated that latency and clocking are not at all related, which is incorrect
They are unrelated.

Quote:
Originally Posted by emitsweet View Post
What exactly is an Analog-to-Digital Converter chip converters?
You got me there.
Old 7th January 2013
  #78
Gear Addict
 

Quote:
Originally Posted by scottwilson View Post
They are unrelated.
So you are saying the delays associated with zero cycle latency within ADCs are not a side effect directly related to clock pulse?

I addition, are you claiming the master clock of a converter is entirely independent of that of internal edge triggering of internal state machines within the respective device? In other words, a converter's master clock is not also controlling low level ADC pulse and related respective sequential circuitry? Then what you are claiming is a digital device such as an audio A/D converter has multiple controlling pulses? independent of ADCs and Master clock functionality? Is that what you are stating? Or are you stating that even if there is latency associated with ADCs that it will not affect digital output? and that delay is not in any way caused by pulse interval accumulation?

2nd to Lastly, are stating that zero cycle ADC semiconductor latency and out-coming digital audio sample latency are not related and not one in the same and related to clock pulse in any way? and/or the latter is not in any way related to the former? And in conclusion, latency as it relates to this thread has nothing to do with analog input to convergence of the ADC output digit(s)? It is only a result of filtering? And filtering though still clock dependent, its resulting latencies are not due in any way to to clock pulse?

Lastly are you claiming latency is the collective delays inherit to both semiconductor ADCs and digital filtering without regards or influence of clocking? If so, what exactly is the delay culprit in such a scenario? So with or without delays inherit to clock pulse this would still be problematical even disregarding the importance of negligibility with concern to phase accuracy?


Quote:
Originally Posted by scottwilson View Post
People like Apogee and Digidesign who typically are building low-latency converters must make compromises between 'quality' of the filter and latency.
so if the op uses an Apogee and Digidesign unit clocked together and not "mastering" converters, latency will not be an issue? but audio quality may suffer?
Old 7th January 2013
  #79
Gear Addict
 
Avast!'s Avatar
Meanwhile, in the real world...

From the analog input to the recording onscreen, whether a person is using a 'converter' or an 'interface', there is a time lapse between the input and the time it is written into the session.

Not taking into account that some recording software compensates for this, automatically or otherwise; and definitely taking into account that the DA chip is -implemented- with many design choices beyond the literal (and time-consuming) conversion...

Each brand and model of converter has a different 'propagation time'.

Emit, you're being on the wrong side of this debate. Clock -rate- effects how long the propagation time takes. -Clocking- has nothing to do with the fact that this variation in input-to-output latency exists. It exists because the device design takes a finite number of clock cycles to crank through the snapshot into a digital word.

Old 7th January 2013
  #80
Gear Addict
 

Quote:
Originally Posted by Avast! View Post
From the analog input to the recording onscreen, whether a person is using a 'converter' or an 'interface', there is a time lapse between the input and the time it is written into the session.
I'm under the impression we are only discussing standalone converters slaved, and not interfaced with a computer? yes no?
Old 7th January 2013
  #81
Gear Addict
 
Avast!'s Avatar
Actually, no, and no, and you're incorrect either way

The OP is asking about varying input devices into a recording session, presumably using just one daw.

Regardless, the time from the input to the output of any real analog to digital audio device used in a recording studio is finite* and varies from model to model**.



*and defined and measurable.

**see lavry et al links above and published and hidden specs of all manufacturers.
Old 7th January 2013
  #82
Gear Addict
 

Quote:
Originally Posted by Avast! View Post
The OP is asking about varying input devices into a recording session, presumably using just one daw.
he's asking Do you use different AD Converters tracking? nothing about DAW

Quote:
Originally Posted by Avast! View Post
varies from model to model**.
well yeah several people have confirmed this, but is it negligible?

It seems Lavry converters due to their superior quality have worse latency than apogees, according to this thread. If a Lavry Gold converter really has a 5ms delay it is then a completely useless converter for tracking. Unless of course you only track 2 channels at a time.

Does anyone use a Lavry gold during the tracking of a full band? If they do they are slaving it or using it as master with something else with multiple channels presumably. It seams Psycho Monkey and scottwilson have confirmed Lavry gold and its 5ms latency is problematic in recording more than 2 tracks.

I'm glad I know this now. I will not buy a gold unit since I am not a mastering engineer or rarely if ever record only 2 tracks at once. I do record vocals or acoustic guitars 1 or 2 tracks at a time but a 5ms delay is too much IMO

thanks for the info guys, I stand corrected
Old 7th January 2013
  #83
Lives for gear
 
scottwilson's Avatar
Quote:
Originally Posted by emitsweet View Post
So you are saying the delays associated with zero cycle latency within ADCs are not a side effect directly related to clock pulse?
I'm sorry I don't understand the term 'zero cycle latency' - I looked for some sort of textbook definition, but can't find any so forgive me if I'm speaking around it.

The reason I got into this thread is that you seem to have been saying that if two converters were clocked together that they shouldn't have any latency differences and if they did, then they were broken. That and a few other misconceptions led me to believe it was simply a misunderstanding of ADC technology that was causing the conversion.

So let me take a step back and discuss for a second where I'm coming from.

First of all, there are a few components of any ADC.

The first is the actual sampler. This is the device that periodically takes a measurement of an analog signal and somehow converts this to a discrete signal of some sort. There is a clock that drives this process. The sample frequency could be Fs or some multiple of that (or in some cases is not slaved at all). I won't pretend to understand all the differences and pros/cons of the types, but this is generally done at the silicon level so that either your Crystal or your Analog Devices or your Burr Brown chip takes an input signal, and a clock and spits out a discrete signal at a data rate higher than the target sample rate.

Second, is the decimating filter. This converts the higher frequency sample rate from the first step into a 24 bit PCM signal at the sample rate we want it to be. This device is generally clocked to the input clock - or at least generates a signal that matches the input clock.

Third is the driver which converts the raw signal into either AES, Symphony, Firewire, USB, or TDM, or whatever.

So there are latencies at each step. The first and the third latencies are trivial at the sample rates we deal with. Microseconds?

The second device, however is where the long latencies are generated. The 1-3 milliseconds of HD IO latency or the 5-10 milliseconds of Lavry or Manley SLAM latency. The reason this is the case is that the filters in this unit operate in real time and need time to operate. This is where you can read in the SLAM manual or the Lavry manual or even some Burr Brown docs about why they design the filters the way they do.

This is why I'm saying that "clocking" has nothing to do with latency. The latency of different devices is PRIMARILY (by multiple orders of magnitude) driven by the design constraints of the decimating filter.

Quote:
Originally Posted by emitsweet View Post
I addition, are you claiming the master clock of a converter is entirely independent of that of internal edge triggering of internal state machines within the respective device? In other words, a converter's master clock is not also controlling low level ADC pulse and related respective sequential circuitry? Then what you are claiming is a digital device such as an audio A/D converter has multiple controlling pulses? independent of ADCs and Master clock functionality? Is that what you are stating? Or are you stating that even if there is latency associated with ADCs that it will not affect digital output? and that delay is not in any way caused by pulse interval accumulation?
Hm. Well, I can say with 100% certainty that I own a converter who's sample frequency is completely independent of any external clock that may be driving the AES digital signal. This may be difficult to understand and it took me a while too, but if you read about Asynchronous Sample Rate Conversion you'll see that this was done to offset the trouble with PLLs and how that generates typically poorer jitter specs than internal clocking.

As to the second point, I'm saying that latency associated with ADCs is not significantly related to the pulse intervals, as I hope I explained above.

Quote:
Originally Posted by emitsweet View Post
2nd to Lastly, are stating that zero cycle ADC semiconductor latency and out-coming digital audio sample latency are not related and not one in the same and related to clock pulse in any way? and/or the latter is not in any way related to the former? And in conclusion, latency as it relates to this thread has nothing to do with analog input to convergence of the ADC output digit(s)? It is only a result of filtering? And filtering though still clock dependent, its resulting latencies are not due in any way to to clock pulse?
I think earlier edits of this post were a little easier for me to understand. Latency is clearly the difference in timing from input signal to output signal... and that the data rate of the output signal has no significant bearing (to within the sample) on latency - from manufacturer to manufacturer.

Quote:
Originally Posted by emitsweet View Post
Lastly are you claiming latency is the collective delays inherit to both semiconductor ADCs and digital filtering without regards or influence of clocking? If so, what exactly is the delay culprit in such a scenario? So with or without delays inherit to clock pulse this would still be problematical even disregarding the importance of negligibility with concern to phase accuracy?
Yes... latency is the collective delay due to all factors from device 1 2 and 3, but that 1 and 3 are going to easily be within the sample and that significant delays are due to device 2 (the decimating filter)


Quote:
Originally Posted by emitsweet View Post
so if the op uses an Apogee and Digidesign unit clocked together and not "mastering" converters, latency will not be an issue? but audio quality may suffer?
Suffer? no. There will be differences in the "quality" of the signal. Is quality a loaded word? I'm not assigning a value judgement there. All engineering is about tradeoffs. If you design a filter meant for lower latency, you're going to have different characteristics than if latency is not one of your driving factors. This results primarily in filters with different phase responses near the cutoff frequency. Clearly people using Avid and Apogee ADC and DACs are not suffering, but there are people who prefer one over another and there are people who can't hear one over another.

s
Old 7th January 2013
  #84
Quote:
Originally Posted by emitsweet View Post
everything he stated was contrary to you prior assertions
No it wasn't.

Quote:
Originally Posted by scottwilson View Post
Actually, I was agreeing with him, and trying to explain WHY different ADC converters will have different latencies... because the latencies are built not into the physical connections, but into the software filters which require a trade-off of phase response near the Nyquist frequency and delay of the signal you're filtering.

I don't think this has clicked yet, so I'll just leave it at that and see how many others jump in here that you want to argue with.

-s
See?

Quote:
Originally Posted by emitsweet View Post
ooops
but he stated that latency and clocking are not at all related, which is incorrect
Clocking is irrelevant to the delay inherent in conversion. Changing the clock won't change this latency - it might make under a sample difference, but it won't make a significant difference. If you want to claim that's "changing the latency" I'll agree with you, but it won't give 2 converters with different latencies the same throughput - it'll just make them spit out samples at the same time.

Quote:
Originally Posted by emitsweet View Post
another condescending chap on GS, it must go with the territory? does it not?
I think you told me I should go back to the basics of learning about conversion, and that you'd designed converters...now who's the pot and who's the kettle?

Quote:
Originally Posted by emitsweet View Post
I'm under the impression we are only discussing standalone converters slaved, and not interfaced with a computer? yes no?
ANY converters have built in latency. Whether they're standalone, interface, part of an HD rig, whatever. It's all the same thing after all - and the same converter is often available standalone and as part of an interface.

Quote:
Originally Posted by emitsweet View Post
he's asking Do you use different AD Converters tracking? nothing about DAW
You could use an HD rig with 3 different converters, all clocked together but with different latencies. How does this add up? Standalone, or interface? A bit of both I'd say.

Quote:
Originally Posted by emitsweet View Post
well yeah several people have confirmed this, but is it negligible?
Yes if you're just tracking. No if you're doing hardware inserts in parallel - you need to be aware of it.

For eg an HD rig with Avid interfaces SHOULD work fine - the system knows the converter latency and accounts for it with a hardware inserts. With a Lynx converter, it should also be fine - the Lynx claims to have the same converter latency as the Avid 192s, and appears to the system as the same. An Apogee? possibly not - you might get phasing.

Quote:
Originally Posted by emitsweet View Post
It seems Lavry converters due to their superior quality have worse latency than apogees, according to this thread. If a Lavry Gold converter really has a 5ms delay it is then a completely useless converter for tracking. Unless of course you only track 2 channels at a time.
Or you monitor analogue - not an unheard of proposition for someone with the money to buy a gold.
Old 7th January 2013
  #85
Gear Addict
 

Quote:
Originally Posted by scottwilson View Post
I'm sorry I don't understand the term 'zero cycle latency' -
it the delay that occurs during what you try to explain in the next paragraph

Quote:
Originally Posted by scottwilson View Post
The first is the actual sampler. This is the device that periodically takes a measurement of an analog signal and somehow converts this to a discrete signal of some sort
it's not "a discrete signal of some sort". At that point it is data, digits, digital information.

I was asking you specifically if the external clocking available in pro converters is the same pulse also controlling the heart of the ADCs or what you call the "Sampler"?



It seems there is a disconnect here with multiple posters as to what latency is or what specific latency in an audio chain is in question in this thread context, where does this delay accumulate? and what impact does this really have on phase relationships with regards to 2 or more synchronized A/D converters?. Since there are several aspects of an audio chain that can cause delays, which one are we specifically discussing here? I don't think any 2 people here are on the same page. It seems like everyone is on the bash me page though ;-)

I can't answer these questions but it seem you are not sure and pych monkey is sort of just making a blind claim that "clocking and latency are not related". They are related. Are they related in the context of 2 converters being synced via master clock? I'm not sure.

Maybe a converter designer can clear it up. I don't see from the perspective of propagation delays there is somehow not a correlation with latency and clocking since everything in a dynamic digital circuit is controlled by a pulse. I know CPUs have multiple clocks but does an Audio A/D converter have multiple internal clocks? I don't think so and zero cycle latency is one place it occurs and it is partially based on clocking edge trigger delays. Does it accumulate into negligible latency? I don't think so. I have used ADCs in the design of proximity and infrared digital sensory applications and there is no build up, however I realize audio is a different animal.

I have never seen an ADCs that uses multiple clock sources, Therefore If one clock is controlling all aspects of a converter I don't see how synchronizing two units there can be that much latency when one unit surrenders its internal clock and is now dependent 100% on an external master. Modern ADCs are so fast I cannot imagine a Lavry being 5ms of delay. If I'm wrong I'm wrong, but if I'm wrong I'm not going to be buying a mastering converter any time soon, certainly not a lavry

So to summarize, if I clocked a lavry to an apogee the lavry would send audio to the computer with greater latency even if the apogee was clocked to the lavry? If so that would mean the audio is not in fact synchronized at the sample level based on your an other's claim here? Lastly any latency caused during the A/D process is only related to filtering?
Old 7th January 2013
  #86
Registered User
 
Rick Sutton's Avatar
 

Quote:
Originally Posted by emitsweet View Post
Modern ADCs are so fast I cannot imagine a Lavry being 5ms of delay. If I'm wrong I'm wrong, but if I'm wrong I'm not going to be buying a mastering converter any time soon, certainly not a lavry
Emit,
I do a lot of mastering and 5ms of latency is not an issue in this application. Really, a 50 ms delay wouldn't be an issue if a higher quality signal was the result.
Old 7th January 2013
  #87
Quote:
Originally Posted by Rick Sutton View Post
Emit,
I do a lot of mastering and 5ms of latency is not an issue in this application. Really, a 50 ms delay wouldn't be an issue if a higher quality signal was the result.
I think his point was he's not going to buy a mastering converter for tracking, not that 5ms was a problem for mastering (or indeed printing mixes).

Which is self evident I suppose...but there you go.
Old 7th January 2013
  #88
Quote:
Originally Posted by emitsweet View Post
iIt seems there is a disconnect here with multiple posters as to what latency is or what specific latency in an audio chain is in question in this thread context, where does this delay accumulate? and what impact does this really have on phase relationships with regards to 2 or more synchronized A/D converters?. Since there are several aspects of an audio chain that can cause delays, which one are we specifically discussing here? I don't think any 2 people here are on the same page. It seems like everyone is on the bash me page though ;-)
That's kind of what happens when you shout out a falsehood and refuse to concede the point when everyone else knows you're wrong

Quote:
Originally Posted by emitsweet View Post
I can't answer these questions but it seem you are not sure and pych monkey is sort of just making a blind claim that "clocking and latency are not related". They are related. Are they related in the context of 2 converters being synced via master clock? I'm not sure.
Ok - YOU ascertained that a correctly clocked converter would have NO latency. This is clearly false - as you've acknowledged.

I'm sure clocking MIGHT affect a converter +/- 1 sample. But it wouldn't change the fact there's an inherent conversion delay. It's not going to change the delay from 1 sample to 16 samples or vice versa. It's not going to make a loopback or hardware insert any more accurate.

Quote:
Originally Posted by emitsweet View Post
Maybe a converter designer can clear it up.
Erm...is it my imagination, or did you earlier in the thread claim to have designed "several" converters? Please clarify. You're rapidly losing credibility in my opinion...maybe you've overstated your credentials?

Quote:
Originally Posted by emitsweet View Post
So to summarize, if I clocked a lavry to an apogee the lavry would send audio to the computer with greater latency even if the apogee was clocked to the lavry? If so that would mean the audio is not in fact synchronized at the sample level based on your an other's claim here? Lastly any latency caused during the A/D process is only related to filtering?

As I understand it, if both converters were sampling a sine wave, they'd spit out synchronised samples, and hopefully the result would be broadly similar, but the lavry (gold for the purposes of this description) would delay the cycle by a number of samples. This ties in with what I've measured with other converters.
Old 7th January 2013
  #89
Lives for gear
 
jrhager84's Avatar
 

It seems drama follows emit around... Lol

Sent from my SAMSUNG-SGH-I777
Old 7th January 2013
  #90
Lives for gear
 
scottwilson's Avatar
Quote:
Originally Posted by emitsweet View Post
it's not "a discrete signal of some sort". At that point it is data, digits, digital information.
Please see: Discrete signal - Wikipedia, the free encyclopedia

Quote:
Originally Posted by emitsweet View Post
I was asking you specifically if the external clocking available in pro converters is the same pulse also controlling the heart of the ADCs or what you call the "Sampler"?
I hope I cleared up that the only thing the (lets's call it the word clock, cause that's what it is) word clock is in direct control of is the output data rate. Some converters will use a PLL and convert that clock to a higher frequency signal for the actual sampling and some will not use it at all and instead use asynchronous sample rate conversion. But there are no converters that I know of in existence today that use the word clock directly to drive the sampling of the analog signal.

Quote:
Originally Posted by emitsweet View Post
I have never seen an ADCs that uses multiple clock sources, Therefore If one clock is controlling all aspects of a converter I don't see how synchronizing two units there can be that much latency when one unit surrenders its internal clock and is now dependent 100% on an external master. Modern ADCs are so fast I cannot imagine a Lavry being 5ms of delay. If I'm wrong I'm wrong, but if I'm wrong I'm not going to be buying a mastering converter any time soon, certainly not a lavry
I think you need to get used to the latter case - Though dismissing a mastering converter outright will only be your own loss.

It doesn't matter how "fast" a converter is, digital filtering takes time because the filter design depends on events that haven't happened yet... this is how they are 'better' than analog filters. You can design them to be dependent on the value of the signal at a future point in time. If that time is 1 second down the line, then you're going to have to design 1 second of latency into your device.

Here's from another wikipedia article (http://en.wikipedia.org/wiki/Digital_filter)

"Since digital filters use a sampling process and discrete-time processing, they experience latency (the difference in time between the input and the response), which is almost irrelevant in analog filters."

"However, digital filters do introduce a higher fundamental latency to the system. In an analog filter, latency is often negligible; strictly speaking it is the time for an electrical signal to propagate through the filter circuit. In digital filters, latency is a function of the number of delay elements in the system."

Quote:
Originally Posted by emitsweet View Post
So to summarize, if I clocked a lavry to an apogee the lavry would send audio to the computer with greater latency even if the apogee was clocked to the lavry? If so that would mean the audio is not in fact synchronized at the sample level based on your an other's claim here? Lastly any latency caused during the A/D process is only related to filtering?
Clocking does not sync the audio. It simply syncs the words coming out of the digital side of the converter.

Now I really am done. Either I'm not explaining it as well as I think I am, or you're just holding on to something in your head that isn't allowing you to move past this.

s
Topic:
Post Reply

Welcome to the Gearslutz Pro Audio Community!

Registration benefits include:
  • The ability to reply to and create new discussions
  • Access to members-only giveaways & competitions
  • Interact with VIP industry experts in our guest Q&As
  • Access to members-only sub forum discussions
  • Access to members-only Chat Room
  • Get INSTANT ACCESS to the world's best private pro audio Classifieds for only USD $20/year
  • Promote your eBay auctions and Reverb.com listings for free
  • Remove this message!
You need an account to post a reply. Create a username and password below and an account will be created and your post entered.


 
 
Slide to join now Processing…
Thread Tools
Search this Thread
Search this Thread:

Advanced Search
Similar Threads
Forum Jump
Forum Jump