The No.1 Website for Pro Audio
T/F: A good external clock can improve a standalone converter's sound
Old 21st June 2006
Gear Nut
Matej's Avatar

Any chance you could make us some sounds for comparison?
Old 21st June 2006
Originally Posted by jonnyclueless
During a mix the producer asked me if I thought he should get a Big ben to improve the sound of his home studio. I told him my experiences and about the topic of this thread etc. I told him the only way to make such a decision was to try it. So I clocked his system to my bigBen and see if he prefered it or even heard a difference. He was so stunned by the improvement that he bought one that day. So maybe we're all imagining things, but as long as it's a drastic difference in our minds, we're all happy.

Well, I think the thinking of people like Dan Lavry is not that anyone is imagining anything, or that people don't prefer one sound over another (which -- as he is careful to point out is completely subjective).

He's saying that the difference that people are apparently hearing comes from more jitter -- not less -- because an external clock (again, according to Lavry, who is not only a well-respected, university trained digital design EE, but makes some of the most desirable -- and expensive -- converter interfaces around) will almost always produce more jitter than an internal clock (in a modern pro device) for reasons which are neither mysterious or controversial, but rather basic digital audio reality.

That might sound odd, but, look around. Everywhere we turn in the recording arena, we're making intentional choices that take us farther away from literal accuracy.

We don't don't use flat, neutral mics all that often, except in calibration processes. We prize heavily colored, "characterful" preamps over technically more accurate preamps. Many of us even run our signals onto magnetic tape at some point in an otherwise digital recording chain. That's not to "recapture the purity" of the analog tape sound... particularly given that in many cases much of the rest of the chain is digital.

Lavry's bette noire, if you will, is not so much that people prefer the sound of external clocks even where they're not necessary (as on a single device) but that people keep trying to insist that the external clock produces less jitter when, he says, it's basic digital 'science' that it will tend to produce more.

PS... I thought johnnyclueless's anecdote was going to turn out to be one of those "dumb producer stories"... heh We haven't had a "dumb producer" thread around here in a few weeks, seems like... I think it's due.
Old 21st June 2006
Lives for gear

No, so far the two people I have done test with have sold millions and millions of albums. And both are extremely detail oriented to the extent that I often sit with them moving regions by single samples (I don't hear a diference, but they do in that respect). Making constant changes by tenths of a DB or pitch changes by 100ths of a cent.

If more jitter is what we hear, then i want a system with more jtter. Because the end result has been that everything sounds tighter, more focused, and has more depth. I tried the test on a recent dance mix and could barely tell a difference. one of those 'am I just imagining the difference?' situtions. but on the other stuff which was more organic, the differences were night and day.

So maybe the difference is material dependant or dependant on the recording quality (though the dance stuff was probably the worst quality ever).
Old 22nd June 2006

[In case anyone is wondering -- the asshole immediately above took it upon himself to completely modify the quote he attributed to me. Feel free to judge me on what I DID write. But don't be confused by this lying wank. Cheers.]
Old 22nd June 2006
Lives for gear

Originally Posted by theblue1

What he is saying is that it is almost certainly less accurate than if the device was using its own internal clock.

It's certainly possible that people will like that more... but as Lavry points out, that difference is mostly attributable to increased jitter -- not less.

Here's my issue with that.

An A/D converter is not the only component in a digital recording chain. To laud the virtues of standalone, internal clocking is to have a very "bedroom studio" view of the professional recording world. Necessarily, you also require some sort of capture methodology. How will you clock the two together? Use the A/D converter as the master clock? OK, does that mean that the recording interface itself is now jitter compromised, since it is not self clocked?

What if you have more than one A/D converter, as is common in many scenarios? Standalone clocking is out the window at that point, even if you were OK with an A/D as the master clock in concept. Is Dan saying that every A/D scenario employing more than one A/D converter unit is technically compromised?

What I'm saying is that the view presented is from the perspective of a manufacturer of converters. As such, it is interesting (and probably accurate) from a clinical point of view, i.e. evaluating a single piece of equipment in isolation. However, count up the number of clock carrying cables in your setup, and unless yours is the simplest possible arrangement, you are into feeding clock down a cable to more than one device.

My point is that, even assuming you can obtain lower jitter from a single A/D via internal clocking, you are not necessarily evaluating anyone's actual, realworld system. And there are many situations, especially in larger systems, where the benefits of a star topology clocking scheme involving equal length, identical cables could potentially sonically outperform using the A/D as master clock, then passing the clock signal around as may be needed using a less symmetrical interconnect methodology. And even that presupposes only one A/D converter is required.

Dan's a smart guy, and I'm sure his statement is theoretically correct, but it has limited realworld impact, IMHO.

FWIW, the facility I administrate has 12 24 channel A/D converters, 14 24 channel D/A converters, 14 RME MADI cards and a mulititude of other interconnected digital devices requiring clocking. I have thought about this before, in the realworld heh
Old 22nd June 2006

I think the upshot of what Lavry is saying is that, yeah, once you route clock from one device to another, there is a greater likelihood of clock signal transmission induced jitter.

But -- obviously, as you suggest -- in all but relatively simple systems, you are, indeed, faced with the necessity of routing clock from device to device. (My very first DAW rig, back in '96, faced precisely that issue, comprised as it was of two ADATs and a BRC [remember those?] hooked into the computer via an original Frontier Designs ADAT i/o card.) And, while I'm not really conversant in the issues, I could certainly find it easy to believe that a star topology clustered around a solid master clock source would offer less problems.

I think Lavry's basic point is that using an external clock source for a single interface will not (under normal circumstances with modern pro gear) produce less jitter than using the device's internal clock.

Again, I'm certainly no expert (and I think one can see the evolution of my thinking in my posts in this thread -- the real ones, not the fake/modified quote from "JuanPabloCuervo" above fuuck )... but I got interested in the specific topic of whether an external clock could actually improve the accuracy of a single audio interface mainly because so many people were going on about it -- and it flew in the face of everything I had THOUGHT I knew about what word clock is all about. So I started this thread in order to try to get to the bottom of it, once and for all. (And, of course, I've also done some outside reading, too.)

Now, I think it's very important to stipulate that the effects of external clocking on a single device may indeed sound better to some folks. If they say so, how can you argue with that? It is, after all, entirely subjective. As jonnyclueless said above, "If more jitter is what we hear, then i want a system with more jtter."

As I tried to suggest above, we make all kinds of choices in recording that take us farther away from absolute accuracy -- it's part of the art of recording.

If someone really likes the sound he's getting from doing something that "breaks the rules" -- who am I to say he shouldn't?

As someone said a long time ago (Stevie Winwood? Ginger Baker?)... Do what you feel.

Old 22nd June 2006
Lives for gear

I would like to reiterate again that while on some material the external clock is a night and day difference, on some material I can hear no difference what so ever.
Old 22nd June 2006
Lives for gear
Originally Posted by jonnyclueless
I recently did a test with a client. We used a 192 and switched between internal and a Big Ben. The difference was unbelievable. It was so drastic I bought one the next day. I never imagined a clock would make such a difference. It sounded really bluury withouth the Big Ben. I never would have noticed until the comparison.
my experience with 16X clock on digi 192s was very positive also. the 16X itself was closer to the original material imo, while 192 wasn't quite as good but greatly improved - very acceptable with the ext. clock.
Post Reply

Welcome to the Gearslutz Pro Audio Community!

Registration benefits include:
  • The ability to reply to and create new discussions
  • Access to members-only giveaways & competitions
  • Interact with VIP industry experts in our guest Q&As
  • Access to members-only sub forum discussions
  • Access to members-only Chat Room
  • Get INSTANT ACCESS to the world's best private pro audio Classifieds for only USD $20/year
  • Promote your eBay auctions and listings for free
  • Remove this message!
You need an account to post a reply. Create a username and password below and an account will be created and your post entered.

Slide to join now Processing…
Thread Tools
Search this Thread
Search this Thread:

Advanced Search
Forum Jump
Forum Jump