(I'll talk about the cat in another post and stick to the topic with this one.

)
Having just (2 days ago counts) completed an unscientific word clock experiment with the AD16x as master and slave, I'd have to say that my expectations were largely met.
The test was unscientific in that there were no blind A/B or control groups, just a quick and dirty impression gathered by comparing what I'm used to hearing with what I just heard. If for you, this invalidates or demerits what follows, then don't bother reading the rest. I'll type about what I think of the AD16x 1st, followed by why I think there are any differences between the clocks of various converters and dedicated clocks.
My voice can be added to that of the happy campers who hear an improvement when using the AD16x's clock as the master to the 192. The difference to me was
not unbelievably obvious and better, but a smallish yet significant enough improvement that if my setup required a master clock, I would buy and use the BB, or perhaps the Isochrone Antelope. I bought the AD16x to supply 16 channels of AD that I expected would be superior to those in the 192 (of which I have 8 channels), and to my ears, the AD16x wins.
Farming out the unoriginal adjectives to describe the difference:
AD16x is silky in the top, clearer and more focused in the midrange (this area is the most obviously different to me), tighter, more focused and not as sloppy as the 192 in the low end, and "more analog" (to quote my my guitar player); while the 192 is brittle ^up there^, sterile all around and papery. Still, the differences are not as huge as I expected or had read that they could be.
Oh yeah, I forgot to mention the source material, which I something I'm
very accustomed to working with. My method of songwriting is improvisation (jamming) with a musical cohort who plays guitar or bass while I try coerce the truth out of the drums. We've done this for 4 years now, and the best parts of almost every session have been mixed by me and burned to CD. I'm so used to that sound that any minor difference between sessions is clearly audible to me.
I never test unknown equipment on clients and I think it's risky practice for those who do.
Strangely, in another test where I didn't expect to hear
any difference, the AD16x's clock seemed to also change to sound of well-mixed popular commercial material. The music of Beck (Sea Change), Ry Cooder (Chavez Ravine), Prince (Musicology), Massive Attack (Protection) and a few others were made to participate while the AD16x kept a watchful eye over the 192. Again, the silky top, tighter bottom and clear mids came to the fore. I expected no difference because I thought that jitter performance could not change when playing back digital source material - that only the AD stage is affected. Perhaps I have misunderstood what has been written on this area of the subject?
So.. given that the laws of physics are not defied by any external wordclock and that theoretically, there should be no measurable improvement in jitter performance for a slaved unit - yet there are myriads of posts hailing the BB clock as the best thing for digital audio since blackface ADATs were finally admitted into the Museum of Suckage - what is the middle ground of understanding?
I don't have the answer, but I have a theory: many (if not all) digital audio manufacturers are not 100% sure of what they're doing when they build a device with its own internal clock. They only build something that
audibly tests better than that of their closest competitor, and market it as aggressively as they can behind the favorable words of the most popular audio engineer they can find. This is a very common model for the way scads of popular products come to flourish in most markets outside of the audio world. Why do we think it's any different for us??
Chew it.