The No.1 Website for Pro Audio
Evaluating AD/DA loops by means of Audio Diffmaker
Old 18th February 2021
  #2221
Lives for gear
 
didier.brest's Avatar
 
🎧 10 years
I cannot access your file.
Could you put your file somewhere, mediafire for instance, where there is free access ?
Old 18th February 2021 | Show parent
  #2222
Gear Maniac
 
jrasia's Avatar
 
1 Review written
🎧 15 years
My apologies. The link should work now. I had the folder restricted, but is now publicly available. Any further issues, and I will try mediafire .
Old 18th February 2021 | Show parent
  #2223
Lives for gear
 
didier.brest's Avatar
 
🎧 10 years
Quote:
Originally Posted by jrasia ➡️
UAD Apollo x16 (...)
One file is the regular DAC to ADC
8.793 µs, 0.3500 dB (L), 0.3626 dB (R), -42.4005 dBFS (L), -43.5589 dBFS (R)

Quote:
Originally Posted by jrasia ➡️
Other is the x16 Master Output DAC to ADC
-83.049 µs, 0.8320 dB (L), 0.8723 dB (R), -42.3787 dBFS (L), -43.5374 dBFS (R)


Quote:
Originally Posted by jrasia ➡️
RME ADI-2 Pro FS with SD Sharp filter
-26.518 µs, -0.0094 dB (L), 0.0054 dB (R), -44.4039 dBFS (L), -45.4661 dBFS (R)

To be added to the next issue of the

Loopback tests requested by forum members: Apogee Symphony Desktop, Eventide H9000R (for confirming the one at the top of the list of the results), Pacific Microsonics Model One and Model Two, Slate Digital VRS-8, SSL 2 and 2+. Any other test welcome too !

Last edited by didier.brest; 18th February 2021 at 10:46 PM..
Old 21st February 2021 | Show parent
  #2224
Lives for gear
 
🎧 10 years
Quote:
Originally Posted by didier.brest ➡️
No I can't because I do not know how the correlated null depth from ADM is computed. The ranking is based on the RMS level of the difference file.

When this ranking is not consistent with the ranking according to the correlated null depth, which occurs only for a small part of the tests, I refine the gain and time delay corrections from ADM in Matlab and generate a new difference file in Matlab.

Matlab refining lowers the difference level generally by a small margin although it is quite significant for both your examples : the result from this extra computation for the MOTU 16A has been given here. For the Symphony, the parameters delivered by ADM are: -1sec, 0,042dB (L), 0,011dB (R)..Corr Depth: 43,4 dB (L), 45,6 dB (R), the level of the ADM difference wav file measued in Wavelab are: -45,9 dBFS (L), 45,5 dBFS dB (R), the prameters refined in Matlab are -1.0000128 s 0,0422dB (L), 0,009dB (R) Difference: -51,2 dBFS (L), 50,1 dBFS dB (R).
I realize this is a wildly old post __ but isn't this problematic?

Just doing my every-couple-years check in.

But this is like .. "we expected the outcome to be A, but it was 12 degrees colder" so.. "we tossed it on a burner and turned it up to high" ...

You are looking at a ranking for transparency.

In a small set of test results you can't explain why there is a better correlation depth to difference, so you are changing the test material.

If you don't know why, or the significance of the results with which you arrived at the desire to change the test material, or a practical, qualitative difference between refined and unrefined results, then .. shouldn't this be the most important question.

Like -- the valuable significance of this whole a** thread literally lives in the reason behind strange outcomes like this and examining why they've happened.

Have there been any recent why's?

Regards
Old 21st February 2021 | Show parent
  #2225
Lives for gear
 
🎧 10 years
Quote:
Originally Posted by Analogue Mastering ➡️
At 44.1kHz or 96kHz AD conversion, it can never align exactly again at the same moment in time at the same frame, the correction will always be a subpicosecond compromise, which translates into a roughly -60dB Mathlab accuracy ceiling.
If this is the case, would it make sense that the ceiling, as it correlates to the depth, would be arithmetic and not geometric? Or vice versa?

Example --

Ceiling is -60
Converter (A) is -45
Converter (B) is -30

New Ceiling is -120

would we expect--
Converter (A) is -105
Converter (B) is -95

or--
Converter (A) is -90
Converter (B) is -60

.... if the former, and it could be proven, I would imagine this test could be deemed useful (although at a certain level many would argue that the end result might be .. "these are all acceptable".. lol. A pill that will perhaps never be swallowed to be sure)

Last edited by nosleepPDX; 21st February 2021 at 01:02 AM..
Old 21st February 2021 | Show parent
  #2226
Lives for gear
 
🎧 10 years
Quote:
Originally Posted by Analogue Mastering ➡️
Yes they are, do you really think people put start and stop buttons at exacty the same time? and all system latency is the same every hardware piece. That's the whole point. You're predominantly demonstrating phase, no depth accuracy
I've had my road to Damascus moment here.

It's the f*cking phase

Not even saying AM is necessarily correct here .. *I* certainly wouldn't know either way --

But this is the first thing I've read that has made sense as to a _just possible_ variation in the results here
Old 21st February 2021 | Show parent
  #2227
Lives for gear
 
🎧 10 years
Quote:
Originally Posted by didier.brest ➡️
If some result in the list of the results could be improved by more accurate time shift and gains values, please tell me which one and with which values.



The official metric for this purpose is the loudness. It is measured according to the recommendation ITU-R BS.1770-4, entitled Algorithms to measure audio programme loudness and true-peak audio level, issued by the International Telecommunication Union in 2015. The loudness is expressed in LUFS instead of dBFS for the RMS level used in this thread launched in 2011. The loudness of an audio sample is its RMS level at the output of a two stage filters. The frequency responses of the two stages are given by the here attached graphs. There is no way that such a frequency weighting (minimum weight in audio band is about -13 dB at 20 Hz) would make that



I don't expect that the ranking according to the loudness would be much different from the ranking according to the RMS level of the
Anyway I am not keen on issuing a new list based on loudness instead of RMS level.
No one would expect you to be keen, but I would expect some interest in the new metric, having just gone to look it does seem interesting, and I think could add some illumination here, especially considering a lot of what they're getting corresponds to what you're getting
Old 21st February 2021 | Show parent
  #2228
Lives for gear
 
didier.brest's Avatar
 
🎧 10 years
Quote:
Originally Posted by nosleepPDX ➡️
In a small set of test results you can't explain why there is a better correlation depth to difference, so you are changing the test material.
I did not change the test material, I took a more accurate measurement tool, which I know perfectly how it works, unlike ADM that is a black box, one of the output parameter of which, the correlated null depth, initially reported in the test results but not used for ranking (except during the first 15 days of this thread), seems being universally unknown, even by Google.
You should not care about the measurement method. Only the results matter: they can be checked and possibly dismissed without the need for any information about how they have been produced. I encourage everybody who can do it (basic skill in digital signal processing, access to some computation tool including FFT, spare time) to do so because in a so long
there may be a few errors, for instance from reporting from Matlab to this thread.
My measurement method delivers RMS level values for the difference between the original and the loopback copy, after time and left and right gains alignment of the copy with respect to the original, slightly lower (on the order of -1 dB) than the ones measured on the difference wav file produced by ADM, because of more accurate alignment parameters. Somebody having a still more accurate method for measuring these same RMS levels should post here an example including the link to the loopback file, the precise values of the three alignment parameters and the two RMS level values. Anybody having an alternative method for measuring an alternative performance parameter should create his/her own thread.

Last edited by didier.brest; 21st February 2021 at 05:51 PM.. Reason: Completing
Old 21st February 2021 | Show parent
  #2229
Lives for gear
 
didier.brest's Avatar
 
🎧 10 years
Quote:
Originally Posted by nosleepPDX ➡️
I would expect some interest in the new metric,
The new metric you are referring to is the one introduced in this thread by this post. It is motivated by the frequency dependency of the human audition. I already said in a previous post that there is no need for it because there is already an international standard for replacing the RMS level expressed in dBFS by the loudness expressed in LUFS. The loudness of a digital audio signal is the RMS level of the modified signal got by applying a frequency weighting similarly to what is done for getting A-weighted levels. But loudness frequency weighting is much different from A-weighting: instead of a bell curve peaking at 1 dB @ 3 kHz, -50 dB @ 20 Hz, -9 dB @ 20 kHz, loudness frequency weighting curve is monotonically increasing from -13 dB @20 Hz to + 4 dB @ 20 kHz. Hence the value of the loudness expressed in LUFS shall be somewhere between the values of the RMS level expressed in dBFS minus 13 dB and plus 4 dB.
Quote:
Originally Posted by esldude ➡️
the RME converter which rates -46 db nulls in the list here on the new metric scores better than -80 db.
Which means that this new metric is very different from the loudness although partly based on loudness frequency weighting in the explanation of its inventor who named it with the initials of his own name. I will undoubtedly give some interest to this new metric when it will be awarded international recognition.


Loopback tests requested by forum members: Apogee Symphony Desktop, Eventide H9000R (for confirming the one at the top of the list of the results), Pacific Microsonics Model One and Model Two, Slate Digital VRS-8, SSL 2 and 2+. Any other test welcome too !

Last edited by didier.brest; 28th February 2021 at 05:59 PM..
Old 21st February 2021
  #2230
Lives for gear
 
🎧 10 years
Quote:
Originally Posted by didier.brest ➡️
Quote:
Originally Posted by nosleepPDX ➡️
I would expect some interest in the new metric,
The new metric you are referring to is the one introduced in this thread by this post. It is motivated by the frequency dependency of the human audition. I already said in a previous post that there is no need for it because there is already an international norm for replacing the RMS level expressed in dBFS by the loudness expressed in LUFS. The loudness of a digital audio signal is the RMS level of the modified signal got by applying a frequency weighting similarly to what is done for getting A-weighted levels. But loudness frequency weighting is much different from A-weighting: instead of a bell curve peaking at 1 dB @ 3 kHz, -50 dB @ 20 Hz, -9 dB @ 20 kHz, loudness frequency weighting curve is monotonically increasing from -13 dB @20 Hz to + 4 dB @ 20 kHz. Hence the value of the loudness expressed in LUFS shall be somewhere between the values of the RMS level expressed in dBFS minus 13 dB and plus 4 dB.
Quote:
Originally Posted by esldude ➡️
the RME converter which rates -46 db nulls in the list here on the new metric scores better than -80 db.
Which means that this new metric is very different from the loudness although partly based on loudness frequency weighting in the explanation of its inventor who named it with the initials of his own name. I will undoubtedly give some interest to this new metric when it will be awarded international recognition.


Loopback tests requested by forum members: Apogee Symphony Desktop, Eventide H9000R (for confirming the one at the top of the list of the results), Pacific Microsonics Model One and Model Two, Slate Digital VRS-8, SSL 2 and 2+. Any other test welcome too !
He’s using ISO 226:2003 equal loudness curves, and a Fourier transform. Sure he named it but based on some international standards.. The 400 ms window seems to be the standard. Smoothing both the original and resulting file makes sense ..

Not saying it’s gospel here but that it’s interesting, and not only does the weighting still seem to stack with many results you’ve got here, it gives you a representation of the time domain and differences at any given point. Subsample time domain differences are accounted for (edit: because frequency is considered in the time domain)

A-weighting may be unnecessarily expansive.

It appears that it could be a powerful tool of illumination in _addition_ to what you have here; it seems wildly pertinent, and weighting— not a reason for dismissal
Attached Thumbnails
Evaluating AD/DA loops by means of Audio Diffmaker-173fcaf0-c45a-4270-b144-7e4a11eebf4d.jpg   Evaluating AD/DA loops by means of Audio Diffmaker-26b69541-c3a9-42bd-a1ba-66dfa3ea5791.jpg  
Old 22nd February 2021 | Show parent
  #2231
Lives for gear
 
esldude's Avatar
 
🎧 5 years
Quote:
Originally Posted by nosleepPDX ➡️
He’s using ISO 226:2003 equal loudness curves, and a Fourier transform. Sure he named it but based on some international standards.. The 400 ms window seems to be the standard. Smoothing both the original and resulting file makes sense ..

Not saying it’s gospel here but that it’s interesting, and not only does the weighting still seem to stack with many results you’ve got here, it gives you a representation of the time domain and differences at any given point. Subsample time domain differences are accounted for (edit: because frequency is considered in the time domain)

A-weighting may be unnecessarily expansive.

It appears that it could be a powerful tool of illumination in _addition_ to what you have here; it seems wildly pertinent, and weighting— not a reason for dismissal
I have found you can get mediocre null results of say -55 db and sometimes you'll hear quite a lot of the music in the residual and sometimes you don't. It depends upon what causes the residual. I think the PK metric is a step in the right direction. I find a PK metric of -55 db has music in the residual and one of say -80 db or lower has little at all. So the PK metric moves in the right direction of comparable results matching how we would hear or not hear those results.

I too think it would be pertinent enough to be useful having tried it on a few dozen nulls now. If someone has ideas of how to improve upon it Paul who writes the software is quite amenable to listening and adding features.

Also if you think PK A-wtd is overly generous Paul likely would give us a check box or something so we could have it with and without the weighting.
Old 22nd February 2021 | Show parent
  #2232
Lives for gear
 
didier.brest's Avatar
 
🎧 10 years
Quote:
Originally Posted by nosleepPDX ➡️
He’s using ISO 226:2003 equal loudness curves
These curves, AKA Fletcher-Mason curves, are for pure frequency tones only and have been known since a long time being inapplicable for measuring audio loudness. See for instance the attached document provided with the Orban loudness meter:

Quote:
After surveying existing equal-loudness contour curves (like the famous Fletcher-Munson set) and finding them inapplicable to measuring the loudness of broadcasts...
Current international standard ITU‑R BS.1770 for audio loudness measurement is based on K-weighting very different from what can be derived from ISO 226:2003 equal loudness curves. But even this audio loudness measurement standard is questionable for the purpose of this DAD loopback test where we deal not with the loudness of the audio itself but with the perceptibility in critical listening of a distortion much lower than the audio that it corrupts. This is why I want to keep the metric as simple as possible. The results are ranked FWIW. Your ears remain the best testing tool. The download links of the loopback files are provided for you to listen to them.


Loopback tests requested by forum members: Apogee Symphony Desktop, Eventide H9000R (for confirming the one at the top of the list of the results), Pacific Microsonics Model One and Model Two, Slate Digital VRS-8, SSL 2 and 2+. Any other test welcome too !
Attached Thumbnails
Evaluating AD/DA loops by means of Audio Diffmaker-loudness-08-wd0wumwwlegn1bdxlf0upfemipmf0cl..jpg  
Attached Files

Last edited by didier.brest; 23rd February 2021 at 12:15 AM..
Old 24th February 2021 | Show parent
  #2233
xav
Gear Head
 
1 Review written
🎧 5 years
Can anyone hear diferencies between those audio files?
Old 24th February 2021 | Show parent
  #2234
Lives for gear
 
3 Reviews written
🎧 15 years
Quote:
Originally Posted by xav ➡️
Can anyone hear diferencies between those audio files?

You mean the whole list of any in particular? I've listened to some of them and yes there are differences in how they handle and present the audio back. Not massive, but can still be important, especially on lots of tracks or if you send tracks out for analog processing, i.e. lopping the DA - AD.
Old 24th February 2021 | Show parent
  #2235
xav
Gear Head
 
1 Review written
🎧 5 years
Quote:
Originally Posted by blayz2002 ➡️
You mean the whole list of any in particular? I've listened to some of them and yes there are differences in how they handle and present the audio back. Not massive, but can still be important, especially on lots of tracks or if you send tracks out for analog processing, i.e. lopping the DA - AD.
Any difference. Here is a shoot out with 2 converters picked in this test :

Evaluating AD/DA loops shoot out
Old 25th February 2021 | Show parent
  #2236
Lives for gear
 
guigui's Avatar
 
Quote:
Originally Posted by xav ➡️
Any difference. Here is a shoot out with 2 converters picked in this test :

Evaluating AD/DA loops shoot out
Which ones are those?
Old 25th February 2021 | Show parent
  #2237
Lives for gear
 
3 Reviews written
🎧 15 years
Quote:
Originally Posted by xav ➡️
Any difference. Here is a shoot out with 2 converters picked in this test :

Evaluating AD/DA loops shoot out
I think I understand what you're trying to prove / deny here, but I don't have the time or inclination to do this test, as it would prove nothing other than my hearing and listening environment capabilities, which would prove nothing.

I guess you are trying to say, that even if there are differences in the conversion for the interfaces you selected, those differences are not easy to hear and certainly make no difference ultimately to an untrained consumer listening on crappy playback systems.

There's no point going down that rabbit hole, as I'm sure your position wouldn't change unless you experienced something to change it.

Some people will always want to buy expensive AD/DA because they believe it produces a better outcome for the music that they make. Some will always want to buy the cheapest they can get away with, that provides the functionality that they need.

The results of music made on either of these choices will not be solely defined by the "quality" of the AD/DA, but they can play a role in some genres, in how the listener experiences the music. Let's take a person that listens to Classical music for example, it is highly likely they do have a high end playback system that could reveal the differences and would enjoy the presentation that a high end converter can provide.
Old 26th February 2021 | Show parent
  #2238
Gear Maniac
 
🎧 5 years
Quote:
Originally Posted by xav ➡️
Any difference. Here is a shoot out with 2 converters picked in this test :

Evaluating AD/DA loops shoot out
I agree with blayz2002.
My opinion on blind tests in general:

There are quite some who don't align the tracks in their DAW and switch between them instantly which is crucial for hearing subtle differences and then say "there is no discernable difference at all, doesn't matter".

Then it happens that the files are not level matched properly, a 0.1 dB difference might be enough to flaw the test, this though can be checked beforehand. My converter drops the level by 0.2 dB when doing a loopback test i.e.

And lastly you need to know what to listen for. This takes some time to learn. If you don't know that all the tracks might seem to sound the same to you, even if they are not.
And imho this depends also a bit on the genre to some extent.
Someone who rarely listens to classic (like me i.e.), has a harder time hearing subtle differences on those tracks, than someone working with such music every day. On the other hand give me a rock track and I will tell you if there is something going on with that vocal/guitar/kick/snare etc. It also is even more evident if its a song you recorded/mixed or mastered.

When talking about AD/DA, especially in mastering and when working on your mixbus, you want to be sure the converter is helping your sound, be it neutral or "colored".
Old 1st March 2021
  #2239
Lives for gear
 
guigui's Avatar
 
I'd really like to see test results for RME M-32 AD / M-32 DA Pro and/or M-1610 Pro too.

And what Symphony module has been used here?
Quote:
Apogee Symphony I\/O Mk II (Diegel)
-0.4 dB (L), -0.4 dB (R), -56.8 dBFS (L), -58.2 dBFS (R)
Old 1st March 2021 | Show parent
  #2240
Lives for gear
 
didier.brest's Avatar
 
🎧 10 years
Quote:
Originally Posted by guigui ➡️
And what Symphony module has been used here?
The only information available from here is Symphony I/O Mk II 2×6 SE.


Loopback tests requested by forum members: Apogee Symphony Desktop, Eventide H9000R (for confirming the one at the top of the list of the results), Pacific Microsonics Model One and Model Two, Slate Digital VRS-8, SSL 2 and 2+, RME M-32 AD Pro / M-32 DA Pro, RME M-1610 Pro.

Last edited by didier.brest; 2nd March 2021 at 12:24 PM..
Old 2nd March 2021 | Show parent
  #2241
Lives for gear
 
guigui's Avatar
 
Quote:
Originally Posted by didier.brest ➡️
The only information available from here is Symphony I/O Mk II 2×6 SE.
Where did you see that?
Old 2nd March 2021 | Show parent
  #2242
Lives for gear
 
didier.brest's Avatar
 
🎧 10 years
Quote:
Originally Posted by guigui ➡️
Where did you see that?
I made a confusion between the test of Diegel and this test:

Quote:
Originally Posted by didier.brest ➡️
Apogee Symphony I/O Mk II 2×6 SE card (ziegenh5)
-0.1 dB (L), -0.0 dB (R), -56.6 dBFS (L), -57.9 dBFS (R)



So I can't answer your question.


I edited on yesterday the list of the results for adding the mention 2×6 SE card to the test of Diegel. I cannot go back today because the post has become too old for being edited again.

I shall do it in the next issue of the
Loopback tests requested by forum members: Apogee Symphony Desktop, Eventide H9000R (for confirming the one at the top of the list of the results), Pacific Microsonics Model One and Model Two, Slate Digital VRS-8, SSL 2 and 2+, RME M-32 AD Pro / M-32 DA Pro, RME M-1610 Pro.
Old 2nd March 2021 | Show parent
  #2243
Lives for gear
 
guigui's Avatar
 
Quote:
Originally Posted by didier.brest ➡️
I made a confusion between the test of Diegel and this test:





So I can't answer your question.


I edited on yesterday the list of the results for adding the mention 2×6 SE card to the test of Diegel. I cannot go back today because the post has become too old for being edited again.

I shall do it in the next issue of the
Loopback tests requested by forum members: Apogee Symphony Desktop, Eventide H9000R (for confirming the one at the top of the list of the results), Pacific Microsonics Model One and Model Two, Slate Digital VRS-8, SSL 2 and 2+, RME M-32 AD Pro / M-32 DA Pro, RME M-1610 Pro.
No problem, man. Thanks!
Old 2nd March 2021
  #2244
Here for the gear
 
Hi Didier,

Here are my files using an RME ADI-2 Pro FS R, both as a DAC and ADC:

https://drive.google.com/file/d/18gu...ew?usp=sharing
https://drive.google.com/file/d/1UzQ...ew?usp=sharing
https://drive.google.com/file/d/1R6Z...ew?usp=sharing

I'm getting a correlated null depth of around 57dB with DeltaWave (no clock drift correction).

Cheers, Mani.
Old 2nd March 2021 | Show parent
  #2245
Lives for gear
 
didier.brest's Avatar
 
🎧 10 years
Quote:
Originally Posted by manisandher ➡️
Here are my files using an RME ADI-2 Pro FS R, both as a DAC and ADC
Why three tests of the same DA-AD converter ? Different settings ? Which ones ?

1. RME ADI-2 Pro FS - Gearslutz DA_AD test.wav
12.270887 ms, 0.2191 dB (L), 0.2194 dB (R), -53.6186 dBFS (L), -54.8152 dBFS (R)

2. RME ADI-2 Pro FS - Gearslutz DA_AD test.wav
47.010116 ms, 0.2191 dB (L), 0.2194 dB (R), -53.6177 dBFS (L), -54.8143 dBFS (R)

3. RME ADI-2 Pro FS - Gearslutz DA_AD test.wav
3.631431 ms, 0.2191 dB (L), 0.2194 dB (R), -53.6188 dBFS (L), -54.8154 dBFS (R)
Old 3rd March 2021
  #2246
Lives for gear
 
guigui's Avatar
 
Here's my (attempt of) contribution. I hope I haven't done anything wrong.

I did two runs, with the 4K analogue colour enhancement off and on, respectively. Both with input gain at around 4 (out of 10) and Monitor Level at around 7 (out of 11). I used two 10ft Mogami Gold TRS cables.

SSL 2

SSL 2 4K

If there's anything wrong, please let me know.

Cheers!
Old 3rd March 2021 | Show parent
  #2247
Lives for gear
 
didier.brest's Avatar
 
🎧 10 years
Quote:
Originally Posted by guigui ➡️
Here's my (attempt of) contribution. I hope I haven't done anything wrong.
It's OK. Could you confirm that it is SSL 2. I did not find any SSL SS2.

SSL SS2
-2.091130 ms, 8.5408 dB (L), 8.3678 (R), -46.2245 dBFS (L), -47.5912 dBFS (R)

SSL_SS2 4K
-2.085787 ms, 6.6597 dB (L), 6.4446 dB (R), -40.4355 dBFS (L), -40.2084 dBFS (R)

To be added to the next issue of the

Loopback tests requested by forum members: Apogee Symphony Desktop, Eventide H9000R (for confirming the one at the top of the list of the results), Pacific Microsonics Model One and Model Two, Slate Digital VRS-8, SSL 2 and 2+, RME M-32 AD Pro / M-32 DA Pro, RME M-1610 Pro.

Last edited by didier.brest; 5th March 2021 at 11:02 PM.. Reason: correcting typo
Old 3rd March 2021 | Show parent
  #2248
Here for the gear
 
Quote:
Originally Posted by didier.brest ➡️
Why three tests of the same DA-AD converter ?
To have more confidence in the results. DeltaWave was giving me correlated null depths between 57dB and 62dB, without changing any settings.

Quote:
Originally Posted by didier.brest ➡️
Different settings ? Which ones ?
Same settings - Sharp filter used for all three.

Mani.
Old 3rd March 2021 | Show parent
  #2249
Lives for gear
 
guigui's Avatar
 
Quote:
Originally Posted by didier.brest ➡️
It's OK. Could you confirm that it is SSL 2. I did not find any SSL SS2.

SSL SS2
-2.091130 ms, 8.5408 dB (L), 8.3678 (R), -46.2245 dBFS (L), -47.5912 dBFS (R)

SSL_SS2 4K
-2.085787 ms, 6.6597 dB (L), 6.4446 dB (R), -40.4355 dBFS (L), -40.2084 dBFS (L)

To be added to the next issue of the

Loopback tests requested by forum members: Apogee Symphony Desktop, Eventide H9000R (for confirming the one at the top of the list of the results), Pacific Microsonics Model One and Model Two, Slate Digital VRS-8, SSL 2 and 2+, RME M-32 AD Pro / M-32 DA Pro, RME M-1610 Pro.
I have no idea why I thought it was SS2

I've already corrected it.
Old 4th March 2021
  #2250
Lives for gear
 
didier.brest's Avatar
 
🎧 10 years
Genex GXD8 ---> Metric Halo LIO-8 (Antoine Fabi)
-727.44 µs, 0.4477 dB (L), 0.3871 dB (R), -55.7013 dBFS (L), -56.2370 dBFS (R)

To be added to the next issue of the

Loopback tests requested by forum members: Apogee Symphony Desktop, Eventide H9000R (for confirming the one at the top of the list of the results), Pacific Microsonics Model One and Model Two, Slate Digital VRS-8, SSL 2+, RME M-32 AD Pro / M-32 DA Pro, RME M-1610 Pro.

Last edited by didier.brest; 4th March 2021 at 09:06 PM..
📝 Reply

Similar Threads

Thread / Thread Starter Replies / Views Last Post
replies: 183 views: 26422
Avatar for baikonour
baikonour 10th October 2006
replies: 371 views: 96918
Avatar for Lucas_G
Lucas_G 1 week ago
replies: 4855 views: 1295263
Avatar for monkeyxx
monkeyxx 3 hours ago
replies: 61 views: 10894
Avatar for Transistor
Transistor 21st February 2015
Post Reply

Welcome to the Gearspace Pro Audio Community!

Registration benefits include:
  • The ability to reply to and create new discussions
  • Access to members-only giveaways & competitions
  • Interact with VIP industry experts in our guest Q&As
  • Access to members-only sub forum discussions
  • Access to members-only Chat Room
  • Get INSTANT ACCESS to the world's best private pro audio Classifieds for only USD $20/year
  • Promote your eBay auctions and Reverb.com listings for free
  • Remove this message!
You need an account to post a reply. Create a username and password below and an account will be created and your post entered.


 
 
Slide to join now Processing…

Forum Jump
Forum Jump