Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Error levels in ADCS used in comparisons (Read 6415 times) previous topic - next topic
0 Members and 2 Guests are viewing this topic.

Error levels in ADCS used in comparisons

Hello Chaps,
        I have been interested in the issue of jitter audibility (or not) for some time. In another place that rhymes with dead-fi there is a discussion which turned to the effect of using reclocking devices to replace a USB----DAC  chain with a USB----Reclocker----Toslink-------DAC chain. The proponents of the reclocker device maintain that since the USB mode of digital transfer is so bad jitter wise that taking the alternative route will mean the signal has so much less jitter that the sound will be vastly improved assuming a DAC with normal jitter rejection.

My argument was that if lowering jitter in the link to a DAC makes so much difference it would be easily meausurable in the analog output of the ADC which one could route to an ADC bung through a spectral analysis and do the maths on.  This is trivial. I have done analyses of samples with differering levels of jitter and the level of jitter does alter the spectra albeit to a small degree.

The reclocker camp came back and say that a cheapo USb ADC used to capture such differences cannot do so. To which my response is that any competent ADC has a known level of quantization error which is typically about 1/2 an LSB ? So for a difference in samples to be swamped by the quantization error the difference must be less than 1/2 an LSB or thereabouts i.e any difference of less than about 0.015mV when using a 16 bit system is lost.

My question is twofold, am I way off beam, I don't mind being called ignorant and if not is a difference of less than 0.015mV even likely to be audible ?
This is not a signature !

Error levels in ADCS used in comparisons

Reply #1
In the USB->DAC case jitter is determined solely by the DAC. USB plays no part in the clocking other than supplying the data early enough for the DAC to clock it.

Error levels in ADCS used in comparisons

Reply #2
In the USB->DAC case jitter is determined solely by the DAC. USB plays no part in the clocking other than supplying the data early enough for the DAC to clock it.


thanks so jitter in usb is basically nonsense ?
This is not a signature !

Error levels in ADCS used in comparisons

Reply #3
In the USB->DAC case jitter is determined solely by the DAC. USB plays no part in the clocking other than supplying the data early enough for the DAC to clock it.


thanks so jitter in usb is basically nonsense ?


if you look at the USB DAC measurements published in Stereophile, you can see that USB DACS vary enormously in their ability to clock the D/A processor chip with high accuracy. This is because while the long-term average transmission rate of the incoming data will be the sample rate specified for the audio, it actually changes all the time. Typically, the USB receiver chip in the most common "adaptive" mode polls the incoming data every millisecond and adjusts the data clocking accordingly. The problem is that the lower-frequency timing variations in the incoming data tend to affect the timing of the DAC clock. Different designers use different strategies to try to eliminate this bleedthrough, some more successfully than others.

For a very good example of adaptive USB DAC design, see figs.12 and 13 at http://www.stereophile.com/content/centran...er-measurements .

For a bad example, see fig.8 at http://www.stereophile.com/content/hrt-mus...-measurements-0 .

John Atkinson
Editor, Stereophile

Error levels in ADCS used in comparisons

Reply #4
In the USB->DAC case jitter is determined solely by the DAC. USB plays no part in the clocking other than supplying the data early enough for the DAC to clock it.


thanks so jitter in usb is basically nonsense ?


if you look at the USB DAC measurements published in Stereophile, you can see that USB DACS vary enormously in their ability to clock the D/A processor chip with high accuracy. This is because while the long-term average transmission rate of the incoming data will be the sample rate specified for the audio, it actually changes all the time. Typically, the USB receiver chip in the most common "adaptive" mode polls the incoming data every millisecond and adjusts the data clocking accordingly. The problem is that the lower-frequency timing variations in the incoming data tend to affect the timing of the DAC clock. Different designers use different strategies to try to eliminate this bleedthrough, some more successfully than others.

For a very good example of adaptive USB DAC design, see figs.12 and 13 at http://www.stereophile.com/content/centran...er-measurements .

For a bad example, see fig.8 at http://www.stereophile.com/content/hrt-mus...-measurements-0 .

John Atkinson
Editor, Stereophile


Thanks for the links, interesting, the HRT (an unfortunate TLA) looks pretty bad but its jitter seems to be the least of its problems. Its IMD and THD and linearity especially are pretty gruesome. Seems to me that these would almost certainly swamp any audible effect of jitter. Since USB is well fast enough to supply data why not just take it from the receiver and bung it into a separate buffer and then clock it out from there, a bit of handshaking to establish the sample rate and off you go ?
This is not a signature !

Error levels in ADCS used in comparisons

Reply #5
You might have a look at asynchronous USB.

Now the DAC times the audio out of the PC so don't have to adjust its speed to the send rate as is the case in adaptive mode


http://thewelltemperedcomputer.com/KB/USB.html
TheWellTemperedComputer.com

Error levels in ADCS used in comparisons

Reply #6
You might have a look at asynchronous USB.

Now the DAC times the audio out of the PC so don't have to adjust its speed to the send rate as is the case in adaptive mode


This is the best way to do it, but I didn't mention it in my earlier response, too keep things simple. You can see a jitter measurement of a USB DAC operating in asynchronous mode in fig.14 at http://www.stereophile.com/content/ayre-ac...ac-measurements .

John Atkinson
Editor, Stereophile

Error levels in ADCS used in comparisons

Reply #7
In the USB->DAC case jitter is determined solely by the DAC. USB plays no part in the clocking other than supplying the data early enough for the DAC to clock it.


thanks so jitter in usb is basically nonsense ?


if you look at the USB DAC measurements published in Stereophile, you can see that USB DACS vary enormously in their ability to clock the D/A processor chip with high accuracy. This is because while the long-term average transmission rate of the incoming data will be the sample rate specified for the audio, it actually changes all the time. Typically, the USB receiver chip in the most common "adaptive" mode polls the incoming data every millisecond and adjusts the data clocking accordingly. The problem is that the lower-frequency timing variations in the incoming data tend to affect the timing of the DAC clock. Different designers use different strategies to try to eliminate this bleedthrough, some more successfully than others.

For a very good example of adaptive USB DAC design, see figs.12 and 13 at http://www.stereophile.com/content/centran...er-measurements .

For a bad example, see fig.8 at http://www.stereophile.com/content/hrt-mus...-measurements-0 .


Thanks for the links, interesting, the HRT (an unfortunate TLA) looks pretty bad but its jitter seems to be the least of its problems. Its IMD and THD and linearity especially are pretty gruesome. Seems to me that these would almost certainly swamp any audible effect of jitter. Since USB is well fast enough to supply data why not just take it from the receiver and bung it into a separate buffer and then clock it out from there, a bit of handshaking to establish the sample rate and off you go ?


That is the basic block diagram for the better adaptive-mode devices, but the problem is that the need to "handshake," so that the buffer doesn't under- or overflow, introduces a path whereby the receiver clock uncertainty "leaks" into the DAC clock. It's pernicious. As another poster just noted, the true solution for USB DACs is to use the USB receiver in asynchronous mode, whereby the DAC clock controls the data flow from the PC and you can used a fixed-frequency, high-precision crystal oscillator. However, even though the parts cost is not much higher - you need to use something like a TAS102B rather than a BB270?-series USB chip, there are currently only a handful of USB DAC products available that operate in asynchronous rather than adaptive mode.

John Atkinson
Editor, Stereophile

Error levels in ADCS used in comparisons

Reply #8
My argument was that if lowering jitter in the link to a DAC makes so much difference it would be easily meausurable in the analog output of the DAC which one could route to an ADC bung through a spectral analysis and do the maths on.  This is trivial. I have done analyses of samples with differering levels of jitter and the level of jitter does alter the spectra albeit to a small degree.

The reclocker camp came back and say that a cheapo USb ADC used to capture such differences cannot do so. To which my response is that any competent ADC has a known level of quantization error which is typically about 1/2 an LSB ? So for a difference in samples to be swamped by the quantization error the difference must be less than 1/2 an LSB or thereabouts i.e any difference of less than about 0.015mV when using a 16 bit system is lost.

1/2 dB would be theoretical minimum quantization error. To my knowledge, no 24-bit ADC or DAC achieves this. Most audio ADCs and DACs are 24-bit these days. Actual performance depends on the manufacturer specifications and how well the design was executed - PCB layout, power-supply filtering and, yes, jitter. The best way to measure performance is to use calibrated test equipment.

Error levels in ADCS used in comparisons

Reply #9
I have been interested in the issue of jitter audibility (or not) for some time. In another place that rhymes with dead-fi there is a discussion which turned to the effect of using reclocking devices to replace a USB----DAC  chain with a USB----Reclocker----Toslink-------DAC chain. The proponents of the reclocker device maintain that since the USB mode of digital transfer is so bad jitter wise that taking the alternative route will mean the signal has so much less jitter that the sound will be vastly improved assuming a DAC with normal jitter rejection.


Unfortunately, you seem to be misrepresenting the issue that you are raising. Discussions of signal flow are generally irrelevant to the actual audibility of jitter. If you want to study the audibility of jitter, then you need to do some listening tests, no?

The usual way to study the audibility of jitter is to obtain some very jitter-free music and play it back via a low-jitter system. Then introduce variable amounts of jitter into the music and determine at what point you actually start hearing audible degradation due to the added jitter.

One of the classic studies of the audibility of jitter can be found in the well-known Benjamin and Gannon JAES paper. Other relevant papers have been cited here many times in the past.

What most people who have seriously studied jitter (as opposed to high end wannabees) have found is that jitter in well-made modern digital audio gear is basically a red herring. If jitter were such a problem people would have been running from their listening rooms screaming and with blood gushing from their ears during the age of analog because the LP format and analog tape are loaded with jitter. If Stereophile magazine published the truth about jitter in analog gear as compared to the trivial amounts of jitter that they incessantly belabor in digital gear, many of their more influential analog-centric columnists might have some serious explaining to do.

If you want to talk about a digital signal chain that has incredible potential for jitter, contemplate HDTV, particularly of the OTA flavor. So why doesn't Katy Couric sound like she's gargling while she's giving the evening news on CBS?  The answer is that despite the manifold opportunities for a HDTV signal to be jittered out of its mind, some carefully-designed circuitry in even $125 HDTVs uses the usual digital domain tricks for removing it. A similar argument applies to many other familiar situations whose details are hidden from you. The digital data stream coming off of an optical disc is rife with jitter. Consider the normal start-stop inherent in reading digital audio files off of a hard drive, particularly in a multitasking environment.