Tell him to Google 'CD player output voltage' and get back to you after he's done reading.
"All digital source components are level matched, using a 1kHz sine wave, to 1V RMS analogue. The reference WAV test tone is created to be at a level of -1dB from absolute, sampled at 96kHz with 24 bit depth. Not trying to be funny but I've been doing this for a while".
Looks fine to me. If there's clipping somewhere in the path, it's probably clearly audible and should be avoided by lowering the analog gain after the DAC.
People generally claim that dacs sound different and I'm sure in some cases they can and do sound different, especially if their levels are not matched, unless thefrequency response of both were so grotesquely different, but I assume that many measure close to flat. My question is, how does one go about level matching dacs? Is there a right and wrong way, or just a standard method?
As to whether or not one should level-match outpus, one should at least verify they are matched, rather than take an EE's word for it. FWIW, this is coming from an EE.
I, for one, had assumed the removal of expectation bias had already been taken into account. It also seemed clear to me from the original post that we were discussing level-matching rather than double-blind methodology. Anyway...
Arny, please correct me if I'm wrong here. Level matching is certainly important for comparisons, but the fact the test was sighted - wouldn't that invalidate the test process regardless whether the equipment was level matched or not?
So while level matching closes the gap in difference between components, that isn't enough.
Yes, an evaluation that lacks bias controls and carefully synchronized source material is invalid, level-matched or not.