Unlike encoding, I believe all decoders are supposed to be the same.
My question is that in a comparison of lossy-format decoders between, let's say, iTunes and Foobar and Winamp, will the output vary between decoding implementations?
In AAC there can be Perceptual Noise Substitution, so I'm not sure I could expect identical output even from the same decoder.
…in which the author claims that differences that at most represent fluctuations in the least-significant of 24-bits are the answer to the question “Why does Apollo sound so good?”. Should we be directing traffic towards such nonsense?
RMS level [of the then-latest version of MAD] is over a hundred times the one produced by Apollo 37zm and the maximum difference is four times the one by Apollo (interestingly the results for MAD 0.11.4b seem to be somewhat better than for the latest one but they are still worse than Apollo's). Actually, the maximum difference of Apollo's output is the smallest possible deviation in 24-bit data, the only smaller possible value would be zero.
I apologise for talking nonsense. :/
Why does Apollo sound so good?The sound quality has always been an important factor when making decisions in the decoder code. Recent addition of 32-bit and 24-bit output has resulted in further improvements in the quality. To prove this, I measured the difference of Apollo 37zm 24-bit output to the MPEG-1 audio layer 3 compliance test reference signal […] Of course, this only goes for the provided test signal, but it should give some picture about the sound quality.
Evens so, bit perfect can be tough because different hardware implements arithmetic with subtle differences and at the LSB level insignificant variations in things like order of operations can lead to tiny differences.