At what bitrate is AAC *universally* transparent?
Reply #13 – 2004-11-18 14:43:52
Theoretically, you can't say any lossy format at any bitrate is sufficient to provide "universal" transparency. As long as something is lossy, there is always some chance that someone will hear an artifact. It is possible to prove hearing of artifact that is adding sound energy that wasn't there before. It is possible to prove NOT hearing artifact that is removing sound energy that wasn't audible anyway. It has been proven times and again that humans may hear artefacts that aren't there at all. It has been proven enough that humans may hear addition artefacts that are really there. Why post this? Not to argue your point, because thats the state of affairs today. But it is conceivable that some day there will be solid proof of lossy codec that provides universal transparency. For that you'd need solid proof of artefacts that are inaudible physiologically. But, the real reason I chimed in, is that your post made me thinking about difference between artefacts that ADD to the signal and artefacts that REMOVE from the signal. There is an important difference between these. Human hearing has developed to survive, and for that detection of faintest sounds was important. Brain processing and nonlinearity of hearing is getting so sharp that it becomes ridiculous - we can hear things that aren't there. On other hand, very sharp detection of missing sounds isn't important to survival, and this ability is undeveloped. Thus, quite often, when artefact of lossy codec is removing signal, not only do we have hard times detecting it missing physiologically, but our brain would fill in the gap with our expectations. Thus, lossy codecs have much better chance of reaching universal transparency than say analog chains that ADD distortion products.