Quote from: gaekwad2 on 05 February, 2010, 01:52:28 PMBecause, I assume, they use Replaygain which, unlike QuickTime's limiter, will prevent clipping without affecting dynamics (though, whether it makes an audible difference...).ReplayGain cannot prevent clipping in this case. It can only work with what it's fed. If that is clipped, the data is gone.
Because, I assume, they use Replaygain which, unlike QuickTime's limiter, will prevent clipping without affecting dynamics (though, whether it makes an audible difference...).
Or is AAC unclippable if you output to float?
... Besides, it's not always strong enough to prevent clipping anyway...
It is possible to multiply input by 0.95 to prevent QT limiter from altering the signal.
Does it also scale down if you convert the input to float before feeding it to Quicktime?
I also doubt, that even if there was as much as 1db attenuation involved, that anyone of us could ABX it after ReplayGain had been applied.
Could you please upload 30 sec of a file, that gets attenuated, to the upload section? I'm unable to reproduce it.
Nao could you also give us instruction for a CLI dbpoweramp version of this?
Quote from: Larson on 07 February, 2010, 05:00:06 AMNao could you also give us instruction for a CLI dbpoweramp version of this?Are there any difficulties? For example:
1) Converting mono tracks - is it better to use the "channels" drop-down -> 1 or the channel count DSP? Or something in the command line itself? Or does it not matter?
BTW Using either the drop down or DSP, foobar properties still shows "channels - 2" with mono files, whereas with mono Nero files it shows "channels - 1". Don't know if this is caused by qtaacenc or the QuickTime encoder itself. [dBp properties info is worse - shows channels - 2 with both types of mono AAC files!]
2) For changing sample rates is it better to use the drop-down in the dB CLI encoder or "--samplerate xxx" in the command line? Or doesn't it matter?
thank you nao!it works amazing! Should "highest quality decoded source" be checked? i have tested a bit anyway and i noticed that some songs converted with XLD Q127 MAX have 1 kbps less than Q127 --highest (which is Max),isn't it weird?
Should "highest quality decoded source" be checked?
isn't it weird?
Because maybe qtaacenc uses QuickTime 7.6.5 on Windows OS, instead XLD uses QuickTime 7.6.3 (QuickTime X) on Mac OS X Snow Leopard and QuickTime 7.6.4 on Mac OS X Leopard.
I don't think the difference of QT version (7.6.3 vs .4 vs .5) matters. I think the difference mainly comes from some environment dependent stuff (compiler, depending floating-point arithmetic routine, etc.). You know different LAME binaries result in subtle difference, even if they are built from the same source.