How can a change in signal volume influence the bitrate of lossy codecs? 2017-01-11 08:57:22 Hi all!For the USB drive i use to play music in my car, I am encoding my FLAC collection to LAME MP3 -V3. Normally, I have about 500 MB free space left on the drive, after copying the entire MP3 collection to it.Recently, I calculated the ReplayGain values for my FLACs and stored them in the tags. Since my car radio does not support applying ReplayGain from the tags, I decided to let foobar2000 apply the album gain before encoding the MP3 files.After copying the entire collection (same as above), with RG applied, to my USB drive, I realized, that I got now about 800 MB left on the drive.Because this confused me, I picked a song from my flac collection and encoded it twice. First with no change to the audio data, secondly, with RG album gain applied to the audio data, which was about -7 dB of gain reduction.The first file with no RG applied achieved a higher bitrate (178 kBit/s) than the one with RG applied (164 kBit/s).I guess, this is because the volume of the file with RG applied is lower, so the resulting average bitrate of the MP3 file is lower.But what I do not understand: why does a reduction of the signal volume lead to a lower bitrate of the lossy encoded file? Because the encoder still needs to deal with the same frequencies and from that point of view, the signal is not changed. It's just the amplitude, that was reduced a bit. I can't imagine how this can result in a lower overall bitrate.