Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: How can a change in signal volume influence the bitrate of lossy codecs? (Read 8579 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

How can a change in signal volume influence the bitrate of lossy codecs?

Hi all!

For the USB drive i use to play music in my car, I am encoding my FLAC collection to LAME MP3 -V3. Normally, I have about 500 MB free space left on the drive, after copying the entire MP3 collection to it.

Recently, I calculated the ReplayGain values for my FLACs and stored them in the tags. Since my car radio does not support applying ReplayGain from the tags, I decided to let foobar2000 apply the album gain before encoding the MP3 files.

After copying the entire collection (same as above), with RG applied, to my USB drive, I realized, that I got now about 800 MB left on the drive.

Because this confused me, I picked a song from my flac collection and encoded it twice. First with no change to the audio data, secondly, with RG album gain applied to the audio data, which was about -7 dB of gain reduction.

The first file with no RG applied achieved a higher bitrate (178 kBit/s) than the one with RG applied (164 kBit/s).

I guess, this is because the volume of the file with RG applied is lower, so the resulting average bitrate of the MP3 file is lower.

But what I do not understand: why does a reduction of the signal volume lead to a lower bitrate of the lossy encoded file? Because the encoder still needs to deal with the same frequencies and from that point of view, the signal is not changed. It's just the amplitude, that was reduced a bit. I can't imagine how this can result in a lower overall bitrate.


Re: How can a change in signal volume influence the bitrate of lossy codecs?

Reply #1
As far as I know:

At a lower volume the psychacoustic model determines that some parts are below the hearing threshold. At lowel volumes more information becomes subject to be "unhearable" and will be filtered out. I think that is happening when you apply a -7dB gain to the file prior to encoding. Also the dynamic range will be somewhat lower.

 

Re: How can a change in signal volume influence the bitrate of lossy codecs?

Reply #2
That sounds logic... I also thought about that some parts of the signal could be quantized to zero but I was not sure if this was possible.

Re: How can a change in signal volume influence the bitrate of lossy codecs?

Reply #3
As far as I know:

At a lower volume the psychacoustic model determines that some parts are below the hearing threshold. At lowel volumes more information becomes subject to be "unhearable" and will be filtered out. I think that is happening when you apply a -7dB gain to the file prior to encoding. Also the dynamic range will be somewhat lower.

Shouldn't it be doing this based on relative signal levels?

If the whole song is recorded at a low level, the listener will increase the volume to their normal level, or use replaygain to make up the difference. Thus the parts that were previously below the hearing threshold would become audible again.

In other words, the encoder does not know the final playback volume.

Re: How can a change in signal volume influence the bitrate of lossy codecs?

Reply #4
I did some more testing on that. Tried it with another codec (AAC with apple encoder) and tested what happened for different samples with negative and positive gain values. Here are the results:

AAC (Apple) VBR Q 100, gain value -9.20 dB

  • Original encoded: bitrate 252 kbps
  • Applied gain and encoded: bitrate 248 kbps

AAC (Apple) VBR Q 100, gain value +1.57 dB

  • Original encoded: bitrate 124 kbps
  • Applied gain and encoded: bitrate 125 kbps


LAME MP3 -V1, same sample as above, gain value -9.20 dB

  • Original encoded: bitrate 236 kbps
  • Applied gain and encoded: bitrate 230 kbps

LAME MP3 -V1, same sample as above, gain value +1.57 dB

  • Original encoded: bitrate 146 kbps
  • Applied gain and encoded: bitrate 149 kbps



So, looking at these results: I think the bitrate differences for MP3 are a bit higher than the differences for AAC, but they exist for both codecs and the dependency gain <=> bitrate seems to exist in both directions. Louder sources seem to produce higher bitrates while quieter sources produce lower bitrates.

Re: How can a change in signal volume influence the bitrate of lossy codecs?

Reply #5
One more thing: bitrate also reduces when encoding to a lossless format. When applying RG (-9,20 dB) to a song that originally had an average bitrate of 1077 kbps for FLAC -8, the same song with RG applied has an average bitrate of only 943 kbps.

So, after testing and thinking about it, could the following be true?

When reducing the gain before encoding, we are losing some dynamic range, as already stated by Fairy. So, if we assume we have a 44,1 kHz, 16 bit source, where the full dynamic range of the 16 bits could be used. After reducing the gain, not the full 16 bit dynamic range can be used any more. The same signal needs to be quantized on a much smaller area of available quantization values and there will be rounding errors. If we have two samples whose original values only had a very tiny delta, the values of these samples might be identical after reducing the gain.

So, in my thoughts, this might be the reason why both, lossy and lossless codecs can encode to lower bitrates when the gain is reduced. Could this be true?

And - if my statement makes sense - what do you think? Could these losses be audible at good listening conditions?

Re: How can a change in signal volume influence the bitrate of lossy codecs?

Reply #6
As far as I know:

At a lower volume the psychacoustic model determines that some parts are below the hearing threshold. At lowel volumes more information becomes subject to be "unhearable" and will be filtered out. I think that is happening when you apply a -7dB gain to the file prior to encoding. Also the dynamic range will be somewhat lower.

Shouldn't it be doing this based on relative signal levels?

If the whole song is recorded at a low level, the listener will increase the volume to their normal level, or use replaygain to make up the difference. Thus the parts that were previously below the hearing threshold would become audible again.

In other words, the encoder does not know the final playback volume.
The encoder is only seeing part of the file at a time. It doesn't know if that part is especially loud or especially soft relative to the rest of the file, so it assumes that the entire file will be played at a normal level and will not be turned up or down an unusual amount.

Re: How can a change in signal volume influence the bitrate of lossy codecs?

Reply #7
The encoder is only seeing part of the file at a time. It doesn't know if that part is especially loud or especially soft relative to the rest of the file, so it assumes that the entire file will be played at a normal level and will not be turned up or down an unusual amount.

I don't know too much about the internals of encoders, how they work and how exactly is audio data stored in the different codecs. But shouldn't it be possible for an encoder to store some kind of "amplitude factor" for a frame, so the frame and its frequencies can be encoded with the best accuracy and the amplitude of the quantized frequencies is applied during decoding of that frame?

Re: How can a change in signal volume influence the bitrate of lossy codecs?

Reply #8
The encoder is only seeing part of the file at a time. It doesn't know if that part is especially loud or especially soft relative to the rest of the file, so it assumes that the entire file will be played at a normal level and will not be turned up or down an unusual amount.

Exactly, it's looking at the relative loudness, not the absolute loudness. Which is exactly why it won't throw away more information, just because it's quiet compared to 0 dBFS, as it is still the same relative difference to the louder parts that may mask it.

Re: How can a change in signal volume influence the bitrate of lossy codecs?

Reply #9
If you reduce the volume and losslessly encode a file the bitrate will also be lower.
[edit] Someone mentioned it already.

Re: How can a change in signal volume influence the bitrate of lossy codecs?

Reply #10
Someone mentioned it already.
That was me, the thread-opener :D

I wrote above how I imagine this can happen - at least for lossless compression. Can somebody clarify if this theory is true or false? If true, is this also the reason for the bitrate loss in lossy compression?


Re: How can a change in signal volume influence the bitrate of lossy codecs?

Reply #12
Okay... Thanks for the replies. Although the bitrate loss should not be audible, the whole story biases me a bit and I think I feel better when I have my music encoded the best way possible that fits the space on my drive. This is LAME -V3 with no RG applied.

But since it would be nice to have approximately the same volume of all albums on my USB drive, I will be using mp3gain to losslessly apply the RG values to the audio material.

Re: How can a change in signal volume influence the bitrate of lossy codecs?

Reply #13
For FLAC, of course, lower level means less levels to encode. That translates immediately into fewer bits.

For MP3 or AAC, the effect will be less dominant, but it will still happen, more masking thresholds will be held at the assumed threshold of hearing level, and so you will have a lower bit rate demand to start with.
-----
J. D. (jj) Johnston

Re: How can a change in signal volume influence the bitrate of lossy codecs?

Reply #14
One more short question that came up while playing around with all that gain-adjustments:

If I use MP3Gain (or foobar2000, which basically does the same for that use case) to adjust the gain in the MP3 files, it will, as I have read, change the global gain multiplier value of each frame in the MP3 file, while, for example, decrementing the value by one means a gain reduction of 1,5 dB.

Now let's say I reduce the gain of an MP3 file by -9 dB, which would mean that I decrement the global gain value of each frame by 6. What would happen, If the MP3 file had some very quiet parts, where the global gain value of the unaltered file is 6 or lower. This will end up in a gain multiplier which is 0, and if I get this right, multiplying each frequency subband gain by zero will end up in a frame which is completely silent.

Isn't that a point where the process of how MP3Gain and foobar2000 reduce the gain of MP3 files, isn't completely lossless? There will be frames which are completely silent after conversion, plus the process is not fully reversible for such frames, because we do not know which value between 0 and 6 the original gain multiplier was??

Re: How can a change in signal volume influence the bitrate of lossy codecs?

Reply #15
The global gain value is an 8 bit integer, so you would have to reduce the gain by something like 120, or 180 dB, before you run out of range.

Re: How can a change in signal volume influence the bitrate of lossy codecs?

Reply #16
Yeah, I know that it's an 8 bit int. But my question was more into that direction:

What will LAME do when encoding frames that are very quiet (not silent, but close to)? Will the global gain value be close to zero, or will the quantized values of the frequency bands be close to zero and the global gain still be some higher value?

Because, say we have a really quiet frame that has global gain = 4, that's still not completely silent. If we try to reduce all the global gain values by let's say 7, we will have global gain of 0 on such frames, which means we lost these frames.

Can I pretend that global gain is high enough so that this kind of information loss would never happen in practice?