I am encoding a mono file to mp3 with Lame and a bitrate of 80, and a stereo file with a bitrate of 160.
Since the bitrate per channel is the same in both cases, I'm curious why is the lowpass different?
Here are the command lines I use and the output I get:
lame --nohist --noreplaygain --preset cbr 80 mono.wav mono.mp3
LAME 3.98.4 64bits (http://www.mp3dev.org/)
Using polyphase lowpass filter, transition band: 20094 Hz - 20627 Hz
Encoding mono.wav to mono.mp3
Encoding as 44.1 kHz single-ch MPEG-1 Layer III (8.8x) 80 kbps qval=3
Frame | CPU time/estim | REAL time/estim | play/CPU | ETA
2298/2298 (100%)| 0:00/ 0:00| 0:00/ 0:00| 88.279x| 0:00
Writing LAME Tag...done
lame --nohist --noreplaygain --preset cbr 160 stereo.wav stereo.mp3
LAME 3.98.4 64bits (http://www.mp3dev.org/)
Using polyphase lowpass filter, transition band: 17249 Hz - 17782 Hz
Encoding stereo.wav to stereo.mp3
Encoding as 44.1 kHz j-stereo MPEG-1 Layer III (8.8x) 160 kbps qval=3
Frame | CPU time/estim | REAL time/estim | play/CPU | ETA
2298/2298 (100%)| 0:01/ 0:01| 0:01/ 0:01| 35.732x| 0:00
Writing LAME Tag...done
I get the same lowpass difference using different settings (forcing dual mono mode, using cbr without the preset switch, etc.).
Thanks