192 kbit/s listening test at SoundExpert
Reply #40 – 2006-07-05 07:17:44
1. mp3: Lame 3.97b2 [--preset standard] 177.1 kbit/s FBR 2. aac: Nero Free aac Encoder 1.0.0.2 [-q0.626 ] 192.2 kbit/s FBR 3. aac: iTunes [192 kbit/s, VBR] 197.8 kbit/s FBR 4. wma 9.1 std: WMPlayer10 [CBR, 192 kbit/s] 198.0 kbit/s FBR 5. ogg: aoTuVb4.51 [-q6,37 ] 192.0 kbit/s FBR 6. mpc: v1.15v [--quality 5.49 ] 191.0 kbit/s FBR 7. he-aac: Winamp High Bitrate Encoder [192 kbit/s] 195.4 kbit/s FBR min = 177.7 kbps & max = 198 kbps [iTunes, WMA]. Difference is higher than 10% (and 20 kbps). Moreover, it seems that you decided to use very precise quality level (0.626, 6.37 & 5.49), probably to match a precise bitrate - but a different criterion (popularity) is used for LAME. This second criterion leads to considerably lower the bitrate and therfore handicap the format. It's likely that for most people there wouldn't be any perceptual difference between V1 and V2 in normal listening conditions but isn't your methodology supposed to amplify the distortions? I'd use the same criterion for all encoder and try either to follow "popular" settings" or to match the same approximate bitrate for all. You said that:“1. mp3: Lame 3.97b2 [-V1 --vbr-new –noreplaygain] 200.5 kbit/s FBR” Then use V1. The deviation would be much lower (min = 191 kbps and max 200 kbps => ~4...5%) and even perfectible (by increasing MPC bitrate).
Wavpack Hybrid: one encoder, one encoding for all scenarios WavPack -c4.5hx6 (44100Hz & 48000Hz) ≈ 390 kbps + correction file WavPack -c4hx6 (96000Hz) ≈ 768 kbps + correction file WavPack -h (SACD & DSD) ≈ 2400 kbps at 2.8224 MHz