I have just verified singaiya's report. Quicktime, left at its default settings for TVBR and CVBR, downsamples automatically below Q59. Apple's bare bones afconvert front-end produces the following output:
rpp3po:Desktop rpp3po$ afconvert test.wav -f m4af -s 3 -d aac -u vbrq 58 -o test.m4a -v
Input file: test.wav, 11393676 frames
strategy = 3
user property 'qrbv' = 58
Formats:
Input file 2 ch, 44100 Hz, 'lpcm' (0x0000000C) 16-bit little-endian signed integer
Output file 2 ch, 0 Hz, 'aac ' (0x00000000) 0 bits/channel, 0 bytes/packet, 0 frames/packet, 0 bytes/frame
Output client 2 ch, 44100 Hz, 'lpcm' (0x0000000C) 16-bit little-endian signed integer
AudioConverter 0x595004 [0x10012d490]:
CodecConverter 0x0x10013e1e0
Input: 2 ch, 44100 Hz, 'lpcm' (0x0000000C) 16-bit little-endian signed integer
Output: 2 ch, 32000 Hz, 'aac ' (0x00000000) 0 bits/channel, 0 bytes/packet, 1024 frames/packet, 0 bytes/frame
codec: 'aenc'/'aac '/'appl'
Input layout tag: 0x650002
Output layout tag: 0x650002
Optimizing test.m4a... done
Output file: test.m4a, 8267520 frames
rpp3po:Desktop rpp3po$ afconvert test.wav -f m4af -s 3 -d aac -u vbrq 59 -o test.m4a -v
Input file: test.wav, 11393676 frames
strategy = 3
user property 'qrbv' = 59
Formats:
Input file 2 ch, 44100 Hz, 'lpcm' (0x0000000C) 16-bit little-endian signed integer
Output file 2 ch, 0 Hz, 'aac ' (0x00000000) 0 bits/channel, 0 bytes/packet, 0 frames/packet, 0 bytes/frame
Output client 2 ch, 44100 Hz, 'lpcm' (0x0000000C) 16-bit little-endian signed integer
AudioConverter 0x59a004 [0x10012cc00]:
CodecConverter 0x0x10013d910
Input: 2 ch, 44100 Hz, 'lpcm' (0x0000000C) 16-bit little-endian signed integer
Output: 2 ch, 44100 Hz, 'aac ' (0x00000000) 0 bits/channel, 0 bytes/packet, 1024 frames/packet, 0 bytes/frame
codec: 'aenc'/'aac '/'appl'
Input layout tag: 0x650002
Output layout tag: 0x650002
Optimizing test.m4a... done
Output file: test.m4a, 11393676 frames
.alexander., rpp3po,
sorry, I don't understand what you're talking about. Already at 96 kb VBR, iTunes gives me 44.1-kHz MP4 files. Using 32 kHz sampling rate at 128 kb or more is a bad idea, anyway. Pre-echo sensitive people like /mnt will tell you why.
I don't understand why you have addressed both posts together. My point wasn't related.
If a codec developer decides (after hundreds of hours of testing) to use a certain default sampling rate for a given bitrate, why should we disallow that?
Does that mean that you would prefer to use Apple's default or not?
Could you please point me to any reference why having only 16kHz of bandwidth makes it harder to avoid pre-echo?