understanding measured and viewable changes after resampling
I'm using Adobe Audition 3 to analyze a single channel of audio (e.g., just the Center channel of a 5.1 track). Audition offers an 'Amplitude Statistics' function, as well as a 'Frequency Analysis' FFT view.
When I resample a 96kHz/24 bit track to 48kHz (using the foobar2k SoX plugin, though others give the same result) the min/max sample values increase, and the minimum RMS power (in dB) decreases. Peak amplitude rises a few hundredths of a dB (not consistently, depends on what track I measure, sometimes I see no change), and max , average,and total RMS power ( in dB) stay the same.
Center from multi 96khz/32bit
Mono
Min Sample Value: -18289.15
Max Sample Value: 27042.06
Peak Amplitude: -1.67 dB
Possibly Clipped: 0
DC Offset: 0
Minimum RMS Power: -132.63 dB
Maximum RMS Power: -15.05 dB
Average RMS Power: -31.55 dB
Total RMS Power: -28.07 dB
Actual Bit Depth: 24 Bits
Using RMS Window of 50 ms
Center from sox resampled multi 48kHz/32bit:
Mono
Min Sample Value: -17149.08
Max Sample Value: 27218.93
Peak Amplitude: -1.61 dB
Possibly Clipped: 0
DC Offset: 0
Minimum RMS Power: -inf dB
Maximum RMS Power: -15.05 dB
Average RMS Power: -31.55 dB
Total RMS Power: -28.07 dB
Actual Bit Depth: 24 Bits
Using RMS Window of 50 ms
But in a Frequency Analysis of the two (FFT size 65536, Blackmann-Harris) the 48kHz curve is 3dB lower than the 96kHz curve ; there are some differences in curve shape at either end of the spectrum, but otherwise the two curves are parallel. (attached pic)
Can someone explain why I'm seeing this (the 3dB difference in visual comparison) when the amplitude stats do not indicate an overall 3dB change? Is it a function of FFT analysis?