understanding measured and viewable changes after resampling 2019-04-09 19:47:03 I'm using Adobe Audition 3 to analyze a single channel of audio (e.g., just the Center channel of a 5.1 track). Audition offers an 'Amplitude Statistics' function, as well as a 'Frequency Analysis' FFT view. When I resample a 96kHz/24 bit track to 48kHz (using the foobar2k SoX plugin, though others give the same result) the min/max sample values increase, and the minimum RMS power (in dB) decreases. Peak amplitude rises a few hundredths of a dB (not consistently, depends on what track I measure, sometimes I see no change), and max , average,and total RMS power ( in dB) stay the same.Code: [Select]Center from multi 96khz/32bit MonoMin Sample Value: -18289.15Max Sample Value: 27042.06Peak Amplitude: -1.67 dBPossibly Clipped: 0DC Offset: 0 Minimum RMS Power: -132.63 dBMaximum RMS Power: -15.05 dBAverage RMS Power: -31.55 dBTotal RMS Power: -28.07 dBActual Bit Depth: 24 BitsUsing RMS Window of 50 msCenter from sox resampled multi 48kHz/32bit: MonoMin Sample Value: -17149.08Max Sample Value: 27218.93Peak Amplitude: -1.61 dBPossibly Clipped: 0DC Offset: 0 Minimum RMS Power: -inf dBMaximum RMS Power: -15.05 dBAverage RMS Power: -31.55 dBTotal RMS Power: -28.07 dBActual Bit Depth: 24 BitsUsing RMS Window of 50 msBut in a Frequency Analysis of the two (FFT size 65536, Blackmann-Harris) the 48kHz curve is 3dB lower than the 96kHz curve ; there are some differences in curve shape at either end of the spectrum, but otherwise the two curves are parallel. (attached pic)Can someone explain why I'm seeing this (the 3dB difference in visual comparison) when the amplitude stats do not indicate an overall 3dB change? Is it a function of FFT analysis?