You must not have used MP3Gain. If, for example, you use 89dB as the reference, and it returns a -5dB gain, then you change to 92dB, it'll return -2dB!
The real problem is: scanners (and users) are not supposed to change the reference level when scanning, because it is in the standard.
So, because users have the ability (and reasons) to change the reference level, why don't we just remove it from the standard, and store the SPL instead?
Actually, that's the problem! If you mix your library with mine, who use 89dB, they'll play with different loudness. And there's no way to tell which files scanned with which reference. If, however, RG store the LEVEL instead, you can play with your 92dB "Preferred Level", and I can play with my 89dB. No matter what one chooses, all files in this world, scanned with RG, will have the same loudness. That's what I call standard. Doesn't that make sense?
Now one more question. Anyone know why the reference level was changed from 83dB to 89dB? Is there any possibility it will be changed again?
Because that's what people actually used (see here). As I recall it, 89 dB was used because 83 dB was seen as a bit low, especially for portable players. Considering it has stayed the same for over 5 years, I wouldn't expect it to be changed again.As for using level rather than gain, it has been discussed before (both in the thread linked above, as well as here), and was considered too late to change in 2003...
When it stored the absolute loudness of the file (e.g. 92dB), it was called Replay Level.When it changed to storing the relative gain (e.g. -3dB), it was renamed to Replay Gain: relative gains, referenced to making the loudness 83dB through a calibrated system, plus 6dB.... conceptually, saying the file sounds x dB loud at reference playback level doesn't actually tell you how much to change the gain if the non-parallel equal loudness curves are ever taken into account ... Whereas saying "shift it x dB to make it the reference playback loudness" can include that factor in the calculation.