I normally scan my albums with true peak turned on and set to auto 8x oversample. Usually the difference between true peak on and off (have tested a few times) is in the second/third decimal place or lower.
However I have a few albums which were digitally released as 24-bit/44.1 KHz. With these the difference is significant. For example:
album peak (true peak 8x): 1.145963
album peak (true peak off): 0.959510
What's going on and which value should I trust? I haven't changed the default setting to downsample high-definition content but I don't think that would apply here anyways.