I have a couple of questions concerning the average level of treble in recordings because, I am noticing differences between, let's say, original UK LP's back in 1991, versus the CD version of the same era. I can feel that some vinyl rips are a bit brighter and clearer and the CD seems the treble has a limit.
a) Do CDs do have a treble limit when mastered and follow a "right" reference? And if that is so: (1) What's the standard intensity of treble on a CD, expected? (2) How can this be measured by using Audacity, for both vinyl and CD? What to expect from the CD in terms of dB intensity of treble, and vinyl? How to know if someone got a master a little bit odd? (I'm not talking about the preemphasis technics here on 1980 CDs... those are superbright)
b) Are these vinyl rips impossible to tell what's really going on because of all the analogue to digital audio conversion factors involved, like equipment and cartridge used, needles, etc. Are some needles brighter? Or is the software for post-processing the data that creates any EQ and boosts the treble? Or are these vinyl cuts really a gem of their time?
I will use an example here:
Kraftwerk - The Mix - Original UK CD vs. Original UK LP. The LP seems better. Or is it just brighter than it should and sound good to my ears but the CD is the right reference?
Certain pressings such as Tears For Fears - Songs From The Big Chair are also another hellish camp. I came across an almost perfect vinyl rip - UK original pressing - and boy does that sound so good, the mastering is definitely different. But the Mercury UK CD is "thinner" in sound, a little bit. While the LP has a bit more bass.
Last but not least:
Can I downsample a 32/192 vinyl rip to 16/44.1 with no major problems using SoX (no dithering), or would it be wiser (though useless) to downsample such a high resolution to 24/96, 24/48 or 16/48 because of the even numbers of the samplerate (less complex math)? Now I'm not talking about any audible differences but wisdom to what theoretically "better" thing to do in this case.
Thanks for dropping a comment.
Vinyl is able of frequencies up to 25 kHz, by design a CD is limited to half of the sample rate (44.1/2) but in practice a bit lower as a filter needs some space say, it starts to roll of sharply at 21 kHz.
But both values are above the upper threshold of our hearing.
The dynamic range of a CD is 16*8 = 96 dB, vinyl can have a dynamic range of 70 db but 50 dB is in practice a common value.
The big question is, are you listening to the same masters?
This not easy to establish.
I do think CDs are often a slightly remastered version of the original tape.
At least there is no RIAA correction applied, a must when cutting vinyl.
The vinyl playback chain will have an influence as well.
As you mentioned, the cartridge will have its coloration, likewise the RIAA correction of the phone stage might color.
If you truncate from 24 to 16, dither is recommended to get rid of the quantization noise.
In the pas it was wise to stick to integers say 192 > 96 > 48 and 176 > 88 > 44 because of badly programmed re-samplers.
I don’t think SOX has this problem.
You can load the tracks in Audacity to compare them.
A spectrum analyses is also useful.
CDs are "ruler flat"
across the audio range and it's cheap & easy to make a flat player. There are a lot more variations in LPs(production and playback) and of course cartridges vary. With CDs you are hearing what the producer intended
(within the variations of your speaker & room).
You may have a particularly bright cartridge or there could be something wrong with your RIAA equalization. Depending on your cartridge, the record, and other variations, you can also get mistracking that distorts "S" sounds. If most records sound bright compared to the CD, I'd guess
it's your system.
Back in the analog-vinyl days, many/most popular/rock records were dull sounding (rolled-off highs). Classical and jazz were reputed to be better but that wasn't what I was listening to. I don't have any experience with "modern" records.
Can I downsample a 32/192 vinyl rip to 16/44.1 with no major problems using SoX (no dithering), or would it be wiser (though useless) to downsample such a high resolution to 24/96, 24/48 or 16/48 because of the even numbers of the samplerate (less complex math)? Now I'm not talking about any audible differences but wisdom to what theoretically "better" thing to do in this case. No
, "Theoretically" or "Mathematically" any
frequency resampling is imperfect and irreversible. And If you were to downsample by throwing-away every-other sample you'd (potentially) get aliasing so it's not done that way... You have to filter and filtering is complex and imperfect. So there's no advantage to even-numbers.
If you just change the bit-depth that's simpler. If you up-sample from 16 to 24-bits it's perfectly reversible back to 16-bits.
Of course, practically or scientifically it makes no difference. 16/44.1 is far-far better than vinyl and the noise dominates everything. Mathematically, noise is randomness and the randomness is worse than any conversion errors/rounding.
Dither or not, it doesn't matter. At 16-bits or better you can't hear dither (or the effects of dither). If you downsample to 8-bits, dithering will probably help. Dither is (low-level) noise and, so vinyl rips are already self-dithered.BTW -
If you simply digitize twice you won't get the same "mathematical" results. (You'll get different sample values.) If you understand how sampling works you should understand why. (It will sound identical.)