Originally posted by plazz Is any kind of compression at all used in audio CD's? I know that there is on DVD's, just at a huge bitrate.
Originally posted by Speek Yes, there are only 44100 samples per second (per channel) on a CD. Uncompressed music has an unlimited amount of samples. But this is a somewhat unusual view
Originally posted by bryant No wonder CDs sound so bad! :mad:
Originally posted by rjamorim WHA?Are you meaning my beloved Floyd CDs sound bad?Be prepared, I'll start saying bad things about your cat.
Originally posted by The Belgain Can anyone tell the difference between 16bit 44.1 kHz music and stuff sampled at a higher frequency or bitdepth (such as DVD-Audio, SACD, etc...)?Was 44.1 kHz chosen randomly as a result of the amount of data that could be put on a CD and the play length they were looking for or is it chosen to bear some kind of relation to the upper frequency limit of human hearing?
Originally posted by rjamorim Be prepared, I'll start saying bad things about your cat.
Originally posted by bryant Very true, Speek! And the data is also truncated to 16 bits, which is yet another form of compression. No wonder CDs sound so bad! :mad:But, by convention, linear PCM at any bit depth or sample rate is considered "uncompressed".
Originally posted by rjamorim I would believe that, when Sony and Philips engineers developed the CD standard, they looked for settings that sound good for almost everyone: 44.100Hz, 16bit, stereo.
was developed in the 1920's or 30's
Originally posted by plazz Sony and Philips developed the CD, but the "blueprint" for encoding audio digitally at 44.1KHz was developed in the 1920's or 30's, by a guy who died a few months ago. Can anyone remember his name? I remember the newspapers crediting him as one of the founders of the "digital revolution".
Originally posted by Frank Klemm Refutable.The 44.1 kHz can't be discussed in 1920's or 30's.This would violate the principle of causality.(44.1 kHz is TV norm based which were introduced in the 40's and 50's).Claiming something without any proves seems to be standard in the common time.
Originally posted by [JAZ] Man! at that time things were analog! not digital (even the "wannabe computers".)The reason of the 44100Hz , as I've read somewhere, is due to the VHS. They chosed a rate that could contain the "usual" human hearing ( 20hz-20Khz), and that it was a multiple of .... something about the VHS... can't remember.. but it is related to that. There was already the intention of leaving some extra margin, to avoid the nyquist frequency problem, and it is the reason of why CD are said to have a lowpass filter at 20Khz (or that they should).
Originally posted by bryant And the data is also truncated to 16 bits, which is yet another form of compression. No wonder CDs sound so bad!
Originally posted by gdougherty Contrary to your statement there isn't any inherent bit truncation that occurs with 16-bit files. Music properly recorded at 16-bit and never digitally amplified past the zero mark of 16-bit would never be truncated as I understand things. Generally 24-bit resampled down to 16-bit using a very good algorithm doesn't necessarily truncate the signal and toss out the dynamic range in the 24-bit sample. Anything in the dynamics that would be tossed are likely inaudible unless you've got things up so loud that the louder sections would make your ears bleed (ie, it drops the bottom, not the top of the signal).
Originally posted by godzilla525 I have a portable CD player that has anti-skip memory in it (Panasonic SL-291C). Just for fun I played a 20-20kHz sine sweep through it. With anti-skip turned off, the sweep played fine. When I turned anti-skip on, the disc spindle speed went up to 2x, and pretty much anything above 10kHz in the sweep was trashed--a lot of strange beeping, an increase in background hiss in some places...I'd say just to leave anti-skip off unless you really need it.