When I click on information in iTunes to see the bit rate of the song, I see different rates. One song has, say, 850kb, another 76-, and so on.
Why does this make sense?
Bit rates are different because songs are different. You should be concerned if it wasn't this way.
Knowing that lossless is lossless, it really shouldn't matter.
When I click on information in iTunes to see the bit rate of the song, I see different rates. One song has, say, 850kb, another 76-, and so on.
Why does this make sense?
http://en.wikipedia.org/wiki/Data_compress...dio_compression (http://en.wikipedia.org/wiki/Data_compression#Lossless_audio_compression)
What makes sense is doing even the most basic research about a concept if it has a feature that you do not understand. Closing thread.