Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: differ bitrate in itunes/foobar200 ? (Read 5865 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

differ bitrate in itunes/foobar200 ?

HI,
I use itunes and foobar2000 to rip a CD (same CD) into ALAC and then add to itunes library.
the file rip from foobar2000, it show 1411 kbps in itunes.
the file rip from itunes, it show 890 kbps in itunes.

why there are different bitrate in itunes ?

Thanks.

differ bitrate in itunes/foobar200 ?

Reply #1
In this specific case, I would guess that iTunes messed something up: 1411kbps is the bitrate audio is encoded with on Redbook CDs, so it must not be reading the file correctly or got confused or something. Either that, or your encoder options in foobar2000 isn't actually compressing/encoding.

Regardless, iTunes uses a slightly different rounding system for calculating bitrates on VBR files than foobar2000.

differ bitrate in itunes/foobar200 ?

Reply #2
Indeed, 1411 is the kbps for uncompressed CD Audio, and 890kbps sounds like a lossless compressed amount.

So first I would recheck that you're producing an alac file with foobar, and not an uncompressed one, and if it really is a compressed version, then try finding what's different between the two. It may simply be that when importing in iTunes, it adds the uncompressed kbps, as opposed to when it rips itself.

differ bitrate in itunes/foobar200 ?

Reply #3
Do all the songs have the same bitrate?
If they have different bitrates, I would say you accidently used Apple Lossless to rip your files (which is a perfectly fine lossless format).
If they all have the same bitrate though, I have no idea.

 

differ bitrate in itunes/foobar200 ?

Reply #4
They didn't accidentally rip their CDs to ALAC, that is their primary goal.  I believe the 1411kbps bitrate displayed in iTunes is for ffmpeg ALAC files (which, to my knowledge, is the only way to encode ALAC with foobar2000).  I know there was an issue with that not too long ago where the ffmpeg ALAC files were actually being encoded at 800-900kbps but iTunes was displaying a 1411kbps bitrate.  I suggest looking at the actual file sizes of these 1411kbps through iTunes.  Additionally, make sure that foobar2000 is reporting the correct bitrate for the ALAC files that it encodes.

differ bitrate in itunes/foobar200 ?

Reply #5
YES. I am using ffmpeg to rip the CD. Does ALAC files metadata contains bitrate data ?
this is around 890kbps (not fixed bitrate).
both rip from itunes/foobar2000 are showing ~890kbps in foobar2000 and the files szie almost same.
and I using bit-compare to compare two tracks, there are no difference.


differ bitrate in itunes/foobar200 ?

Reply #6
I am not sure where the bitrate data is located but, if the file sizes are the same (roughly) and they are both bit-for-bit identical, I wouldn't really worry about what iTunes is displaying.  Apple has changed what iTunes displays for bitrates over the year.  There was a time when iTunes would display AAC files as being a constant bitrate, then it would display the actual overall average bitrate, they went to showing a constant bitrate (with VBR after that if the files were iTunes VBR AAC), back to the average bitrate, and then now they are back to showing just a constant bitrate with (VBR) after VBR iTunes AAC files.

I kicked around in Google a little bit to see if there was a solution or where the problem is located (ie is it an iTunes issue or an ffmpeg one).  I didn't find anything but I searched for only a few minutes.  You may want to head on over to the ffmpeg page to see what they say.