Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: A large quality loss? (Read 16037 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

A large quality loss?

Reply #25
Quote
Lame and iTunes were technically tied.


Hasn't there been two or three revisions to the iTunes encoder since then? Anyways, the only thing Apple explicitly has said about AAC decoding is that it is more efficient than mp3 (on its AAC page)

A large quality loss?

Reply #26
Quote
I believe they are saying for a given amount of "transparency" you can use a lower bit rate if you us AAC (and that lower bit rate will translate to less CPU power).
[a href="index.php?act=findpost&pid=263751"][{POST_SNAPBACK}][/a]

sorry, but that is totally wrong. Lower Bitrate doesn't mean less CPU...AAC needs the same CPU power no matter at what bitrate (this applies to mp3 as well)...use the search function...there are many more threads like this one and I think pretty much everything has been discussed...
--alt-presets are there for a reason! These other switches DO NOT work better than it, trust me on this.
LAME + Joint Stereo doesn't destroy 'Stereo'

A large quality loss?

Reply #27
Lower bitrate will decrease overall power consumption, all things being equal, due to reduced disk access. CPU is not the only power consumer.

192kbps MP3 may suck more power than 128kbps AAC because it is going to clear that memory buffer faster (and then hit the disk for more data).

A large quality loss?

Reply #28
i didn't have time to read all the stuff at the top so don't shame me if i look dum

going from 192 mp3 to 128 aac is not a good idea because aac and mp3 encode different so mp3 would throw some audio bits away to get the file size down but then aac would throw audio bits away and keep some audio bits that mp3 threw away so it would end up degrading a lot of quality

A large quality loss?

Reply #29
Quote
sorry, but that is totally wrong. Lower Bitrate doesn't mean less CPU...AAC needs the same CPU power no matter at what bitrate (this applies to mp3 as well)...use the search function...there are many more threads like this one and I think pretty much everything has been discussed...
[a href="index.php?act=findpost&pid=263825"][{POST_SNAPBACK}][/a]
You're right...I got my thoughts confused when thinking about power consumtion and bitrate -- and incorrectly translated that to CPU power.  Give me a break, I wrote it at 2am! 

A large quality loss?

Reply #30
Quote
Quote
Lame and iTunes were technically tied.


Hasn't there been two or three revisions to the iTunes encoder since then? Anyways, the only thing Apple explicitly has said about AAC decoding is that it is more efficient than mp3 (on its AAC page)
[a href="index.php?act=findpost&pid=263816"][{POST_SNAPBACK}][/a]



You're correct there

Someone did some tests comparing the new and old itunes versions and nero encoders

I believe if you compared the lastest Nero VBR AAC encoder against lame the gap would be much larger between the two formats at ~128kbps

A large quality loss?

Reply #31
Quote
sorry, but that is totally wrong. Lower Bitrate doesn't mean less CPU...AAC needs the same CPU power no matter at what bitrate (this applies to mp3 as well)...use the search function...there are many more threads like this one and I think pretty much everything has been discussed...
[a href="index.php?act=findpost&pid=263825"][{POST_SNAPBACK}][/a]


This is both technically and demonstrably wrong.

A bigger bitrate means that more bit reading and decoding needs to be done in the same amount of time. All other things being equal, this always leads to more CPU demand.

Bit reading/decoding may be a small part of the overall decoding requirement, so it makes only a very small difference, but it will certainly be there.

A large quality loss?

Reply #32
Quote
Quote
Quote
Lame and iTunes were technically tied.


Hasn't there been two or three revisions to the iTunes encoder since then? Anyways, the only thing Apple explicitly has said about AAC decoding is that it is more efficient than mp3 (on its AAC page)
[{POST_SNAPBACK}][/a]



You're correct there

Someone did some tests comparing the new and old itunes versions and nero encoders

I believe if you compared the lastest Nero VBR AAC encoder against lame the gap would be much larger² between the two formats at ~128kbps
[a href="index.php?act=findpost&pid=266202"][{POST_SNAPBACK}][/a]

To add a reply to a more-than-one-year message, you can see that these assumption was also wrong.
[a href="http://www.rjamorim.com/test/multiformat128/results.html]2004[/url] = iTunes CBR (4,26) & LAME VBR (4,18) => diff: 0,06 points
2005 = iTunes VBR (4,74) & LAME VBR (4,60) => diff: 0,14 points

Conclusion: the gap has been increased but only from a low margin (0.08 points) and both LAME and iTunes are still tied (i.e. no one can't be said better from a statistical point of vue, at least for the group of listeners and the samples used for the whole exercice).

___
² emphasis is mine