Skip to main content

Topic: Should i start using AAC? Can you notice a better quality? (Read 45860 times) previous topic - next topic

0 Members and 1 Guest are viewing this topic.
  • shadowking
  • [*][*][*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #25
The modern lossy bitrate is far too bloated. Comparisons for mp3 Vs AAC are difficult as the average bitrate is now 250 .. 300k . Several years back the motivation of AAC and others was to provide near transparency @ 128k and better handling for problem samples where mp3 192k wasn't cutting it. Other encoders used VBR to implement these improvements. An early example was the MPC encoder which gave great quality @ 160..200kbit

Things where really interesting back then on the lossy side of things - 2002 ~ 2006. The other thing is mp3 was always competitive at 192k if you don't count rare problem samples. Even then it could probably sound very acceptable and satisfy 90 something % of people . This is probably true even for the old CBR 192 encodings.

Where its at today you could have just stuck it out with 256 CBR mp3 ten yrs ago , ignored any audio lossy development since and still be competitive .
wavpack -b4x4s1c

  • HTS
  • [*][*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #26
The modern lossy bitrate is far too bloated. Comparisons for mp3 Vs AAC are difficult as the average bitrate is now 250 .. 300k . S

But I think video games and other entertainment media are still using sub-standard 128-160kbps mp3 or oggs.


  • shadowking
  • [*][*][*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #27
Yes. What I am trying to say is that at 250..300k you can use a lame encoder from the dark ages and still be fine.
  • Last Edit: 29 January, 2012, 12:13:15 AM by shadowking
wavpack -b4x4s1c

  • IgorC
  • [*][*][*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #28
Quick question.

Wikipedia says:

Quote
The MPEG-2 audio tests showed that AAC meets the requirements referred to as "transparent" for the ITU at 128 kbit/s for stereo, and 320 kbit/s for 5.1 audio.


Who and when were these tests conducted by/at? How did MP3 score at 128kbps?


http://mpeg.chiariglione.org/quality_tests.php
http://mpeg.chiariglione.org/working_docum...AAC_results.zip
http://mpeg.chiariglione.org/working_docum...ts_overview.zip

In my opinion it's true.  High quality  AAC encoder(s) provide(s) high quality at 128 kbps for statistically average lsitener. Some people won't hear  the difference already at lower bitrate than that, other will need to rise bitrate.

  • zima
  • [*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #29
I read everywhere that "AAC is better than MP3" so that is why i ask this. I know this applies at low bitrates but at high bitrates too?

I'm sorry, but I can't help it, a car analogy popped into my mind  (and such, as everybody knows, are the benchmark in ~computing...):

I read everywhere that "AAC is better than MP3" seems a bit like "higher octane fuel is better" ...yeah, sure, it kinda is, in how it allows for "more on the edge" things technically, potentially higher-performing engine designs.
But virtually any engine is tuned & works with particular, perfectly sufficient octane levels - so pouring into the tank a "better" (than specified for the engine) octane fuel won't really bring any relevant improvement in its operation (at least in a non-malfunctioning engine), despite what many people think & say.
(I guess hearing system is a rough equivalent of the engine here... hm, maybe this analogy didn't turn out entirely horrible after all  - and now that I think about it, there's plenty of audiophile-like snake oil salesmen in automotive field, too)


Now, on to make my post less silly...

I did a quick test with foobar and im surprised i can not even hear any difference between q.35 AAC and a FLAC file! And .35 means about 100kbps, with MP3 im sure i would have heard a difference... i guess.
[...]
I will test more when my new senheisser hearphones arrive, i read somewhere that with hearphones you ear more "sound bugs".

Such surprises tend to happen quite often, when people do a proper ABX test ...and I think you really should also do an ABX of LAME encodes at similarly low bitrates (~2 lower than your usual encodes); the results might very well surprise you similarly.
(BTW, ~training yourself to notice particular kinds of "sound bugs" also helps in hearing them - but IMHO one should really reconsider the utility of essentially trying to notice them, if the idea is supposedly listening to music)


[...] do ABX tests, see what is the lowest bitrate where you can't tell the difference between original CD audio and mp3 (or aac) compressed audio. For me it's rather low, between 128 and 160 kbit - so I'm using around 192 kbit encoding, to be on safe side.

Is that really "rather low"? Per the MPEG tests linked nearby (for just one example), it would seem close to typical, at worst.


the iTunes store no longer sells any 128 kbps AAC files; everything has been 256 kbps for a couple of years

I don't think that's quite as universal? Quickly searching for some confirmation brings up http://support.apple.com/kb/ht1711 "Why is iTunes Plus format available for certain music, but not all?" (plus I believe it's a bit country-specific, also in how some areas still get DRM'd music)


I have heard people speculate Apple chose CVBR for iTunes because they were worried about users being confused about songs resulting in different bit rates even though the quality settings were the same, but I don't know how accurate that is. iTunes (on OS X at least) also used to have issues with calculating bit rates on true VBR AAC files but this may have been fixed since.

Edit: Just did a quick test with the latest iTunes using 256kbps VBR, the resulting file reads as 256kbps in iTunes and 269kbps in OS X Finder. XLD using constrained VBR produced a file that reads as 269kbps in both Finder and iTunes. I suspect iTunes is deliberately displaying the bit rate as 256kbps when the files are produced through iTunes conversion.

It would certainly seem more elegant if iTunes and iPods simply displayed, from the start, some "quality preset expressed as nominal bitrate" tag - while the actual bitrate (essentially hidden from the user, normally) were allowed to float in a true VBR fashion.
But maybe they weren't so confident in the fully VBR mode, at the beginning? (I suppose CBR, maybe also ABR, is somewhat more straightforward to implement properly at the beginning / any codec dev here would like to shine a light on it?)


The modern lossy bitrate is far too bloated.
[...]
The other thing is mp3 was always competitive at 192k if you don't count rare problem samples. Even then it could probably sound very acceptable and satisfy 90 something % of people . This is probably true even for the old CBR 192 encodings.

Where its at today you could have just stuck it out with 256 CBR mp3 ten yrs ago , ignored any audio lossy development since and still be competitive .

I'd go further: since for many listeners, in many parts of the world, p2p (certainly with quite "random" quality - heck, how many are transcodes from other lossy files?), Youtube (etc., similarly random) videos, or low-bitrate streams are a major medium of music...

And people seem to be generally happy with those. Likewise with FM radio, or non-plus DAB in areas using "too low" MP2 (two!) bitrates (or at least, research suggests they're fine - in fact, the complaints seem to be at least partly 'audiophile' in nature)


Overall, IMHO, the main utility of AAC (particularly as HE-AAC v2; generally, any of the "more advanced" codecs) lies in being able to cram more music, in perfectly acceptable quality, into portable players and mobile phones (or the just mentioned radio streams - but then, many stations don't seem to bother...) - after all, as HA tests show, we're getting where 96 kbps is already quite decent, and it (and lower) will only improve.
OTOH, it looks like space constraints might become moot, even on the cheapest of devices, in relatively near future... (well, maybe even then it will be still "which format and bitrate gives best battery life on this particular player?" - energy storage doesn't improve nearly so fast; likewise bandwidths, for large part of human population)
  • Last Edit: 29 January, 2012, 07:03:56 AM by zima

Should i start using AAC? Can you notice a better quality?
Reply #30
iTunes Store is 256 AAC with no DRM everywhere no more 128 with DRM.

  • HTS
  • [*][*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #31
Does anyone know what does apple mean by the 0-127 numbers for their TVBR? Other than just "higher the better". Like what do those numbers literally correlate to in the technical sense?

  • lvqcl
  • [*][*][*][*][*]
  • Developer
Should i start using AAC? Can you notice a better quality?
Reply #32
Why do you think that it has some hidden meaning?

  • hlloyge
  • [*][*][*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #33
[...] do ABX tests, see what is the lowest bitrate where you can't tell the difference between original CD audio and mp3 (or aac) compressed audio. For me it's rather low, between 128 and 160 kbit - so I'm using around 192 kbit encoding, to be on safe side.

Is that really "rather low"? Per the MPEG tests linked nearby (for just one example), it would seem close to typical, at worst.


For me it is - considering that most people choose quite higher bitrates for encoding their music.

  • adlai
  • [*][*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #34
the answer is yes.

I moved years ago and I haven't regretted it.

Also, the AAC codec is under active development by apple, while LAME is pretty much glacial in its development.

  • saratoga
  • [*][*][*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #35
Also, the AAC codec is under active development by apple, while LAME is pretty much glacial in its development.


Latest LAME release: v3.99 (October 2011)

Not sure why you think that.  Lame is quite active:

http://lame.cvs.sourceforge.net/viewvc/lam...ml/history.html
  • Last Edit: 29 January, 2012, 05:21:25 PM by saratoga

  • adlai
  • [*][*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #36
most of it is for stupid stuff like tagging files or minor bug fixes. Not many quality improvements.
  • Last Edit: 29 January, 2012, 05:44:33 PM by db1989

  • saratoga
  • [*][*][*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #37
So if what you really meant is that Lame is active, but that the Apple encoder is more active, I'm curious how you know this.  Got a link to the Apple encoder changelog?
  • Last Edit: 29 January, 2012, 05:44:43 PM by db1989

  • IgorC
  • [*][*][*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #38
3.99 fills bitrate gap between the V0 and CBR 320 kbps . And it's useful.
But last time me and halb27 have discussion whether there were any real quality improvement while keeping the same bitrate  since LAME 3.97.  http://www.hydrogenaudio.org/forums/index....mp;#entry779727

Shortly, no improvement since 3.97.
  • Last Edit: 29 January, 2012, 06:08:04 PM by IgorC

  • slks
  • [*][*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #39
Anyway, if one uses XLD or any other QuickTime frontend, to produce AAC from his own lossless material, it's better to target for quality (range 0 ~ 127) instead of bitrate. Target quality of 110 roughly corresponds to 255Kbps VBR, but the real average bitrate of a track, which is the value both Finder and iTunes show, actually depends on source complexity.


While I agree that theoretically, a properly designed "true VBR" mode is better than ABR - the recent AAC listening tests here showed that for Apple's encoder, the "Constrained VBR" mode (which I suppose is like ABR) actually scored better.

I'm no expert at statistics or reading these listening test graphs, but it seemed that the Constrained VBR setting did score marginally better. (And both modes scored better than Nero AAC). My guess is that Apple has not tuned their true VBR mode as much as they have the constrained VBR.

  • C.R.Helmrich
  • [*][*][*][*][*]
  • Developer
Should i start using AAC? Can you notice a better quality?
Reply #40
the recent AAC listening tests here showed that for Apple's encoder, the "Constrained VBR" mode (which I suppose is like ABR) actually scored better.

Sorry, that is not what the test showed. The average scores alone don't tell the whole story. Please read the text above (and below) the result plots on the test page.

Quote
I'm no expert at statistics or reading these listening test graphs, but it seemed that the Constrained VBR setting did score marginally better. (And both modes scored better than Nero AAC). My guess is that Apple has not tuned their true VBR mode as much as they have the constrained VBR.

Again, I disagree. True VBR averaged 6-7 kbps lower in bit-rate than constrained VBR while not performing significantly worse in terms of quality. So it seems very well tuned. (That said, on the items tested, TVBR is tied to the Fraunhofer encoder whereas CVBR is significantly better than it).

Chris
If I don't reply to your reply, it means I agree with you.

  • polemon
  • [*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #41
I found the most revealing instrument to be harpsichord


That appears to be a rather common observation.


This is due to the fact, that the harpsichord produces tones with very high harmonics, that are very difficult to encode properly by means of compression. Compressors encode wave harmonics. A sine wave has very few of them (starting from 1), and square waves have the most. Actually, since you get ringing at the vertical parts of a square wave, which actually is a type of singularity, the number of harmonics in a square wave is infinite (and you can only get so close to it).

The harpsichord produces waveforms with mostly triangular waves, which are one order of complexity above sine waves.
Theoretically, all digital signals are the most difficult to encode, along with random seeds, like white noise with no cut-off limitation.
-EOF-

  • Defsac
  • [*][*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #42
most of it is for stupid stuff like tagging files or minor bug fixes. Not many quality improvements.
Quicktime 7.7 was security fixes only. Quicktime 7.6.6 has been around since March 2010, and that release was to address H264 and iMovie issues. The last version that affected AAC encoding was 7.6.0 I believe (Jan 2009)

  • nu774
  • [*][*][*][*][*]
  • Developer
Should i start using AAC? Can you notice a better quality?
Reply #43
most of it is for stupid stuff like tagging files or minor bug fixes. Not many quality improvements.
Quicktime 7.7 was security fixes only. Quicktime 7.6.6 has been around since March 2010, and that release was to address H264 and iMovie issues. The last version that affected AAC encoding was 7.6.0 I believe (Jan 2009)

Actually, they often update AAC encoder without any announce.
As far as I know, the most recent update was on QuickTime 7.7.1, CoreAudioToolbox 7.9.7.8.

  • Defsac
  • [*][*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #44
most of it is for stupid stuff like tagging files or minor bug fixes. Not many quality improvements.
Quicktime 7.7 was security fixes only. Quicktime 7.6.6 has been around since March 2010, and that release was to address H264 and iMovie issues. The last version that affected AAC encoding was 7.6.0 I believe (Jan 2009)

Actually, they often update AAC encoder without any announce.
As far as I know, the most recent update was on QuickTime 7.7.1, CoreAudioToolbox 7.9.7.8.

I very much doubt 7.7.1 affected the AAC encoder given it wasn't even released on OS X (correct me if I'm wrong, it wasn't released for 10.6 at least) and was released on Windows to address a specific buffer overflow exploit that was fixed with a system update on OS X. The latest version for OS X 10.6 is 7.6.6. They do occasionally make undocumented changes to QuickTime but given they don't even bother to mention them in the version announcement I think these are usually very minor. I've been using QT AAC for years and the last major change I noticed was from 7.5.5 to 7.6.0 (significantly increased bit rates at -q 127) which was documented in the 7.6.0 release notes as 'AAC encoder improvements' or something like that. But the QT release notes are very vague at the best of times. In any case I don't think there's any reason to suggest that QT is any more actively developed than LAME (and the LAME changelog is much more explicit).
  • Last Edit: 27 February, 2012, 06:38:44 AM by Defsac

  • nu774
  • [*][*][*][*][*]
  • Developer
Should i start using AAC? Can you notice a better quality?
Reply #45
I very much doubt 7.7.1 affected the AAC encoder given it wasn't even released on OS X (correct me if I'm wrong, it wasn't released for 10.6 at least) and was released on Windows to address a specific buffer overflow exploit that was fixed with a system update on OS X.

Look at
http://www.hydrogenaudio.org/forums/index....showtopic=91484
and
http://www.hydrogenaudio.org/forums/index....showtopic=90678

They have been improving the AAC encoder.
Even if you didn't know the fact above, at least you can simply compare the resulting AAC bitstream, and you will find QT 7.7.1 result is not bit-identical with the older ones.

  • TechVsLife
  • [*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #46
As a practical matter, if you notice artifacts that you think are correctable (bug or poor encoding), report them to apple tech support (google apple bug report).  However, it is a minor hassle setting up an account to do so (btw, with microsoft, most teams use connect.microsoft.com). 

The guys who work in quicktime were pretty good at replying to bug reports and they're at least "aware" of this site.  In April 2011, they told me that an aac quality bug I reported had been fixed in a subsequent release.  ("We believe this issue has been addressed in Mac OS X Lion Developer Preview Build 11A419.")
  • Last Edit: 24 March, 2012, 02:15:56 PM by TechVsLife

  • simonh
  • [*][*][*]
Should i start using AAC? Can you notice a better quality?
Reply #47
Personally, i'd say stick with what you know and trust. I used to be occasionally tempted to re-encode to Vorbis or, only possibly, AAC. What stopped me is... why bother? MP3 is, as Wikipedia states "the de facto standard" for digital music. Thanks to the LAME devs, MP3 continues to be the most popular format after all these years. Another thing to consider is that in a few years, all the patents will have expired on MP3. There's every chance that at that time, even your toaster will be "MP3 compatible"!

Should i start using AAC? Can you notice a better quality?
Reply #48
The only reason i would use AAC is if you live in the Apple ecosystem. Lame mp3's still have issues with gapless.

  • greynol
  • [*][*][*][*][*]
  • Global Moderator
Should i start using AAC? Can you notice a better quality?
Reply #49
I presume you mean the problem when streaming from iTunes.  How is it that Lame is at fault?
13 February 2016: The world was blessed with the passing of a truly vile and wretched person.

Your eyes cannot hear.