Skip to main content

Topic: Nine different codecs 100-pass recompression test (Read 21699 times) previous topic - next topic

0 Members and 1 Guest are viewing this topic.
Nine different codecs 100-pass recompression test
Reply #25
Interesting.
Though i am surprised that Vorbis did so bad.

Have you tried with aoTuVb6.03?
Cause it should be more resilient then LibVorbis.


That AAC did so well is no surprise to me, as JJ said that this sort of thing was one of their design goals. For all I know AAC may have code that recognizes files that are processed by it.

  • Mach-X
  • [*][*][*][*]
Nine different codecs 100-pass recompression test
Reply #26
Interesting.
Though i am surprised that Vorbis did so bad.

Have you tried with aoTuVb6.03?
Cause it should be more resilient then LibVorbis.


Shouldn't make a difference. aoTuV's betas do not stray from the libvorbis spec, they only are more efficient. IE quality level for encode at same setting is identical, only file size is different. Unless I stand to be corrected?

  • eahm
  • [*][*][*][*][*]
Nine different codecs 100-pass recompression test
Reply #27
Shouldn't make a difference. aoTuV's betas do not stray from the libvorbis spec, they only are more efficient. IE quality level for encode at same setting is identical, only file size is different. Unless I stand to be corrected?

I would love to know this as well, from a Vorbis developer.

Is Ogg Vorbis improving/being developer anymore? Is all your attention on Opus now? Thanks.
  • Last Edit: 26 March, 2013, 12:19:18 PM by eahm

  • lvqcl
  • [*][*][*][*][*]
  • Developer
Nine different codecs 100-pass recompression test
Reply #28
IE quality level for encode at same setting is identical

No, it's not.

  • 2Bdecided
  • [*][*][*][*][*]
  • Developer
Nine different codecs 100-pass recompression test
Reply #29
Anyone else think there's an error in here?

e.g. compare...
RESULTS BY CODEC (100 PASSES, FROM BEST TO WORST): VBR, HIGH QUALITY (~256 KBPS) 1 (tie) MP3 (LAME)
...with...
DETAILED RESULTS: lame, vbr high quality

The latter is worst at 10 passes than the former is at 100. In the latter section at 100 passes, it sounds far worse than in the first set of samples (also supposedly after 100 passes).

Apologies if I've misunderstood or missed something.

Cheers,
David.

  • 2Bdecided
  • [*][*][*][*][*]
  • Developer
Nine different codecs 100-pass recompression test
Reply #30
Sound quality of lossy codecs is determined though DBT, full stop.
Agree 100%.

This test is an interesting insight into what codecs do when pushed beyond their limits, and shows you what a specific unlikely transcoding scenario will produce. However, the codec that performs best over 100 iterations is not necessarily the one that's best in a single iteration. e.g. one might do all the damage in the first iteration, and then make no change in the other 99.

It is interesting and worthwhile, but it's not the last word (and maybe not even the first word) in choosing a codec for a given application.

Cheers,
David.

P.S. reminds me of this...
http://www.youtube.com/watch?v=mES3CHEnVyI
  • Last Edit: 26 March, 2013, 01:34:58 PM by 2Bdecided

  • Porcus
  • [*][*][*][*][*]
Nine different codecs 100-pass recompression test
Reply #31
e.g. one might do all the damage in the first iteration, and then make no change in the other 99


And that is more than just theory. LossyWAV.

  • Mach-X
  • [*][*][*][*]
Nine different codecs 100-pass recompression test
Reply #32
IE quality level for encode at same setting is identical

No, it's not.

Care to explain? My understanding of the betas is that quality level 2 is quality level 2 regardless, and that the tunings only reduce filesize, not change in sound quality.

  • hankwang
  • [*]
Nine different codecs 100-pass recompression test
Reply #33
About the noise in the Vorbis sample: I experienced that "Vorbis exhibits an analog noise-like failure mode" (phrasing from Wikipedia). I wonder whether this noise is really an artifact of the quantization in the codec, or is deliberately added by the decoder using a pseudorandom generator in order to mask other encoding artifacts. In a normal low-bitrate Vorbis sample with noise-like artifacts, I find that less disturbing than the warbling sounds in MP3. It would make sense to mask artifacts with noise and it would explain the huge noise after 100 re-encodes.

Anyone who knows the internals of Vorbis who could chime in?

  • Primius
  • [*]
Nine different codecs 100-pass recompression test
Reply #34
If codec A was better than codec B after 100 iterations , wouldn't it be also better on the first iteration?
Is lossyWAV a "realistic" counterexample?, how would lossyWAV perform if a random time shift between the compression iterations was introduced?
(to be fair, this would also be a applied to the other codecs in the test)

Would optimizing an existing encoder to perform well this test inevitably result in regressions in the first encode iteration?

Could the reason why opus ranked low be because "it has no psychoacoustic model"?

Could the High frequency noise caused by 100 iterations of Vorbis be the same underlying problem, that caused the "HF noise boost" complaints in the past, I read about in the wiki?

  • greynol
  • [*][*][*][*][*]
  • Global Moderator
Nine different codecs 100-pass recompression test
Reply #35
If codec A was better than codec B after 100 iterations , wouldn't it be also better on the first iteration?

Not necessarily.

At the end of the day you have to rely on DBT for any particular codec/setting/sample/iteration/etc. so I don't see the point in such a lazy end-around.
13 February 2016: The world was blessed with the passing of a truly vile and wretched person.

Your eyes cannot hear.

  • db1989
  • [*][*][*][*][*]
  • Global Moderator
Nine different codecs 100-pass recompression test
Reply #36
If codec A was better than codec B after 100 iterations , wouldn't it be also better on the first iteration?
Maybe you missed the discussion about the potential for codecs to recognise that the input signal had previously being processed by that format and act accordingly. It’s not been verified AFAIK, but it’s a very real possibility, so you can’t just generalise like this. There are plenty of reasons that such simple rules may not be true and are generally a bad idea.

Anyway, in case it hasn’t already been said enough, DBT of properly encoded first-generation files is the only way to judge a codecs’ performances in the normal use-cases for which they’re designed. Any extrapolation from 100 passes is pointless at best, dangerously misleading at worst.

  • Mach-X
  • [*][*][*][*]
Nine different codecs 100-pass recompression test
Reply #37
db1989 and,greynol 100%, and greynol I hadn't meant to imply that you intended to bin the discussion,I was simply suggesting to all mods that while not particularly useful on a practical level, nor should ANY conclusions about ANY codec be drawn from the results (and all such claims SHOULD be binned), I find the tests and results interesting on a casual academic level. Indeed on a casual listen of the samples I must say I am a bit embarrased to say I might not be able to abx the 100 pass aac vs original. Along the lines of what arnie was saying is it possible the aac encoder can detect what has already been processed? After one pass does it simply spit the same file out 99 times? Can we use filesize or some other measurement to find out?

  • lvqcl
  • [*][*][*][*][*]
  • Developer
Nine different codecs 100-pass recompression test
Reply #38
Care to explain? My understanding of the betas is that quality level 2 is quality level 2 regardless, and that the tunings only reduce filesize, not change in sound quality.

http://en.wikipedia.org/wiki/Vorbis#Tuned_versions
Quote
Various tuned versions of the encoder (Garf, aoTuV or MegaMix) attempt to provide better sound at a specified quality setting, usually by dealing with certain problematic waveforms by temporarily increasing the bitrate.

  • Mach-X
  • [*][*][*][*]
Nine different codecs 100-pass recompression test
Reply #39
I see the word "attempt" in there, but no evidence that anything audible nor tested actually was accomplished. In fact since I cant abx libvorbis at -q2 or higher it stands to reason that those tunings offer no improvements at settings higher than that, including those used in this experiment.

  • Nick.C
  • [*][*][*][*][*]
  • Developer
Nine different codecs 100-pass recompression test
Reply #40
In fact since I cant abx libvorbis at -q2 or higher it stands to reason that those tunings offer no improvements at settings higher than that, including those used in this experiment.
[my emphasis]
So, on the basis of one failed ABX result, you contend that no improvements can be made? Which material did you use? Were any of the samples known problem samples for Vorbis?

On the topic of recursive lossyWAV processing - at the same quality settings, lossyWAV stops changing the audio at about the fourth iteration.
lossyWAV -q X -a 4 -s h -A --feedback 2 --limit 15848| FLAC -5 -e -p -b 512 -P=4096 -S-

  • saratoga
  • [*][*][*][*][*]
Nine different codecs 100-pass recompression test
Reply #41
In fact since I cant abx libvorbis at -q2 or higher it stands to reason that those tunings offer no improvements at settings higher than that, including those used in this experiment.


Tuning in this context usually means improving transparency on rare problem files.  Its no surprise you don't notice a difference, at those bitrates most codecs are generally transparent except for the sorts of problem files tuning is meant to help with. 

  • Mach-X
  • [*][*][*][*]
Nine different codecs 100-pass recompression test
Reply #42
Precisely the point I was getting at. At the bitrates *used* in this experiment, on the sample *used*, there is no evidence to suggest that using a 'tuner' fork of vorbis would produce results any different than already presented. *I* didn't put forth a claim, somebody else did. Still waiting on the abx test of the 100 pass libvorbis vs the 100 pass tuner.

  • Spikey
  • [*][*][*]
Nine different codecs 100-pass recompression test
Reply #43
Quote
Anyway, in case it hasn’t already been said enough, DBT of properly encoded first-generation files is the only way to judge a codecs’ performances in the normal use-cases for which they’re designed. Any extrapolation from 100 passes is pointless at best, dangerously misleading at worst.

I think in addition to this, it misses the obvious point that say after 3 reencodes instead of 100, is the 'loser' from the 100 experiment ABX'able from the 'winner'? Or, any versus any other for that matter. So while after 100 passes things might be really obvious (or really confusing), in just 1-3 reencodes all may be non ABX-able from one another (although of course, the test still needs to be done!).

Interesting thread, although I think it's confusing/oversimplifying a good topic rather than clarifying it. (Scary to see some oldtimers relying on a wave graph with obvious limitations rather than their own ears/logic!)