Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: 128kbps Extension Test - OPEN (Read 56551 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

128kbps Extension Test - OPEN

Reply #150
this discussion will never end 

once again: the used codec settings (of course also the vbr settings) will average to around 128kbps most of the time when you encode full music clips...
I know, that I know nothing (Socrates)

128kbps Extension Test - OPEN

Reply #151
Quote
Suppose we have a piece with ONLY castanets? And MPC at -q4 (say) would create a 700kbps average file. Would you consider it fair to compare that file to (say) Vorbis that has encoded the same castanets file to 140kbps? Youpi! MPC file sounds better!! 

Yes. In my opinion, it's fair.

The only exception is for people listening castanets -and castanets only- the entire day.
I'm listening a lot of harpsichord. This imply that I know that :
- many encoders are not able to encode it properly at low-mid and even high bitrate
- many VBR encoders will increase statistically mesured bitrate for this instrument.

I perfectly aware that mpc --radio can't match the 128 kbps target. Even --thumb, in many case. Vorbis is in the same case. Then, it would be fair too balance their results.
On the other side, in many discs of mine, solo harpsichord can be heard some seconds only ; average bitrate of these discs is close to 130/140 kbps, even with ponctual 200 kbps parts. Quality is great during the entire disc. But if small parts are totally destroyed by the encoder (mp3 for exemple), I can't enjoy the whole encoding.


Roberto's test is a general one. We can't consider some particular tastes or listening behaviour, and therefore, there's no need to play with some formula. For the great majority of discs, mpc --radio and vorbis -b 4,25 are close to 128 kbps. And all these common discs include complex part as bachpsichord, castanets, Atrain... Lowering performance of VBR codec won't be fair at all.

128kbps Extension Test - OPEN

Reply #152
No it isn't fair; that's my point 

Look, consider this: My hypothetical castanets file will be (say) 700kbps in mpc at the settings Roberto chose. Anyone who is -surprised- at that castanets.mpc sounds better than castanets.ogg (140kbps) should stop taking drugs.

Now. This says one of 2 things:

1/ The test for 'castenets' is useless. We know the result before listening.

2/ The results should be skewed to make the result fair: Yes, castanets.mpc sounds better but at 5x the bitrate so it needs to be corrected for this The Real result needs to be extrapolated from the choices the codec makes.

Now the castanets or harpsichord examples are -extremes- but still.. consider the hypothetical evil .aac codec maker sent Roberto a aacenc.exe program that adds +2 to the quality setting Roberto told it to use. Roberto wants Quality4? Well I'll make a Quality6 file. Then AAC files would of course always be more like 192kbit, but hey, thats Fair cause VBR (which is, in the end, a mix-bag of codec choices made by humans!) allocates as much bits as it Needs to

See where I'm going with this? evil-AAC wins, by your rationale.

And, it shouldn't win (because of this)-  the results simply have to be skewed, and a size penalty must be given to correct for larger filesizes! I don't want to know which codec starts using an insane amount of bits. I want to know which codec Makes the most efficient use of bits it gets at bitrate 128kbit. THAT is an adequate test result.

Now, if you guys say that in the end, the 'ogg package' of files is exactly the same size as the 'aac' and 'mpc' package of files after encoding, then this already means that 'globally' the test might be fair, the overall quality of a codec then will be measured.

But I submit that the packages probably aren't the same size, and even if they are, comparing them on a file-by-file basis is more informative In that case we can clearly see which codec YOU should use. If you like harpsichord pieces, you will want to know which codec is best at harpsichord pieces! You don't want to know that codec X is best overall only to find out that it is best at everything except harpsichord

Hu..hu.. Done with my rant

Anyway, noone is stopping ME from making a corrected results table. But honestly, I really think my points could be valid here.

128kbps Extension Test - OPEN

Reply #153
I'm KO. Not by your logical, but by my poor english. I can't express what I have in mind.

Anyway, balanced-results or adjusted-settings, performed in order to make thing 'more fair', are for me exactly the same thing. Pardon me, but I'm a bit bored to see, on each page, the same criticism and some variation on this theme.

128kbps Extension Test - OPEN

Reply #154
Quote
I'm KO. Not by your logical, but by my poor english. I can't express what I have in mind.

Anyway, balanced-results or adjusted-settings, performed in order to make thing 'more fair', are for me exactly the same thing. Pardon me, but I'm a bit bored to see, on each page, the same criticism and some variation on this theme.

No problem - Thank you for trying, maybe I just am missing the obvious point and need some sleep

If people don't agree with me for the next test I will write the PuntCodec™ Lossy Codec (it only loses the .wav tags). It will accept the same quality settings as ogg/aac/mpc and it will *cough* aim[/i] at around 128kbit for -q4, 192kbit for -q6. Only no matter what setting you give it, it will copy 1-on-1 the exact wav data from input to output giving you a VVBR (VeryVariableBitRate) file of oh, ah, 1400kbit/sec.

And when decoding: NO ARTIFACTS! Perfect Codec. PuntCodec™ wins! 
(maybe for v2 I will steal Flac sourcecode! at least 60% compression! Still we win!)

128kbps Extension Test - OPEN

Reply #155
Sorry, but there was testing to find appriopriate quality setting before.
Your codec wouldn't pass it.

/EDIT\ You really need to get some sleep.
I've written about it at least few times it this thread.
But this time the text is at least amusing. Well done! \EDIT/
ruxvilti'a

128kbps Extension Test - OPEN

Reply #156
Quote
Sorry, but there was testing to find appriopriate quality setting before.
Your codec wouldn't pass it.

Ratz. Well in that case I would send them an ogg vorbis codec to -test-, and send them the latest optimized alpha *evil laughter* when the test is about to start.

For the record of course Im not accusing any codec maker to cheat like me  but I AM saying that for example I have noticed (havent experimented much I admit) MPC to create larger files than ogg at any quality setting. THIS fact does -not- make MPC a better codec. It makes MPC a codec that creates larger files.

And as I said before: filesize doesn't concern me, the quality/bitrate quotient concerns me. Puntcodec@1400kbit winning everything is not interesting. AAC@100kbit creating a slightly better sounding sample than .ogg@105kbit would be, and mpc@110kbit sounding better than .ogg@100 is inconclusive. That is my whole point.

128kbps Extension Test - OPEN

Reply #157
Quote
Ratz. Well in that case I would send them an ogg vorbis codec to -test-, and send them the latest optimized alpha *evil laughter* when the test is about to start.

I don't use alphas on my tests. That's why MPC is version 1.14

128kbps Extension Test - OPEN

Reply #158
Anyway, the tester shouldn't accept that alpha without testing average bitrates at given quality levels.
ruxvilti'a

128kbps Extension Test - OPEN

Reply #159
Quote
but I AM saying that for example I have noticed (havent experimented much I admit) MPC to create larger files than ogg at any quality setting. THIS fact does -not- make MPC a better codec. It makes MPC a codec that creates larger files.

Goldberg Variations, played by Glenn Gould (55 minutes, stereo digital record) :

ogg -b 4 : 114 kbps
mpc -q4 : 99 kbps

I can find you more than 100 CD in my library where mpc --radio will be more economic than vorbis -b 4(,25)...

Others :
http://membres.lycos.fr/guruboolez/AUDIO/a...assical_VBR.txt
http://membres.lycos.fr/guruboolez/AUDIO/a...etailed_VBR.txt

128kbps Extension Test - OPEN

Reply #160
Quote
Quote
Ratz. Well in that case I would send them an ogg vorbis codec to -test-, and send them the latest optimized alpha *evil laughter* when the test is about to start.

I don't use alphas on my tests. That's why MPC is version 1.14 

Fine, fine, I will test my codec once and if it doesn't create a Blue Screen of Death I will call it stable, OK? *Dr. Evil Laugh*

Seriously though, rjamorim (look, I spelled it right!) - don't you agree with my point:

Quote
Filesize doesn't concern me, the quality/bitrate quotient, what I call codec efficiency concerns me. Puntcodec@1400kbit winning everything is not interesting. AAC@100kbit creating a slightly better sounding sample than .ogg@105kbit would be, and mpc@110kbit sounding better than .ogg@100 is inconclusive. That is my whole point.


Or am I being silly? Guruboolez tried to argue with me but English isnt his language, it isn't mine either (Dutch anyone?).. I just don't see a reason not to balance quality settings based on a per file (audio-type) basis instead of based on what 'some guy at some codec coding factory thought would be a good set of parameters'. Who knows, maybe MPC's algorythms are inferior for castanet sounds and it needs 700kbit to compensate this inferiority while all other codecs can do the same with 300kbit? You won't know without weighed results

128kbps Extension Test - OPEN

Reply #161
If I'm summing up the problems properly, they're these:

Some codecs are VBR. This means that they allocate bits based on the difficulty of the sample. In other words, quality remains constant, and file size varies.

Some codecs are ABR. This means that they make the best out of a specified file-size. In other words, file size remains constant, and quality varies.

Roberto's test was designed so that in standard, global usage file size should be approximately equal between the VBR and ABR codecs. However, for this test there are samples that are difficult to psychoacoustically encode. The ABR codecs will thus be hampered by their limited file size, whilst the VBR codecs will excel due to the relaxation of this constraint.

In reverse, there may be samples (I haven't downloaded and examined all packages) where the ABR samples will be of a higher quality than the VBR, due to the fact that the sample is less difficult to encode.

The fact that two different encoding methodologies are being combined into one test will inevitably obfuscate proper analysis, as there was an uncontrolled variable.

It would seem to me that either all codecs should be VBR, or all should be ABR, to ensure uniformity throughout the samples.

I suppose we'll see how combining the two modes affects the results, however.

128kbps Extension Test - OPEN

Reply #162
MPC is good if it can compensate for its inferior algorithm when it needs to (even switching to lossless)
while everything else is worse, because the difference is very audible.

/EDIT\ Only CBR codec is QuickTime, which is still very good, comparable to mpc.
(for me at least - tried ff123's analyzer)
LAME is using ABR, because its low quality VBR perfoms poorly compared to it. \EDIT/
ruxvilti'a

128kbps Extension Test - OPEN

Reply #163
Quote
Quote
but I AM saying that for example I have noticed (havent experimented much I admit) MPC to create larger files than ogg at any quality setting. THIS fact does -not- make MPC a better codec. It makes MPC a codec that creates larger files.

Goldberg Variations, played by Glenn Gould (55 minutes, stereo digital record) :

ogg -b 4 : 114 kbps
mpc -q4 : 99 kbps

I can find you more than 100 CD in my library where mpc --radio will be more economic than vorbis -b 4(,25)...

Others :
http://membres.lycos.fr/guruboolez/AUDIO/a...assical_VBR.txt
http://membres.lycos.fr/guruboolez/AUDIO/a...etailed_VBR.txt

OK I stand corrected then. From those tables you showed me the reverse is true. (Doesn't matter to me, I have no preference for ogg or mpc)

So, for -your- kind of music, ogg consistently uses more bits it seems, with little exception. (I only compared q=3.5). Now suppose that the ogg and the mpc output all sounded EXACTLY the same (not perfect or transparent, just -the same quality-) to you. Wouldn't you then pick mpc as the 'best codec' instead of saying ogg is equal to mpc?

Quote
MPC is good if it can compensate for its inferior algorithm when it needs to (even switching to lossless)
while everything else is worse, because the difference is very audible.

/EDIT\ Only CBR codec is QuickTime, which is still very good, comparable to mpc.
(for me at least - tried ff123's analyzer)
LAME is using ABR, because its low quality VBR perfoms poorly compared to it. \EDIT/


PuntCodec™ switches to lossless ALL THE TIME. It needs to cause its only lossy algorythm is 'delete the wav header'. Still sounds the same! And gets 1400kbps. Does this make puntcodec better than mpc? No. Q.E.D.

128kbps Extension Test - OPEN

Reply #164
Quote
Seriously though, rjamorim (look, I spelled it right!) - don't you agree with my point:

Congratulations.

An no, I won't agree with anything at this moment. I want to see how this discussion develops first.

128kbps Extension Test - OPEN

Reply #165
Quote
So, for -your- kind of music, ogg consistently uses more bits it seems, with little exception. (I only compared q=3.5).


Not exactly. Classical music, with vorbis, is generally cool to encode. The quality scale isn't tuned for this musical genre, and -b 4 setting is close to 115-120 kbps (near -b 3 average bitrate). On the contrary, mpc need more bitrate for tonal instruments : violin, flute, cello... But at the same setting, mpc is more sympathic for piano, mono-recordings, and some others... Mpc has more amplitude than Vorbis (Wma9pro VBR is the champion, here).

Quote
Now suppose that the ogg and the mpc output all sounded EXACTLY the same (not perfect or transparent, just -the same quality-) to you. Wouldn't you then pick mpc as the 'best codec' instead of saying ogg is equal to mpc?


Sound can't be the same. Sorry for not supposing what you asked me to suppose, but Vorbis, AAC, WMA, MPC, MP3 simply can't be the same. For one single sample, artifacts are really different, not eaqually distributed, with different amplitude, etc...
But if you really want me to suppose this, yes, i would say mpc is the best (as far as I understand what you exactly mean).

128kbps Extension Test - OPEN

Reply #166
Quote
Quote
So, for -your- kind of music, ogg consistently uses more bits it seems, with little exception. (I only compared q=3.5).


Not exactly. Classical music, with vorbis, is generally cool to encode. The quality scale isn't tuned for this musical genre, and -b 4 setting is close to 115 kbps (near -b 3 average bitrate). On the contrary, mpc need more bitrate for tonal instruments : violin, flute, cello... But at the same setting, mpc is more sympathic for piano, mono-recordings, and some others... Mpc has more amplitude than Vorbis (Wma9pro VBR is the champion, here).

OK, agreed, of course. Each codec has its own strenghts and weaknesses, and eacho codec has different methods of selecting how many bits it would 'like' to use on a specific type of sample.. piano, rock, silence.. etc
Quote
Quote
Now suppose that the ogg and the mpc output all sounded EXACTLY the same (not perfect or transparent, just -the same quality-) to you. Wouldn't you then pick mpc as the 'best codec' instead of saying ogg is equal to mpc?


Sound can't be the same. Sorry for not supposing what you asked me to suppose, but Vorbis, AAC, WMA, MPC, MP3 simply can't be the same. For one single sample, artifacts are really different, not eaqually distributed, with different amplitude, etc...


I agree. For example I understand that mp3 is such an inferior codec that it CAN'T encode certain types of sounds properly, no matter how many bits you throw at it.

But you forget: we are speaking statistics here. We are doing a test with (say) 100 people who all pull on sliders saying how good/bad a track sounds. Now it is possible that when all the slider positions are added up and divided by 100, that for 'harpsichord.wav' ogg vorbis and mpc get exactly the same score...

Quote
But if you really want me to suppose this, yes, i would say mpc is the best (as far as I understand what you exactly mean).


... and as you say. If for example for that track ogg and mpc get -exactly- the same statistical average score as said by the 100 testers, and the .ogg file is (say) 10% smaller, then ogg vorbis is the best codec, for this small track, for this type of music, according to the human taste in music... And again: yes Im sure the ogg will have sounded different than the mpc because of different coding methods, but if the panel of 100 people say that they liked the tracks equally well, then that means something. It is, after all, psychoacoustic encoding, the human factor is important.

And, it might be that ogg -overall- scores worse than mpc, sure but it is useful information to know that for this specific type of track, even though statistically they sounded identical, the ogg vorbis codec would be the best, because with that information you can perhaps find out what codec is most suitable for which types of music.

128kbps Extension Test - OPEN

Reply #167
Quote
... and as you say. If for example for that track ogg and mpc get -exactly- the same statistical average score as said by the 100 testers, and the .ogg file is (say) 10% smaller, then ogg vorbis is the best codec, for this small track, for this type of music, according to the human taste in music...
(...)
And, it might be that ogg -overall- scores worse than mpc, sure but it is useful information to know that for this specific type of track, even though statistically they sounded identical, the ogg vorbis codec would be the best, because with that information you can perhaps find out what codec is most suitable for which types of music.


One sample isn't necessary representative of the global genre. Especially when you take bitrate into consideration. There are a lot of signal characteristic to take into consideration : noise, stereo separation, volume... For MPC for exemple, single harpsichord notes will produce a very high bitrate ; higher density of attacks will reduce the bitrate. Other encoder can react on the opposite way.
Therefore, you can't extrapolate bachpsichord.wav future result to the entire harpsichord music. It's a good basis, but not an accurate one. I can send another bachpsichord sample, part of the same disc, and bitrate difference would be totally different, and quality maybe too...

128kbps Extension Test - OPEN

Reply #168
Interesting point..

But in that case there can be a few conclusions:

1/ bachpsicord should be discarded completely since it is a 'freak occurrence' that only shows an abnormal bitrate for one (or a few) codecs and will therefore color the test.
2/ bachpsichord is just a good example where a particular codec (or a few codecs) have trouble.

In case 1, well, discard the sample cause it corrupts the 'overall image'?
In case 2, well it is interesting to see what the same codec would do with those other samples, compared to other codecs. In principle more samples would be needed for more accurate results. Also it wouldn't be fair to only include one 'freak' sample that confuses one codec.. at least one confusing sample per codec should be in there.

But I think in the end we are bound by some 'luck' choosing the samples. It might be that Roberto has chosen totally horrible samples from mpc point of view and totally great for ogg.. there's no way to prevent such faults, really. (unless you start listening to complete music pieces). The best test would be to have one sample of every song ever made  but that's kinda hard to do.

If you assume that Roberto has chosen fairly 'average, representative' samples that represent a lot of the different possible challenges lossy codecs face, then a codec that has more punt-weighted bad problems should receive negative points, the 'representative' choice of -all- samples should cancel out the freak occurrences with the exeption samples.

128kbps Extension Test - OPEN

Reply #169
puntloos - were you here for the pre-test discussion?  The encoders were tested (the same versions as are being used in the test) on a large number of albums.  The settings chosen were the ones that came closest to 128kbps on average over all those albums.  Your hypothetical encoder would never have been able to achieve that because the same version used in the preliminary run would have been the one used in the actual testing.

Quote
I want to know which codec Makes the most efficient use of bits it gets at bitrate 128kbit. THAT is an adequate test result.

That's EXACTLY what's being done.  Settings were chosen that result in an average of 128kbps over the long run.  VBR codecs try to make the most efficient use of those bits by spending more (sometimes lots more) on hard parts and fewer on easy parts.  ABR and CBR codecs use those bits less efficiently by spending the same amount everywhere (very roughly speaking of course).  In the end, both produce music collections of same size (give or take), with the bits distributed differently.

If you forced VBR codecs to behave like ABR or CBR codecs what would that prove?  That they do poorly when they're not used as they were intended?  Why penalize them?  Should we encode longer files at lower bitrates so they end up the same size as shorter ones?  I don't think too many people are going to try out half a dozen quality settings on each file until they achive an exact bitrate.  They're going to choose a setting that gives them a desired quality level or a desired average bitrate over the whole of their collection.

In the case of the samples for this test, my understanding is that some were chosen BECAUSE they have shown to be difficult to encode.  Naturally, VBR codecs will spend more bits on some of them - that's the goal of a VBR codec.  If easy samples were chosen, they would spend less.  Of course, it would be harder to hear artifacts in easy samples, so they probably wouldn't give very useful results.

This reminds me of a benchmark I recently heard about comparing a Pentium 4s to a G5.  They disabled hyperthreading on the P4 to make the test "fair".  That's like saying if one CPU had an extra fast cache, it should be disabled to make the test "fair".  Nonsense of course.

Now, if we were trying to determine which would sound best for STREAMING at 128kbps, I could understand trying to manage the bitrates on a per-file basis, but that's clearly not what's being tested here.  If it were, I'm sure oggenc's managed bitrate mode (and other encoders' equivalents) would be chosen.
I am *expanding!*  It is so much *squishy* to *smell* you!  *Campers* are the best!  I have *anticipation* and then what?  Better parties in *the middle* for sure.
http://www.phong.org/


128kbps Extension Test - OPEN

Reply #171
"PuntCodec™ switches to lossless ALL THE TIME. It needs to cause its only lossy algorythm is 'delete the wav header'. Still sounds the same! And gets 1400kbps. Does this make puntcodec better than mpc? No. Q.E.D."


I'm really having a problem understanding any flaw in this test.

Each codec averages about 128bit.

Therefore each codec is given THE SAME AMOUNT OF BITS TO USE.

Its up to the codec on how to use them.

To test which codec does this the best we zoom in on areas of which are 'very' hard to encode.  If a codec is transparent on the dificult areas it will be transparent on the easy areas (okay a slight assumption however a very good one to make)

We zoom in on the hard parts and test how well each codec does.

MPC can avg 1411bit if it wants and OGG can use 12bits, it doesn't matter they are the SAME SIZE FILE IN THE END.... 128bit.  MPC isn't using more bits, its 128 just like the others.  If MPC was 'smart' enough to use bits where they are needed well then I think thats a good thing.

MPC IS NOT GIVEN MORE BITS THAN ANY OTHER CODEC. Its 128bit avg, just like the rest.  Prove that wrong and you might have a case.

I think the key thing to remember is this test is not about these particular samples, but music in general.

(sorry for adding this too many people have said it already but i dunno)

128kbps Extension Test - OPEN

Reply #172
Quote
1/ bachpsicord should be discarded completely since it is a 'freak occurrence' that only shows an abnormal bitrate for one (or a few) codecs and will therefore color the test.

Harpsichord is known to be one of the most difficult instrument for many lossy encoders, sensitive to pre-echo (sharp attack) and subjet of heavy distorsions during the development of the note. I see two advantages on it :
- easy to ABX for many people (at least, easier than a piano or a violin)
- interesting to see how will react a psymodel, and bitrate consequences (for exemple, mpc and vorbis will detect the difficulties, and increase the bitrate ; a VBR setting for lame, as --preset-standard, underate difficulties, and will produce an abnormal distorted sound)
And harpsichord isn't a 'freak occurence'. It's just like saying that electric guitar is something rare, and should be removed : pure non-sense. Harpsichord was the most used instrument during at least two century, in the whole Europa. According to my listening tastes, harpsichord is more usual than any kind of cymbals. A listening test had to include a good variety of musical genre and various samples, in order to be 'fair' and representative.

Quote
2/ bachpsichord is just a good example where a particular codec (or a few codecs) have trouble.

Yes, exactly. But not for a single or even few codecs, but for most lossy & perceptual format. I know one exception : wma and wma9pro. I know wma to be bad on most other samples - critical or not. Don't know wma9pro general performance.
Therefore, it's interseting to test various format on a difficult instrument ; mixing VBR and CBR will make thing more intersting (and certainly not unfair), because I know that one pure CBR encoder, at 128 kbps, may sound as good if not better than others at 160-200 kbps.

Quote
But I think in the end we are bound by some 'luck' choosing the samples. It might be that Roberto has chosen totally horrible samples from mpc point of view and totally great for ogg.. there's no way to prevent such faults, really. (unless you start listening to complete music pieces). The best test would be to have one sample of every song ever made but that's kinda hard to do.

If you assume that Roberto has chosen fairly 'average, representative' samples that represent a lot of the different possible challenges lossy codecs face, then a codec that has more punt-weighted bad problems should receive negative points, the 'representative' choice of -all- samples should cancel out the freak occurrences with the exeption samples.

Did you ever made a listening test ? Did you ever took the responsabilty of choosing the adequate and representative samples, easy enough for being detected from original ? What made a sample 'exceptional', and which of the twelve is really 'freak' ? Did you participate to this test ?

 

128kbps Extension Test - OPEN

Reply #173
A question which hasn't been answered yet: what about WMAPro 2 pass? By using 2 pass you are forcing a vbr codec to a certain bitrate; tell it "to make the most out of 128k" which is unfair because you are artificially limiting the codec.

I'm not criticizing the test as a whole, but interpreting the results will be very difficult. Maybe the only usefull conclusion we can pull out of this is that vbr is better than managed bitrates which in turn is better than pure cbr.

Take this car anology: You have 2 types of cars with similar motors, one with many gears (vbr) and another with just a single gear (cbr). If you put them on an empty road (easy sample) and adjust the gear of the vbr car so the motor is going approximately as fast as the cbr car's, you won't notice much of a difference. If you put the two in city traffic (difficult sample), the cbr car will have a hard time because of all the varying speeds neccessary that its motor just can't deliver. On average the vbr car's motor is running at the same speed as the cbr car's, but depending on the situation other gears are used to compensate for weak motor output.

So what does this tell you about the quality of the motor? Maybe the vbr car has a worse motor than the cbr car, but uses it more efficient. You'll never know because with this test you don't gain access to this type of info. It will most likely only tell you that having gears in your car is a sensible thing. The anology has its flaws, but I hope you understand my point.

If you choose samples which are known to be difficult to encode then you are treating cbr codecs unfairly. The samples are chosen because they cause trouble and vbr codecs have a way of reacting to troublemakers which the cbr codecs lack.

Heh, I'm creative tonight, so here comes another anology. You are testing two boxers and start hitting them at various strengths. The cbr boxer allways uses the same amount of protection while the vbr boxer is skilled adapts his protection to the force of the blow. Under normal conditions both handle the hits similarly well. On stronger hits the cbr boxer sometimes is a little shaky. Since this isn't really telling you much, you decide to hit them both much stronger. The vbr boxer reacts and pulls out a heavy shield that can deal with even the most forcefull blows, but the cbr boxer simply collapses. Again, what does this tell you?

128kbps Extension Test - OPEN

Reply #174
Ok, I think i'll add my grain of salt to why this isn't unfair (or fairer than the opposite).

First, bigger bitrate means better quality, if we stay on the same codec version (except if there's a bug). But this is not a linear progression so high_bitrage_quality/Factor_of_high_bitrate_and_low_bitrate not equal to low_bitrate_quality.
(addenum to this: if a codec has been rated transparent, how do you know that with a lower bitrate isn't still transparent?)

Second, said many times. We are trying to identify the quality that some codecs have around 128kbps.  Quality depends on the content being encoded, and the bitrate being used.  ABR/CBR codecs are constrained to a bitrate, and thus the quality is not constant.
VBR codecs try to maintain the same quality, and thus, deciding when to use more bits or not.

What can we extract from here?
The CBR/ABR codecs will have different qualities for a bitrate depending on the sample. This is why we use many samples, not just one, to get a global idea.
The VBR codecs, in theory, should maintain the same amount of quality, with different bitrate demands so we shouldn't expect big variations on the range (this would be indicating that they have failed on their purpose).[/b]
Thus, with VBR codecs we try to see how good they are, in maintaining the desired quality. There's no interest at all with the bitrate.

Going back to this test, VBR codecs just can't participate in it, because we are already telling them a quality. Why are they included? Because we've checked withing several files the average bitrate that it uses, and if it is getting the same quality for them, we can indicate which is the quality that averages that bitrate.

Since it is fair to test several samples in CBR/ABR to find the average quality that a bitrate produces, it is as fair to test several samples in VBR to find the quality that averages that bitrate. (addenum: you can't judge the bitrate of a VBR coded from a single file just like you can't judge the quality of an ABR/CBR coded from a single file)

I think this last sentence should be strong enough to divise any more discussion about it.