Skip to main content

Topic: 128kbps Extension Test - OPEN (Read 43482 times) previous topic - next topic

0 Members and 1 Guest are viewing this topic.
  • upNorth
  • [*][*][*][*][*]
128kbps Extension Test - OPEN
Reply #125
Quote
I haven't read in detail how this test is performed. But in general isn't it a better idea for testing different codecs to fix a bitrate and adjust the quality level (for mpc and vorbis) so that the output is going to be exactly equal to that bitrate? This way every codec will be given the same amount of space to demonstrate their skills. Let's say 128kbps lame, q4.3 MPC, q4.5 vorbis for a specific sample, but 128kbps lame, q5 mpc, q5.5 vorbis for another sample...

What would this prove?
The beauty of a quality setting is that it's the bitrate that changes and not the quality. I still think that what matters, is what kind of bitrates a certain codec at a certain setting, average to in the long run.

At least when I encode music I settle for a specific quality setting (currently MPC -q5) and not a bitrate. Do you really test every track to find the setting that is closest to your desired bitrate, and thereby end up with an album where "all" the tracks are encoded with different settings, just so that all of them has about the same average bitrate? It's fine by me if this gives others the warm fuzzy feeling everyone here talk about, but I don't get it. 

As I see it, a test like this has to settle for some interesting samples to put to the test because you have to limit the amount. I would expect the easiest to spot problems parts of a certain track also might be the same places where a good VBR codec truly shines. If you force it not to be smart and use bits where it thinks they should be used, your test IMHO won't be much worth.

I don't say that this is easy at all, but at least when it's done in this way it applies to real life usage. I don't really care what these Slashdot people say, it's only fun to read it...

  • verloren
  • [*][*][*]
128kbps Extension Test - OPEN
Reply #126
I have almost the opposite question to most people.  I understand and totally agree with the reasoning that roberto has used for setting the various quality levels.  But from the accounts listed here it seems like there are many samples that are way above the 128kbps nominal value, but few if any that are significantly below (the examples I've seen have been around 122kbps for example).

So I wonder if it would be useful to give some really easy to encode samples, to make sure that when the encoder decides it only needs say 50kbps it is making as good a decision as when it picks 190kbps for a hard passage.

And no, I haven't downloaded the samples as I lack the facilities, so perhaps this is already in there!  If so I claim "official mirror's" right to ask one stupid question

Cheers, Paul

  • ff123
  • [*][*][*][*][*]
  • Developer (Donating)
128kbps Extension Test - OPEN
Reply #127
Quote
I have almost the opposite question to most people.  I understand and totally agree with the reasoning that roberto has used for setting the various quality levels.  But from the accounts listed here it seems like there are many samples that are way above the 128kbps nominal value, but few if any that are significantly below (the examples I've seen have been around 122kbps for example).

So I wonder if it would be useful to give some really easy to encode samples, to make sure that when the encoder decides it only needs say 50kbps it is making as good a decision as when it picks 190kbps for a hard passage.

And no, I haven't downloaded the samples as I lack the facilities, so perhaps this is already in there!  If so I claim "official mirror's" right to ask one stupid question

Cheers, Paul

I would guess that the high and low bitrates are not distributed the same way.  For example, if a codec spends 90 percent of its time at 124 kbit/s, then 10 percent of the time it could grow to 165 kbit/s while still averaging 128 kbit/s overall.  It could be that the VBR codecs never let the bitrates dip down to the extent that they're allowed to increase.

If that's the case (probably a reasonable assumption), and a test suite were to be comprised completely of random samples of music (not chosen at all for degree of difficulty), then most of the time it might be completely transparent to the listeners, and basically useless for trying to discriminate between codecs because of the large number of samples which would be required to simulate a real-world music collection.

One type of music which seems to produce lower bitrates is solo piano.  Roberto mentioned in his first test that he removed this from the test suite because it didn't discriminate well on the 64 kbit/s test.  This implies that the codecs would indeed be transparent at lower bitrates.  Still, maybe one or two "very easy" to encode samples, which produce lower VBR bitrates might be a good thing to include in a future test just to make sure one of the VBR codecs isn't failing badly at those bitrates.

ff123

  • askoff
  • [*][*][*][*]
128kbps Extension Test - OPEN
Reply #128
Quote
How do you know that it's 50% exta of what the original bitrate was targeted to be, you are testing just short hard-to-encode clips. The targeted quality setting tested produces 128kbps average, and we are testing the quality of specific quality setting of a vbr codec which gives this average bitrate. We are not testing qualities of 12 different quality settings of one vbr codec in one test.

This is quite odd. Even in the name of this topic clearly says "128kbps Extension Test" and nothing about "128kbps average album quality extension test clips". Well what the hec i'm whining about this subject anymore. I gues nothing will be changed anymore in this test, so i just have to do this how it has been started and try to test later in my way with my self. And why not setup my own public test. After all this is only Roberto's test, and it's as official as anyone else public test. Not the only official test.

  • rjamorim
  • [*][*][*][*][*]
128kbps Extension Test - OPEN
Reply #129
If anyone is willing to create his own listening test, I would be very happy to help him set it up. I'm pretty sure ff123 would also be happy.


And no, this test won't be changed. It works the way it is, and lots of people already took it, I won't ask them to retake (specially since there's no reason, really).
Get up-to-date binaries of Lame, AAC, Vorbis and much more at RareWares:
http://www.rarewares.org

  • rpop
  • [*][*][*][*][*]
  • Global Moderator
128kbps Extension Test - OPEN
Reply #130
To all the people whining: where were you during the pre-test dicussion? Don't you think these comments would've been more appropriate and helpful then?
Happiness - The agreeable sensation of contemplating the misery of others.

  • verloren
  • [*][*][*]
128kbps Extension Test - OPEN
Reply #131
Quote
I would guess that the high and low bitrates are not distributed the same way.  For example, if a codec spends 90 percent of its time at 124 kbit/s, then 10 percent of the time it could grow to 165 kbit/s while still averaging 128 kbit/s overall.  It could be that the VBR codecs never let the bitrates dip down to the extent that they're allowed to increase.

Thanks for the response ff123, that sounds very plausible.

Cheers, Paul

  • westgroveg
  • [*][*][*][*][*]
128kbps Extension Test - OPEN
Reply #132
Quote
Still have my priviledge

Quote
Now, tell me what can you say overall about the average 128kbps quality (certain quality setting) based on results from those? Or actually any useful result.. I'd like to know...


The reason VBR exists is that a codec can be advanced enough to lower its bitrate and up its bit rate, but for this to be a fair test - especially with different sample types - for all we know on the harpsicord: ogg will go to an average of 200Kbps, whilst WMA goes to 128Kbps. Now you could say, tough luck WMA for not matching Ogg when it goes to 200Kbps, but you could also say that WMA is better programmed because it stays within its quality range and does not vary wildly. *** these codecs and numbers are totally made up ***

I am thinking then, if the codec has ABR avaliable it should be used in preferrence to VBR in this type of test.

The way I see it WMA would have failed to adapt & keep the selected QUALITY level.

  • loophole
  • [*][*][*][*]
128kbps Extension Test - OPEN
Reply #133
Quote
Quote
Also, Compressor which comes with Final Cut Pro 4 seems to have VBR options for AAC, not just ABR.

(for those who think QuickTime Pro is CBR, it isn't - it's ABR)


The purpose of the last test was to find the AAC codec to be used for this test.  Final Cut Pro 4 wasn't tested.

Final Cut Pro (by Apple) actually leverages QuickTime (also by Apple), it just displays a different interface which seems to allow VBR modes.

  • ezra2323
  • [*][*][*][*][*]
128kbps Extension Test - OPEN
Reply #134
Roberto - just curious, whay was WMA 9 not included, only the PRO version? Very few people here use WMA to begin with, and those that do are most likely using it because it has excellent hardware support. WMAPRO does not. I have tried to load these files on to my WMA compliant portables and they are not recognized.

However, I would be very interested to see how 2 pass VBR 128 WMA (not the professional version) stacks up against the competition since this is a very popular format with the new legitimate music sites popping up. Yes, I know they likley are using WMA CBR 128, but could probably be convinced to swithch to 2 pass 128 VBR if the quality gain was sufficient.

It would be interesting to see how the WMA offerings stack up against Apple's AAC offering.

  • ezra2323
  • [*][*][*][*][*]
128kbps Extension Test - OPEN
Reply #135
BTW - not questioning the test, I think its great! Just requesting WMA (regular, not pro) be added

  • guruboolez
  • [*][*][*][*][*]
  • Members (Donating)
128kbps Extension Test - OPEN
Reply #136
A public blind test can't include too much encodings. It will make the notation (and hierarchy) harder...
I did few tests between WMA std and WMA pro (one, on the 12 samples, is available on the OTHER AUDIO FORMATS section). In my mind, the gap between STD and PRO encoder is consequent. Too consequent maybe for keeping any hope on standard codec quality, against stronger formats. On the other side, WMApro is more mysterious. No test are available. No mention on its quality on Hydrogenaudio. How good is it ? Can this new format, created and supported by a giant, compete with Goliath MP4 or David Vorbis, in quality term ? We had to let this format a chance, and to test it, against the best challengers of the moment.

This test include the best formats available, and for each of them, the best codec at the best setting. Only exception (easy to understand) : mp3. It would be interesting to include wma standard format, but then, why not atrac3 ? Fraunhofer Fastenc VBR ~128 ? VQF 2.0... As I said before, a public test couldn't include too much challengers. Some choice were made, with dialogue. IMO, Roberto did the good one. Other people will be disappointed. That's life...

Nevertheless, if you're interested by wma standard performance, you can easily include yourself some encodings in each downloaded package.


About hardware support : I give more chance to WMApro to be widely support in the next two years on DVD/Portable than to vorbis. I hope to be wrong...

  • phong
  • [*][*][*][*]
128kbps Extension Test - OPEN
Reply #137
Quote
Added your comments (actually, did some copy-pastingĀ  hope you don't mind).

Honored.  :)

Quick question...  On a couple of the samples, I'm not having too much trouble abxing most or all of the codecs, but others are much harder for me (no surprise obviously).  If time is a limiting factor (I may only be able to set aside a small amount of time this week), which of the following would you prefer people to do:
a) A very careful analysis of a few of the samples, making every reasonable effort to distingush as many from the originals as is possible with their equipment and ears.
B) Try to do all 12 samples, at the expense of a few of the best encodings getting rated as "perfect" where more careful analysis would reveal some minor audible defects in some of them.
c) If you can't do your most careful analysis of all 12 samples, don't bother submitting results at all.
d) Whatever floats one's boat.  Have fun and don't stress too much.

From my interpretation of the readme, I doubt c) is the case.  :)  If a) is prefered, do you have a prefered method of chosing the samples?  Go down the list in order?  Pick randomly?  Do the easiest ones, thereby providing the most possible discriminating data?
  • Last Edit: 27 July, 2003, 04:49:30 PM by phong
I am *expanding!*  It is so much *squishy* to *smell* you!  *Campers* are the best!  I have *anticipation* and then what?  Better parties in *the middle* for sure.
http://www.phong.org/

  • rjamorim
  • [*][*][*][*][*]
128kbps Extension Test - OPEN
Reply #138
Quote
From my interpretation of the readme, I doubt c) is the case.    If a) is prefered, do you have a prefered method of chosing the samples?  Go down the list in order?  Pick randomly?  Do the easiest ones, thereby providing the most possible discriminating data?

Yes, c) definitely isn't the case.

It's really a matter of whatever floats your boat. Both a) and B) suit me well. And if you decide to go with a), I suggest taking files randomly. If people only do going through the order, I'll have too many 41_30sec samples and maybe too few of others, as happened on the AAC test.

Thanks for participating.

Regards;

Roberto.
Get up-to-date binaries of Lame, AAC, Vorbis and much more at RareWares:
http://www.rarewares.org

  • rjamorim
  • [*][*][*][*][*]
128kbps Extension Test - OPEN
Reply #139
Damned be that stupid B) smilie.

Couldn't some admin please replace it with :cool: or something?
Get up-to-date binaries of Lame, AAC, Vorbis and much more at RareWares:
http://www.rarewares.org

  • rjamorim
  • [*][*][*][*][*]
128kbps Extension Test - OPEN
Reply #140
Hello.

Tonight I'll have to pull the files from Paul's mirror because it's reaching the bandwidth consumption limit of 5Gb.

So, I would like to ask if someone with a reasonably fast server could spare me some 20Mb and some few Gb of bandwidth so that I can keep the files there until sunday. It consumed 5Gb these first 5 days, so I expect to consume as much until the end of the test.

If you can, please PM or mail me. You don't even need to give me login/password, just upload the packages to your server and send me the addresses.

Thank-you very much;

Roberto.
Get up-to-date binaries of Lame, AAC, Vorbis and much more at RareWares:
http://www.rarewares.org

  • ff123
  • [*][*][*][*][*]
  • Developer (Donating)
128kbps Extension Test - OPEN
Reply #141
I'm uploading the samples now to

http://ff123.net/128exten/Samplexx.zip

I have about 12 GB bandwidth to spare.

I have to leave for work, but they should be uploaded within the next half hour or so.  I'll verify that sample12.zip uploaded properly from work.

ff123

Edit:  changed the path
  • Last Edit: 28 July, 2003, 12:53:43 PM by ff123

  • verloren
  • [*][*][*]
128kbps Extension Test - OPEN
Reply #142
I've also made some more space available to Roberto - who knew it would be quite this popular!  I'm sure he'll let you know the details if he decides to use the space.

Cheers, Paul

  • rjamorim
  • [*][*][*][*][*]
128kbps Extension Test - OPEN
Reply #143
Wow. Thanks a lot, both of you
Get up-to-date binaries of Lame, AAC, Vorbis and much more at RareWares:
http://www.rarewares.org

  • puntloos
  • [*][*][*]
128kbps Extension Test - OPEN
Reply #144
I'm happy to set up some space on my own server as a backup mirror. I have, well, virtually unlimited bandwidth to spare on my gigabit uplink

If needed just contact me.

Also one question, I've read most of the comments here but didnt see it (I think)

Why not penalize 'naughty' codecs (say ogg using -q4 generating a 190kbit file) by just multiplying its end score for a particular file with 128/190? Sounds reasonable to me but maybe I'm overlooking something big here. In the end I 'respect' that a codec rightly thinks it needs to use 190kbit for a particular piece to keep the 'Quality level 4' there, but indeed when comparing it to a codec that is perhaps just as advanced but less frivoulous with bitrate allocating isnt fair.

"Yay, vorbis wins all tests by allocating 256kbit continuous for all samples)

p.s. just using vorbis as an example, feel free to replace that word with <your hated codec name here>

(uh for the record (people are using the term 'bandwidth' kinda loosely here).. I can easily deal with sending out 100Gb of data over a week. I can not deal with sending out 1Gb/second  )
  • Last Edit: 28 July, 2003, 12:50:38 PM by puntloos

128kbps Extension Test - OPEN
Reply #145
Sorry, but the test is frozen. JohnV will eat you alive! (either he or his tiger)
  • Last Edit: 28 July, 2003, 01:35:16 PM by AstralStorm
ruxvilti'a

  • puntloos
  • [*][*][*]
128kbps Extension Test - OPEN
Reply #146
Quote
Sorry, but the test is frozen. JohnV will eat you alive! (either he or his tiger)

Ah but Im not asking to change the test

I am just suggesting perhaps it would be interesting to see if the outcome of the test would differ in any meaningful way using the 'puntloos audio correction factors' 

  • guruboolez
  • [*][*][*][*][*]
  • Members (Donating)
128kbps Extension Test - OPEN
Reply #147
Quote
Why not penalize 'naughty' codecs (say ogg using -q4 generating a 190kbit file) by just multiplying its end score for a particular file with 128/190? Sounds reasonable to me but maybe I'm overlooking something big here.

Why "penalize" ? A VBR codec had to put more bits on complex signals. The 12 samples are complex, difficult : it's nonsense to expect an average bitrate close to 128 kbps, and completely stupid to punish a VBR codec for doing correctly his job.

I'm happy to see (or hear) mpc putting ~700 kbps frame on complex signals like castanets. I was glad to use --preset standard, introducing a lot of 320 kbps frame when needed, and compensate it on quiet/easy part. Every people here are using great format or setting. Each are VBR : that mean consequent variations, but at the end, an average bitrate -the same for most albums. Is there one reason to applaud on their qualily for listening purpose, and blame or punish them for testing ?

  • bond
  • [*][*][*][*][*]
128kbps Extension Test - OPEN
Reply #148
am i right that the test still hasnt been announced on slashdot?
I know, that I know nothing (Socrates)

  • puntloos
  • [*][*][*]
128kbps Extension Test - OPEN
Reply #149
Quote
Quote
Why not penalize 'naughty' codecs (say ogg using -q4 generating a 190kbit file) by just multiplying its end score for a particular file with 128/190? Sounds reasonable to me but maybe I'm overlooking something big here.

Why "penalize" ? A VBR codec had to put more bits on complex signals. The 12 samples are complex, difficult : it's nonsense to expect an average bitrate close to 128 kbps, and completely stupid to punish a VBR codec for doing correctly his job.

I'm happy to see (or hear) mpc putting ~700 kbps frame on complex signals like castanets. I was glad to use --preset standard, introducing a lot of 320 kbps frame when needed, and compensate it on quiet/easy part. Every people here are using great format or setting. Each are VBR : that mean consequent variations, but at the end, an average bitrate -the same for most albums. Is there one reason to applaud on their qualily for listening purpose, and blame or punish them for testing ?

Oh but as I said: I agree with a good VBR codec allocating HIGH amounts of bits to complex pieces, no problem there.

My dad taught me to always think in extremes when it comes to physics, so:

Suppose we have a piece with ONLY castanets? And MPC at -q4 (say) would create a 700kbps average file. Would you consider it fair to compare that file to (say) Vorbis that has encoded the same castanets file to 140kbps? Youpi! MPC file sounds better!! 

My point therefore is that even though VBR is a very valid way to encode music, and I have no problem at all if some codec I use goes above the 'indicated bitrate' if it feels it needs to. But when comparing these results I think a -certain- penalty must be given to the 700kbps output file. Im sure you agree that comparing the QUALITY of a file that averages at 700kbps with a file averaging at 140kbps isnt fair and will give skewed results when you try to determine the 'best codec'. I'm not a mathematician so Im not sure if my way of unskewing these results (multiply the 'score' of the 700kbps file with 128/700) is completely fair, but at least it will give results that closer matches 'fairness' in my mind.