Skip to main content
Topic: Multiformat@128 public listening test - CANCELLED (Read 36688 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Multiformat@128 public listening test - CANCELLED

Reply #25
Quote
I agree, I shouldn't.

But I will for sure be influenced by this clic or even the length by the tracks 

By the way even with the buffer @ 2000 ms, the noise is still here 

How can you tell that the tracks are different length?  Do they have a slightly different starting positions or are the ending points different?

ff123

Multiformat@128 public listening test - CANCELLED

Reply #26
Quote
Also sometimes just moving the starting position of the playback range using the selection bar down at the bottom will eliminate the clicking.

Yes, you can easily set the playback range to avoid the click. I've just gone through the kraftwork samples, concentrating on a middle part, and it works fine.

However, for 4 out of 6 samples, if you listen from the start then it's immediately obvious which one is the coded version. That's hardly "hidden reference"!



Quote
The java library lowers the volume, for some unknown reason. Hopefully this is not too serious of a problem, since this occurs for all samples equally.


Well, it's a good job we're not testing dithering or resampling!

It's not serious, but it's not right. As best it means increasing your volume control, at worst it may drop artefacts below the noise floor, or add extra ones.

I would have thought bit-perfect playback (where possible) was a pre-requisite of such a tool?

I agree it shouldn't hurt the current test. The click issue does though - I would think it completely biases the test towards the two coded versions which maintain the click, and so can't be immediately identified.

Cheers,
David.

Multiformat@128 public listening test - CANCELLED

Reply #27
Quote
Quote
Quote
The version of vorbis used by the test was old. The ogg file was not created by aoTuV beta2. (It's aoTuV experiment [20040402])  If aoTuV test page to "aoTuV beta2 Win32 OggEnc-locale fix" (file name is oggenc_aotuv_b2m.zip) downloads and a suitable file is encoded, it will output the file which is clearly different.

Oh no!

This is bad... 

Oh my god

I encoded hongroise.wav with both the beta 2 and 040402 versions of aotuv and compared the decoded wavs with the decompressed ogg file supplied in the download.  The spectral view is clearly consistent with beta 2 and not with 040402.  In the beta 2 version, the lowpass is varying above 17 kHz, and in 040402 the lowpass is a constant 17 kHz.

However, if I do a binary compare of the wavs, I find all three are different from each other.

ff123

Multiformat@128 public listening test - CANCELLED

Reply #28
Quote
Quote
I agree, I shouldn't.

But I will for sure be influenced by this clic or even the length by the tracks 

By the way even with the buffer @ 2000 ms, the noise is still here 

How can you tell that the tracks are different length?  Do they have a slightly different starting positions or are the ending points different?

ff123

well, it is written : for exemple the supposed length for "Hongroise" is 30.00
and when I play a file, it stopps at 29.99... Then I'm sure that its the encoded one 
It doesnt happen with all tracks, but it does happen with some of them

Multiformat@128 public listening test - CANCELLED

Reply #29
Quote
Quote
Quote
I agree, I shouldn't.

But I will for sure be influenced by this clic or even the length by the tracks 

By the way even with the buffer @ 2000 ms, the noise is still here 

How can you tell that the tracks are different length?  Do they have a slightly different starting positions or are the ending points different?

ff123

well, it is written : for exemple the supposed length for "Hongroise" is 30.00
and when I play a file, it stopps at 29.99... Then I'm sure that its the encoded one 
It doesnt happen with all tracks, but it does happen with some of them

Ah, visual evidence.

Well, I'll notify schnofler, so he can fix this for future versions.

ff123

Multiformat@128 public listening test - CANCELLED

Reply #30
Quote
(I think I would still prefere everything to be in one torrent in the future but it is a minor point.)

The problem with this approach is that some people don't want the entire samples package. Specially considering it's 165Mb in size. Some might want to doanload-as-they-test, others have the time to test one or two samples only and that's it...

If only the sample packages were smaller, but at 165kbps, it really wouldn't help much.

Multiformat@128 public listening test - CANCELLED

Reply #31
Quote
I found the problem.
The version of vorbis used by the test was old. The ogg file was not created by aoTuV beta2. (It's aoTuV experiment [20040402])  If aoTuV test page to "aoTuV beta2 Win32 OggEnc-locale fix" (file name is oggenc_aotuv_b2m.zip) downloads and a suitable file is encoded, it will output the file which is clearly different.

vendor name is
aoTuV experiment [20040402] = AO; aoTuV b2 test [20040402] (based on Xiph.Org's 1.0.1)
aoTuV beta2 [20040420] = AO; aoTuV b2 [20040420] (based on Xiph.Org's 1.0.1)
...

[EDIT]
Correction of an expression

What?

Multiformat@128 public listening test - CANCELLED

Reply #32
Quote
Just for the next test:

Would be nice to include a shell script/Makefile for decoding on the non-windows platforms. Wouldn't be too much work and sure is pretty easy.

Doing the decoding by hand is error-prone too.

The problem I have is that it is impossible to provide precompiled binaries for all the "important" platforms - Linux (all different and incompatible distros), MacOS X, BSD and Solaris. Also, I don't know about platform specifics. To decode Vorbis in Linux, I use oggdec or ogg123?

My experience in Linux is very small (my Linux partition in now absolutely fubar after a failed attempt of recognizing my gigabit NIC, that led to installing kernel 2.6, that broke everything completely), and I have 0 experience with OS X, Solaris and BSD.

I can add ErikS' script to the package, but I won't be able to provide support about it.

Regards;

Roberto.

Multiformat@128 public listening test - CANCELLED

Reply #33
Quote
Quote
I found the problem.
The version of vorbis used by the test was old. The ogg file was not created by aoTuV beta2. (It's aoTuV experiment [20040402])  If aoTuV test page to "aoTuV beta2 Win32 OggEnc-locale fix" (file name is oggenc_aotuv_b2m.zip) downloads and a suitable file is encoded, it will output the file which is clearly different.

vendor name is
aoTuV experiment [20040402] = AO; aoTuV b2 test [20040402] (based on Xiph.Org's 1.0.1)
aoTuV beta2 [20040420] = AO; aoTuV b2 [20040420] (based on Xiph.Org's 1.0.1)
...

[EDIT]
Correction of an expression

What?

He pointed that Vorbis samples were created by a wrong encoder. He noticed that the samples were not created by "aoTuV b2" encoder (released on April 20) but by "aoTuV b2 test" encoder (released April 4), which was a pre-beta/test version toward aoTuV b2.

I also checked vender string of one sample and confirmed, as he said, that the vender string was "AO; aoTuV b2 test [20040402] (based on Xiph.Org's 1.0.1)", which is to say the file was created by the very test version (aoTuV b2 test). Sad to say, you used "aoTuV experiment [20040402] Win32 OggEnc - locale fix" instead of "aoTuV beta2 Win32 OggEnc - locale fix" in his web site

Multiformat@128 public listening test - CANCELLED

Reply #34
Congratulations to harashin, that is the first participant to submit all 18 results.

(@harashin: did you sleep at all last night? :B )

Multiformat@128 public listening test - CANCELLED

Reply #35
Quote
He pointed that Vorbis samples were created by a wrong encoder. He noticed that the samples were not created by "aoTuV b2" encoder (released on April 20) but by "aoTuV b2 test" encoder (released April 4), which was a pre-beta/test version toward aoTuV b2.

I also checked vender string of one sample and confirmed, as he said, that the vender string was "AO; aoTuV b2 test [20040402] (based on Xiph.Org's 1.0.1)", which is to say the file was created by the very test version (aoTuV b2 test). Sad to say, you used "aoTuV experiment [20040402] Win32 OggEnc - locale fix" instead of "aoTuV beta2 Win32 OggEnc - locale fix" in his web site

Oh, joy. Guess I downloaded the wrong locale fix.

What do we do now?

Multiformat@128 public listening test - CANCELLED

Reply #36
Quote
Quote
He pointed that Vorbis samples were created by a wrong encoder. He noticed that the samples were not created by "aoTuV b2" encoder (released on April 20) but by "aoTuV b2 test" encoder (released April 4), which was a pre-beta/test version toward aoTuV b2.

I also checked vender string of one sample and confirmed, as he said, that the vender string was "AO; aoTuV b2 test [20040402] (based on Xiph.Org's 1.0.1)", which is to say the file was created by the very test version (aoTuV b2 test). Sad to say, you used "aoTuV experiment [20040402] Win32 OggEnc - locale fix" instead of "aoTuV beta2 Win32 OggEnc - locale fix" in his web site

Oh, joy. Guess I downloaded the wrong locale fix.

What do we do now?

Encode again and restart the test.
Someone could say that it might have been easier just to use the official Vorbis instead of the 3rd party tweaks, but I'm not saying that. 
Weird that you weren't informed better about the version. I would have thought that the Vorbis enthusiasts had made sure that you get the right version.
Juha Laaksonheimo

Multiformat@128 public listening test - CANCELLED

Reply #37
Quote
Encode again and restart the test

yepa, i dont think that that many people already started...
basically we had this problem already one time, it should be still enough time to restart
I know, that I know nothing (Socrates)

Multiformat@128 public listening test - CANCELLED

Reply #38
Quote
Quote
Quote
I found the problem.
The version of vorbis used by the test was old. The ogg file was not created by aoTuV beta2. (It's aoTuV experiment [20040402])  If aoTuV test page to "aoTuV beta2 Win32 OggEnc-locale fix" (file name is oggenc_aotuv_b2m.zip) downloads and a suitable file is encoded, it will output the file which is clearly different.

vendor name is
aoTuV experiment [20040402] = AO; aoTuV b2 test [20040402] (based on Xiph.Org's 1.0.1)
aoTuV beta2 [20040420] = AO; aoTuV b2 [20040420] (based on Xiph.Org's 1.0.1)
...

[EDIT]
Correction of an expression

What?

He pointed that Vorbis samples were created by a wrong encoder. He noticed that the samples were not created by "aoTuV b2" encoder (released on April 20) but by "aoTuV b2 test" encoder (released April 4), which was a pre-beta/test version toward aoTuV b2.

I also checked vender string of one sample and confirmed, as he said, that the vender string was "AO; aoTuV b2 test [20040402] (based on Xiph.Org's 1.0.1)", which is to say the file was created by the very test version (aoTuV b2 test). Sad to say, you used "aoTuV experiment [20040402] Win32 OggEnc - locale fix" instead of "aoTuV beta2 Win32 OggEnc - locale fix" in his web site

There must be more to this explanation, though.  Because like I wrote in my previous post, I did a spectral view comparison of the two encodes of Hongroise vs. what's in the download package and it indicates that it's clearly consistent with the beta2 version (using oggenc_aotuv_b2m.zip), not with the 040402 version.

ff123

Multiformat@128 public listening test - CANCELLED

Reply #39
Quote
The spectral view is clearly consistent with beta 2 and not with 040402.  In the beta 2 version, the lowpass is varying above 17 kHz, and in 040402 the lowpass is a constant 17 kHz.

However, if I do a binary compare of the wavs, I find all three are different from each other.

all three are different? 
strange
I know, that I know nothing (Socrates)

Multiformat@128 public listening test - CANCELLED

Reply #40
Quote
Quote
I have the same problem as Zurman - this time with the Kraftwork sample

The .wavs are fine, but when played in ABCHR, there's a click at the start of the original, but not all the coded versions (2 out of six have clicks).

Maybe ABCHR is reading some data in the wav file as audio?

This is a very serious problem.

I'm using WinXP, audiophile 2496, just installed Java.


Also, are the files being ReplayGained in ABCHR? If so, why?! If not, why is playback so quiet?

The java library lowers the volume, for some unknown reason.  Hopefully this is not too serious of a problem, since this occurs for all samples equally.

That's correct, but there's another reason. One of the encoded samples is considerably quieter than all the others (about 3dB), so all the samples are brought down to this lower level, because of gain correction.

This volume issue in the Java Sound implementation is a terrible blunder, indeed, but Sun don't care much about sound playback, so chances are it's not going to be corrected anytime soon.

About the clicking noises and the session saving bug: I'm really sorry about this, and I will try looking into it, but unfortunately I'm also insanely busy at the moment, so I can't promise anything.

Multiformat@128 public listening test - CANCELLED

Reply #41
First of all, I want to thank rjamorim for arranging this test. It's is always interesting to see the results.

This time I decided to have a go myself, but there are a few thing that puts me off (some are mentioned already):[/i]
1) Of the samples I have tried so far, I can positively ABX them as many times as I want to, even with the TV on, 1 meter behind my head. The volume difference is just way to obvious.
2) The click I hear when I change between samples distracts me quite a bit. Playing around with the buffer setting didn't seem to make a difference.
3) When looping a selected range and changing between samples, it doesn't always start over when it has reached the end. When this occures it plays futher with an amount equal to the selected range. This seems to happen if you happen to change sample right at the end of the selected range.

I hope some of these are related to the fact that this is my first attempt to participate. If so, please enlighten me.

Two features I would like to see:[/i]
A) When looping a small part, it would be nice if there was an option to start from the beginning every time you switch sample. Not continue on as it does now.
B) Highlighting of the play button of the playing sample, even when using keyboard shortcuts.

I guess that's enough whining for now 

Edit: Spelling

Multiformat@128 public listening test - CANCELLED

Reply #42
OK, this test seems to be a big nasty failure, first because I fucked up and downloaded the wrong version, and second because Atrac is being gay (anyone amazed?) and bringing the volume down.

People agree I should call it off?

Multiformat@128 public listening test - CANCELLED

Reply #43
Quote
People agree I should call it off?

Yeah, and a big thanks from me too, for spending the time you do setting up these tests.
daefeatures.co.uk

Multiformat@128 public listening test - CANCELLED

Reply #44
Quote
I think the answer to this would have to be no.
Discussion would bias the results.

ff123


Can you explain why? Biased comared to what? Naive population?

My main question would be:

What is the intention of the author of this test? What is his research question?

1) Find out non-random semi-naive population's ability to distinguish and rate differences between codec encodings?

2) Find the differences of the encodings (not the population) as well as practically possible with the help of HA forum members, in order to rank the encoded results under scrutiny as accurately as possible?

If it's 1) then we should not discuss (unless they are obvious click/length issues as discussed).

However if it is 2) then by all means we should discuss it and in the future we should have even more accurate results as the tester population advances from naive to expert  slowly.

If I have misunderstood, please feel free to correct me. I mean no harm, I'm only trying to help.

regards,
  halcyon

Multiformat@128 public listening test - CANCELLED

Reply #45
Well, why not just wavegain the ATRAC sample... I doubt it will introduce any audible aritfacts. Hell, why not just wavegain them all, then at least you'll be playing on even ground, and won't have to worry about varying volume levels.

In response to the above, discussing test results causes listener bias in people who have yet to actually do the listening, since they will expect certain results (much like someone listening to DVD-A expects it to sound better than CD, so they think it does).

Either discussion of results should be saved until after the test is finished, or persons who have not yet completed the tests should be barred from reading the comments.

Multiformat@128 public listening test - CANCELLED

Reply #46
Quote
OK, this test seems to be a big nasty failure, first because I fucked up and downloaded the wrong version, and second because Atrac is being gay (anyone amazed?) and bringing the volume down.

People agree I should call it off?

Maybe the best. But you should do it quickly before some nasty guy announces it on /.

The problem with the java abchr should be investigated too.

edit: It's better to cancel the test then to end up with senseless results.

Multiformat@128 public listening test - CANCELLED

Reply #47
Quote
People agree I should call it off?

I agree. Reencode ogg, possibly increase volume on the atrac sample, and start again.

(Sorry Harashin and everyone else who got started quickly)

Multiformat@128 public listening test - CANCELLED

Reply #48
Quote
People agree I should call it off?

OMG. This is bad. Really bad. But IMHO yes you should.

OR another possibility would be to change the packages now, exclude ATRAC and include correct vorbis. Also provide correct vorbis encodes seperately and make new setup files. Maybe that's a lot of work though and people should cooperate too. 

Too bad some had already done some serious amount of listening.

regards
echo

Multiformat@128 public listening test - CANCELLED

Reply #49
Darn.  I even installed Java last night just so I could try ABXing for the first time.  But the rest are right.  It appears the test went off the railroad tracks from the very beginning, with little to no room for recovery.  Some things just aren't meant to be.

Regardless, thanks Roberto for running these tests and helping out the community.

 
SimplePortal 1.0.0 RC1 © 2008-2019