Skip to main content
Topic: History and Accreditation of ABX Testing/Invention? (Read 20233 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

History and Accreditation of ABX Testing/Invention?

Reply #50
Perhaps the lack of killer samples for jitter reflects the lack of HDMI jitter being audible in practice for human listeners.  (I seem to recall Ethan Winer suggesting something to the effect that jitter should not normally be noticeable with modern equipment.)


AFAIK pretty much any artifact can be made pathological enough to be audible.

But again, would that reflect real-world situations, and would they be common cases,  or corner cases?

Audiophiles of the Stereophile/TAS variety, and their industry shills, are poised to treat such differences as if they 'matter' intensely, yet, curiously, things that make quite a large difference to people in normal  music listening --  the production and mastering quality of the deliverables, and the effect of the playback environment (loudspeakers and rooms)  -- tend not to be their focus.


Audiophiles and their industry shills would rather natter on about the dangers of  HDMI jitter, apparently.

History and Accreditation of ABX Testing/Invention?

Reply #51
If there is no source control for segment selection and looping, I don't think adding user control gets us to where we want to be.


It is a matter of the right tool for the job at hand, and the job at hand is not always the same.

If the listeners need to be controlled, then it makes sense to take control away from them.

If the listeners are themselves investigators, then giving more control to them makes sense.

Quote
Yes, it is better than timed switching.  But one can't compare the past to the future as was done with the ABX comparator itself.


Amir it was you yourself that tried to conflate the 1950 ABX with the 1982 ABX until I documented your trickery. Trying to make all such comparisons seem invalid is just more trickery.

Quote
Quote
Much of an ABX test, however, comes in the material selection, listener training, listener comfort, and presence of listener feedback.

A failing unfortunately for just about every test cited on audio forums as proof of this and that.  The outcomes may still be right but the protocol certainly not.


All empty rhetoric coming from someone who sells audio gear based on sighted evaluations with no controls at all.  For examples of such behavior:







History and Accreditation of ABX Testing/Invention?

Reply #52
The name ABX was among other things a play on my initials. "ABK's ABX test". ;-)


Sort of like THX has been said by some to stand for Tomlinson Holman's eXperiment, no?  [some sources say X is for "crossover", not eXperiment, and there seems to be some debate if the original movie THX 1138 was just Lucas' telephone number while in college or was selected for aesthetic qualities.

History and Accreditation of ABX Testing/Invention?

Reply #53
I have never heard an audio file that incorporates an intentionally large amount of jitter, introduced for demonstration purposes, to hear what really excessive jitter actually sounds like.  (Perhaps someone here at HA might be kind enough to provide a link to such a file.)

I now see that ABK in June this year furnished files for this forum demonstrating a range of levels of jitter severity: Jitter Listening test files.

History and Accreditation of ABX Testing/Invention?

Reply #54
I have never heard an audio file that incorporates an intentionally large amount of jitter, introduced for demonstration purposes, to hear what really excessive jitter actually sounds like.  (Perhaps someone here at HA might be kind enough to provide a link to such a file.)

I now see that ABK in June this year furnished files for this forum demonstrating a range of levels of jitter severity: Jitter Listening test files.

Arny post those on AVS Four a few months ago.  Here you can see me passing them .  http://www.avsforum.com/forum/91-audio-the...ml#post25081490

[quote author=Amir on AVS link=msg=0 date=]foo_abx 1.3.4 report
foobar2000 v1.3.2
2014/06/18 16:39:16

File A: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\no jitter.wav
File B: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\30 Hz Severe Jitter 0.05.flac

16:39:16 : Test started.
16:39:52 : 01/01 50.0%
16:40:26 : 02/02 25.0%
16:40:38 : 03/03 12.5%
16:40:48 : 04/04 6.3%
16:40:58 : 05/05 3.1%
16:41:07 : 06/06 1.6%
16:41:24 : 07/07 0.8%
16:41:32 : 08/08 0.4%
16:41:51 : 09/09 0.2%
16:42:04 : 10/10 0.1%
16:42:12 : 11/11 0.0%
16:42:21 : 12/12 0.0%
16:42:43 : Test finished.

----------
Total: 12/12 (0.0%)

=============
foo_abx 1.3.4 report
foobar2000 v1.3.2
2014/06/18 16:33:19

File A: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\30 Hz max jitter 0.1.flac
File B: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\no jitter.wav

16:33:19 : Test started.
16:34:25 : 01/01 50.0%
16:34:38 : 02/02 25.0%
16:34:58 : 03/03 12.5%
16:35:16 : 04/04 6.3%
16:35:26 : 05/05 3.1%
16:35:41 : 06/06 1.6%
16:35:54 : 07/07 0.8%
16:36:34 : 08/08 0.4%
16:36:44 : 09/09 0.2%
16:36:54 : 10/10 0.1%
16:37:03 : 11/11 0.0%
16:37:13 : 12/12 0.0%
16:38:05 : Test finished.

----------
Total: 12/12 (0.0%)

============

foo_abx 1.3.4 report
foobar2000 v1.3.2
2014/06/18 16:44:39

File A: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\no jitter.wav
File B: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\30 Hz jitter strong level .025.flac

16:44:39 : Test started.
16:45:05 : 01/01 50.0%
16:45:15 : 02/02 25.0%
16:45:28 : 03/03 12.5%
16:45:36 : 04/04 6.3%
16:45:54 : 05/05 3.1%
16:46:17 : 06/06 1.6%
16:46:29 : 07/07 0.8%
16:46:45 : 08/08 0.4%
16:46:55 : 09/09 0.2%
16:47:05 : 10/10 0.1%
16:47:19 : 11/11 0.0%
16:47:33 : 12/12 0.0%
16:47:36 : Test finished.

----------
Total: 12/12 (0.0%)

=====

foo_abx 1.3.4 report
foobar2000 v1.3.2
2014/06/18 19:04:40

File A: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\no jitter.wav
File B: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\30 Hz noticable jitter 0.0125.flac

19:04:40 : Test started.
19:05:27 : 01/01 50.0%
19:05:54 : 02/02 25.0%
19:06:19 : 03/03 12.5%
19:06:35 : 04/04 6.3%
19:06:57 : 05/05 3.1%
19:07:16 : 06/06 1.6%
19:07:43 : 07/07 0.8%
19:08:15 : 08/08 0.4%
19:08:37 : 09/09 0.2%
19:09:05 : 10/10 0.1%
19:09:30 : 11/11 0.0%
19:10:05 : 12/12 0.0%
19:10:09 : Test finished.

----------
Total: 12/12 (0.0%)
[/quote]

Only one other person attempted to run then and he only passed the easiest one although my memory is not very clear.  I could not get Arny to run and report his results or for that matter, anyone else.

That point aside, this is not a useful test of "jitter."  Arny has picked very low frequency of 30 Hz for the jitter which means masking is quite strong.  Sources of jitter can and do have wide ranging spectrum not represented by this one example.

Would be good to see if anyone here will attempt to run Arny's test and report on it.  I noticed no results reported in the thread he post here either.

Anyone thinking there is no such thing as critical listeners, should give the above a listen and report back.  I think they will find that they are mistaken in that regard .
Amir
Retired Technology Insider
Founder, AudioScienceReview.com

History and Accreditation of ABX Testing/Invention?

Reply #55
That point aside, this is not a useful test of "jitter."  Arny has picked very low frequency of 30 Hz for the jitter which means masking is quite strong.


I didn't pick 30 Hz out of the air, it was a jitter frequency that frequently showed up in a relatively large number of tests of contemporary HDMI-based A/V gear. Its origin is that HDMI audio packets are interleaved with HDMI video, and the interleaving technique is based on the video frame rate. Video frame rates in the 24-30 Hz range are very common so jitter in this frequency range is seen quite frequently in real-world AV gear.

Making an issue out of masking shows a lack of understanding of the audibility of FM distortion. Masking is not the only phenomenon that affects audibility.  It is a complex area, but Zwiker and Fastl's iconic book Psychoacoustics Facts and Models has quite a bit to say about this, which I have tried to follow.

Quote
Sources of jitter can and do have wide ranging spectrum not represented by this one example.


I tried to negotiate another test with Amir based on other jitter frequencies, but he demanded that I precisely duplicate some very complicated jitter spectra which IME would take a lot of work and prove nothing new.

I agree that the 30 Hz tests are not definitive, and would like build some tests involving much higher but real world relevant frequencies.

Quote
Would be good to see if anyone here will attempt to run Arny's test and report on it.  I noticed no results reported in the thread he post here either.

Anyone thinking there is no such thing as critical listeners, should give the above a listen and report back.  I think they will find that they are mistaken in that regard .


Straw man argument. If course there are people who for whatever reason including hearing defects and listener training have thresholds of audibility that is different from others. It may not be well known that certain hearing defects may make some listeners more sensitive to certain artifacts than others with more normal hearing.

The tests I provided, like many others that I've provided from time to time,  include extensive files for incremental listener training. Positive results will be obtained, but the thresholds that will evolve may not please everybody who thinks that jitter is a serious problem with modern gear.

History and Accreditation of ABX Testing/Invention?

Reply #56
Would be good to see if anyone here will attempt to run Arny's test and report on it.  I noticed no results reported in the thread he post here either.

I have added two ABX reports, and some comments, at post #7 of Arny's jitter test files thread.

History and Accreditation of ABX Testing/Invention?

Reply #57
I have never heard an audio file that incorporates an intentionally large amount of jitter, introduced for demonstration purposes, to hear what really excessive jitter actually sounds like.  (Perhaps someone here at HA might be kind enough to provide a link to such a file.)

I now see that ABK in June this year furnished files for this forum demonstrating a range of levels of jitter severity: Jitter Listening test files.

Arny post those on AVS Four a few months ago.  Here you can see me passing them .  http://www.avsforum.com/forum/91-audio-the...ml#post25081490

[quote author=Amir on AVS link=msg=0 date=]foo_abx 1.3.4 report
foobar2000 v1.3.2
2014/06/18 16:39:16

File A: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\no jitter.wav
File B: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\30 Hz Severe Jitter 0.05.flac

16:39:16 : Test started.
16:39:52 : 01/01 50.0%
16:40:26 : 02/02 25.0%
16:40:38 : 03/03 12.5%
16:40:48 : 04/04 6.3%
16:40:58 : 05/05 3.1%
16:41:07 : 06/06 1.6%
16:41:24 : 07/07 0.8%
16:41:32 : 08/08 0.4%
16:41:51 : 09/09 0.2%
16:42:04 : 10/10 0.1%
16:42:12 : 11/11 0.0%
16:42:21 : 12/12 0.0%
16:42:43 : Test finished.

----------
Total: 12/12 (0.0%)

=============
foo_abx 1.3.4 report
foobar2000 v1.3.2
2014/06/18 16:33:19

File A: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\30 Hz max jitter 0.1.flac
File B: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\no jitter.wav

16:33:19 : Test started.
16:34:25 : 01/01 50.0%
16:34:38 : 02/02 25.0%
16:34:58 : 03/03 12.5%
16:35:16 : 04/04 6.3%
16:35:26 : 05/05 3.1%
16:35:41 : 06/06 1.6%
16:35:54 : 07/07 0.8%
16:36:34 : 08/08 0.4%
16:36:44 : 09/09 0.2%
16:36:54 : 10/10 0.1%
16:37:03 : 11/11 0.0%
16:37:13 : 12/12 0.0%
16:38:05 : Test finished.

----------
Total: 12/12 (0.0%)

============

foo_abx 1.3.4 report
foobar2000 v1.3.2
2014/06/18 16:44:39

File A: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\no jitter.wav
File B: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\30 Hz jitter strong level .025.flac

16:44:39 : Test started.
16:45:05 : 01/01 50.0%
16:45:15 : 02/02 25.0%
16:45:28 : 03/03 12.5%
16:45:36 : 04/04 6.3%
16:45:54 : 05/05 3.1%
16:46:17 : 06/06 1.6%
16:46:29 : 07/07 0.8%
16:46:45 : 08/08 0.4%
16:46:55 : 09/09 0.2%
16:47:05 : 10/10 0.1%
16:47:19 : 11/11 0.0%
16:47:33 : 12/12 0.0%
16:47:36 : Test finished.

----------
Total: 12/12 (0.0%)

=====

foo_abx 1.3.4 report
foobar2000 v1.3.2
2014/06/18 19:04:40

File A: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\no jitter.wav
File B: C:\Users\Amir\Music\Arny's 30 Hz Jitter File\Arny's new files\30 Hz noticable jitter 0.0125.flac

19:04:40 : Test started.
19:05:27 : 01/01 50.0%
19:05:54 : 02/02 25.0%
19:06:19 : 03/03 12.5%
19:06:35 : 04/04 6.3%
19:06:57 : 05/05 3.1%
19:07:16 : 06/06 1.6%
19:07:43 : 07/07 0.8%
19:08:15 : 08/08 0.4%
19:08:37 : 09/09 0.2%
19:09:05 : 10/10 0.1%
19:09:30 : 11/11 0.0%
19:10:05 : 12/12 0.0%
19:10:09 : Test finished.

----------
Total: 12/12 (0.0%)


Only one other person attempted to run then and he only passed the easiest one although my memory is not very clear.  I could not get Arny to run and report his results or for that matter, anyone else.

That point aside, this is not a useful test of "jitter."  Arny has picked very low frequency of 30 Hz for the jitter which means masking is quite strong.  Sources of jitter can and do have wide ranging spectrum not represented by this one example.

Would be good to see if anyone here will attempt to run Arny's test and report on it.  I noticed no results reported in the thread he post here either.

Anyone thinking there is no such thing as critical listeners, should give the above a listen and report back.  I think they will find that they are mistaken in that regard .
[/quote]
Why is the no jitter file in your tests a wav while the others are flac? On Arnys download all is flac. Maybe no jitter.wav was by accident a file with silence? Can you repeat this with the new abx plugin?
Is troll-adiposity coming from feederism?
With 24bit music you can listen to silence much louder!

History and Accreditation of ABX Testing/Invention?

Reply #58
I find foobar ABX plug-in woefully inadequate for finding small differences.

Here you can see me passing them . 
[quote author=Amir on AVS link=msg=0 date=]foo_abx 1.3.4 report
foobar2000 v1.3.2
[/quote]
Well, which is it?
It seems ABX works just fine for showing audible differences, when on the agenda.
Cognitive load doesn't seem too high with you passing all these completely unsupervised ABX online tests Amir.

Or did you use alternate methods of finding these Windows files differences?


Let's make up our minds whether ABX is a valid tool for discerning audible differences, not just when you have something to pitch, like Hi-Re$, DACs, etc.

cheers,

AJ
Loudspeaker manufacturer

History and Accreditation of ABX Testing/Invention?

Reply #59
Why is the no jitter file in your tests a wav while the others are flac? On Arnys download all is flac.

I decoded the files to instrument them so when playing, I just grabbed them by name, not extension.

Quote
Maybe no jitter.wav was by accident a file with silence?

Ha?  You mean when I heard music when playing A and B I was imagining it in one of the cases?  You say based on me using .wav vs .flac?

Quote
Can you repeat this with the new abx plugin?

Won't do.  But you can try and see if you can pass them as MLXXX just did.
Amir
Retired Technology Insider
Founder, AudioScienceReview.com

History and Accreditation of ABX Testing/Invention?

Reply #60
I find foobar ABX plug-in woefully inadequate for finding small differences.

Here you can see me passing them . 
[quote author=Amir on AVS link=msg=0 date=]foo_abx 1.3.4 report
foobar2000 v1.3.2

Well, which is it?
It seems ABX works just fine for showing audible differences, when on the agenda. [/quote]
The discussion was not around "ABX" but the specific instantiation of it in foobar plug-in as you quoted in my post.

Read my more detailed explanation: http://www.avsforum.com/forum/91-audio-the...ml#post25131921
[quote author=Amir on AVS link=msg=0 date=][quote author=Arny on AVS link=msg=0 date=]
FOOBAR2000 has its Set Start and Set End feature to allow people to exploit this sort of thing. That's the standard implementation of this kind of feature and is IME about as good as it gets.[/quote]
As good as it gets? I think it is one of the worst implementations. So that everyone can follow us, here is the sample picture of the UI:

First problem is that when you move the slider it does not update the time code. You have to hit the Start or End buttons to see it. This means a bunch of hunting with a slider to try to get it back to the same segment.

The slider cannot be stretched which means that it is very difficult to get it down to tenth of a second which is what you need when you want to isolate say, a single guitar pick.

When you hit the "set start" button nothing happens. It actually remembers that point but it shows no visual notification. It is only when you hit the end button that it shows where the start was. I want to see that marker so that for example I can give myself 1 second after that. Without seeing where the start position is, I wind up with a pain in the neck trial and error.

Then there is lack of looping. It plays from start to end of that segment but won't repeat. You want it to repeat so that you then switch inputs and hear their effects.

The other problem of switching tracks. There is no soft dissolve so you get a click as you switch between the input tracks.

Compounding all of this is that every time you start the tool it forgets it settings so all the pain repeats to adjust the slider.

And then there is the undiscoverable way of saving the results. You have to hit exit and then it gives you a chance to save the results. The file name however is blank. It should instead start with the file names. The ABX box often hides the tracks and when you are in the save dialog box, you cannot move it to see your track names.

All of this make it a lot harder to setup the right test. I gave up and just used the first segment I found. It was just too much work to try to locate others.

So if one wants to have less revealing results, you can say this is a good tool. Otherwise, it leaves a lot to be desired versus ones written by people who do this for a living.
[/quote]

And example of a better done program: http://www.avsforum.com/forum/91-audio-the...ml#post25144066

[quote author=Amir on AVS link=msg=0 date=][quote author=Arny on AVS link=msg=0 date=]If FOOBAR2000 is the worst ABX Comparator, then you should be able to quite easily provide us with a link to a similarly-priced but far better product.[/quote]
Good morning Arny. I see you were up late posting these messages. Hopefully you didn't do it on my account .

Anyway, back to your post, you said and I quote: "FOOBAR2000 has its Set Start and Set End feature to allow people to exploit this sort of thing. That's the standard implementation of this kind of feature and is IME about as good as it gets. "

I responded by explaining how those features are poorly implemented and gave examples of why. My reference is proper tools designed in-house for this kind of work and spending years using them and realizing the benefits of the features that I talked about. And it was not just our tools at Microsoft that was this way. Same was true of the tools at other companies like Dolby that I had used in independent shootouts. That is my reference and not some other free tool you are asking me to produce.

All the failings of Foobar ABX would tilt the game in favor of finding no difference. So I get why we defend it and call it "as good as it gets." To someone who has not used any other tools with superior abilities, they would accept that and then wonder why they are not hearing the differences that are really there, but hard to isolate with Foobar ABX.

Now that you know the right feature set you can do your own search and find the right tool. Look at their UI and see if they have the features I was talking about. Here is a good example of a free tool that covers a lot of the bases that Foobar ABX does not: http://lacinato.com/cm/software/othersoft/abx.



Right away you see the Loop button as highlighted in the above sample screenshot. If you use the tool you will see that the sliders work without having to play anything. Better yet, there are two of them so you can position them closely to each other. In Foobar, the slider handle is thick and there is only one of it. This means that if you place the start and stop at say one second apart, you can't even tell that you have done that! They handle covers that short start-stop blue line.

Notice the ability to set gains as to equalize volumes if necessary.

The "offset" comes in handy for your track specifically. You have that deafening tone at the beginning which you need to skip over. And this lets you do that.

Is it perfect? No. It uses Java runtime and while it says it doesn't need a separate download, it asked me to do exactly that. The link at Oracle downloads version 1.7 and the tool wants 1.8. Good luck finding that version on Oracle site. But you can find it and once there the program runs without any installations. Simply double click on the .exe and it runs.

As much as I despise Java, in this case it does enable cross platform ability from Linux to Mac.

Another miss is not being able to enter the timecode directly instead of using the sliders.

And the best part is that I don't have to look at the silly mask of an alien in my task bar from Foobar. 

Anyway go ahead and use it and then let us know if Foobar ABX is still as good as it gets Arny
.[/quote]

That aside, a craftsman can build a nice cabinet using a dull chisel.  It just annoys him more and takes longer.  But a lay person may give up altogether when given the same tool.  That is the problem here.  To the extent we recruit lay people from every forum to run these tests and generate data that we rely on, then we must do everything in our power to make their job easier and remove frustration factor especially at the start.  Most people give up running the test at all after the first instance of comparison.  Poor user interface heavily encourages that outcome showing once again that we have a loaded dice in the implementation of ABX as utilized in forums.

I am more persistent and work harder at getting around too limitations.  So don't use my results to judge the correctness of the tool.
Amir
Retired Technology Insider
Founder, AudioScienceReview.com

History and Accreditation of ABX Testing/Invention?

Reply #61

Well, which is it?
It seems ABX works just fine for showing audible differences, when on the agenda.

The discussion was not around "ABX" but the specific instantiation of it in foobar plug-in as you quoted in my post. 


Your usual crap/obfuscation. the thread title is History and Accreditation of ABX Testing/Invention?, your original post makes no mention of Foobar. The discussion is History and Accreditation of ABX Testing/Invention?, not your smokescreens.

I find foobar ABX plug-in woefully inadequate for finding small differences.

Here you can see me passing them . 
[quote author=Amir on AVS link=msg=0 date=]foo_abx 1.3.4 report
foobar2000 v1.3.2
[/quote]
Again, is ABX a valid method for ascertaining audible differences, or are your ABX logs worthless due to "Cognitive load" (of BS) or alternate methods used, etc?

cheers,

AJ
Loudspeaker manufacturer

History and Accreditation of ABX Testing/Invention?

Reply #62
There are a few tools to choose from. As an experienced user, I find all of them more than adequate except in the past some of the java based tools glitched horribly on my system. They're OK now.

I know that I can't even imagine what a naive user would make of these tools, so I'm not even going to try.


I don't buy this "it's so difficult - how can anyone be expected to manage this" argument. I assume the context is that we're taking people who think sighted testing is OK, and are trying to get them to blind test? So they're coming from sighted tests where there's no easy way to listen to the same segment twice, no easy way to synchronise sources, sometimes no easy way to switch quickly at all etc etc. Then you give them a double-blind test, solve all these problems, but it's suddenly too hard to hear a difference? Come on. We both know why it's too hard to hear the difference: no one is telling them what to expect any more. Boo Hoo.

Cheers,
David.

History and Accreditation of ABX Testing/Invention?

Reply #63
I don't buy this "it's so difficult - how can anyone be expected to manage this" argument. I assume the context is that we're taking people who think sighted testing is OK, and are trying to get them to blind test? So they're coming from sighted tests where there's no easy way to listen to the same segment twice, no easy way to synchronise sources, sometimes no easy way to switch quickly at all etc etc. Then you give them a double-blind test, solve all these problems, but it's suddenly too hard to hear a difference? Come on. We both know why it's too hard to hear the difference: no one is telling them what to expect any more. Boo Hoo.

Well, you need to "buy it" because I have concrete data to back what I said, and contra to your statement.

When we had this very discussion/tests on AVS Forum, Mark Henninger who is an AVS Writer and influential member, constantly ridiculed anyone being able to pass these tests.  He proceeded to post his results repeatedly showing random outcomes:

[quote author=imagic on AVS link=msg=0 date=]In these tests, I tried to increase the number of iterations in order to decrease the margin of error.

PC -> Optical S/PDIF -> Pioneer Elite SC-55 -> Sony MDR-1R

foo_abx 1.3.4 report
foobar2000 v1.3.2
2014/07/15 08:40:55

File A: E:\AVS\Foobar ABX\Jangling Keys\keys jangling band resolution limited 4416 2496.wav
File B: E:\AVS\Foobar ABX\Jangling Keys\keys jangling full band 2496.wav

08:40:55 : Test started.
08:41:50 : 01/01 50.0%
08:42:08 : 01/02 75.0%
[...]
08:47:00 : 13/29 77.1%
08:47:07 : 13/30 81.9%
08:47:13 : Test finished.

----------
Total: 13/30 (81.9%)


and...

foo_abx 1.3.4 report
foobar2000 v1.3.2
2014/07/15 09:14:40

File A: E:\AVS\Foobar ABX\Jangling Keys\keys jangling band resolution limited 3216 2496.wav
File B: E:\AVS\Foobar ABX\Jangling Keys\keys jangling full band 2496.wav

09:14:40 : Test started.
09:15:33 : 00/01 100.0%
09:15:45 : 00/02 100.0%
09:16:20 : 01/03 87.5%
[...]
09:22:30 : 11/29 93.2%
09:22:45 : 11/30 95.1%
09:22:48 : Test finished.

----------
Total: 11/30 (95.1%)

Mark Henninger[/quote]
After a bunch of back and forth and seeing my results and technique for passing such tests, he posts this remarkable outcome: http://www.avsforum.com/forum/91-audio-the...ml#post25871786

[quote author=imagic on AVS link=msg=0 date=]Laptop? Practice?
Well, I decided to give my laptop a try since Amir did so well using his. Lo and behold, I had little difficulty with the 16/32 key jangling test. Not quite perfect, but I suspect a bit more practice would get me up to perfect.

My laptop is a Sony Vaio PCG-41412L with the HD upgraded to a SSD. All audio enhancements are off. I used a pair of Sony MDR-1R headphones.

The results speak for themselves; I found a critical segment that revealed an audible difference. I've had some practice, which helped—just as Amir suggested. Now, I can pass an ABX test I previously failed. I'll tackle the 16/44 test next. Oh, and it was a piece of cake to pick out the differences in the 16/16 and 22/16 tests.

foo_abx 1.3.4 report
foobar2000 v1.3.3
2014/07/19 11:26:49

File A: C:\Users\mark_000\Downloads\keys jangling band resolution limited 3216 2496.wav
File B: C:\Users\mark_000\Downloads\keys jangling full band 2496.wav

11:26:49 : Test started.
11:27:29 : 00/01 100.0%
11:28:58 : 00/02 100.0%
11:29:46 : 00/03 100.0%
11:29:59 : 01/04 93.8%
11:30:06 : 01/05 96.9%
11:30:16 : 02/06 89.1%
11:30:26 : 03/07 77.3%
11:30:34 : 04/08 63.7%
11:30:45 : 05/09 50.0%
11:31:00 : 06/10 37.7%
11:31:10 : 07/11 27.4%
11:31:29 : 08/12 19.4%
11:31:41 : 09/13 13.3%
11:32:05 : 10/14 9.0%
11:32:20 : 10/15 15.1%
11:32:30 : 11/16 10.5%
11:32:41 : 12/17 7.2%
11:32:52 : 13/18 4.8%
11:33:07 : 13/19 8.4%
11:33:16 : 14/20 5.8%
11:33:28 : 15/21 3.9%
11:33:40 : 16/22 2.6%
11:33:58 : 17/23 1.7%
11:34:12 : 18/24 1.1%
11:34:25 : Test finished.

----------
Total: 18/24 (1.1%)

Mark Henninger
[/quote]

He went from being casual listener with extreme bias as to the outcome of the test, to a more critical listener and neutral experimenter.  That was all that it took for the outcome to reverse.

So no, we are not discussing the people who do sighted tests.  They could care less about double blind tests of any kind.  We are discussing people who shout from mountaintop about double blind tests who:

1. Won't participate in hardly any of them.

2. Defend poor implementations of tools.

3. Lack the professional experience to know and understand how users behave in such tests.

4. Lack the professional experience to know how to create tests that show small differences.

5. Routinely violate best practices in the industry as JJ mentioned and I have said until am blue in the face.

The problem is "us."  We say we believe in double blind testing yet demonstrate little interest in properly conducting such tests.  Our excuse?  Oh, the other guy uses sighted tests.  What does that have to do with what we do?  Let's set the best example here.  Let's make the tools and experiments as good as we can make them.  Let's all of us participate and run these tests instead of accusing each other of not knowing how to run a test like this by comparing a track to silence.

If we are not willing to reform ourselves, then let's not remotely go to the place where we say we want to reform others.
Amir
Retired Technology Insider
Founder, AudioScienceReview.com

History and Accreditation of ABX Testing/Invention?

Reply #64
You're better qualified to code the ultimate blind testing tool than I am.

It sounds like you're more motivated too.

Cheers,
David.


History and Accreditation of ABX Testing/Invention?

Reply #66
What does any of this have to do with the topic at hand?

Why should any of you who continually engage in off-topic ankle biting and parallel posting, despite continued requests to remain respectful and on-topic, be granted the privilege of posting on this forum?
Is 24-bit/192kHz good enough for your lo-fi vinyl, or do you need 32/384?

History and Accreditation of ABX Testing/Invention?

Reply #67
Do you know what the group effort was composed of?

Not exactly, that’s why I asked. But uh, see below.

It really hasn't been discussed, but for example it included but was far from being limited to writing Dave Clark's AES convention paper and suquel JAES article. Where did I take credit for doing any of that?

In the author list, of course! Are you one of the author’s? No? Why not, if you contributed so much?

I know that you are overwhelmed by envy, have never done anything of consequence in your profession, and have already tried and convicted me in your mind of being a thief of other people's intellectual property.
Your wild shot in the dark is too far off mark to even comment. But the fact that I asked “what credit do you give to others?” and “can you tell me how to find a source for the exact history”, and your response is all you, you, you, shows clearly how you blow your own horn.

History and Accreditation of ABX Testing/Invention?

Reply #68
Do you know what the group effort was composed of?

Not exactly, that’s why I asked. But uh, see below.


I'm used to people who make far reaching pronouncements and personal attacks based on total ignorance.

Quote
It really hasn't been discussed, but for example it included but was far from being limited to writing Dave Clark's AES convention paper and sequel JAES article. Where did I take credit for doing any of that?

In the author list, of course! Are you one of the author’s? No? Why not, if you contributed so much?


Just another example of your ignorance. I am mentioned in both articles.

There was a hidden agenda to the ABX effort, which related to who was fully employed and who was not.

Quote
I know that you are overwhelmed by envy, have never done anything of consequence in your profession, and have already tried and convicted me in your mind of being a thief of other people's intellectual property.
Your wild shot in the dark is too far off mark to even comment. But the fact that I asked “what credit do you give to others?” and “can you tell me how to find a source for the exact history”, and your response is all you, you, you, shows clearly how you blow your own horn.


I think I found you on the web, and I was then correct - you have nothing that compares with the entire ABX project in terms of scope and impact on the world of audio.  You can only wish!

The entire project went miles beyond the basic test and comparator, and those who did it who want to take bows have done so.

Clark's part appears in publications under his name, Nousaine's part appears in publications under his name, but neither of them devised or built the part that I take credit for. 

Furthermore, it might be argued that there may have been some appropriation of other people's work but not by me. It was a sharing, a gift to those who really needed it. 

What really happened apparently went right over your head.

I take credit for what I did by myself, which was design and build both the experiment and the first machine that implemented it.  The two are IMO inseparable and came together at the same time.

The entire ABX Project had many participants and extended over several decades, but by then the experimental design was fully settled and every machine that implemented it was like Foobar2000/ABX - yet another re-implementation of the original ABX Comparator concept that I made happen by myself, only assisted by the people who helped make it fail until it was ready for prime time. There was no "Why don't you try this?" As soon as we had failed I knew exactly what to do next and it worked as far as it went, until the next innovation was needed and I supplied that, too.

History and Accreditation of ABX Testing/Invention?

Reply #69
So Arnold, you are the inventor of ABX?

History and Accreditation of ABX Testing/Invention?

Reply #70
So Arnold, you are the inventor of ABX?


Thanks for the question, as I can now answer it in a better state of being informed about the prior (ca. 1950) non-interactive ABX test.

I am the sole constructor of the first Interactive ABX Comparator, and the person who did the first Interactive ABX test.

That test compared the amplifier section of a Heath  AR1500 stereo receiver to a Dyna ST-400 power amp while driving Ohm F speakers

It came out random guessing.

History and Accreditation of ABX Testing/Invention?

Reply #71
I owned the AR15, predecessor of the AR1500. Unfortunately it kept blowing output transistors, so I got rid of it.

Before that I had the AJ15, which is just the tuner section of an AR15, and Dynaco preamp and amp.

History and Accreditation of ABX Testing/Invention?

Reply #72
So Arnold, you are the inventor of ABX?


Thanks for the question, as I can now answer it in a better state of being informed about the prior (ca. 1950) non-interactive ABX test.

I am the sole constructor of the first Interactive ABX Comparator, and the person who did the first Interactive ABX test.

That test compared the amplifier section of a Heath  AR1500 stereo receiver to a Dyna ST-400 power amp while driving Ohm F speakers

It came out random guessing.


I'm not asking if you did the first test. I'm asking if you are the inventor of ABX. Not the switchbox.  Not the test. 

ABX.

Are you the inventor of ABX?

History and Accreditation of ABX Testing/Invention?

Reply #73
I think I found you on the web, and I was then correct - you have nothing that compares with the entire ABX project in terms of scope and impact on the world of audio.  You can only wish!

Karl Rove was quite good with the “attack the messenger to deflect from the message” method, but you are not so good. Your new shot in the dark still misses. And wait, are you trying to compare my personal career successes with yours (IT career), or are you now taking credit for “the entire ABX project”? I will give you credit for the hobbyist/self-promotion/ignore-your-collaborators area, you definitely beat me there.

Quote
Clark's part appears in publications under his name, Nousaine's part appears in publications under his name, but neither of them devised or built the part that I take credit for. 

Furthermore, it might be argued that there may have been some appropriation of other people's work but not by me.

But, anyone who says “I invented ABX” has appropriated other people’s work.
And the Clark articles:
Quote
Just another example of your ignorance. I am mentioned in both articles.

in the Acknowledgements, not the author list I asked about. Our director always acknowledges his secretary in his papers, because he never puts in commas and she fills them in. You did much more than put in commas, but you contribution didn’t put you in the author list.

About the question-
So Arnold, you are the inventor of ABX?


Thanks for the question, as I can now answer it in a better state of being informed about the prior (ca. 1950) non-interactive ABX test.

but you were informed in 1982, right?
However the ASA was publishing a great many papers based on the JASA version of ABX long before the AES even mentioned the AES version of ABX.  People who read journals and don't just drop their names know such things.  They also know the nature of the differences between the two. One of the items in the committee review of Clark's AES ABX paper included clarifying these things because enough people in the AES knew about the ASA version of ABX whcih came much earlier.  So did we.
and you still wrote "I invented ABX" after 1982.

Unless you know everything, you have areas of ignorance, just like me. One area I admit ignorance is the exact nature of the ABX developments in the 70’s and 80’s. I only know what I find within the internet, and that is what I have put in this forum. You are in a special position to be able to fill in some gaps. We know what you did, but what is the rest of the story?
Can you suppress your ego, quit attacking the messenger, and tell us more about the “History and Accreditation of ABX Testing/Invention”? More than repeating about the box and first test?
I fear not-
Quote
The entire ABX Project had many participants and extended over several decades, but by then the experimental design was fully settled and every machine that implemented it was like Foobar2000/ABX - yet another re-implementation of the original ABX Comparator concept that I made happen by myself, only assisted by the people who helped make it fail until it was ready for prime time. There was no "Why don't you try this?" As soon as we had failed I knew exactly what to do next and it worked as far as it went, until the next innovation was needed and I supplied that, too.
Everybody owes you, you, you.
(sigh)

History and Accreditation of ABX Testing/Invention?

Reply #74
I'm not asking if you did the first test. I'm asking if you are the inventor of ABX. Not the switchbox.  Not the test. 

ABX.

Are you the inventor of ABX?

We can simplify the response:
a-yes, I did it all, everyone owes me credit.
b-no, I played a big role, but it was a group effort.
c-I don't like the choices, so I'll attack someone whose message I don't like.

EDIT: changed from simple yes/no

 
SimplePortal 1.0.0 RC1 © 2008-2019