Skip to main content
Topic: Transparency: Public Listening Tests vs Anecdotes and My Experiences (Read 754 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Transparency: Public Listening Tests vs Anecdotes and My Experiences

In my own listening tests (admittedly on only a few songs), I've found Opus 96 kbps reliably transparent.  Others' observations on HA seem to agree.  Yet, in the public 96 kbps listening test, most samples were not found to be transparent at this bitrate.  I also have found Vorbis near-transparent at this bitrate--I can sometimes pick out subtle artifacts in critical listening but never notice anything obvious.  Yet, Vorbis scored worse than Opus in the same listening test.

Similarly, 128 kbps AAC (Fhg) seems transparent to me even with CBR and even 96 is close.  Yet, the most recent listening test at this bitrate suggests various AAC codecs perform similarly to Opus@96 kbps.

Why do public listening tests seem so much more pessimistic than my experiences or, in the case of Opus, others' experiences on this forum?
  • Are exceptionally-hard samples typically selected for listening tests?
  • Do listening test participants typically have an exceptionally good ear for subtle artifacts?
  • Have codecs gradually improved with time such that the listening tests I cite are outdated?

Re: Transparency: Public Listening Tests vs Anecdotes and My Experiences

Reply #1
Opus from 5 years ago and opus today are two different beasts. Optimizations were made, so sound quality is even better. There has been no, as far as I am aware, recent tests of codecs.
Error 404; signature server not available.

Re: Transparency: Public Listening Tests vs Anecdotes and My Experiences

Reply #2
Opus from 5 years ago and opus today are two different beasts. Optimizations were made, so sound quality is even better.
Original poster talks about 96 kbps. 

During the last 5 years of development of Opus:
5-48  kbps - big quality improvements
56-80 kbps - very small improvements
80-500 kbps - microscopic improvements which are hard to detect  + bugfixes

  • Are exceptionally-hard samples typically selected for listening tests?
It was a mixed bag.  Though hard samples were well represented.

  • Do listening test participants typically have an exceptionally good ear for subtle artifacts?
Yes,  more than half of the results had come from well trained listeners.  In real life scenario a such listeners represent low percentage of all people.

  • Have codecs gradually improved with time such that the listening tests I cite are outdated?
No. Last audible improvements were made in:

Apple AAC  - approx. 10 years ago or so
Vorbis - 2011 (last version of Aotuv Beta 6)
MP3 LAME  - 2011 (last version was 3.99 which contained quality improvement and not just bugfixes and misc stuff). 3.100, 3.100.1 are here just for maintenance and bugfixes.
Opus - last measurable improvements for bitrate higher than 80 kbps were made in version 1.1, December 2013. Since then only lower bitrates were improved.

Re: Transparency: Public Listening Tests vs Anecdotes and My Experiences

Reply #3
In my own listening tests (admittedly on only a few songs), I've found Opus 96 kbps reliably transparent.  Others' observations on HA seem to agree.  Yet, in the public 96 kbps listening test, most samples were not found to be transparent at this bitrate.  I also have found Vorbis near-transparent at this bitrate--I can sometimes pick out subtle artifacts in critical listening but never notice anything obvious.  Yet, Vorbis scored worse than Opus in the same listening test.

The thing about such public tests (96kbps and higher) is that participants are mainly trained listeners and not quite average user.
In order to obtain more realistic results by including more of average listeners, test should be conducted at lower bitrates such as 48-64 kbps.

For example this one http://www.mp3-tech.org/tests/aac_48/results.html . Look how well MP3 LAME 128 kbps performed, because it was used as high anchor and because of low bitrate test  there were plenty of average users.

Considering this, everything that performs as well as LAME MP3 128k VBR will be in a  transparent zone for big mass of people.

 

Re: Transparency: Public Listening Tests vs Anecdotes and My Experiences

Reply #4

For example this one http://www.mp3-tech.org/tests/aac_48/results.html . Look how well MP3 LAME 128 kbps performed, because it was used as high anchor and because of low bitrate test  there were plenty of average users.

Considering this, everything that performs as well as LAME MP3 128k VBR will be in a  transparent zone for big mass of people.

Interesting!  I've definitely successfully ABX'd LAME 128 VBR (-V5) on some songs where Opus 96 is perfectly transparent to me.  I listen to a lot of rock with lots of cymbals.  At non-transparent bitrates, LAME can produce some very obvious non-linear/robotic artifacts there.  When Opus becomes non-transparent, the artifacts sound more like subtle additive noise and stereo image distortion.  They're harder to describe and harder to pick out by listening hard to a specific instrument. 

Also, Opus 96 scores ~0.4 MOS points higher than LAME 128 in the public 96 kbps listening test.  I guess the tl;dr is that Opus 96 is non-transparent on killer samples and/or to people who really know what subtle artifacts to look for, but probably transparent in most other cases.

Re: Transparency: Public Listening Tests vs Anecdotes and My Experiences

Reply #5
I guess the tl;dr is that Opus 96 is non-transparent on killer samples and/or to people who really know what subtle artifacts to look for, but probably transparent in most other cases.
Agree

 
SimplePortal 1.0.0 RC1 © 2008-2019