Skip to main content
Topic: Sample specific discussions: sample #1 (Read 4087 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Sample specific discussions: sample #1

As I said in the test results thread I would like to discuss about each sample separately. The overall results were tied and all encoders seem to be equal. In closer inspection it seems that each encoder had problems with at least some samples.

It would be useful to analyze each sample separately in order to find out what kind of problems the testers noticed and how severe they are. The discussion would help to understand the test results and probably also help the codec developers in their work. This has not been done before, but I think the outcome would be valuable.

Some testers added comments to the result files. Those comments are useful if the tester intends to revisit a saved session later. Unfortunately the comments in the result files are quite hidden and they cannot be easily evaluated and compared. That's why I didn't add comments to my results (expect the some unfinished, partially wrong comments in one of my first result files - I meant to delete them, but I forgot to do that.)

This thread is for the Sample #1. Please try to keep the discussion on topic. If you want to discuss about any other sample feel free to start a new thread for it. I am hoping that eventually we'll have 14 separate threads - one for each sample. I'll add them by myself if others have not done that before me.


Sample #1 - finalfantasy

The overall results:



The results from the individual testers:



I sorted the testers so that the most critical tester is the first on the left.

Since Sebastian already removed the test sample links I uploaded the first two sample packages to RapidShare so that anyone can listen to the actual samples: http://rapidshare.com/files/167567675/Samples_01_and_02.zip (7.5 MB)


I'll check my results and relisten to the samples later today. I'll post my personal comments after that.


Sample specific discussions: sample #1

Reply #2
My results
[blockquote]iTunes: 3.80
LAME 3.98: 2.60
Low anchor: 1.00
FhG: 3.40
LAME 3.97: 1.80
Helix: 4.30
[/blockquote]Lame 3.97 has an obvious problem with this sample. The problem is similar like the known problems with harpsichord samples.
Lame 3.98 is better, but if you just listened 3.97 you can still clearly hear traces of the same problem.
FhG actually has a similar problem, but it is less pronounced.
iTunes and Helix are clearly better, but not transparent. They have slight inpurity in the string attacks and other highest frequencies. Helix was closer to transparent.

EDIT

Here are the bitrates from Sebastians bitrate table:
Code: [Select]
iTunes   LAME 3.98.2   l3enc (Low Anchor)   Fraunhofer   LAME 3.97   Helix
-----------------------------------------------------------------------------
118      107           128                  119          97          114

They may partially explain the results.

Sample specific discussions: sample #1

Reply #3
My results:

iTunes: 1,7
Lame 3.98.2: 1,1
l3enc 0.99a: 1,0
Fraunhofer: 2,3
Lame 3.97: 1,3
Helix v5.1: 3,4

There are several problematic spots but I point out two.

At about 1.25 sec distortion when hitting 2 or 3 cords at once:
  • Fraunhofer sounds at this spot better than all others.
  • Lame 3.98.2 even has distortion in its echo and some cords inmediate after this, sounds much worse than Lame 3.97
  • Helix has less distortion than Lame 3.97
At about 18.85 sec continuous falling violin like sound is distorted:
  • Helix sounds at this spot much better than Fraunhofer and all others
  • Lame 3.98.2 even guitar sounds warble a lot, Lame 3.97 does much better and iTunes even better (but worse than Helix & Fraunhofer)
I think this sample can contribute a lot to tune Lame 3.98.2 at the bitrate around 128kbps.

This kind of analysis is time consuming and very tough to describe for me having no idea about instrument sounds 

 
SimplePortal 1.0.0 RC1 © 2008-2019