Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Yalac - Comparisons (Read 214298 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Yalac - Comparisons

Reply #200

Don't you think you are paying a bit too much work into error correction? I think I never had a file with corrupt frames, and the only scenario I could imagine would be an audio stream over WLAN, not the best way to transport lossless files to be saved on another computer.


I store my lossless "archive" on CDs. I use Monkey's Audio for this, because it compresses good enough so I can always find two albums together to put on one CD. However, when I can I put the files in a rar archive with recovery information to fill up the rest of the available space on the CD, because I would hate to lose my lossless copied due to holes in the CD. If this protection can be provided by the audio codec itself, I'd consider this a point to chose that format.


May be, Reed-Solomon codes may be put into some tag? Or I hane another chance to suggest "special" frames - this time with error-correction codes. Anyway, it would be very valuable feature.

Yalac - Comparisons

Reply #201
May be, Reed-Solomon codes may be put into some tag? Or I hane another chance to suggest "special" frames - this time with error-correction codes. Anyway, it would be very valuable feature.

IMO, error correction measures (for a stream or file) are not within the scope of a specific (audio) codec. The place to implement such a feature would be the (generic) container format and its software components like muxer and parser/splitter.

@TBeck
I just have to ask again: 
Have you had a look at existing container formats like Ogg, Matroska and others? What's your reason for not using one of them?

Yalac - Comparisons

Reply #202
IMO, error correction measures (for a stream or file) are not within the scope of a specific (audio) codec. The place to implement such a feature would be the (generic) container format and its software components like muxer and parser/splitter.


I'm not sure about this. Yes, it's logical - but only sound samples must be covered by ECC - not tags, or something else. I don't know, if it can be done on generic container level.

Yalac - Comparisons

Reply #203
- Some trauma from the release of the first evaluation version: one tester had nothing better to do than damaging the files and reporting errors and crashes...

That tester was me

As a sidenote, implementing a facultative recovery record, like PAR2 / RAR Recovery record (redundancy)/ CIRC would be nice -- it would allow partial recovery of a compressed file, or total recovery if it isn't too damaged;  this type of damage happens often on removable media like DVDs or CDs, or in case of HDD failures.

Think about it.

Yalac - Comparisons

Reply #204

IMO, error correction measures (for a stream or file) are not within the scope of a specific (audio) codec. The place to implement such a feature would be the (generic) container format and its software components like muxer and parser/splitter.


I'm not sure about this. Yes, it's logical - but only sound samples must be covered by ECC - not tags, or something else. I don't know, if it can be done on generic container level.


Why exclude the tags etc. from error correction? They are maybe 0,1% of the compressed file size, so excluding them won't save much space or gain much security.

However, I agree that error correction should be provided by the container format.

Yalac - Comparisons

Reply #205
Why exclude the tags etc. from error correction? They are maybe 0,1% of the compressed file size, so excluding them won't save much space or gain much security.

Because if someone changes the tags, it flags an error.

 

Yalac - Comparisons

Reply #206

Why exclude the tags etc. from error correction? They are maybe 0,1% of the compressed file size, so excluding them won't save much space or gain much security.

Because if someone changes the tags, it flags an error.

No, the tagging software must update the error correction codes when it changes the tags. Otherwise it is really damaging the file.

Yalac - Comparisons

Reply #207
@TBeck
I just have to ask again: 

Do you?

Have you had a look at existing container formats like Ogg, Matroska and others? What's your reason for not using one of them?

Well, that's me, somewhat lost in the new wonderful world of streaming and container formats...

I've become a bit lazy over the years. I am dealing only with things i need or find interesting. Unfortunately streaming and container formats have not been among them.

But because of yalac i had to take a fast look at them. And then come to a fast deceision.

Things that seem important for me:

- The container for yalac should be easy to use by other developers.
- For audio purposes it can be quite simple and limited.
- Matroska seems to be an extremely ambitious approach, that i really appreciate. But i often read, that it is seldom beeing used because of it's complexity. I don't know enough about matroska to say if that is really true. But it seems to scare away some developers.
- FLAC and WavPack are successful although they are using their own container (yes, i know that you can wrap another container around it, that's always possible).

My current plan (may change if i know more):

Make a simple container format for yalac, that other developers can easily use. I doubt, that this would prevent yalac from beeing succesful (there may be other more important reasons for little success).

And if my deceision is wrong? Just throw away my container, use only the codec and wrap the data into a container of your choice.

Yalac - Comparisons

Reply #208


Why exclude the tags etc. from error correction? They are maybe 0,1% of the compressed file size, so excluding them won't save much space or gain much security.

Because if someone changes the tags, it flags an error.

No, the tagging software must update the error correction codes when it changes the tags. Otherwise it is really damaging the file.


Shade has a good point, though. Updating the ECCs for every tagging operation might be really slow.
I think we're getting off topic again... This discussion should be in the file format thread.

Yalac - Comparisons

Reply #209
Because if someone changes the tags, it flags an error.
No, the tagging software must update the error correction codes when it changes the tags. Otherwise it is really damaging the file.

Generally, hashes are audio only.  There is no reason to hash the tags with the audio data, and therefore if done properly, no need to recycle the ECCs if you update the file tags.

Yalac - Comparisons

Reply #210
Wow... it's funny that I've been involved in these discussions about error detection, because I just had to majorly make use of that tonight.

I was doing a very large encode of hundreds of lossless albums, and I started noticing that foobar was giving me unmatching checksum errors, so I plunked my ape files into the Monkey's Audio GUI, and I found that I had several corrupt files.

It was quite a bit of work to do this, and took a lot of time.  The fact that I could still do it fairly easily was good though.  Fortunately, I had backups of my music, but if I didn't I'd hope that there would still be some way of salvaging what wasn't damaged, since these were lossless image files. (i.e. a whole CD in one file).

When developing YALAC, I hope that attention is given to some kind of a system that allows quick and easy detection of errors, and also can pinpoint the frame (time into the track, perhaps) where the error is.  If an error is found, it should be trivial to skip it, silence the bad frame, and go onto the next one.

As for my problem, I'm guessing that I had a bad disk drive, or a poor USB2 connection or something like that, but whatever the case, I had corrupted files, and I would never have known if I hadn't tried to encode them all, and noticed that foobar was popping up some mysterious error, that I couldn't read.  Some detective work allowed me to find out what was going on, and now I'm back to normal...


I thought my real-world scenario here would help to drive home the relevance of this topic, because I know how often this stuff just seems like abstract theory with no likely application to the real world.

Yalac - Comparisons

Reply #211
When developing YALAC, I hope that attention is given to some kind of a system that allows quick and easy detection of errors, and also can pinpoint the frame (time into the track, perhaps) where the error is.  If an error is found, it should be trivial to skip it, silence the bad frame, and go onto the next one.

Sounds a bit, as if you would describe some of the new features of V0.10.  They are done...

Just a quick summary of the new features of V0.10:

- Error detection, automatic repairing of damaged files, error protocol.
- 1 Hardware profile (a variation of the old TURBO preset). Yes, only 1 is better... Some of you told me so earlier.
- File format is ready for streaming. Currently not too useful without software, that can use it.

I thought my real-world scenario here would help to drive home the relevance of this topic, because I know how often this stuff just seems like abstract theory with no likely application to the real world.

Thanks!

Yalac - Comparisons

Reply #212
Enough progress for V0.10?

Much has be done, but there is even more left to do. I could easily spend another 2 month or more to implement all the things from my to do list. And it keeps growing...

I am tempted to release V0.10 soon (to the testers), because it is always very motivating to receive some feedback and because there may be enogh new to be tested.

Please read about the new features and tell me, if you would like to test them.

File format

You know that most of my latest work has gone into the file format:

- Error detection and recovery of damaged files
- Streaming support

First thing to test would be the recovery of damaged files. You may have read, that i have built a tiny tool to damage files.

Hardware profiles and presets

We have talked about this earlier. First i wanted to implement the hardware profiles as additional presets. Some of you found this confusing (special thanks to Synthetic soul) and i guess you are right.

Therefore V0.10 again will only provide the familar presets TURBO to EXTRA (i dropped INSANE, part of it went into EXTRA). But TURBO, FAST and NORMAL are now beeing called hardware profiles with restrictions for the encoder options. You will see that some of the encoder options are now disabled in the options dialog, if you select a restricted preset.

For the restricted presets you can now choose the evaluation level: Normal, Extra or Max. A higher evaluation level means lower encoding speed but better compression. It's your choice! Regardless of the evaluation level you will always stay within the set of allowed options for the selected preset.

Evaluation level Normal is the default and mostly equal to the old presets. Level Extra has been configured to provide you as much compression as possible at half the  encoding speed of level Normal. Level Max sets all possible options to the maximum and compresses only slighttly better than Extra.

Important: The evaluation level has nearly no effect on the decoding speed!

The following table provides some results from my primary test file sets "rw" and "songs". Test system: P3-800. All tests performed without file output.

Code: [Select]
        | Compression             | Encoding speed          | Decoding speed  |
        | Evaluation              | Evaluation              | Evaluation Max  |
Preset  | Normal  Extra   Max     | Normal  Extra   Max     | MMX     Pascal  |
--------+-------------------------+-------------------------+-----------------+
rw      |                         |                         |                 |
Turbo   |  58.29   57.90   57.84  |  57.54   29.53   17.68  |  71.05   61.03  |
Fast    |  57.32   57.03   56.93  |  38.56   19.86   11.79  |  66.45   47.96  |
Normal  |  56.75   56.58   56.55  |  20.29   10.94    7.85  |  61.06   37.38  |
N 0.09  |  56.66   56.45   56.41  |  18.03    7.94    5.67  |  54.86   32.59  |
--------+-------------------------+-------------------------+-----------------+
songs   |                         |                         |                 |
Turbo   |  49.47   49.16   49.11  |  54.69   29.87   17.97  |  72.85   62.20  |
Fast    |  48.49   48.19   48.09  |  39.24   19.82   11.77  |  67.73   47.20  |
Normal  |  47.79   47.67   47.63  |  19.68   10.78    7.72  |  60.31   33.19  |
N 0.09  |  47.69   47.54   47.50  |  17.50    7.86    5.56  |  54.89   28.83  |
--------+-------------------------+-------------------------+-----------------+

Preset

Restricted presets of V0.10. "N 0.09" is preset Normal of V0.09.

Evaluation

Normal/Extra/Max determines the evaluation depth of the encoder. This should only affect encoding but not decoding speed.

Compression

Compression ratio in percent.

Encoding / Decoding speed

Multiple of real time. Because decoding speed should not depend on the Evaluation level, i only tested Max. MMX is the speed of the optimized assembler code, Pascal the speed of pure Pascal code.

Two presets have been modified:

Turbo now uses 12 instead of 8 predictors. You may remember our discussion about 8 vs. 16 predictors for preset Turbo. Well, 12 lies in between and FLAC -8 is using 12 predictors too, hence we may be able to perform some nice comparisons. Frame duration of Turbo has been increased from 63 to 94 ms to compensate for the compression penality caused by the space requirements of the new file format additions (sync codes, crc, streaming info).

Normal now is using smaller frames and only 96 instead of 128 predictors. Compression is about 0.1 percent worse, but encoding speed is more than 10 percent higher on my system. Decoding speed is more than 15 percent higher which seems important for a (restricted) hardware profile.

Final question to the testers:

- Would you like to test it now?
- Or wait for more new features?
- Or quit testing? (Don't even think about it... please...)

Yalac - Comparisons

Reply #213
The preliminary report you give us and progress you've made in organization is encouraging.  I do not think that any supplemental optimizations would be required from different test groups, but you can always release the version... Most testers will give you feedback in the next 2 days anyways ;-)

As you've noticed, I've mainly concentrated (of late) on giving opinion -- I don't think my test bench is particular in any way, so...

Anyways, your choice depends on whether you want to analyse the data that Joseph will give you ;-)

I'm sure, in any case, that positive feedback is welcome and encouraging.

Good luck,
Tristan.

Yalac - Comparisons

Reply #214
Quote
' date='Jul 12 2006, 04:41' post='410941']
As you've noticed, I've mainly concentrated (of late) on giving opinion -- I don't think my test bench is particular in any way, so...

And your opinion is important for me!

Possibly you could be so nice and damage some files later... 

Quote
' date='Jul 12 2006, 04:41' post='410941']
Anyways, your choice depends on whether you want to analyse the data that Joseph will give you ;-)

Good point. Probably he is allready missing something. 

Quote
' date='Jul 12 2006, 04:41' post='410941']
I'm sure, in any case, that positive feedback is welcome and encouraging.

Oh yes!


Yalac - Comparisons

Reply #216
I'm all up for some testing Thomas.  I think the usual runs comparing 0.10 to 0.9 and other codecs plus some new error recovery testing is ample to be getting on with.

I may be slower with the error detection tests as that sounds like some manual process (e.g.: listening) will be required, and I don't get much time for such things.  My speed/compression tests look after themselves pretty much on the whole.
I'm on a horse.

Yalac - Comparisons

Reply #217
I'm all up for some testing Thomas.  I think the usual runs comparing 0.10 to 0.9 and other codecs plus some new error recovery testing is ample to be getting on with.

Great!

I may be slower with the error detection tests as that sounds like some manual process (e.g.: listening) will be required, and I don't get much time for such things.  My speed/compression tests look after themselves pretty much on the whole.

Oh, "Listening"?  That would be really much work!

The test could consist of two parts:

1) Damage files with my damaging utility. Then decompress them and look if yalac is crashing. This all can be performed automatically. If yalac crashes, you can stop the test and let me correct the program.

2) Pick some files, load them into an audio editor and look if the error positions (frames) reported by the error log file have been muted.

Ok, a listening test on one or two files would be nice, but this is really far more than i would have expected you to do. And all this is not urgent.

It may take some days, until V0.10 is done.

Edit: I am too dumb... There is no need for a listening test! Just perform a binary compare of the decoded file. If any difference (muted frames) has been reported by the error protocol, everything is ok!

Yalac - Comparisons

Reply #218
TBeck, I had not been too interested in running encoding tests, because it's kind of mundane, and I think the others can do a better job with some of the scripts they have.

I would like to see how your codec can handle damaged files though, so you can include me in your test group when you're ready.  Now that I've had some firsthand experience with damaged files, I'd like to see how your codec compares to others I've used.

BTW, is there a "test" mode in YALAC's command line utility?  I guess even if the file is repaired, it should have a flag of some sort in it indicating that it is different from the original encode, or that it is a repaired damaged file, and not identical to the original source.  That might be important for testing the integrity of a backup or something.

Yalac - Comparisons

Reply #219
- Error detection and recovery of damaged files
- Streaming support

Great!

Quote
Therefore V0.10 again will only provide the familar presets TURBO to EXTRA (i dropped INSANE, part of it went into EXTRA). But TURBO, FAST and NORMAL are now beeing called hardware profiles with restrictions for the encoder options. You will see that some of the encoder options are now disabled in the options dialog, if you select a restricted preset.

For the restricted presets you can now choose the evaluation level: Normal, Extra or Max. A higher evaluation level means lower encoding speed but better compression. It's your choice! Regardless of the evaluation level you will always stay within the set of allowed options for the selected preset.

Is it a bit confusing that Turbo Max encoding is slower than Normal?
If the speed is "Turbo" at decoding, I would like a preset name that suggests that. Maybe Fast--> Light, Turbo--> Superlight (Ultralight?)... Shows that it doesn't have heavy requirements for decoding. Encoding may be fast or slow depending on evaluation level.

Yalac - Comparisons

Reply #220
V0.10 is (nearly) done

Compression algorithms:

- Implementation of the classic channel decorrelation method "Mid-Side". Because of it's high speed it's a welcome addition for the fast presets.
- Lower criterion for PreFilter sensitivity Low, that makes it a bit more sensitive.

Presets:

- TURBO uses 12 instead of 8 predictors and a frame duration of 94 instead of 63 ms. Therefore and because of the processing time needed for the new checksum calculation it will be a bit slower.
- FAST uses a frame duration of 125 instead of 100 ms.
- NORMAL uses 96 instead of 128 predictors and a frame duration of 188 instead of 250 ms. It should encode 10 percent faster, but may loose about 0.1 percent compression.
- HIGH uses 160 instead of 192 predictors. It should encode 5 percent faster, but may loose about 0.05 percent compression.
- EXTRA uses an optimized implementation of old frame partition search level Insane, that has replaced High. Encoding will be a bit slower. Compression can be slightly better.
- INSANE is gone.
- Presets TURBO, FAST and NORMAL are now beeing called restricted hardware profiles. New evaluation levels let you vary their encoding efficiency (encoding speed-compression tradeoff).

File format:

- Streaming support.
- Error detection.

Because the new streaming format is only partially implemented, only a subset of the possible sample rates is beeing supported in this version: 8000, 11025, 16000, 22050, 32000, 44100, 48000 and 96000 Hz.

The full implementation will support any sample rate from at least 8 to 192 KHz and up to 16 Channels.

Program functions:

- Test the integrity of a compressed file.
- Perform a test encode of a file without creating any output.
- Reconstruct damaged files.

Release:

I hope to send the new version to the following testers within the next 3 days:

Destroid
JohanDeBock
Josef Pohm
Shade[ST]
Supacon
Synthetic Soul

You may ask, why it may take another 3 days instead of the usual 24 hours to release it. I always begin my final testing after this announcement. And this time it is very probable that i will find some errors i will have to correct before the release.

What should be tested:

- Comparison with V0.10: Speed and compression performance of the presets.
- Because TURBO, FAST and NORMAL now allow the selection of one of three evaluation levels, there would be 9 combinations. No need to test them all! I would only vary the evaluation level on preset TURBO, that is possibly the most attractive preset for possible hardware support.
- Test the error detection and recovery:

1) Damage files with my damaging utility. Then decompress them and look if yalac is crashing. This all can be performed automatically. If yalac crashes, you can stop the test and let me correct the program.

2) Perform a binary compare of the decoded file. If any difference (muted frames) has been reported by the error protocol, everything is ok!

3) If you can't get enough: Pick some files, load them into an audio editor and look if the error positions (frames) reported by the error log file have been muted.

Warning:

I never changed so much at once. This time program errors are very probable! Sorry in advance...

Plans for V 0.11:

- Better functionality: Possibility to cancel encoding/ decoding, specify what to do if an (output) file allready exists.
- New switches for the command line version to give access to more options only available in the GUI-version.
- Finalization of the streaming format.
- Possibly try some new ideas for some compression improvements... I need this from time to time...

Other important items from my to do list:

- File format should support tagging and metadata.

Yalac - Comparisons

Reply #221
Well, I've run a small test on a single file.  I tested out the error handling, and I'm impressed... it works very elegantly.  If an APE file had the slightest error, it'd just bork the whole file and you couldn't do much about it.  YALAC has a great deal of configurability to work for anyone's purposes, I suppose.

I'm not going to get into the detailed results of compression and such, but I noticed that the restricted "Fast" preset produced a file that seemed very slow to decode, exceeded only by Unrestricted Extra in the time it took to decode.  That's an odd anomaly.  Perhaps my computer was just busy, or I have a freak file?

From what I've seen, unrestricted high appears to be the most useful, with reasonable encoding speed (much faster than extra, at least), and excellent compression ratios, and also very fast decoding speed.

Perhaps later I can convert several albums containing different genres of contemporary popular music using your codec for a more involved analysis.


I do have a question that I hadn't thought of before:

For seeking in audio players, does the stream info frame have anything to do with that?  It was implied in the documentation that you could only seek to a spot every 2 seconds in the file by default, but I presume that's only over a streamed connection, and seeking on a local disk or something in a program like foobar would be normal, correct?

Yalac - Comparisons

Reply #222
Well, I've run a small test on a single file.  I tested out the error handling, and I'm impressed... it works very elegantly.  If an APE file had the slightest error, it'd just bork the whole file and you couldn't do much about it.  YALAC has a great deal of configurability to work for anyone's purposes, I suppose.

Great news! Thanks.

I'm not going to get into the detailed results of compression and such, but I noticed that the restricted "Fast" preset produced a file that seemed very slow to decode, exceeded only by Unrestricted Extra in the time it took to decode.  That's an odd anomaly.  Perhaps my computer was just busy, or I have a freak file?

That's a bit strange. It can not be caused by the calculations performed by the decoder. Only reasons i could think of:

1) Disturbances by other system activities.
2) The disk io of some systems seems to be sensitive to the block size yalac uses for file io. Turbo and fast are using smaller blocks than the other presets. But usually this makes them faster!

I do have a question that I hadn't thought of before:

For seeking in audio players, does the stream info frame have anything to do with that?  It was implied in the documentation that you could only seek to a spot every 2 seconds in the file by default, but I presume that's only over a streamed connection, and seeking on a local disk or something in a program like foobar would be normal, correct?

You are right. No restrictions for the common software players.

A streamed connection could have a start up latency (when linking to the stream) up to the info frame interval.

Seeking on some hardware players could be limited to the user selected stream info interval (2 seconds are the default). But with a good decoder implementation they could overcome this limitation. Seeking to non-info-frames is always possible (at least if the hardware has enough resources), but it will be a bit slower.

Thank you for your fast feedback!


Yalac - Comparisons

Reply #224
How can streaming now be tested?  Can anyone advise on how to set up a remote streaming server somewhere, so I could see how effective YALAC is as a streaming format?

Perhaps it could be tested with some classical music, or something that compresses to low bitrates.