Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Yalac - Comparisons (Read 213836 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Yalac - Comparisons

Reply #226
If an APE file had the slightest error, it'd just bork the whole file and you couldn't do much about it.


If APE file is broken, at least Foobar2000 can play it after 1-2 seconds of silence, and you can conver it to another file without errors - but, of cource, with silence samples.

And compression of new version isn't better - it even worse than 0.0.9, and even Extra worse than APE High...

Yalac - Comparisons

Reply #227
Good test JohanDeBock!

About APE, I had problems where I was decoding ape images with cuesheets, and an error was encountered.  Foobar 0.9 would usually just abort and go to the next file, and foobar 0.8 would play static for the remainder of the file, or jumble things up quite a bit... one time it also skipped ahead, and I heard the beginning of the next song at the end of what was supposed to be the current one.  There's some very inconsistent error handling with Monkey's.

Zergen, I wonder if the new error correction and stream info impacts the compression? 
Did you change much else about the codec Tbeck, like what options are used in the presets?

Yalac - Comparisons

Reply #228
Zergen, I wonder if the new error correction and stream info impacts the compression? 
Did you change much else about the codec Tbeck, like what options are used in the presets?

Yes to both questions.

You can find a detailed specification of the preset modifications in the release post (V0.10). Normal and High sarcifice some compression for a bit more speed.

Extra now uses an optimized variation of some older encoder option earlier used by the removed preset Insane, that seems to perform slightly worse than before. Possibly i will go back to the old implementation.

The position of yalac is a bit difficult. I want it to decode considerably faster than other encoders with similar compression ratios. Therefore i can not use some better performing but slower compression algorithms.

Another restriction: I don't want to use methods, that could be patented.

Monkey for instance is using a range encoder for the compression of the prediction residuals. It is believed to be patent free, but you can not be totally sure.

I too have played with range coding (about 6 years ago), but later removed it because of the possible patent issues.

If i would reactivate and optimize it a bit, any preset of yalac could compress about 0.25 to 0.30 percent better with a very small speed penality. On many test sets this would be sufficient to be on par with Monkey.

But currently i want to avoid the possible patent issues. Maybe i will add it as an option, that can be removed if it should give trouble.

Yalac - Comparisons

Reply #229
If you add range coding back and there is a patent on the technology, wouldn't the patent be necessary to decode that same info?  eg. if you remove the range coding from the encoder, old files with range coding will still need to have an 'illegal' decoder to uncompress them, no?

I suppose that's not your problem as long as you don't distribute all the decoders, but still...

 

Yalac - Comparisons

Reply #230
Monkey for instance is using a range encoder for the compression of the prediction residuals. It is believed to be patent free, but you can not be totally sure.

The "range coder" publication is quite dated and until now nobody has claimed anything. I think it's a safe thing.  (I'd make use of it for any future codec). Besides, in Germany you don't need to worry about that unless you plan to earn money with it, IIRC. (PPL who want to make commercial use of your codec would have to worry, though)

BTW: Reportedly, the company On2 uses a range coder for their VP6 video codec. I don't think they pay any fees to anyone because of that.

Sebastian

Yalac - Comparisons

Reply #231
The "range coder" publication is quite dated and until now nobody has claimed anything. I think it's a safe thing.  (I'd make use of it for any future codec). Besides, in Germany you don't need to worry about that unless you plan to earn money with it, IIRC. (PPL who want to make commercial use of your codec would have to worry, though)

Thanks for the encouragement!

But you know that there are strong forces in europe who want to change this. But you are right: Often the patent trouble begins, if someone is earning money with the code (and hence has something to pay the fees...) .

BTW: Reportedly, the company On2 uses a range coder for their VP6 video codec. I don't think they pay any fees to anyone because of that.

That's a very good point!

To be honest, besides the patent issues there is one other reason why i didn't use it: I did not understand any detail of the range coder... Shame... But that's no excuse! I will have to learn a bit more.

(Hm, i shouldn't write such things, if i cared about my self presentation...)

  Thomas

Yalac - Comparisons

Reply #232
Good test JohanDeBock!

About APE, I had problems where I was decoding ape images with cuesheets, and an error was encountered.  Foobar 0.9 would usually just abort and go to the next file, and foobar 0.8 would play static for the remainder of the file, or jumble things up quite a bit... one time it also skipped ahead, and I heard the beginning of the next song at the end of what was supposed to be the current one.  There's some very inconsistent error handling with Monkey's.

Zergen, I wonder if the new error correction and stream info impacts the compression? 
Did you change much else about the codec Tbeck, like what options are used in the presets?

Well, may be realy there's some very inconsistent error handling with Monkey's Audio. I had not so many such errors, and in my case it did resonably well.
About compression - I understand that formatted stream cannot be efficient as raw data, and don't complain about this. As for me, ratio decrease due to new format is fully acceptable, even while I cannot imagine situation when I would stream lossless. But new Normal and High don't make me happy (especially because I don't care much about decompression speed - if can be played and written to CD on-the-fly, all ok). But it's just another priorities...

Yalac - Comparisons

Reply #233
I too have played with range coding (about 6 years ago), but later removed it because of the possible patent issues.

If i would reactivate and optimize it a bit, any preset of yalac could compress about 0.25 to 0.30 percent better with a very small speed penality. On many test sets this would be sufficient to be on par with Monkey.

Well, i have to reply to myself to correct me...

I should have known better.

Some weeks ago i played a bit with my old arithmetic encoding implementation and could only achieve about 0.150 percent better compression on average. But i was quite sure, that i could improve it.

Now i have optimized my models and am achieving 0.165 better compression...

Then i switched to a rangecoder and even worse: Only 0.123 percent left!

My current (very fast) bit coder seems to compare better than expected to range coding.

For me an advantage of 0.123 percent isn't enough to switch to the slower rangecoder.

It's a bit different with low level files, which can be compressed to about 25 to 30 percent. Here the range coder can indeed achieve 0.30 to 0.35 percent better compression. But overall it's not good enough for me.

Sorry for my misinformation.

Yalac - Comparisons

Reply #234
Then i switched to a rangecoder and even worse: Only 0.123 percent left!

My current (very fast) bit coder seems to compare better than expected to range coding.

For me an advantage of 0.123 percent isn't enough to switch to the slower rangecoder.


Is it possible to implement range coding in one preset (HIGH, for example)?

Yalac - Comparisons

Reply #235
Is it possible to implement range coding in one preset (HIGH, for example)?

Yes, it could be implemented as an option. It even has to, because the current bitcoder still has to be kept, because the rangecoder does not perform too well on very small audio data blocks.

I am not sure, if i will add it. Unfortunately i have to do very much work before i can say, how much slower on decoding the range coder will be. It's not too nice to have all the work done and then find, that it was useless because decoding is too slow.

Other reasons against using the rangecoder:

- Considerably more complex source code.
- There may be other optimizations possible, which don't affect decoding speed but provide the same compression improvements.
- And if i wouldn't care about decoding speed at all, there would be many better ways to improve compression, for instance the symmetric algorithms used by the better compressing encoders. If i wanted maximum compression at any price (slow...), i would write a totally different compressor.

Well, i really don't know if i will implement it.

Yalac - Comparisons

Reply #236
Well, i really don't know if i will implement it.

You know what? Don't.  You're not trying to beat Monkey's.  Nobody uses the deprecated format anyways.  Your targets are FLAC, WavPack, and maybe OptimFrog's fast modes.  And you're doing fine -- Much better than people would have fathomed 4 months ago, on april first.

Yalac - Comparisons

Reply #237
http://synthetic-soul.co.uk/comparison/lossless/

http://synthetic-soul.co.uk/comparison/yalac/

I hope to get some error testing done the beginning of next week (in the next 3-4 days).

Thanks for your fast test and sorry for my late reply... The summer heat is making me a bit lazy.

What i find interesting:

- New fast is encoding more than 10 percent slower. That's a bit strange, because i have not changed anything except the frameduration (125 instead of 100 ms). Possibly your disk io is sensitive to the higher block size.
- New Normal compresses only slightly worse but unfortunately can not achieve the expected encoding speed up of 10 percent. Possibly it's allreday a bit limited by disk io.
- New Extra is slower and compresses a bit worse. Possibly i will go back to the old version of frame partition search level high.

  Thomas

Yalac - Comparisons

Reply #238
Quote
' date='Jul 19 2006, 16:38' post='413434']

Well, i really don't know if i will implement it.

You know what? Don't.  You're not trying to beat Monkey's.  Nobody uses the deprecated format anyways.  Your targets are FLAC, WavPack, and maybe OptimFrog's fast modes.  And you're doing fine -- Much better than people would have fathomed 4 months ago, on april first.

Thanks. 

I guess you are right. There are other possible optimizations i could evaluate. But it's good, that i have tried it. Now i can be sure, that it isn't too important.

  Thomas

Yalac - Comparisons

Reply #239
What i find interesting:

- New fast is encoding more than 10 percent slower. That's a bit strange, because i have not changed anything except the frameduration (125 instead of 100 ms). Possibly your disk io is sensitive to the higher block size.
I should have CPU only figures for 0.10 and 0.9.  I'll try to post them soon.

I'm trying to find some time to play with Damage, but I'm getting nowhere fast.
I'm on a horse.

Yalac - Comparisons

Reply #240
As promised, CPU-only figures for Turbo:

Code: [Select]
             0.09       0.10      Extra        Max
==================================================
Encode    x122.71    x107.71     x61.66     x39.39
Decode    x122.45    x115.73    x114.85    x114.50


Maybe I should try a few files with both encoders in the same run, to see whether the machine state is at fault?
I'm on a horse.

Yalac - Comparisons

Reply #241
Maybe I should try a few files with both encoders in the same run, to see whether the machine state is at fault?
I think machine state is in for a lot, at these speeds.  Do you disable your antivirus scanner?

(Nice new avatar, btw -- I like it much better than the old one.  Kind of the same style as dev0's, whom I complimented on his )

Yalac - Comparisons

Reply #242
As promised, CPU-only figures for Turbo:

Code: [Select]
             0.09       0.10      Extra        Max
==================================================
Encode    x122.71    x107.71     x61.66     x39.39
Decode    x122.45    x115.73    x114.85    x114.50


Maybe I should try a few files with both encoders in the same run, to see whether the machine state is at fault?

No, not for me,  thanks. Maybe it's CPU dependend, for instance: The smaller frames of V0.09 just fitted into the CPU cache and the slightly bigger ones of V0.10 don't always.

The speed loss isn't to bad, isn't it? You know, i had to increase the frame size a bit to compensate for the new stream info and error check data.

BTW: What do you think about new NORMAL? (A bit more speed but 0.05 to 0.1 percent less compression.)

And for something different: I find JohanDeBock's comparison very interesting. Especially the compression performance of the new evaluation levels for NORMAL. They can come very close to High without loosing decoding speed. I have to look into his comparison for such evaluations, because your file set does not benefit from higher predictor orders and therefore the advantage of HIGH over NORMAL is always quite small.

Yalac - Comparisons

Reply #243
I think machine state is in for a lot, at these speeds.  Do you disable your antivirus scanner?

(Nice new avatar, btw -- I like it much better than the old one.  Kind of the same style as dev0's, whom I complimented on his )
I don't disable antivirus, as I wouldn't if I was ripping/encoding/decoding/etc. in real life.  I just ensure that nothing else is active, like Outlook Express, Google Talk, etc.

(I'm glad you approve. )

The speed loss isn't to bad, isn't it?
No, it seems quite negligable to me.

BTW: What do you think about new NORMAL? (A bit more speed but 0.05 to 0.1 percent less compression.)
Very, very slight drop in compression but faster encoding and decoding.  I'm always happy to see Normal drift in that direction.

And for something different: I find JohanDeBock's comparison very interesting.
I can't view the Open Office document and the PDF I can see doesn't appear to have Yalac (should we be using TAK now?) in it.
I'm on a horse.

Yalac - Comparisons

Reply #244
I can't view the Open Office document and the PDF I can see doesn't appear to have Yalac (should we be using TAK now?) in it.


http://uclc.info/LossLess.pdf
http://uclc.info/LossLess.ods

The OpenOffice spreatsheet is interactive, contains macros's for sorting. This the first time I actually use OpenOffice, seems to be very good and free .

Yalac - Comparisons

Reply #245
http://uclc.info/LossLess.pdf
http://uclc.info/LossLess.ods
Thank you.

The OpenOffice spreatsheet is interactive, contains macros's for sorting. This the first time I actually use OpenOffice, seems to be very good and free .
It does look very nice, but I already have MS Office, and don't want to bloat my machine even further.

And for something different: I find JohanDeBock's comparison very interesting. Especially the compression performance of the new evaluation levels for NORMAL. They can come very close to High without loosing decoding speed. I have to look into his comparison for such evaluations, because your file set does not benefit from higher predictor orders and therefore the advantage of HIGH over NORMAL is always quite small.
Ah yes, I see what you mean now.  Normal Max (59.49%) is very, very close to High (59.46%) with no difference in decompression speed compared to  Normal.

It's good to see other user's results and compare.  Very good work Johan.
I'm on a horse.


Yalac - Comparisons

Reply #247
Can't open office open Excel files just fine?


Yalac - Comparisons

Reply #249
Off-topic, anyone?  Synthetic, maybe you should split the OpenOffice discuttion elsewhere.