Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Horrible performance of lossless codecs (Read 11276 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Horrible performance of lossless codecs

http://sjeng.org/ftp/vorbis/Garf_Bl33p!.flac

Any idea why the performance of most lossless codecs is so horrible on this very simple signal?
I would expect prediction to be almost perfect for it.

But:

FLAC bitrate:580kbps
APE bitrate:750kbps
WavPack: also >500kbps

ZIP bitrate: 21kbps
7z/RAR bitrate: 2.5kbps

Horrible performance of lossless codecs

Reply #1
I just tried it with optimfrog using the 'normal' compression setting, and the output size was 51.9 KB.

Horrible performance of lossless codecs

Reply #2
Quote
http://sjeng.org/ftp/vorbis/Garf_Bl33p!.flac

Any idea why the performance of most lossless codecs is so horrible on this very simple signal?
I would expect prediction to be almost perfect for it.

But:

FLAC bitrate:580kbps
APE bitrate:750kbps
WavPack: also >500kbps

ZIP bitrate: 21kbps
7z/RAR bitrate: 2.5kbps

bzip2 131bps

I think the problem is with the use of Rice-coding.  My understanding is that it was designed for normally distributed residuals, and is decidedly sub-optimal where you have mostly zeros with an occasional + or - 65535.

Good test sample!

--John

Horrible performance of lossless codecs

Reply #3
Those last few percent of compression that the best lossless compressors get is at the cost of an enormous amount of complexity in the prediction algorithms. They look at dozens of previous samples and have dozens of adjusting coefficients. So, a single transition will generate a whole train of non-zero residual values to encode. Another version that simply used the previous sample as the prediction would encode this much better, but would virtually never work better on any sample of real music (in fact, WavPack's "fast" mode compresses this sample to about half the size of the "high" mode for this reason).

Also, no lossless compressor is going to take advantage of exactly repeating sequences of numbers like a general data compressor, because these never occur in audio data and require completely different coding algorithms (i.e. dictionary based, no Rice coding).

An "ideal" compressor could be made to try several different simple and complex algorithms to detect cases like this, but most people would not be willing to put up with the encoding time penalty unless it improved performance on any "real" samples.

BTW, your sample scared my cat! 

Horrible performance of lossless codecs

Reply #4
Quote
http://sjeng.org/ftp/vorbis/Garf_Bl33p!.flac

Any idea why the performance of most lossless codecs is so horrible on this very simple signal?
I would expect prediction to be almost perfect for it.

But:

FLAC bitrate:580kbps
APE bitrate:750kbps
WavPack: also >500kbps

ZIP bitrate: 21kbps
7z/RAR bitrate: 2.5kbps

I think Bryant's right.  Note that for regular signals like this, tuning the blocksize in FLAC gets you a lot, e.g. "flac -8 --lax --blocksize=384" is about 169kbps.  But BWT compressors like bzip2 will really kick on this signal, they'll probably use just a few bits to encode a single cycle, and a little for the dictionary.

Josh

Horrible performance of lossless codecs

Reply #5
Garf, I haven't looked at the WAV yet, but that FLAC file is very very strange ! It looks like a killer sample for RAR !

Original (FLAC) = 4'394'931 bytes
RAR (Best) => 80'162 bytes
RAR (Good) => 81'523 bytes
RAR (Normal) => 27'680 bytes 
RAR (Fast) => 55'240 bytes
RAR (Fastest) => 105'302 bytes

Just looking at the FLAC output with a hex editor shows that it's extremely redundant.

Horrible performance of lossless codecs

Reply #6
I win !  B)

My packer projects brings the .WAV down to 76 bytes.

Compression ratio: 1:139264 

Horrible performance of lossless codecs

Reply #7
Quote
Those last few percent of compression that the best lossless compressors get is at the cost of an enormous amount of complexity in the prediction algorithms. They look at dozens of previous samples and have dozens of adjusting coefficients. So, a single transition will generate a whole train of non-zero residual values to encode. Another version that simply used the previous sample as the prediction would encode this much better, but would virtually never work better on any sample of real music (in fact, WavPack's "fast" mode compresses this sample to about half the size of the "high" mode for this reason).

Also, no lossless compressor is going to take advantage of exactly repeating sequences of numbers like a general data compressor, because these never occur in audio data and require completely different coding algorithms (i.e. dictionary based, no Rice coding).

An "ideal" compressor could be made to try several different simple and complex algorithms to detect cases like this, but most people would not be willing to put up with the encoding time penalty unless it improved performance on any "real" samples.

BTW, your sample scared my cat! 

Quote
Just looking at the FLAC output with a hex editor shows that it's extremely redundant.


This problem can't be avoided adding a function, in lossless codecs, that checks if the output file is redundant as numlock noted and changes the compression mode used?

Just my 0,0000002 euros


edit = well the problem of time penalty will still be there...
Vital papers will demonstrate their vitality by spontaneously moving from where you left them to where you can't find them.

Horrible performance of lossless codecs

Reply #8
Quote
Garf, I haven't looked at the WAV yet, but that FLAC file is very very strange ! It looks like a killer sample for RAR !

Original (FLAC) = 4'394'931 bytes
RAR (Best) => 80'162 bytes
RAR (Good) => 81'523 bytes
RAR (Normal) => 27'680 bytes  
RAR (Fast) => 55'240 bytes
RAR (Fastest) => 105'302 bytes

Just looking at the FLAC output with a hex editor shows that it's extremely redundant.

If we're into RAR killer samples, I remember seeing some files (it was some Unreal Tournament mod IIRC) where RAR miserably lost it to ZIP, even to its own ZIP compressor. I can dig them up if anyone interested.
Microsoft Windows: We can't script here, this is bat country.

Horrible performance of lossless codecs

Reply #9
Quote
If we're into RAR killer samples, I remember seeing some files (it was some Unreal Tournament mod IIRC) where RAR miserably lost it to ZIP, even to its own ZIP compressor. I can dig them up if anyone interested.

Your file might make RAR heuristics fails, and thus pick the wrong algorithm 

If you still have it I'd be interested, yes.

Horrible performance of lossless codecs

Reply #10
Sorry, Garf, but why you have uploaded FLAC instead of ZIP than? 

-Eugene
The  greatest  programming  project of all took six days;  on the seventh  day  the  programmer  rested.  We've been trying to debug the !@#$%&* thing ever since. Moral: design before you implement.

Horrible performance of lossless codecs

Reply #11
Quote
Quote
Just looking at the FLAC output with a hex editor shows that it's extremely redundant.


This problem can't be avoided adding a function, in lossless codecs, that checks if the output file is redundant as numlock noted and changes the compression mode used?

Had the same idea. Some ultrasuperduperhigh compression mode would run two compressors at once, the regular audio compressor and e.g. a ZIP compressor, and then store the blocks that are compressed the most.

Horrible performance of lossless codecs

Reply #12
Quote
Sorry, Garf, but why you have uploaded FLAC instead of ZIP than? 


Horrible performance of lossless codecs

Reply #13
At least that sample made me understand the true performance of my dial-up modem (56k):

Horrible performance of lossless codecs

Reply #14
Your modem uses compression, you could say it zips-on-the-fly.

I didn't realize it zipped so well until after I uploaded it.

Horrible performance of lossless codecs

Reply #15
Zero-knowledge compressors have a big advantage on this "audio" sample:  they don't believe it's audio.

After seeing a short header, followed by a few alternances of:
FF7F (ie: 32767 in hex & reversed byte-order) repeated 102 times
and:
0080 (ie: -32768 in hex & reversed byte-order) repeated 100 times

... the packer will suppose this would continue for a long time.

For example, in a well-tuned arithmetic coder you can encode a whole alternance (2*102 + 2*100 bytes) in a matter of bits.

Using arithmetic coding, if you remove the WAV header one could encode this whole file in less than 10 bytes, and still be able to handle any input file (or unexpected data at the end of this file) correctly.

Even if you know the repeating sequence has a very high probability to appear, you still must reserve those 0.000001% probability for all other cases. That's also why the file would take 10 bytes, and not zero bit 

Huffman encoding, on the other hand can only assig whole bits, ie:  it can assign "0" to the most common sequence, and "1xxxxxxxxxxxxxxx..." to all possible others.

Horrible performance of lossless codecs

Reply #16
Yes, and bzip is the winner here. It compressed the file to 983 bytes. WinRar (with PPM forced) compressed to 4755 bytes.

Such results are pretty obvious and very similiar to those of Short_Block_Test_2, which is very sparse as well: http://eltoder.nm.ru/temp/Short_Block_Test_2.res

-Eugene
The  greatest  programming  project of all took six days;  on the seventh  day  the  programmer  rested.  We've been trying to debug the !@#$%&* thing ever since. Moral: design before you implement.

Horrible performance of lossless codecs

Reply #17
RAR 3.x can shrink the wav further to ~4756 bytes, when forcing markov ("text") compression at maximum order (99).

Edit:
LOL eltoder, what did you do to RAR, to gain that extra byte ? 
About bzip2, cool but that's still far away from my 76 bytes.

Horrible performance of lossless codecs

Reply #18
Quote
RAR 3.x can shrink the wav further to ~4756 bytes, when forcing markov ("text") compression at maximum order (99).

Edit:
LOL eltoder, what did you do to RAR, to gain that extra byte ? 
About bzip2, cool but that's still far away from my 76 bytes.

The maximal order is in fact 63, not 99. And the best results are obtained at order 60 
And could you share you great program with us?

-Eugene
The  greatest  programming  project of all took six days;  on the seventh  day  the  programmer  rested.  We've been trying to debug the !@#$%&* thing ever since. Moral: design before you implement.

Horrible performance of lossless codecs

Reply #19
Quote
The maximal order is in fact 63, not 99. And the best results are obtained at order 60 
And could you share you great program with us?

-Eugene

Oh. I should have tried all possible values then 

Well, why not.. (thanks for the compliment btw) but I'm a bit ashamed, it's awfully slow, and optimized for data, not audio (except for Garf ® © audio of course). Also the name of input and output files is hardcoded in the source.. 

Edit:  Btw I absolutely love your signature !

Horrible performance of lossless codecs

Reply #20
Quote
it's awfully slow, and optimized for data, not audio (except for Garf ® © audio of course). Also the name of input and output files is hardcoded in the source.. 

I absolutely need this program! 

And I like my sig too, but now I think it's a bit long

-Eugene
The  greatest  programming  project of all took six days;  on the seventh  day  the  programmer  rested.  We've been trying to debug the !@#$%&* thing ever since. Moral: design before you implement.

Horrible performance of lossless codecs

Reply #21
Ok, I'll send it to you tonite 

Be advised though, to compress such files well I had to hack the probability modelling curve.. so you (or we) would have to find the parameters again.

Also, it's a bit-based program (it compresses bits, not bytes). However, it uses heuristics and tries to still exploit byte, word, dword alignments, when it is possible.

The probability estimation part is a mess, it uses variable-length PPM, hashtables, dynamic decay curves, etc etc.. It looks more like a nuclear physics simulator than a packer. I think a complete rewrite should be done ASAP 

IIRC, I have an improved arithmetic coding backend, and crazy ideas lying around, that I could use in the new project (when time allows).

Edit: I'll have a bit more time when my laptop's back from Shinjuku-Ku with a new hdd.

Horrible performance of lossless codecs

Reply #22
SBC Archiver 0950 beta:

sbc.exe c -m3 -b63 newarchive Garf_Bl33p!.wav

compresses to 217 Bytes... 

 

Horrible performance of lossless codecs

Reply #23
Very impressive. It blasts away RKIVE, 777...  even though the latter uses arithmetic coding.

Edit: SBC is one of the 1st archivers to associate block-sorting and arithmetic coding 

Horrible performance of lossless codecs

Reply #24
All of them do. Imagine PPM or BWT without arithmetic coding?

-Eugene
The  greatest  programming  project of all took six days;  on the seventh  day  the  programmer  rested.  We've been trying to debug the !@#$%&* thing ever since. Moral: design before you implement.