HydrogenAudio

Lossless Audio Compression => Lossless / Other Codecs => Topic started by: Garf on 2003-06-17 18:14:31

Title: Horrible performance of lossless codecs
Post by: Garf on 2003-06-17 18:14:31
http://sjeng.org/ftp/vorbis/Garf_Bl33p!.flac (http://sjeng.org/ftp/vorbis/Garf_Bl33p!.flac)

Any idea why the performance of most lossless codecs is so horrible on this very simple signal?
I would expect prediction to be almost perfect for it.

But:

FLAC bitrate:580kbps
APE bitrate:750kbps
WavPack: also >500kbps

ZIP bitrate: 21kbps
7z/RAR bitrate: 2.5kbps
Title: Horrible performance of lossless codecs
Post by: k.m.krebs on 2003-06-17 18:39:56
I just tried it with optimfrog using the 'normal' compression setting, and the output size was 51.9 KB.
Title: Horrible performance of lossless codecs
Post by: rompel on 2003-06-17 21:12:33
Quote
http://sjeng.org/ftp/vorbis/Garf_Bl33p!.flac (http://sjeng.org/ftp/vorbis/Garf_Bl33p!.flac)

Any idea why the performance of most lossless codecs is so horrible on this very simple signal?
I would expect prediction to be almost perfect for it.

But:

FLAC bitrate:580kbps
APE bitrate:750kbps
WavPack: also >500kbps

ZIP bitrate: 21kbps
7z/RAR bitrate: 2.5kbps

bzip2 131bps

I think the problem is with the use of Rice-coding.  My understanding is that it was designed for normally distributed residuals, and is decidedly sub-optimal where you have mostly zeros with an occasional + or - 65535.

Good test sample!

--John
Title: Horrible performance of lossless codecs
Post by: bryant on 2003-06-17 22:59:56
Those last few percent of compression that the best lossless compressors get is at the cost of an enormous amount of complexity in the prediction algorithms. They look at dozens of previous samples and have dozens of adjusting coefficients. So, a single transition will generate a whole train of non-zero residual values to encode. Another version that simply used the previous sample as the prediction would encode this much better, but would virtually never work better on any sample of real music (in fact, WavPack's "fast" mode compresses this sample to about half the size of the "high" mode for this reason).

Also, no lossless compressor is going to take advantage of exactly repeating sequences of numbers like a general data compressor, because these never occur in audio data and require completely different coding algorithms (i.e. dictionary based, no Rice coding).

An "ideal" compressor could be made to try several different simple and complex algorithms to detect cases like this, but most people would not be willing to put up with the encoding time penalty unless it improved performance on any "real" samples.

BTW, your sample scared my cat! 
Title: Horrible performance of lossless codecs
Post by: jcoalson on 2003-06-18 09:01:39
Quote
http://sjeng.org/ftp/vorbis/Garf_Bl33p!.flac (http://sjeng.org/ftp/vorbis/Garf_Bl33p!.flac)

Any idea why the performance of most lossless codecs is so horrible on this very simple signal?
I would expect prediction to be almost perfect for it.

But:

FLAC bitrate:580kbps
APE bitrate:750kbps
WavPack: also >500kbps

ZIP bitrate: 21kbps
7z/RAR bitrate: 2.5kbps

I think Bryant's right.  Note that for regular signals like this, tuning the blocksize in FLAC gets you a lot, e.g. "flac -8 --lax --blocksize=384" is about 169kbps.  But BWT compressors like bzip2 will really kick on this signal, they'll probably use just a few bits to encode a single cycle, and a little for the dictionary.

Josh
Title: Horrible performance of lossless codecs
Post by: NumLOCK on 2003-06-18 09:53:39
Garf, I haven't looked at the WAV yet, but that FLAC file is very very strange ! It looks like a killer sample for RAR !

Original (FLAC) = 4'394'931 bytes
RAR (Best) => 80'162 bytes
RAR (Good) => 81'523 bytes
RAR (Normal) => 27'680 bytes 
RAR (Fast) => 55'240 bytes
RAR (Fastest) => 105'302 bytes

Just looking at the FLAC output with a hex editor shows that it's extremely redundant.
Title: Horrible performance of lossless codecs
Post by: NumLOCK on 2003-06-18 10:05:34
I win !  B)

My packer projects brings the .WAV down to 76 bytes.

Compression ratio: 1:139264 
Title: Horrible performance of lossless codecs
Post by: Atlantis on 2003-06-18 10:07:33
Quote
Those last few percent of compression that the best lossless compressors get is at the cost of an enormous amount of complexity in the prediction algorithms. They look at dozens of previous samples and have dozens of adjusting coefficients. So, a single transition will generate a whole train of non-zero residual values to encode. Another version that simply used the previous sample as the prediction would encode this much better, but would virtually never work better on any sample of real music (in fact, WavPack's "fast" mode compresses this sample to about half the size of the "high" mode for this reason).

Also, no lossless compressor is going to take advantage of exactly repeating sequences of numbers like a general data compressor, because these never occur in audio data and require completely different coding algorithms (i.e. dictionary based, no Rice coding).

An "ideal" compressor could be made to try several different simple and complex algorithms to detect cases like this, but most people would not be willing to put up with the encoding time penalty unless it improved performance on any "real" samples.

BTW, your sample scared my cat! 

Quote
Just looking at the FLAC output with a hex editor shows that it's extremely redundant.


This problem can't be avoided adding a function, in lossless codecs, that checks if the output file is redundant as numlock noted and changes the compression mode used?

Just my 0,0000002 euros


edit = well the problem of time penalty will still be there...
Title: Horrible performance of lossless codecs
Post by: Peter on 2003-06-18 10:08:23
Quote
Garf, I haven't looked at the WAV yet, but that FLAC file is very very strange ! It looks like a killer sample for RAR !

Original (FLAC) = 4'394'931 bytes
RAR (Best) => 80'162 bytes
RAR (Good) => 81'523 bytes
RAR (Normal) => 27'680 bytes  
RAR (Fast) => 55'240 bytes
RAR (Fastest) => 105'302 bytes

Just looking at the FLAC output with a hex editor shows that it's extremely redundant.

If we're into RAR killer samples, I remember seeing some files (it was some Unreal Tournament mod IIRC) where RAR miserably lost it to ZIP, even to its own ZIP compressor. I can dig them up if anyone interested.
Title: Horrible performance of lossless codecs
Post by: NumLOCK on 2003-06-18 10:11:54
Quote
If we're into RAR killer samples, I remember seeing some files (it was some Unreal Tournament mod IIRC) where RAR miserably lost it to ZIP, even to its own ZIP compressor. I can dig them up if anyone interested.

Your file might make RAR heuristics fails, and thus pick the wrong algorithm 

If you still have it I'd be interested, yes.
Title: Horrible performance of lossless codecs
Post by: eltoder on 2003-06-18 12:01:25
Sorry, Garf, but why you have uploaded FLAC instead of ZIP than? 

-Eugene
Title: Horrible performance of lossless codecs
Post by: Tripwire on 2003-06-18 12:02:44
Quote
Quote
Just looking at the FLAC output with a hex editor shows that it's extremely redundant.


This problem can't be avoided adding a function, in lossless codecs, that checks if the output file is redundant as numlock noted and changes the compression mode used?

Had the same idea. Some ultrasuperduperhigh compression mode would run two compressors at once, the regular audio compressor and e.g. a ZIP compressor, and then store the blocks that are compressed the most.
Title: Horrible performance of lossless codecs
Post by: thop on 2003-06-18 12:31:34
Quote
Sorry, Garf, but why you have uploaded FLAC instead of ZIP than? 

Title: Horrible performance of lossless codecs
Post by: anza on 2003-06-18 12:59:37
At least that sample made me understand the true performance of my dial-up modem (56k):
Title: Horrible performance of lossless codecs
Post by: Garf on 2003-06-18 13:02:41
Your modem uses compression, you could say it zips-on-the-fly.

I didn't realize it zipped so well until after I uploaded it.
Title: Horrible performance of lossless codecs
Post by: NumLOCK on 2003-06-18 13:19:09
Zero-knowledge compressors have a big advantage on this "audio" sample:  they don't believe it's audio.

After seeing a short header, followed by a few alternances of:
FF7F (ie: 32767 in hex & reversed byte-order) repeated 102 times
and:
0080 (ie: -32768 in hex & reversed byte-order) repeated 100 times

... the packer will suppose this would continue for a long time.

For example, in a well-tuned arithmetic coder you can encode a whole alternance (2*102 + 2*100 bytes) in a matter of bits.

Using arithmetic coding, if you remove the WAV header one could encode this whole file in less than 10 bytes, and still be able to handle any input file (or unexpected data at the end of this file) correctly.

Even if you know the repeating sequence has a very high probability to appear, you still must reserve those 0.000001% probability for all other cases. That's also why the file would take 10 bytes, and not zero bit 

Huffman encoding, on the other hand can only assig whole bits, ie:  it can assign "0" to the most common sequence, and "1xxxxxxxxxxxxxxx..." to all possible others.
Title: Horrible performance of lossless codecs
Post by: eltoder on 2003-06-18 13:29:19
Yes, and bzip is the winner here. It compressed the file to 983 bytes. WinRar (with PPM forced) compressed to 4755 bytes.

Such results are pretty obvious and very similiar to those of Short_Block_Test_2, which is very sparse as well: http://eltoder.nm.ru/temp/Short_Block_Test_2.res (http://eltoder.nm.ru/temp/Short_Block_Test_2.res)

-Eugene
Title: Horrible performance of lossless codecs
Post by: NumLOCK on 2003-06-18 13:37:41
RAR 3.x can shrink the wav further to ~4756 bytes, when forcing markov ("text") compression at maximum order (99).

Edit:
LOL eltoder, what did you do to RAR, to gain that extra byte ? 
About bzip2, cool but that's still far away from my 76 bytes.
Title: Horrible performance of lossless codecs
Post by: eltoder on 2003-06-18 14:00:40
Quote
RAR 3.x can shrink the wav further to ~4756 bytes, when forcing markov ("text") compression at maximum order (99).

Edit:
LOL eltoder, what did you do to RAR, to gain that extra byte ? 
About bzip2, cool but that's still far away from my 76 bytes.

The maximal order is in fact 63, not 99. And the best results are obtained at order 60 
And could you share you great program with us?

-Eugene
Title: Horrible performance of lossless codecs
Post by: NumLOCK on 2003-06-18 14:09:24
Quote
The maximal order is in fact 63, not 99. And the best results are obtained at order 60 
And could you share you great program with us?

-Eugene

Oh. I should have tried all possible values then 

Well, why not.. (thanks for the compliment btw) but I'm a bit ashamed, it's awfully slow, and optimized for data, not audio (except for Garf ® © audio of course). Also the name of input and output files is hardcoded in the source.. 

Edit:  Btw I absolutely love your signature !
Title: Horrible performance of lossless codecs
Post by: eltoder on 2003-06-18 14:19:26
Quote
it's awfully slow, and optimized for data, not audio (except for Garf ® © audio of course). Also the name of input and output files is hardcoded in the source.. 

I absolutely need this program! 

And I like my sig too, but now I think it's a bit long

-Eugene
Title: Horrible performance of lossless codecs
Post by: NumLOCK on 2003-06-18 14:32:46
Ok, I'll send it to you tonite 

Be advised though, to compress such files well I had to hack the probability modelling curve.. so you (or we) would have to find the parameters again.

Also, it's a bit-based program (it compresses bits, not bytes). However, it uses heuristics and tries to still exploit byte, word, dword alignments, when it is possible.

The probability estimation part is a mess, it uses variable-length PPM, hashtables, dynamic decay curves, etc etc.. It looks more like a nuclear physics simulator than a packer. I think a complete rewrite should be done ASAP 

IIRC, I have an improved arithmetic coding backend, and crazy ideas lying around, that I could use in the new project (when time allows).

Edit: I'll have a bit more time when my laptop's back from Shinjuku-Ku with a new hdd.
Title: Horrible performance of lossless codecs
Post by: JamesBond on 2003-06-18 15:22:40
SBC Archiver 0950 beta:

sbc.exe c -m3 -b63 newarchive Garf_Bl33p!.wav

compresses to 217 Bytes... 
Title: Horrible performance of lossless codecs
Post by: NumLOCK on 2003-06-18 15:36:21
Very impressive. It blasts away RKIVE, 777...  even though the latter uses arithmetic coding.

Edit: SBC is one of the 1st archivers to associate block-sorting and arithmetic coding 
Title: Horrible performance of lossless codecs
Post by: eltoder on 2003-06-18 15:50:24
All of them do. Imagine PPM or BWT without arithmetic coding?

-Eugene
Title: Horrible performance of lossless codecs
Post by: NumLOCK on 2003-06-18 15:55:06
Quote
All of them do. Imagine PPM or BWT without arithmetic coding?

-Eugene

Well, bzip2 does use BWT and regular Huffman...  but this is for performance and patent issues, of course.

Btw, have you made a compressor yourself Eugene ?
Title: Horrible performance of lossless codecs
Post by: eltoder on 2003-06-18 16:29:47
Quote
Quote
All of them do. Imagine PPM or BWT without arithmetic coding?

-Eugene

Well, bzip2 does use BWT and regular Huffman...  but this is for performance and patent issues, of course.

Btw, have you made a compressor yourself Eugene ?

I thought bzip2 uses some sort of rangecoding. Seems I was wrong. 
At least, most use rangecoding after BWT

Compressor? I've tried several times, but never got anything usefull. LZW and PPM like things were rather funny, though 

-Eugene
Title: Horrible performance of lossless codecs
Post by: NumLOCK on 2003-06-18 16:36:52
I wrote a LZ77 or LZW program (can't remember) once, but its performance was catastrophic: it inflated files most of the time 
Title: Horrible performance of lossless codecs
Post by: rjamorim on 2004-01-05 18:01:39
Garf, could you please make this sample available again? I wanted to try it with WavPack4's assymetrical modes. (quoting David's readme: "Because the standard compression parameters are optimized for "normal" audio, this option works best with "non-standard" audio where it can often achieve enormous gains.")

also:
Quote
RAR (Best) => 80'162 bytes
RAR (Good) => 81'523 bytes
RAR (Normal) => 27'680 bytes 
RAR (Fast) => 55'240 bytes
RAR (Fastest) => 105'302 bytes


The reason is simple. RAR's audio compression routines automatically kick in at Best and Good modes. So at Best and Good it tries to compress it as audio, and at Normal, Fast and Fastest it compresses it as general data.

Quote
In "Auto" mode WinRAR will decide when to use the audio compression depending on source data and only if "Good" or "Best" compression method is selected.


Regards;

Roberto.
Title: Horrible performance of lossless codecs
Post by: bryant on 2004-01-07 06:20:31
Quote
Garf, could you please make this sample available again?

I still have that file and just ran it through a bunch of lossless audio compressors (plus the old and new WavPacks, of course):

Code: [Select]
             Original File
            -------------
10,584,044 garf.wav     (orginal file)

         Monkey's Audio 3.97
         -------------------
 5,642,404 garf-xh.ape  (extra high mode)
 5,494,040 garf-h.ape   (high mode)
 4,530,440 garf-n.ape   (normal mode)
 4,527,024 garf-f.ape   (fast mode)

                FLAC
                ----
 4,425,222 garf-1.flac  (mode 1)
 4,397,071 garf-5.flac  (mode 5)
 4,392,479 garf-8.flac  (mode 8)

           OptimFROG 4.507
           ---------------
    53,186 garf.ofr     (default mode)
     2,985 garf2.ofr    ("newbest" mode)

              RKAU 1.07
              ---------
   164,261 garf-1.rka   (fast mode)
   590,353 garf-3.rka   (high mode)

               LA 0.4
               ------
 3,343,564 garf-h.la    (high mode)
 2,364,142 garf.la      (default mode)

               WavPack
               -------
 2,134,878 garf-ff.wv    (3.97, very fast mode)
 1,476,068 garf-ff.wv    (4.0a2, very fast mode)
   766,124 garf-ffx4.wv  (4.0a2, very fast mode, extra processing)

 1,804,879 garf-f.wv     (3.97, fast mode)
 1,503,778 garf-f.wv     (4.0a2, fast mode)
   769,248 garf-fx4.wv   (4.0a2, fast mode, extra processing)

 4,034,834 garf-n.wv     (3.97, default mode)
 1,741,260 garf-n.wv     (4.0a2, default mode)
   770,792 garf-x4.wv    (4.0a2, default mode, extra processing)

 4,035,317 garf-h.wv     (3.97, high mode)
 2,675,828 garf-h.wv     (4.0a2, high mode)
   768,158 garf-hx3.wv   (4.0a2, high mode, extra processing)


The first interesting point is that WavPack, RKAU, LA and Monkey's Audio all have the characteristic that the "higher" modes do worse than the "faster" modes.

Second, the "extra" processing mode does significantly help WavPack's performance with this sample.

Finally, OptimFROG is pretty amazing. Good job, Florin!