Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: How efficient is the Hoffman encoding? (Read 8553 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

How efficient is the Hoffman encoding?

Reply #25
Just a thought: Could it be possible to compress MP3 files even more by first decompressing the huffman encoded data and then recompress it with a more efficient compression algorithm? It's a common point of view that huffman isn't as efficient as defalte (for example) is.

So let's say if the Huffman part in the MP3 standardization was replaced by deflate or bzip2's compression scheme... wouldn't it result in significantly smaller "MP3s"? (I know, this term is wrong as MP3 is a standard and doesn't allow compression algorithms other than Huffman encoding to be used)

EDIT: By the way - I guess that I'm wrong but the above is probably what you can call a straightforward chain of thoughts regarding to compression efficiency. I think Huffman is to be used when it's about compressing a bunch of values which can be classified through their appeareance amount only and when it's NOT about compressing a regular octet stream with multiple appeareances of equal or similar value arrays.

How efficient is the Hoffman encoding?

Reply #26
Deflate and bzip2 use Huffman coding for the entropy encoding too. It just uses a different context model.

Menno

How efficient is the Hoffman encoding?

Reply #27
Isn't huffman coding technique the most efficient lossless coding method that ever existed? ..provided that the signal statistics of the data matches the particular huffman codes..

How efficient is the Hoffman encoding?

Reply #28
IIRC huffman is most effective if every input value is represented by an output value with a integer number of bits. Arithmetic coding doesn't have this limit and is better for this reason.

edit:
Somewhere here there's a good explanation by NumLock.
Let's suppose that rain washes out a picnic. Who is feeling negative? The rain? Or YOU? What's causing the negative feeling? The rain or your reaction? - Anthony De Mello

How efficient is the Hoffman encoding?

Reply #29
Are there any sites where you could read how MP3 compression works (every step is explained) and how psychoacustic models work? Also a site for digital music in general would be interesting.

How efficient is the Hoffman encoding?

Reply #30
Quote
Just a thought: Could it be possible to compress MP3 files even more by first decompressing the huffman encoded data and then recompress it with a more efficient compression algorithm? It's a common point of view that huffman isn't as efficient as defalte (for example) is.


There are two ways of doing that, and both were rejected in MPEG:

1. Use some other entropy coding algorithm, say arithmetic coding

While it might work better than Huffman, under certain circumstances (sometimes not), it has drawbacks that are serious  when  frame-by-frame audio encoding is a target.  Drop-in (random seeking) and error protection are some of the things that are hard to do with arithmetic coding in MP3/AAC/...

2. Use adaptive Huffman codebooks, which are calculated from the signal statistic of the source material

This might improve performance by ~1% over current fixed codebooks, but unfortunately it is a heavy burden for decoder complexity, as additional memory would be required for storage of the custom codebooks  and it would break some optimization possibilities -  in addition,  codebooks must be sent every time which complicates "drop in" streaming  and encoder design as well.


It has been proven that fixed set of Huffman codebooks is a best solution for MP3/AAC  taking into account all mainstream applications where these formats are used.