# HydrogenAudio

## Hydrogenaudio Forum => Scientific Discussion => Topic started by: Max A.K. on 2003-11-25 08:54:03

Title: Quantization
Post by: Max A.K. on 2003-11-25 08:54:03
How to calculate the number of bits required to ensure that the quantization error for the tone is within the minimum masking threshold if i know power of the tone and masking threshold?
Title: Quantization
Post by: NumLOCK on 2003-11-25 09:18:20
Hi,

Seems simple.

Title: Quantization
Post by: Max A.K. on 2003-11-25 10:12:21
Quote
Hi,

Seems simple.

Title: Quantization
Post by: NumLOCK on 2003-11-25 10:29:26
Quote
Quote
Hi,

Seems simple.

Yes.  This might depend on the codec, but in general if you don't take the minimum value, the quantization noise might get perceptible.
Title: Quantization
Post by: Max A.K. on 2003-11-26 09:16:52
Quote
Quote
Quote
Hi,

Seems simple.

Yes.  This might depend on the codec, but in general if you don't take the minimum value, the quantization noise might get perceptible.

Maybe you can give me some references where I can read more about it?
Title: Quantization
Post by: NumLOCK on 2003-11-26 09:35:38
Quote
Maybe you can give me some references where I can read more about it?

My apologies, I can't..  because I won't be home until some time

Edit: Since your question was pretty essential, you can probably get away with any good book on psychoacoustics (but I don't know which ones by heart).
Title: Quantization
Post by: Ivan Dimkovic on 2003-11-26 10:59:51
The measure we are talking about here is called "perceptual entropy",  and the basic paper where it was described is:

J. D. Johnston, "Estimation of perceptual entropy using noise-masking criteria," in Proc. ICASSP, 1988, pp. 2524--2527.

Basically, for plurality of frequency lines, amount of the bits required to code the bitstream can't be lower than:

PE = SUM (0<i<-numlines) log2(Energy / MaskThr)

PE is a theoretical minimum - and it can't be achieved in true codec (you need some side bits to represent the coding information, etc...)

So, for a real coding system, bits required to code a plurality of signals is most likely to be around:

BITS = p * PE + q

where  p and q are codec dependent parameters..  For example, for MPEG-4 AAC:

BITS = 0.6 * PE + 24 * sqrt (PE)

This is not exact measure, of course - since huffman codebooks are not perfect, and signal statistic might vary from frame to frame, etc..
Title: Quantization
Post by: Max A.K. on 2003-11-28 08:51:17
Quote
The measure we are talking about here is called "perceptual entropy",  and the basic paper where it was described is:

J. D. Johnston, "Estimation of perceptual entropy using noise-masking criteria," in Proc. ICASSP, 1988, pp. 2524--2527.

Basically, for plurality of frequency lines, amount of the bits required to code the bitstream can't be lower than:

PE = SUM (0<i<-numlines) log2(Energy / MaskThr)

PE is a theoretical minimum - and it can't be achieved in true codec (you need some side bits to represent the coding information, etc...)

So, for a real coding system, bits required to code a plurality of signals is most likely to be around:

BITS = p * PE + q

where  p and q are codec dependent parameters..  For example, for MPEG-4 AAC:

BITS = 0.6 * PE + 24 * sqrt (PE)

This is not exact measure, of course - since huffman codebooks are not perfect, and signal statistic might vary from frame to frame, etc..

So if i have one tone (TF) and i know minimum masking threshold (MinT) i need

log2(TF / MinT)  bits to quantize this tone?
Title: Quantization
Post by: Ivan Dimkovic on 2003-11-28 16:31:01
Quote
log2(TF / MinT) bits to quantize this tone?

Yep.. actually, this should be the "theoretical minimum" - like with normal entropy.

You also have to add side data necessary to represent the compressed spectrum, like huffman table indexes or tables themselves - also, each huffman codebook has its own performance on different signal statistics, etc..

So, for a particular coding system (like MP3) the formula is little bit more complicated.