Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
0 Members and 1 Guest are viewing this topic.

Quantization

2003-11-25 08:54:03
How to calculate the number of bits required to ensure that the quantization error for the tone is within the minimum masking threshold if i know power of the tone and masking threshold?

Hi,

Seems simple.

Quote
Hi,

Seems simple.

Quantization

Quote
Quote
Hi,

Seems simple.

Yes.  This might depend on the codec, but in general if you don't take the minimum value, the quantization noise might get perceptible.

Quantization

Quote
Quote
Quote
Hi,

Seems simple.

Yes.  This might depend on the codec, but in general if you don't take the minimum value, the quantization noise might get perceptible.

Maybe you can give me some references where I can read more about it?

Quantization

Quote
Maybe you can give me some references where I can read more about it?

My apologies, I can't..  because I won't be home until some time

Edit: Since your question was pretty essential, you can probably get away with any good book on psychoacoustics (but I don't know which ones by heart).

Quantization

The measure we are talking about here is called "perceptual entropy",  and the basic paper where it was described is:

J. D. Johnston, "Estimation of perceptual entropy using noise-masking criteria," in Proc. ICASSP, 1988, pp. 2524--2527.

Basically, for plurality of frequency lines, amount of the bits required to code the bitstream can't be lower than:

PE = SUM (0<i<-numlines) log2(Energy / MaskThr)

PE is a theoretical minimum - and it can't be achieved in true codec (you need some side bits to represent the coding information, etc...)

So, for a real coding system, bits required to code a plurality of signals is most likely to be around:

BITS = p * PE + q

where  p and q are codec dependent parameters..  For example, for MPEG-4 AAC:

BITS = 0.6 * PE + 24 * sqrt (PE)

This is not exact measure, of course - since huffman codebooks are not perfect, and signal statistic might vary from frame to frame, etc..

Quantization

Quote
The measure we are talking about here is called "perceptual entropy",  and the basic paper where it was described is:

J. D. Johnston, "Estimation of perceptual entropy using noise-masking criteria," in Proc. ICASSP, 1988, pp. 2524--2527.

Basically, for plurality of frequency lines, amount of the bits required to code the bitstream can't be lower than:

PE = SUM (0<i<-numlines) log2(Energy / MaskThr)

PE is a theoretical minimum - and it can't be achieved in true codec (you need some side bits to represent the coding information, etc...)

So, for a real coding system, bits required to code a plurality of signals is most likely to be around:

BITS = p * PE + q

where  p and q are codec dependent parameters..  For example, for MPEG-4 AAC:

BITS = 0.6 * PE + 24 * sqrt (PE)

This is not exact measure, of course - since huffman codebooks are not perfect, and signal statistic might vary from frame to frame, etc..

So if i have one tone (TF) and i know minimum masking threshold (MinT) i need

log2(TF / MinT)  bits to quantize this tone?