Hi Techies,
I want to measure the quality of my AAC-LC encoder for a particular stream @16khz sampling frequency.
I had Encoded and decoded the stream with bitrate@32kbps,mono channel.
As per my understanding,algorithm delay for a 48khz stream is 1024(1 frame delay by encoder)+512(half of that by decoder).I have a tool which can measure the quality of encoder for 44.1/48 khz stream.
But, as i want to measure the quality of my encoder @16khz sampling fr,i had downsampled an 48khz stream to 16khz stream(using goldwave tool) and after processing(encoder+decoder) again i had upsampled the output to 48khz stream.
I need to mention the algorithm delay(in terms of samples) to my quality measurement tool,but i am struck with calculation of algorithm delay in this scenario.
Please throw some light on this issue...
Thanks In Advance,
satish
Just manually time align the two samples in an audio editor.
Just manually time align the two samples in an audio editor.
I want to know the actual logic to calculate the delay introduced by encoder for an 16Khz input.
Thanks & Regards,
satish
Search for the paper "A guideline to audio codec delay" by Manfred Lutzky et al. Was presented at an AES conference, I think, and might be available for free.
Chris
Search for the paper "A guideline to audio codec delay" by Manfred Lutzky et al. Was presented at an AES conference, I think, and might be available for free.
Chris
Hi Chris,
Thank you for your reply.
Actually, already i had gone through that paper,but i didn't get any calculation for an 16khz sampling frequency input stream. AAC-LC delay calculations in that paper were w.r.t 48khz sampling fr stream.
Can you please let me know what extra factor will come into picture when the input stream is @16khz sampling fr?
Thanks & Regards,
satish