HydrogenAudio

Lossy Audio Compression => Opus => Topic started by: Burrows22 on 2013-04-21 14:08:39

Title: Opus Encoder Latency Measurement
Post by: Burrows22 on 2013-04-21 14:08:39
Is there any way I can measure the latency of the Opus encoder for realtime data streaming for use over a network? Like how NetJack works using CELT?

I'm doing my university dissertation on the implications of latency induced by lossy audio compression over a network for live music performances. I've worked out a way to measure the latency of the decoder but need to find a way to measure the latency of the encoder to get the overall latency the codec creates using different compression settings.

Is there any way I'm able to do this?

Thanks!
Title: Opus Encoder Latency Measurement
Post by: NullC on 2013-04-22 17:50:09
Is there any way I can measure the latency of the Opus encoder for realtime data streaming for use over a network? Like how NetJack works using CELT?
I'm doing my university dissertation on the implications of latency induced by lossy audio compression over a network for live music performances. I've worked out a way to measure the latency of the decoder but need to find a way to measure the latency of the encoder to get the overall latency the codec creates using different compression settings.
Is there any way I'm able to do this?
Thanks!
The decoder's latency is a constant (for a given framesize).

The OPUS_GET_LOOKAHEAD (https://mf4.xiph.org/jenkins/view/opus/job/opus/ws/doc/html/group__opus__encoderctls.html#gaf81b9e01501910adc67195ebb42b4a54) CTL returns the encoder's current lookahead. Adding this to the frame duration gives you the complete delay of the encoder and decoder.  If you want to minimize the delay added by the encoder run it with application OPUS_APPLICATION_RESTRICTED_LOWDELAY.
Title: Opus Encoder Latency Measurement
Post by: Burrows22 on 2013-04-23 15:57:50
Is there any way I can measure the latency of the Opus encoder for realtime data streaming for use over a network? Like how NetJack works using CELT?
I'm doing my university dissertation on the implications of latency induced by lossy audio compression over a network for live music performances. I've worked out a way to measure the latency of the decoder but need to find a way to measure the latency of the encoder to get the overall latency the codec creates using different compression settings.
Is there any way I'm able to do this?
Thanks!
The decoder's latency is a constant (for a given framesize).

The OPUS_GET_LOOKAHEAD (https://mf4.xiph.org/jenkins/view/opus/job/opus/ws/doc/html/group__opus__encoderctls.html#gaf81b9e01501910adc67195ebb42b4a54) CTL returns the encoder's current lookahead. Adding this to the frame duration gives you the complete delay of the encoder and decoder.  If you want to minimize the delay added by the encoder run it with application OPUS_APPLICATION_RESTRICTED_LOWDELAY.


Unfortunately I'm not much of a coding guru, how can I go about implementing this? Is there a language I should be using to call this method?

The way I figured out to measure the latency of the decoder involved running two VLC applications, one with a wav file and the other an opus file. I then use applescript to tell VLC to play both files simultaneously whilst recording the outputs into Logic via JackPilot. I checked my test first playing identical wav files in the VLC applications which resulted in zero latency between them, I then tested using a wav and an opus file which resulted in varied latency samples.

Are there any papers or articles you know of that test the latency of the codec that I may not have already read?

Thanks