Skip to main content

Topic: How would you test real-time encoding/decoding? (Read 4132 times) previous topic - next topic

0 Members and 1 Guest are viewing this topic.
  • hellokeith
  • [*][*][*][*]
How would you test real-time encoding/decoding?
I've been gaining interest in real-time media playback, and I was thinking of what would be needed to test/compare various audio codecs (both lossless and lossy) at realtime encoding/decoding.

Which lossless/lossy codecs support streaming? I wouldn't actually perform any streaming, since the network component would introduce some variables not tied to the encoders/decoders themselves.  But I would do the encode and decode (separately) and save those metrics.

How do you determine/force real-time encoding and decoding? How much of a buffer/delay is acceptable for real-time?

If can force real-time encoding/decoding in lossy codecs, what kinds of comparisons would you do? Compression efficiency? Quality? Could VBR, VBR-Peak, or ABR be used with confidence?

  • benski
  • [*][*][*][*][*]
  • Developer
How would you test real-time encoding/decoding?
Reply #1
By "real-time", do you mean

1) Able to encode/decode at least as fast as the media plays back.

2) Able to perform the calculations within a predictable amount of time. (classic definition of "real time" in computing terminology)

  • hellokeith
  • [*][*][*][*]
How would you test real-time encoding/decoding?
Reply #2
By "real-time", do you mean

1) Able to encode/decode at least as fast as the media plays back.

2) Able to perform the calculations within a predictable amount of time. (classic definition of "real time" in computing terminology)


#1 would be the minimum.  Regarding #2, I suppose predictability beyond "fast enough" could be a set of comparable metrics.

I'm having a hard time deciding a test which would provide useful comparison data.  Usually listening tests are based around a kbps range.