Opus on Cortex-M3 2014-12-03 19:02:57 First post on this forum. This is a great community, so hopefully you'll all be able to help me. I'm working on a project where I want to get the Opus decoder running on an embedded ARM core. basically, I want to decode and playback an encoded stream from an SD card on a Cortex M3.To test the Opus codec first, I'm setting up a pretty basic demo where I'm filling a buffer from a .WAV file on an SD card, encoding it, then immediately decoding the result, and comparing the input, and output buffers. I'm leaving all of the settings in the encoder and decoder instances as the defaults aside from the sampling frequency (pulled from the .WAV header), and the channel count (also pulled from the .WAV header), in this case 192 K and stereo. The issue I'm encountering is that the decoder is actually outputting twice the amount of data I expect it to. The input buffer size is 1920 samples (5 ms of 16-bit interleaved stereo data; frame size hard coded to 960 samples to match this); the encoder processes the data just fine, and typically outputs ~85 samples into its output buffer. When I pass the encoded data to the decoder, the decoder processes this, and writes the decoded data to its output buffer, with the opus_decode function returning 1960 (number of bytes decoded - which is correct). However 3920 bytes of data are actually written to the decoders output buffer. Each sample is written to the buffer twice, so every 2*ith sample and 2*i + 1th sample are the same. I'm pretty sure that endianness is not the issue in this case. I've set the codec's options appropriately, and I've disabled the float API, and set it to run in fixed point mode. There's no errors or warnings at compile time, and there are no run-time errors. Any thoughts on what may be causing this?