Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: 0.9.x and the dsp timing problem (Read 3404 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

0.9.x and the dsp timing problem

I have been trying for 2 days, still no idea how to solve the below problem, about dsp.

if I want my dsp to skip audio samples like the below code (skips for 5 seconds):

Code: [Select]
    virtual bool on_chunk(audio_chunk * p_chunk, abort_callback & p_abort)
    {
        if (skipped<5)
        {
            skipped += p_chunk->get_duration();
            return false;
        }
        return true;
    }

or increase the playing tempo of a song by some factor (4.4x faster):

Code: [Select]
    virtual bool on_chunk(audio_chunk * p_chunk, abort_callback & p_abort)
    {
        p_chunk->set_sample_rate(p_chunk->get_srate()*4.4);
        return true;
    }

the above 2 ways are working, but have one common problem, foobar 0.9.x will not update the seekbar and the played time correctly (e.g. if first 5 seconds skipped, the correct starting display time should be 0:05, while foobar shows it as 0:00 as normal, as if no audio samples were skipped), so most likely when the song ends, the seekbar and the display time are still going on.

The problem didn't happen in 0.8.3, which would neatly jump to the correct seekbar position and display the correct time.

Is there any new way supported by the 0.9 sdk to solve the problem? because I have some dsp development and porting of some old 0.8.3 dsps halted because of this reason.

Thanks

0.9.x and the dsp timing problem

Reply #1
The displayed playback position is determined from how much data comes out of the DSP chain, not by how much is send in. According to Peter, changing this would be quite tricky, and it is not sure if or when it will be changed.

0.9.x and the dsp timing problem

Reply #2
ok, thanks for the reply, but can you explain a bit more, I don't quite understand that you said about:
"is determined from how much data comes out of the DSP chain, not by how much is send in"

do you mean the displayed playback time is determined by the position indicated by the input plugin, not the output of the dsp that sends to the output device?
if so, what class in the sdk or anyway for me to hook that or do something?

thanks again

0.9.x and the dsp timing problem

Reply #3
First, the 0.8.3 behavior wasn't correct either, it was only off by smaller magnitude - output latency was calculated according to data that came out of DSP, which in case of pitch shift had different rate than decoded data.
I'm currently working on restoring the pre-0.9 behavior, at least to make skip silence and similar cases behave correctly.
Microsoft Windows: We can't script here, this is bat country.

0.9.x and the dsp timing problem

Reply #4
thanks so much, peter. looking forward to it.