Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: ov_read on macosx giving 0es and -128 (Read 7872 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

ov_read on macosx giving 0es and -128

I've stumbled upon a rather strange problem. At least its strange for me. I'm going crazy here, not knowing if its some compiler setting i need to change or something in the code, or something with the libraries i compiled.
Story goes like this: im trying to add ogg vorbis support to a multiplatform application im working on. The very same code works perfectly fine on windows and on iphone device, but does not work on macosx.
After lots of testing and checking every little piece of code, ive come to conclusion that the only difference is how the ov_read function behaves on those platforms.
On windows and iphone im getting proper values, but on macosx im getting a steady flow of repeated "0,-128, 0,-128" values in the decoded buffer. I've tried changing the endianness parameter just in case, tried changing the word size parameter - all for nothing. ov_read_float works fine on all three platforms, so if i dont figure out how to fix ov_read using ov_read_float is my last resort.
The strengest thing in all this is the fact that if i allow ov_read to run, it probably causes some memory errors, since after decoding the file, every other ogg file i try to load gives me a -133 error (bad header) in ov_open_callbacks (or ov_open. i tried with both). I'm hoping someone experienced something similar and could guide me in some direction. I'm not posting too much code as its pretty much just copy paste from tutorials, and it works perfectly fine on two other machines, so i would guess its either something compiler specific in my xcode, or im handling the ov_read in a bad way.
int size = ov_read(&vf, myBuffer, 4096, 0, 2, 1, &current_section);
for(int i = 0 ; i < 4096 ; ++i)
   printf("%d,",myBuffer);

this prints on my screen either values like "0,0,0,0,0,0,1,0,-1,-1,-1,-3,9,-15 (...)" (working ok on windows/iphone)
or
"0,0,0,-128,0,-128,0,-1,0,-2,-128,-128,-128,-128,-128,-128 (...)" (working not ok on macosx)
Any advice would be helpful. Thanks!

 

ov_read on macosx giving 0es and -128

Reply #1
Hi!

I'm having the same problem, have you found the solution?

Thanks!