There's not much documentation on the Decoding Speed Test (foo_benchmark) component, so I can't be sure what the intended behavior is, but I'd guess it isn't this:
System:
CPU: AMD FX(tm)-8100 Eight-Core Processor, features: MMX SSE SSE2 SSE3 SSE4.1 SSE4.2
App: foobar2000 v1.3.17
Settings:
High priority: no
Buffer entire file into memory: no
Passes: 1
Threads: 2
Postprocessing: none
Stats by codec:
FLAC: 1006.218x realtime
File: K:\example.flac
Run 1:
Decoded length: 1:07:19.253
Opening time: 0:00.001
Decoding time: 0:08.028
Speed (x realtime): 1006.238
Run 2:
Decoded length: 1:07:19.253
Opening time: 0:00.001
Decoding time: 0:08.028
Speed (x realtime): 1006.199
Total:
Opening time: 0:00.001 min, 0:00.001 max, 0:00.001 average
Decoding time: 0:08.028 min, 0:08.028 max, 0:08.028 average
Speed (x realtime): 1006.198 min, 1006.238 max, 503.145 average
Total:
Decoded length: 2:14:38.507
Opening time: 0:00.001
Decoding time: 0:16.056
Speed (x realtime): 1006.290
That's the full report from a test I just ran. How can the average speed be less than the minimum speed?