1
FLAC / Re: FLAC v1.4.x Performance Tests
Last post by Porcus -Because I didn't pay sufficient attention to my own testing, I ran something more - testing blocksizes that nobody wants to use. (Reason for that: make sure there is no partitioning for the residual, so there are no differences in that.)
Took the 219 minute CDDA .wav file and made 30 .flac files:
-0 and -5 at fifteen distinct block sizes, all odd numbers to be sure I didn't test different handling of partitioning.
Most are too low to be interesting! (Why? I'll explain at the end.)
17, 23, 31, 45, 65, 97, 147, 223, 341, 523, 803 <in between here is the -0 to -2 preset blocksize!> 1237, 1907, 2941 <in between here are the other presets> 4537 (ffmpeg uses the subset upper bound of 4608)
Those were encoded repeatedly using different FLAC releases (Xiph builds ... I hope I didn't get that wrong!). Timings for the -0b<N> and -5b<N>:
To the right end of both diagrams, you see that 1.4.3 has gotten rid of the slowdown around the 4096 blocksize, which is good because that is a default.
Then decoding. Note, here the flac versions are different! Because the results were a bit surprising, I dug up 1.3.1 too (a tinytiny bit slower than 1.3.4, I deleted the latter to get the number of curves down) and also 1.2.1. Omitted is also 1.4.2 which is as good as tied to 1.4.3. Kept in both the 32-bit and 64-bit version of 1.4.3 (it being the current release).
And here the block sizes are visible, I lost them from the encoding charts ...
1.3 is much faster at the small block sizes! At the very smallest, 17 samples, it decodes at at 22 seconds, 1.4 needs 40, 1.4 32-bit needs 60. That is a little bit of difference?!
The observation that 1.4.3 32-bit crosses 1.2.1 this way also indicates that something got lost since 1.3.
Note the difference between 64-bit builds is pretty much nothing at the reasonable block sizes. Don't overinterpret this
Imgur album: https://imgur.com/a/JDAccOD
* Comparison with ffmpeg - which spends bonkers amounts of time at small blocks - at https://hydrogenaud.io/index.php/topic,125791.msg1044102.html#msg1044102
* Why these block sizes? I tried the minimum block size over in that thread, and since ffmpeg did so horrible, I set out to check (1) where it would start to behave normal, and (2) what is then ... normal? Reference implementation for comparison. And then it turned out, differences between versions ... and a new run.
Corpus: I took twenty-one of the CDs in my signature (7 in each broad subdivision), the first ten and a half minute of each, merged to a 219 minute WAVE file.
Timing was done overnight with hyperfine, warmup + 5 runs with 30 seconds cooldown in between (ffmpeg last), and times are median of 5.