Using par2 on my lossless CD-R backups Reply #25 – 2004-02-19 10:15:00 QuotePeter - thanks for that test. I've seen similar results.So you agree that the sweet spot should be:(1) the highest multiple of 2048 that(2) gives the highest efficiency rating for(3) one file setIn your example, that would be a block size in the range of 188,416 - 514,048. Does that look right to you? The process time would be really quick and the number of recovery blocks would be low. Would that still yield the optimal results?There are several conflicting things you are trying to achieve when selecting a block size:1) Getting as many recovery blocks as possible.2) Getting as high an efficiency as possible.3) Mimimizing the time to create.I've just done another test with a similar set of files (but this time I've adjusted the redundancy to get as near as possible to 100MB of PAR2 files), and gathered a little more information:Code: [Select]block redundancy recovery efficiency computesize blocks time 20,480 15.04% 4,675 91.2% 4h 11m 40,960 15.79% 2,455 95.8% 1h 57m 81,920 16.13% 1,255 97.8% 56m122,880 16.21% 842 98.3% 37m163,840 16.26% 634 98.6% 27m204,800 16.28% 508 98.8% 22m262,144 16.27% 397 98.8% 17m393,216 16.23% 265 98.7% 11m524,288 16.20% 199 98.4% 8m786,432 16.18% 133 98.0% 6m[/font]As you can see, the increased efficiency lets you have a slightly higher redundancy setting, but recovery block count does drop off very rapidly when you increase the block size to the value necessary for the peak efficiency.I would suggest that as long as you are within a few % of the peak efficiency, you are probably OK, and it may be more important to have a reasonably high recovery block count.At some point in the future I will be working on features specific to VD/DVD usage and when I do that I will set things up so that instead of specifying the redundancy setting to use, you will specify how much free space you want to fill up.