Limits of Lossless Compression...?
Reply #37 – 2005-11-22 20:34:45
Basically, although pi is (as far as we know) maximum-entropy, it is actually immensely compressible, given the proper algorithm. [a href="index.php?act=findpost&pid=344061"][{POST_SNAPBACK}][/a] IMO, pi is not compressible at all because of its "maximum entropy" feature. You basically have two choices when you want to use pi in a data decompressor: either store pi at "full" length somewhere or compute it when required. The latter looks like pi were compressed to the size of the pi-calculation routine, but in fact you have to invest time and electricity into the computation of pi when you need a part of it. While doing this, you have to "throw away" huge portions of pi because you don't need them for that particular purpose. This "data loss" occurs at the same time as electricity is converted into heat - therefore entropy increases while pi is calculated. [a href="index.php?act=findpost&pid=344087"][{POST_SNAPBACK}][/a] Ah ha. That kind of compressibility. I guess in terms of total universal entropy, you're right. However, I was talking about file size compressibility: the amount of data needed to be transmitted can be made very small, thereby putting the burden of creating all that extra entropy on the processor. @Canar: The Chudnovsky formula adds ~14 digits of pi per term. However, the terms themselves are pretty complex: [(6k)! (13591409 + 545140134k)] / [(k!)3 (3k)! (-640320)3k] for k from 0 to inf. [ edit: Actually, I suppose that's not too complex, since you can save the factorials for use in the next iteration. You just need to throw around some REALLY big numbers. ]