hmm, Halcyons test, was carried out with a special test-CD, with black stripes on it ?Is this very realistic, or was that Sony drive so bad ?
you are just reading 2 times again by t&c, but eac has had the chance to read it 82 times in the first (or only) run..)
How big is the probability, that eac reads twice (no c2, secure) and thinks: no error, all is well, but it has read twice the same wrong value for that damaged sector.
First, let's assume that the chipset can't correct E32 errors, and that the errors come from unreadable parts of the CD. Therefore errors will be in frames with at least 3 wrong bytes. If they are random, the probability that the detection will fail is 1/256^3=1/16,777,216 It is the probability that the two readings are wrong (three wrong bytes) and identical. Being 7350 frames per second, if all frames are wrong, there would be about two undetected errors per CD. But this can vary very much. 1 - recent chipsets must have five wrong bytes per frame in order to generate wrong data : that would lead to one undetected error once every 32,000 CDs. 2 - errors should be random if the CD is completely unreadable, but for the other bytes to be OK, the damage must not be total. So there should be some parts slightly damaged, and these are likely to cause some problems since there can be just jitter errors in some EFM data. I don't have the EFM table in text format, so I can't check if the possibility to misplace one transition can lead to an error as small as one modified bit in the decoded 8-bits symbol. If I had setup the EFM table myself, I would have tried to do so, in order to get LSB errors if possible in the case of jitter errors. If it is indeed the case, we can imagine three unstable bytes leading to 1 wrong bit in each. In this case, these bits being random, there would be one chance out of 8 to get twice the same wrong data. So the theoretical efficiency is between 1/8 (jitter errors - E32 chipset) and 1/1,099,511,627,776 (damaged CD, E52 chipset). That's quite an uncertainty isn't it ?
which leaves more room for minor differences between the reads which aren't audible, but still exist
Unfortunately, if the surface is damaged, the original data is changed.
I'm still a little worried about the results of the NEC with Test & Copy
Quotewhich leaves more room for minor differences between the reads which aren't audible, but still existWhy wouldn't they be audible ?
***About caching, let's say we want to be sure our drive does not cache, so we turn it off and start ripping (T&C). How many CDs should be ripped that way to be sure...? (I know this is a tricky question )
between 1/8 - 1/100000000000... (per disk)
Hence my qestion: Personal Experiences??? Quite some folks will use T&C if the disk has scratches, just 'to be sure'. So, what's your experience?(note: one can only draw conclusions in case of a considerable amount of those rips)
So it's not 1/8 but 1/2.