Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: DAE quality (Read 16314 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

DAE quality

Hi.

Recently I've performed the EAC DAE quality test with 2 of my drives.
As people here ask "What drive should I buy" quite often, I think it's a good idea to start a thread containing of DAE tests only (and a second thread for discussing the results).

There are several problems and questions about this I'd like to mention.

- Does something like this already exist somewhere else so it'd be a waste of time (I know CDRinfo.com, but they've only done DAE error correction tests in some of their reviews - and the used method changes now and then.)

- People who wan to test their drives need to create a test CD themselves, so the results submitted by different people won't be comparable

- BUT: Some parameters should be somewhat comarable as far as I undersand, e.g. C2 accuracy.

- Maybe there could be a way to improve the test disc creation process to get results with better comparability (<- is this an existing word?  ), but OTH it would probably make creating the test CD more complicated.

- Has anyone compared the results (s)he got with 2 (or more) different test CDs (different brand, 650 vs. 700 MB media etc.) / the same drive?

- At different (reduced) extraction speed the results are different. -> What / how many speeds should be tested?

- Are there other similar programs that work with all (most) drives, not only a certain brand? (NeroCDSpeed ?)

- Should extraction of copy protected CDs be included?

- What parts of the results shoud be posted - only numbers or also pictures?

- Would it be a good idea to post some error patterns to determin the chipset's error correction capabilities (E21 E22 etc.)


There are probably many more points I've missed, but this post is long enough anyway.


So ... All input is appreciated, especially by "the experts" Pio2001, spath, JeanLuc (sorry if I forgot mentioning YOU  ).

Cheers tigre
Let's suppose that rain washes out a picnic. Who is feeling negative? The rain? Or YOU? What's causing the negative feeling? The rain or your reaction? - Anthony De Mello

DAE quality

Reply #1
I think Pio already has some protocols for home-made tests of error correction
and error concealment. For C2 error accuracy there would be more to discuss though.

DAE quality

Reply #2
I don't know how true this really is, but somebody once told me that if I use EAC in secure mode with the cache and C2 disabled, the only thing that is different amongst various drives is the speed. Otherwise, the ripped wav files are identical in al respects. Is this really true? If it is, then I don't think there is any need for such a thread.

 

DAE quality

Reply #3
Quote
For C2 error accuracy there would be more to discuss though.

Especially when you consider most drive's varying C2 accuracy values with different numbers of C2 errors ...
The name was Plex The Ripper, not Jack The Ripper

DAE quality

Reply #4
Quote
Quote
For C2 error accuracy there would be more to discuss though.

Especially when you consider most drive's varying C2 accuracy values with different numbers of C2 errors ...

So this would mean testing C2 accuracy doesn't make sense, because even if a drive got 100% in a test you can't be sure that some errors remain unreported when extracting "real life" disks. The conclusion from this would be for secure rips C2 should NEVER be used, besides when using test & copy with CRC comparison. Correct?

In this case, testing C2 accuracy would be quite useless, but internal error recovery capabilities and error patterns (to determine what kind of error correction is used by the drive) would still be interesting.
Let's suppose that rain washes out a picnic. Who is feeling negative? The rain? Or YOU? What's causing the negative feeling? The rain or your reaction? - Anthony De Mello

DAE quality

Reply #5
Quote
I don't know how true this really is, but somebody once told me that if I use EAC in secure mode with the cache and C2 disabled, the only thing that is different amongst various drives is the speed. Otherwise, the ripped wav files are identical in al respects.

If no uncorrectable errors were reported, most likely the result is perfect (if correct secure mode settings were used), no matter what drive was used. But how much damage EAC can restore before uncorrectable errors are reported probably depends on the drive and its internal error correction capabilities.
Let's suppose that rain washes out a picnic. Who is feeling negative? The rain? Or YOU? What's causing the negative feeling? The rain or your reaction? - Anthony De Mello

DAE quality

Reply #6
Exactly ... there are not much different chipsets out there IIRC ... and the drive's age (in terms of laser diode wear and/or spindle bearing tolerances during heat-up) has some effect as well ...
The name was Plex The Ripper, not Jack The Ripper

DAE quality

Reply #7
There is/was a section on CDR-Info,  and in a few other places,  they only were concerned with ripping speed.  I don't know if they are still getting updated.

I think it would be interesting to have something like this..  I'd guess the rate of errors, and how quickly they get handled would have some effect on the read rate to a greater or lessor degree.  Standardizing when you have many making a test disk by hand is a problem.  But even just raw ripping speed might be useful for picking drives.

My ol' Plextor 10x12x24 is getting a little long in the tooth..  I'll be shopping someday soon..

DAE quality

Reply #8
Quote
- Has anyone compared the results (s)he got with 2 (or more) different test CDs (different brand, 650 vs. 700 MB media etc.) / the same drive?


Yes. Comparing the C2 accuracy graphs for my Memorex in the DAEquality and Old CDR sections of my analyze ( http://perso.numericable.fr/laguill2/dae/dae.htm ), we can see the missed errors being quite OK with the old CDR (between 7 and 8 lines below the reported errors, that means between 128 and 256 times below, according to the logarithmic scaling, thus less than 1 %) and going through the roof with the DAEquality test CD (missed errors > reported errors, that is more than 50 % of missed errors).

Quote
- Should extraction of copy protected CDs be included?


Yes, but there too, results can vary from a test CD to another. I had consistent CRCs with Kraftwerk Tour de France on some tracks, and small differences on other tracks. But when I wanted to retest with Doc Gyneco' CD, I got a majority of inconsistent errors (thus not mastered)... more than mastered ones ! Needless to say that I couldn't get  the same CRC twice on any track. And both CDs were CDS200...

Quote
- Would it be a good idea to post some error patterns to determin the chipset's error correction capabilities (E21 E22 etc.)


I'd like it. The test I've done is limited, because it gives the error correction used in case of an ideal burst error, and different strategies can be used for different kinds of errors.
But the Teac drive, that comes first in C2 correction ability (4 bytes), is also my best drive at reading dying CDRs. Thus the error correction ability might be a good indication for the drive performance on poor CDs.

DAE quality

Reply #9
Yes I think a proper test would be great...I found it difficult to find recommendations as to an excellent DAE drive to use with EAC. I have now purchased a new Plextor 54x EIDE drive and an Asus 52x CD-ROM drive and am awaiting there arrivial to give them a spin....

I gather that alot of our Plextor 40x SCSI drives are starting to die and we need to find new good drives....

Kindest Regards,
Scott 

DAE quality

Reply #10
Thanks for your reply, Pio.

Quote
Quote
- Has anyone compared the results (s)he got with 2 (or more) different test CDs (different brand, 650 vs. 700 MB media etc.) / the same drive?


Yes. Comparing the C2 accuracy graphs for my Memorex in the DAEquality and Old CDR sections of my analyze ( http://perso.numericable.fr/laguill2/dae/dae.htm ), we can see the missed errors being quite OK with the old CDR (between 7 and 8 lines below the reported errors, that means between 128 and 256 times below, according to the logarithmic scaling, thus less than 1 %) and going through the roof with the DAEquality test CD (missed errors > reported errors, that is more than 50 % of missed errors).

I think here's a misunderstanding. My question was about the DAE quality test CD. If I create 2 or more different DAE quality test CDs - how repeatable are the results? I.E. How much different are the results if 2 different DAE test CDs are used? I ask this because if we start a database-like thread where different people report their results, it would be important to know how reliably we can compare the results.

Quote
Quote
- Should extraction of copy protected CDs be included?


Yes, but there too, results can vary from a test CD to another. I had consistent CRCs with Kraftwerk Tour de France on some tracks, and small differences on other tracks. But when I wanted to retest with Doc Gyneco' CD, I got a majority of inconsistent errors (thus not mastered)... more than mastered ones ! Needless to say that I couldn't get  the same CRC twice on any track. And both CDs were CDS200...

I rather meant "copy possible without (handling) problems" vs. "copy possible with tweaks (detect TOC manually etc.)" vs. "copy impossible" and "no audible problems" vs. "problems introduced by copyprotection are audible".

I don't think that the number of inconosistent errors is useful if not everyone uses exactly the same CD (-> different versions of CDS) in perfect condition for testing.


Quote
Quote
- Would it be a good idea to post some error patterns to determin the chipset's error correction capabilities (E21 E22 etc.)


I'd like it. The test I've done is limited, because it gives the error correction used in case of an ideal burst error, and different strategies can be used for different kinds of errors.

So this seems to be useful. How can this be performed for a test thread like this? Wave substraction -> zoom in -> post image(s)? Or ask someone to write a program that does this automatically? And how to interprete the error patterns reliably / reproducably?
Let's suppose that rain washes out a picnic. Who is feeling negative? The rain? Or YOU? What's causing the negative feeling? The rain or your reaction? - Anthony De Mello

DAE quality

Reply #11
Quote
If I create 2 or more different DAE quality test CDs - how repeatable are the results?


They are not repeatable at all. The first DAEquality CD I created returned no errors on my Teac drive, or maybe one or to on the deepest scratch. The second one had so much errors that it crashed the drive.

Quote
"copy possible without (handling) problems" vs. "copy possible with tweaks (detect TOC manually etc.)" vs. "copy impossible"


We should look at CDFreaks first,they must have many data about this already.

Quote
"no audible problems" vs. "problems introduced by copyprotection are audible".


This is more difficult. We don't know if all CDS200 CDs have clicks. This must be directly related to the error correction ability (E22, E32, E42), and error concealment ability. I think that if we analyze these two parameters and compare them with clicks on a given CDS200 CD, we can predict the clicks that this CD will give on another drive knowing its abilities. Possible difference will come from different CDS200 version. It is possible that they changed the mastered errors so as to follow the improvement in error correction in drives.

Quote
So this seems to be useful. How can this be performed for a test thread like this? Wave substraction -> zoom in -> post image(s)? Or ask someone to write a program that does this automatically? And how to interprete the error patterns reliably / reproducably?


I don't think a program can reliably analyze the patterns. Well, it could, it is much easier than performing an OCR, but it would still be difficult to program... and if an unexpected pattern appear, we should analyze it by ourselves.
The images, or wav files (much smaller) should be posted, because the models of error correction I considered come from indirect sources, and I'm not completely sure of them (for example I don't know when errors are flagged at the C2 stage if they were flagged by the EFM decoder or the C1 stage). Actually, I deduced the strategies used from the patterns I saw. Leaving the wavs available would allow further analyzing and discussion.
The analyze itself is not difficult. I can post examples in order to show which patterns come from what strategy... this is the missing part of my webpage. The Appendix 2 just show prototypes of patterns.

DAE quality

Reply #12
Quote
Quote
If I create 2 or more different DAE quality test CDs - how repeatable are the results?
They are not repeatable at all. The first DAEquality CD I created returned no errors on my Teac drive, or maybe one or to on the deepest scratch. The second one had so much errors that it crashed the drive.
So this - combined with JeanLuc's statement
Quote
Especially when you consider most drive's varying C2 accuracy values with different numbers of C2 errors ...
- would make creating a test thread for drive comparison completely useless, because results will not be comparable. The only remaining thing that would make some sense is comparison of error patterns then.

But what if we find a way to create test CDs in a more reproducable way, e.g. using drills to make small holes (e.g. 1 mm, 2mm, 3mm) instead of painting a triangle? Or by using a special tool with defined presure to create scratches?

Quote
Quote
"copy possible without (handling) problems" vs. "copy possible with tweaks (detect TOC manually etc.)" vs. "copy impossible"
We should look at CDFreaks first,they must have many data about this already.
Of course for some drives information about some details is already available, but one important purpose of such a comparison thread would be to have all information about 1 drive bundled in 1 post.

Quote
We don't know if all CDS200 CDs have clicks. This must be directly related to the error correction ability (E22, E32, E42), and error concealment ability. I think that if we analyze these two parameters and compare them with clicks on a given CDS200 CD, we can predict the clicks that this CD will give on another drive knowing its abilities. Possible difference will come from different CDS200 version. It is possible that they changed the mastered errors so as to follow the improvement in error correction in drives.
Sounds interesting. So one reason more to inculde / focus on error correction/concealment ability.
Let's suppose that rain washes out a picnic. Who is feeling negative? The rain? Or YOU? What's causing the negative feeling? The rain or your reaction? - Anthony De Mello

DAE quality

Reply #13
The ABEX test discs used by cdrinfo.com are industrially produced and thus should lead to comparable results on a high level of confidence (these discs are veeeery expensive, though) ... any home made solution will definitely not ... with the exception of you owning every DAE-capable drive available

Although I admit that the idea of drilling holes or applying scratches at predetermined positions with predetermined force would be a good way to start industrial production ...
The name was Plex The Ripper, not Jack The Ripper

DAE quality

Reply #14
Quote
But what if we find a way to create test CDs in a more reproducable way, e.g. using drills to make small holes (e.g. 1 mm, 2mm, 3mm) instead of painting a triangle?


Drilling a CD must be avoided in all cases and at all cost. The fast spinning speed, combined with the fragility of the CD near the hole, and the completely out of balance shape, would make it explode in the drive.

Quote
So this - combined with JeanLuc's statement [...]- would make creating a test thread for drive comparison completely useless, because results will not be comparable.


These are two different statement.
I think that comparing the amount of errors got by different users with different test CDs is completely useless. I even got significantly different results with the same drive / same test CD, ripping two times (bottom of http://perso.numericable.fr/~laguill2/dae/...ldaecompare.htm )

But giving C2 accuracy reports can be interesting. If we look at all the results got, we can see that the big differences in C2 accuracy occured only on very severe problems, with big drop outs in the wav, far beyond the point of unrecoverableness. There is no example of C2 accuracy inconsistencies between tests for CDs that can be recovered. Thus some C2 accuracy measurements can be interesting for secure ripping, granted two conditions : the test CD must not be too hard for the drive, and the results won't be valid for CDs with big problems, be it a single big scratch.

DAE quality

Reply #15
C2 accuracy will always change a bit with the input error rate, it's a
statistical (documented) property of the CIRC. Now if you see unexpected
variations in C2 accuracy, it means your drive is behaving strangely.
As for your tests, comparing the error rates wouldn't make much sense,
but you should be able to find out the correction strategy and C2 accuracy
in a consistent way (if you're careful about making your test disc and
interpreting the results).

DAE quality

Reply #16
Quote
Now if you see unexpected
variations in C2 accuracy, it means your drive is behaving strangely.

Here's the main problem that I got :



This is a graph from "DAEquality".
It shows the number of returned C2 errors (green) and the numbers of errors that were not flagged by the drive (blue) versus time on the CD.
Horizontal scale unit : 1 minute
Vertical scale unit : x2 (logarithmic); top = 176400 wrong bytes per second.

The left part and the beginning of the right part are consistent : the blue curve is 6 to 7 lines below the green one, the C2 accuracy is near 99%
But at the end of the CD, the drive went mad, and the C2 accuracy fell down (blue curve = green curve).

As long as we can tell that we're in a case where the drive operates properly, the C2 accuray information is useful. The big problem is that with a test CD such as the DAE quality one, it is not possible, look :



Among the five peaks on the right, we can tell that the drive operated properly in the first one (blue curve well below the green one), and improperly in the four others (blue curve on top). Thus we can't tell for a given test CD, if all peaks look the same, if the drive is stressed or operates properly.
Even more annoying is the cluster on the left. There were some bursts of errors where the drive didn't report C2 properly. The last blue peak shows a lack of C2 accuracy where the reported C2 was less than 6 lines below the top, that is less than
176400/(2^6)=2757 per second. If we add the undetected errors, that are a bit less, we don't exeed 6000 errors per second here. This behaviour at such an error rate makes very difficult the C2 accuracy analysis, unless we use a test CD like mine, with a progressive error rate from 0 to 176400 per second. But this was a decaying CDR that must be dead now.

DAE quality

Reply #17
Hi Pio,

Why do you think the C2 accuracy could be so different between
the black and the scratched regions of your second disc ? Why
would it be so different between the first scratched region
and the next ones ? This is against CIRC theory, and when I
see such strange results common sense makes me first question
the software tool of a single student before blaming dozen
of engineers from a drive manufacturer. You already know what
I think of Andre Wiethoff, so as far as I'm concerned until the
source code of this tool is available for review, I consider
its results meaningless. And if you plan to start your own DAE
quality project I really suggest you to start from an open-source
burning program, add what you need (which is not much), and
then let others review your code/strategy and try it.

DAE quality

Reply #18
I've found the problem with C2 accuracy reports !
The analyze program reports bursts of C2 inaccuracies when the drive misplaces some parts of the audio. If some audio is misplaced, it won't be flagged wrong, but it will be different from the reference file, if the comparison is done without correction for skips.
Andre told me that the skip detection routines were different between EAC and Analyze.exe, and that one may detect skips unnoticed by the other.

Analyze.exe detected 93 misplaced bits of audio in the Sony DDU1621 rip (top graph in the post above), and 34 ones in the Memorex DVDMaxx1648 rip (bottom graph in the post above).
We can see in the log files for the other rips ( http://perso.numericable.fr/laguill2/dae/a...l/astraldae.htm ), that there were zero detected skips when the C2 accuracy is consistent ( http://perso.numericable.fr/laguill2/dae/a...al/astralc2.htm )

There are two possibilities.
1-When an extraction have skips, the program doesn't detect them all, and mistakes the remaining misplaced unflagged data for undetected errors
2-The program doesn't take skipping into account for error counting, and count all right misplaced data as errors.


Besides, when I did my analysis about C2 accuracy, I checked the results of the analyze.exe program of Andre, that computes the number of errors per second and draw the graphs, against a program of mine, in VB, that just computes the number of errors per second from the Wav and the C2 file. It doesn't take skipping into account and just performs a sequencial check of all data.
I've pointed out two minor bugs that were corrected (actually, Andre forgot to put the corrected  version of his program (analyze.exe 1.4) online, but I can send it to you if you need it). Now, the C2 / C2 on right data / errors not flagged count match exactly between our programs for samples without skips. You can see the source of mine here, or, if you've got Visual Basic installed, get all the files here (1.8 MB, because of samples).

DAE quality

Reply #19
Great. So basically thanks to Andre's ego all published C2 accuracy
results so far of his DAE tool are not to be trusted (which, if you
remember, I had suggested him on his board more than a year ago).
Anyway, I don't know VB but I'll try to understand your code and
give you some feedback on it.

DAE quality

Reply #20
Quote
Great. So basically thanks to Andre's ego all published C2 accuracy
results so far of his DAE tool are not to be trusted (which, if you
remember, I had suggested him on his board more than a year ago).


There have been very few of them... most are mine, and one or two other people used it and published something also.

Actually, the root of C2 accuracy doubts were these :
First, Andre told it. He certainly found some drives that claimed to support it, but didn't. He also said that some drives supported it, but badly. In fact, he might have stumbled on the same problem that we are facing here. For example, my Teac E540 firmware 3.0 drive is buggy (Feurio says that firmware 1.* are slow but OK, and that firmware 3.0 is fast but bad at reading audio). Ripping some CDs, it sometimes skips one whole sector, goes on ripping, and later repeats another sector and thus comes in synch again. Analyzing this data, someone not realizing that the data have been shifted by one sector for a while would simply conclude that it is wrong, because it is different from the reference file, and that the C2 support is broken, because the drive didn't report any error.

After this, when C2 support was added in EAC, most people reacted this way : "CRC OK" was always being beleived as the ultimate proof of a perfect rip (because EAC was based on the assumption that if an error occurs, the data will automatically be different in a second rip... Fortunately, Bobhere showed that this was not always the case).

Enabling C2, there was sometimes different CRC when "no errors occured".
Disabling C2 in this case often resulted in uncorrectable errors.

Several reports of this kind achieved to convince people that disabling the use of C2 was a good thing in order to get a perfect error detection.

First problem : we can't know if the CRC mismatch come from an error that was not flagged because of a C2 failure (this was assumed), or because the rereading process got the same error several times and wrongly assumed that the C2 well detected error had been reread properly (thus counting the data as OK while it was still C2 flagged as wrong ! ).

Actually, from version 0.9beta1 to 0.9beta4, EAC featured a very good option that solved the above problem ! Checking the second C2 box, the correction decision was not based on the similarity of rereadings, but on the C2 flag !
There I was faulty. As it had been advertised as allowing to rip properly protected audio CDs, I assumed that it interpolated the uncorrectable errors. This was the thing to do for protected CDs. The name was clear enough although : "also use C2 for error correction"  But my word was holy. Nobody questionned it (some did, but accepted my correction). I even posted that it didn't work well (of course, it didn't do what I expected it would ! ), and since I was nearly the only one to report about it, it was finally removed ! Tell me about confusion !
Worse : I reinstalled EAC 0.9 beta 4, and now I can't get the option to work properly (see http://www.hydrogenaudio.org/forums/index....howtopic=15571& . For clarification, with a CDS200 protected CD, I assume it "working" if I get full fast lines of error correction, with "synch error" displayed, and "not working" when I get slow error correction, but one line only, and no error reported. Same CRC in all cases )

Second problem : long after EAC was popular, the first and only test I know about its real accuracy (without the use of C2) was done by Tigre ( http://www.hydrogenaudio.org/forums/index.php?showtopic=9049 ). Analyzing his results (10641 errors in file 1 + file 2, 1930 consistent ones, 22 undetected for sure even if the comparisons are made between 27 sectors blocks), we get
10641-1930=8711 inconsistent errors, thus about
8711/2=4355.5 per file, thus
4355.5+1930=6285.5 errors per file, thus a detection accuracy between
(6285.5-1930)/6285.5=69 % and
(6285.5-22)/6285.5=99.6 %
..Since the number of unknown errors remaining after detection and correction is between 22 and 1930.

OK, but what about CRCs in this case ? A consistent error is some data that is completely destroyed on the CD, a neat destruction that turns it never readable while the surrounding data is always readable. The errors being always there, the returned data is always the same, since interpolation always occurs at the same place. In this case, when EAC tells "no errors occured", then all the CD is nearly consistent, at least it finally turned out consistent 8 times out of 16 for data that might have been inconsistent at the first reading, otherwise, EAC would tell "there were errors".
In this case, it is not uncommon to get CRC OK. Not always become some inconsistent errors can occur, but quite often since at the end all data turn out consistent more than 8 times out of 16.
Thus CRC OK is not a proof that no errors occured when no C2 is checked, on the contrary it is likely to happen even if some data are unrecoverable.
How much likely ? To know it, Tigre's test must be extended to 16 rips of the same CD in order to simulate EAc's error correction as well as error detection.

Why does disabling C2 turn the errors unrecoverable (inconsistent) ? Maybe the fact of going back and reading the same block again increases the amount of errors compared to a continuous reading. Thus the number of errors increasing, a part featuring only several consistent ones gets inconsistent ones as well (remember that according to Tigre's test, inconsistent errors seem more common than consistent ones, thus a range featuring only consistent errors is rare and must have very few of them).


Besides all this, there are CDRinfo.com C2 accuracy's tests, made with ABEX test CDs and Nero CDSpeed beta. The results about accuracy cover all the % range from zero to 100. There too we don't know if their problems can come from skipping during the playback, that can explain many C2 "failures".


Quote
Anyway, I don't know VB but I'll try to understand your code and
give you some feedback on it.


Feel free to check it, but you know, it's nothing more than a tool made in order to check Analyse.exe's results, that I didn't understand when I looked at the wav files (actually, part of my problems came from a defective RAM stick : when I substracted (Wav 1 - Wav 2) in SoundForge, the result was consistent and different than -(Wav 2 - Wav 1), consistent too, and Samplitude didn't have the problem... unbelievable where RAM bugs effects can be found !!!).
It's not a real program (it only accepts RAW audio  ).



Edit : added (without the use of C2)

DAE quality

Reply #21
Oof ! Maybe the above post is too much for many people... here's a abstract :

Spath have good reasons to assume that most drive report C2 perfectly (inside their CIRC implementation limits, that may explain the accuracies between 99 and 100 %  ...?)
I analyzed all the tests we have done so far, and none proves him wrong ! So he may well be right.