However, as a consumer of EAC, I've bought into the idea of this Holy Grail: as close as possible 'bit perfect' audio extraction. Is that not why EAC was developed?
Is all existing offset data based on what Andre has done with EAC? Has any come directly from the manufacturers that could prove or disprove the exisiting data? For my own archiving, can anyone confirm the claim that Andre's EAC offsets are -30 samples off? I'm admittedly no expert, but would love to hear if there's any validity to what IpseDixit says.
Couldn't those 30 samples create problems if a disk is many times ripped then burned? I know it would probably be a stupid thing to do but it seems like the photocopy of a photocopy empirical experience.
If all the databases are affected by the bug, which is basically EAC specific, isn't this ugly for other ripping programs which rely on accurate rip?
I wonder if there could just be a checkbox in the offset options, that is on by default, and adjusts the offset by 30 samples.That way the existing database can stay, and old EAC's would be the same, but new versions of EAC would be able to easily address this issue, whilst still relying on the existing database.
You have 8 million tracks submitted to the database, they are not going to go away.
However, I understand the huge problem in now implementing a new correction!
Basically, I see two camps developing here: one in support of the standard EAC reference offset, and the other, a small group of purists, who'll integrate the new offset into their extraction process. In any event, Andre's EAC offset is the standard, like it or not. It's the measure for which all drives are calibrated by EAC to produce identical results. None of that is going to change. However, there is an alternative now, a way to get results closer to "perfection" by breaking away from the old convention. Use whichever method you see fit. I can't fault anyone for going either way.