Skip to main content
Topic: Curious about read and write offsets. (Read 842 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Curious about read and write offsets.

In short: Aren't the chunks of bits fed to the error detection and correction codes and modulations (or at least, some of them), different in size compared to a single stereo sample (32bit)? IF this is correct, how can the CD drive's electronics fudge the alignment of the signal coming from the laser by arbitrary ammounts of samples?

Can someone clear this up for me, or point me to a good explanation?

Re: Curious about read and write offsets.

Reply #1
Not exactly sure what you mean. Each CDDA frame is equal in size, and consists of six complete 16-bit stereo samples: 24 bytes for the audio (two bytes × two channels × six samples = 24 bytes), eight CIRC error-correction bytes, and one subcode byte = 33 bytes.

https://en.wikipedia.org/wiki/Compact_Disc_Digital_Audio#Frames_and_timecode_frames
https://en.wikipedia.org/wiki/Track_(CD)#Sector_structure

If the checksum doesn't check out, the entire frame is rejected and usually re-read a couple times, before the system rejects the media and stops playing or skips to the next frame, where the process begins anew. The error correction on a CD is interleaved, so that bursts of errors can be corrected across the surface. The error correction is spread out across the track, such that errors huddling in one section (like a deeper, wider scratch) can be corrected still.

The process is explained in detail on this German Wikipedia page: https://de.wikipedia.org/wiki/Fehlerkorrekturverfahren#Compact_Disc_(CD) <--- unfortunately, this section doesn't exist on the English page.

The application of CIRC in audio CDs is explained very well here: https://de.wikipedia.org/wiki/Cross-interleaved_Reed-Solomon_Code unfortunatelly, the English page doesn't contain this information either, so Google Translate to the rescue!

Re: Curious about read and write offsets.

Reply #2
I found this on the EAC website:
Quote
------------------------------------------------------------------------------------------------------------------
What is an read or write offset? When do they occur?
------------------------------------------------------------------------------------------------------------------
During extraction or writing of the audio data, nearly all CD-ROM/CD-R drives will add an offset to the position. This is usually around 500-700 audio samples (ca. 1/75 second) on reading and around 0-18 samples on writing (ca. 1/1000 second). So if a program queries a specific sector, it will not receive exactly that sector, but shifted with the number of samples of the offset.

This doesn't affect error correction but it does affect AccurateRip where you are comparing the whole ripped file with another whole ripped file.

Hello
 Hello

Those are the same words with a different starting point/offset.   It's exactly the same if you read the word (or play the music) but the "data" doesn't match.

Re: Curious about read and write offsets.

Reply #3
Well, yes, I understand the concept, but that doesn't address my question at all.
My confusion is not about what an offset it and why that would throw off correction checks like AccurateRip and CUETools. I understand that well enough.

My question is how come CD drives can actually fudge data positions like that (resulting in these these offsets) considering all the data integrity schemes that EVEN plain redbook audio has, particularly when it comes to alignment. It is my understanding that, for these discrete sample based offsets to occur, each stereo sample (32bit) on the disc should be fully readable on it's own. If it is somehow grouped with other samples' data in a bigger block of encoded data (EFM, Reed-Solomon or whatever it is), then, even if the drive would be slighty misaligned, the data integrity scheme would give off the alignment anyway. I mean, you can't read an encoded block of data at all if you start reading it half way and continue reading until halfway the next block. The block would not be decodeable and no samples would be extracted.
I'm asking for someone surely knowledgeable in the matter than me to tell me what I'm getting wrong or what I'm missing, because I just can't understand this with the information I have.

Obviously the phenomenon occurs, so that's out of the question. I just want to know how come it is even possible.

Re: Curious about read and write offsets.

Reply #4
You seem to be confusing audio CDs and hard disks  Audio CDs do not return individual samples on demand, they return a stream of audio data.  Nothing is fudged, they just might send you audio a few ms later or earlier than you expected.  Any fudge is in converting that to a digital file and forcing it to start at a particular point.  The error detection of correction is relative to the rest of the stream because there is no absolute positioning.  Disc alignment is irrelevant because there are no absolute positions (ie. sectors) like a hard drive.  When you think you're positioning when reading a CD, you're actually guessing and then seeking through the stream, one reason they're so slow to seek.

Re: Curious about read and write offsets.

Reply #5
I didn't say or imply at any point that you can address discrete samples on a CD. If anything, I'm dubious that it would be possible. In fact, I don't think that you can even DECODE a single sample off the disc all by itself without reading the whole EDC block that contains it, which is, as you can see, my whole point here. If you can't even READ it by itself, but you need the whole block, how come you can be misaligned by arbitraty sample offsets?

https://en.wikipedia.org/wiki/Compact_Disc_Digital_Audio#Data_encoding

Clearly, data on disc is organized recursively in blocks that can only be properly decoded as a whole, so how come none of that can be used to hint at the alignment and we get these artbitrary sample offsets? At some step of the whole process of reading a CD, some component has to be able to see the start and the end of these blocks, or it couldn't decode them. Are you telling me that none of that is even hinted at down the line to other steps in the process? How can the drive even start reading in the middle of one of these "channel data frames"? There are 6 samples in each, and then these are clustered toghether in bigger blocks with another encoding? can you start reading these at an arbitrary point in the middle too? Don't you need to start reading them from the start and decode them in full or they don't even pass the check?

[...] they just might send you audio a few ms later or earlier than you expected [...]
Is this right? Sounds more like ripping a CD through with a standalone player connected throigh SPIDIF than using a DAE ripper on a computer. How does that even deal with reading speed changes anyway?

Re: Curious about read and write offsets.

Reply #6
If you can't even READ it by itself, but you need the whole block, how come you can be misaligned by arbitraty sample offsets?
The audio data and the timing data are two interleaved streams, packed into frames. Each frame contains 32 bytes from the audio stream and one byte from the timing stream. The CD standard was designed so that after you read a frame from the disc, you can send these two streams to separate parts of the CD player, and never worry about how the two correspond to each other after that point. If you're just listening to the CD, it's no big deal if the processing of those two streams introduces an offset of a few milliseconds between them.

When you want to read a CD with a computer, you have to match up those two streams after decoding them. Since the audio CD standard allows both streams to have varying amounts of decoding delay, different CD drives can introduce different offsets between the two. And there's no requirement for the CD drive to decode an entire frame of audio at a time, either - it could use a decoder that outputs one sample at a time, which makes arbitrary offsets possible.

(Some clever engineers have figured out the exact delays necessary to recombine the two streams with no offset. I recall reading somewhere that such drives are listed as a +6 offset in AccurateRip.)

Clearly, data on disc is organized recursively in blocks that can only be properly decoded as a whole,
Sort of.

Those six samples in each frame are actually spread across over 100 consecutive frames by CIRC, which means you need to read quite a few frames from the CD before you can get any audio out of it. CIRC doesn't care about sector boundaries, either. (One sector is 98 frames.)

Re: Curious about read and write offsets.

Reply #7
@Octoctrabas
Thank you. It makes sense now.
So, if I'm getting it right, the conclusion would be that it should technically be possible to figure out absolute positions from the data structures, but the hardware implementations on the drives aren't made that way because it's not needed for normal listening, which is what consumer drives are really intended for anyway. No exceptions made for DAE either, which is, at best, assured a consistent offset.
I guess there might be some super expensive industrial grade reader with this kind of low level capability, intended for quality checks on pressed discs or something.

One last thing. I'm curious about that CIRC "spreading", could you explain it in more detail? or point me to some explanation. I tried to find it since reading about it on that wiki article, but I couldn't find much.

Re: Curious about read and write offsets.

Reply #8
I'm not sure if it's the clearest explanation, but ECMA-130 annex C has some pretty nice diagrams that show how data from each frame is spread apart during the CIRC encoding/decoding process. The diagrams show it in terms of various different delays, based on how many frames the encoder/decoder must process before the byte input on one end comes out the other.

Re: Curious about read and write offsets.

Reply #9
@Octocntrbass
Thank you again.
Surely not light reading, but I guess that'll do.

 
SimplePortal 1.0.0 RC1 © 2008-2018