Anyone Ever Solved The Mystery of While MJ's "Billie Jean" Fails DoP Playback?
2020-04-22 23:58:07
The title says it all. For those that don't know; the SACD of Michael Jackson's Thriller was mastered a little too hot for Scarlett Book and it's been known to have some problems. It always played fine in my players...then I switched to DSD DAC playback (SMSL M8). That's when the issues began. The biggest issue is that the beginning of Billie Jean was "corrupted"; the DAC would lose sync. I ripped the disc once, I ripped it twice, I made sure it was still playing in my ancient player, I made sure my PS3 was still properly ripping, and I bought the DSD files from HDTracks. The result was the same; the second beat (or so) of Billie Jean would make the DAC go nuts. Oddly enough; I could convert the tracks directly to PCM with no issue. I don't listen to the album often; I ignored the issue. Fast forward to last September. Lightning strike took out a lot of stuff...including my DAC. So I bought a newer DAC...not the one I ultimately kept; but I bought a cheap "stop-gap" (Sabaj DA3...not a good choice). The biggest improvement was the chips/drivers now supported native DSD bitstream playback instead of DoP packing. So I reconfigured everything to do native DSD playback. Guess what I put on? Thriller. Guess what happened? It played. Flawlessly. No desync, no dropped samples; it just played the entire thing as effortlessly as my player. I reconfigured back to DoP playback. The problem was worse; with the track not wanting to even attempt to sync for the first few seconds. Flip back to native DSD...works flawlessly. The DAC I ultimately bought to use (SMSL SU-8); being the exact same basic architecture, it was was exactly the same as the Sabaj. The thing is I can't figure out why this would happen. All DoP does is pack the lower 16-bits of a 176.4khz stream with DSD data, using the other 8-bits as the DoP flag. The chipset should be extracting the exact same DSD data out of the stream. I've often wondered if there's just a pattern with the DSD data that's causing the chipset to get confused. I'm asking because I've done some searching and have only found people mentioning it. Before I start spending hours in a hex editor comparing loads and loads of data; I thought I'd ask to see if someone already did this and I could save myself some effort.