Anyone tried to repair clipping with interpolation? 2007-04-21 17:36:44 I've lately been thinking about the problems of clipping in 44.1khz/16 bit PCM audio.I've definitely noticed that contemporary recordings sound bad -- digitizing some old cassette tapes has also had me thinking about the mastering process.It seems to me that it might be possible to undo ~some~ of the damage that clipping does to an audio recording. The idea is to go a 20 and 24 bit representation and try to interpolate something reasonable for the clipped areas. Yes, information is permanently lost, but we can certainly predict something better than an #0000 or #ffff value for these points.One scheme I can imagine is dividing the signal into time blocks, and estimating the fourier components of the signal, throwing out the clipped samples. This an underdetermined problem -- there would be more than one possible set of fourier coefficients that would reproduce the clipped points. I'd select one of these with a maximum entropy method -- basically, by maximizing some function that estimates the 'plausability' of a solution, say, by making the spectral envelope smooth.I'd also imagine that something similar could be done using the linear predictive methods that are used in many lossless encoding schemes.In the end we get a signal that sounds a bit better and that responds better to future signal processing such as lossy compression, phase rotation, et al.Is this a crazy idea -- has anybody tried it before?