Skip to main content

Topic: Click and/or nosie removal -- should I dither afterwards? (Read 1482 times) previous topic - next topic

0 Members and 1 Guest are viewing this topic.
  • Dario
  • [*][*][*]
Click and/or nosie removal -- should I dither afterwards?
Hi guys,

As per the topic's title, if I have 16-bit files which I process by audio-editing software in order to remove clicks and/or noise, do they need to be dithered afterwards? I suppose they do.

Thank you.

  • AndyH-ha
  • [*][*][*][*][*]
Click and/or nosie removal -- should I dither afterwards?
Reply #1
No, any damage is already done. Dither is before, not after, and is generally automatically handled by the program doing the transform.

Click removal is local, it effects only a relatively few samples per click so there isn't much opportunity for the systematic quantization errors that produce the type of distortion that dither is to overcome.

NR is applied more extensively to the entire file so dithering the transform might make a theoretical difference. However, if the source is LPs, or any earlier form of disk, the background noise is more than enough dither anyway.

Click and/or nosie removal -- should I dither afterwards?
Reply #2
What I would do is increase bit depth to 32 bit (or whatever)
then denoise
then  dither while decreasing bit depth back to 16 bit.

Or generally :
increase bit depth --> apply dsp effects -> decrease bit depth to normal with dither
  • Last Edit: 30 July, 2013, 04:48:15 PM by extrabigmehdi

  • DVDdoug
  • [*][*][*][*][*]
Click and/or nosie removal -- should I dither afterwards?
Reply #3
Quote
Or generally :
increase bit depth --> apply dsp effects -> decrease bit depth to normal with dither
That's probably unnecessary...
1. Most audio software works internally at 32-bit floating-point anyway.
2. It's unlikey that you are doing anything that would require dithering (or re-dithering).
3. It's very unlikely that you can hear the effects of dither at 16-bts.

Click and/or nosie removal -- should I dither afterwards?
Reply #4
Quote
Or generally :
increase bit depth --> apply dsp effects -> decrease bit depth to normal with dither
That's probably unnecessary...
1. Most audio software works internally at 32-bit floating-point anyway.
2. It's unlikey that you are doing anything that would require dithering (or re-dithering).
3. It's very unlikely that you can hear the effects of dither at 16-bts.



Well, if you apply multiple dsp, then you still have cumulation of error between each step .
Also what's the point of commercial products like  "iZotope MBIT+ Dither" ?
I suppose that the one offered by izotope dither would be better than the one built in dsps.
Why sony bothered to include "iZotope MBIT+ Dither" in soundforge , if there's no advantage ?

  • AndyH-ha
  • [*][*][*][*][*]
Click and/or nosie removal -- should I dither afterwards?
Reply #5
For years I made the challenge whenever the topic came up:
Show me a sample of real music (not test signals) in which you can distinguish the difference between dithering and not dithering, when reducing to 16 bit. That is the main purpose of dither, to eliminate quantization distortion when reducing the bit depth.
No one has yet provided a sample. If you can, it will be interesting to hear it.

Dithering was also done at recording time, when recording live in 16 bit (or any lesser bit depth). Was is the operative word, I don't know of any equipment now being made that can apply dither to the input when recording, 24 bit being generally available and not needing it. Of course, microphone preamps do provide some random noise by virtue of not being able to avoid doing so, particularly at high gain levels, but I don't know how much that counts.

Lack of dither might be audible, in 16 bit, if one can obtain a quiet enough recording environment, good enough equipment, and the source audio is rather low level. There is always a difference in the data, dither vs, no dither, but hearing that difference is another matter.