Well, if you look at the size of the smallest step (normalized to 1) times the maximum slew rate of an input signal, you can directly calculate the jitter level equal to the smallest step size
Ok, so the maximum slew rate for a 22kHz sine sampled with 16bits is about 9.0e9 quantization steps per second, and therefore peak jitter below ~0.1ns will not cause a bit to flip. Similarly, about 0.5ps for 24bits. Does that sound sensible?
I put together some quick and dirty MATLAB code to attempt to simulate the effects on jitter in an ADC. It has a whole pile of shortcomings, but still gives some interesting results. The code should work unaltered in GNU Octave 2 and 3.
% Approximately simulate the effects of jitter on a signal
% x - Original (unquantized) samples
% fs - Sample rate
% rmsjitter - RMS jitter value (seconds)
% bits - Number of bits
% jy - Quantized signal with simulated jitter
% qy - Quantized original signal
function [jy qy] = simulatejitter(x, fs, rmsjitter, bits)
lx = length(x);
jitter = randn(1, lx)*rmsjitter*fs; % Jitter in samples, white gaussian
y = zeros(size(x));
for ii = 1:lx
for jj = 1:lx
y(ii) = y(ii)+x(jj)*sinc((ii-jj)+jitter(ii));
end;
end;
% Quantize to the given number of bits
jy = quantize(y, bits);
qy = quantize(x, bits);
end
function q = quantize(s, bits)
step = 2^(bits-1);
q = round(s*step)/step;
end