Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: What is the pre-amp input window range for dynamic microphone voltages (Read 4074 times) previous topic - next topic
0 Members and 2 Guests are viewing this topic.

What is the pre-amp input window range for dynamic microphone voltages

First, thanks for your time!
I'd like to stay away from decibel scale if at all possible; that is, stick to voltages.  Thanks!
Also, I'd like to ignore impedance
Also, I'd like to ignore frequency response, noise, etc.
All I want to focus on is the voltage (pressure) at any given point of time, let's say 48kHz sampling
(let me know if you think the above impossible or just plain dumb).

A dynamic microphone outputs, let's say, a variable voltage between -x and y.  I still can't figure out, in all my research, what this range is  -2mV to 2mV?  -1mV to 1mv, 0.1...  I believe the number is SMALL! I understand impedance is involved, etc., but I want to simplify as much as possible to answer this question as broadly as possible (if possible  :D  )  Perhaps if you have a specific mic in mind that would be helpful.  Anyway...

I assume when I raise or lower the analog gain on my ADC I cannot amplify EVERY voltage (change) input from the microphone equally

Because of all that stuff above we can't talk about  :'( , I must pick a range to amplify it.  The min and max I effectively chose will determine how much noise or clipping I might experience. 

Any answers would be helpful.  Again, all this general.  Do most pre-amps take a window of 50% of the voltages, 60%, 90%.  That is, if we have the gain set at zero, I would assume we are taking X% voltages from the microphone's maximum output voltage (or close to it) down to 0.  If we're at max gain than we're starting as close to 0 and working our way up.

(If we do use dbs, and the mic has 60db of sensitivity and our Audio Interface has 120db dynamic range--wait, how does that make sense?  We shouldn't need a gain knob at all?)

Or let me put question another way, how much of a dynamic microphones voltage can an ADC accurately amplify at any given time? 

Sorry for the confusing question. 

 







Re: What is the pre-amp input window range for dynamic microphone voltages

Reply #1
Quote
What is the pre-amp input window range for dynamic microphone voltages
It depends on the particular preamp.    Different preamps have different overload limits and different gains.

The most popular microphone of all time, the Shure SM57/58 puts-out 1.6mV at 1Pa (94dB SPL).    It's a dynamic mic, which is similar to a speaker.  It's a coil, magnet, and diaphragm...   An electrical generator.

Studio condenser mics have a built-in "head amp" which gets 48V "phantom power" from the interface or preamp.   Condenser mics tend to put-out 10 or 20 times more voltage than a dynamic mic.


Line Level (the output from a preamp) is about 1V so gains of 100-1000 are common.

With an interface we don't know the sensitivity of the ADC or the actual gain or output-voltage from the preamp. 

Most audio  interfaces are optimized for condenser mics and they may not have enough gain for a dynamic mic, depending on how loud the sound is.

Quote
I assume when I raise or lower the analog gain on my ADC I cannot amplify EVERY voltage (change) input from the microphone equally.
An amplifier is simply a linear voltage multiplier.     Digital amplification/attenuation is also done as multiplication.  Each sample (44100 samples per second, etc.) is amplified by the same factor (greater than 1 for amplification and less than one for attenuation).

Once the data is digitized/quantized there are "steps".    At 16-bits the data goes from −32,768 to +32,767 so the steps are very small.    With 8 bits you can only count to 255* (bigger steps) and if you make an 8-bit file the low-resolution results in audible quantization noise.   24-bits can hold values from −8,388,608 to 8,388,607.

0dBFS (zero decibels full-sale) is defined as the highest number you can "count to" with a given number of bits.   With floating point representation, a numerical value of 1.0 represents 0dB and for all practical purposes there are no upper or lower limits (as long as you remain in the digital domain).  The numbers in a 24-bit file are bigger than those in a 16-bit file but when you play the file, everything is automatically scaled to match the DAC (digital-to-analog converter).

Introduction To Digital Audio

Quote
Because of all that stuff above we can't talk about  :'( , I must pick a range to amplify it.  The min and max I effectively chose will determine how much noise or clipping I might experience.

There are two sources of noise.    There is acoustic noise in the room, and electrical noise from the preamp.   You can increase the acoustic signal-to-noise ratio by making the sound (signal) louder or getting closer to the microphone.  These will also improve the electrical signal-to-noise ratio.

The gain control in most preamps (including the preamps built-into interfaces) comes after the preamp so turning-down the gain control usually turns-down the signal and noise together so it doesn't affect quality.

Preamps have a voltage limit and they will clip if you try to go over.   Usually that voltage is high-enough that it "never happens".   The preamp built-into an interface has enough headroom so that the ADC (analog-to-digital converter) clips first.   It will always clip at exactly dBFS.   There is a direct correlation between dB SPL (and voltage levels from the mic) and dBFS but there is no standard calibration.  i.e.  If the SPL loudness goes-up by 3dB, the digital level will go up by 3dB (as long as you're not clipping.)

Quote
(If we do use dbs, and the mic has 60db of sensitivity and our Audio Interface has 120db dynamic range--wait, how does that make sense?  We shouldn't need a gain knob at all?)

Microphone sensitivity is given as a voltage at a specified SPL level.    94dB SPL (1 Pascal) is the "standard".

Digital recording levels aren't critical and pros often record at -12 to -18dB (at 24-bits).   The main thing is that you don't want to clip!!!   Low levels are often an indication of an analog problem and you might have a poor signal-to-noise ratio and the noise can become a problem when you amplify to "normal levels".   But turning-down the knob on your interface doesn't hurt anything as long as you have a good signal-to-noise ratio.

At very-low digital levels the quantization noise becomes a problem (again when amplified).     16-bits has a dynamic range of about 96dB and if you go below that, the "numbers" are zero and you have pure digital silence.    24-bits has a dynamic range of about 144dB.



* 8-bit WAV files are biased/offset and there are no negative values.    Of course, that bias is removed with it's played.



Re: What is the pre-amp input window range for dynamic microphone voltages

Reply #2
Quote
Quote
I assume when I raise or lower the analog gain on my ADC I cannot amplify EVERY voltage (change) input from the microphone equally.
An amplifier is simply a linear voltage multiplier.     Digital amplification/attenuation is also done as multiplication.  Each sample (44100 samples per second, etc.) is amplified by the same factor (greater than 1 for amplification and less than one for attenuation).
Thanks for the long detailed reply!  Sure, with a strong enough voltage an amplifier can effectively become a linear voltage multiplier.  The loss in resolution isn't noticed.  But I don't believe any pre-amp can perform accurate linear voltage multiplication with a microphone voltage/current because the current is too weak.  Or can be too weak if a person is speaking too far away.  Or, if they're speaking too loudly, forcing the amplifier to focus on the strong end while the weak end falls off the sensitivity of the pre-amp.

So, mustn't a microphone pre-amp pick a range of voltage to amplify?  Which it can do through automatic gain control electronics or you can do manually with a gain knob.

If that wasn't true we wouldn't need a gain knob in the same way there isn't a gain knob in a high end stereo system amplifier because the amplifier assumes it's receiving a voltage optimized for amplification. 




MOD edit: Nest quotes [[original] reply]

Re: What is the pre-amp input window range for dynamic microphone voltages

Reply #3
Quote
But I don't believe any pre-amp can perform accurate linear voltage multiplication with a microphone voltage/current because the current is too weak.
Linearity isn't a problem until clipping.  It's super-easy to build a highly-linear amplifier.   Non-linearity will show-up in distortion measurements and most audio amplifiers (even cheap ones) have distortion below audibility unless over-driven.

There is a type of distortion in class A/B power amplifiers called "crossover distortion" and it happens at the zero crossing where the signal goes from positive-to-negative or negative-to-positive.    Crossover distortion is worse (when measured as percent-distortion) at lower levels.   But it's normally below audibility and it's harder to hear distortion at lower levels even if it's a higher percentage of distortion.   Power amplifier output-stages are about the only place you'll find class A/B circuits.    (This is one reason some "crazy audiophiles" think class A is better.) 

At low levels, noise is the problem.   Electrical noise is "instability".   You could have a 1V signal and if the noise is 0.1V, the actual voltage will randomly fluctuate between 0.9V and 1.1V.  If it's an AC audio signal the noise and signal are summed.   If the noise is random it will be white noise.


Quote
forcing the amplifier to focus on the strong end while the weak end falls off the sensitivity of the pre-amp.
It's not "focusing" on anything.  Remember, it's just multiplication.  You can multiply 1 x 10 = 10, or 1001 x 10 = 10,010 and you're not losing any resolution with bigger numbers.   Also when you have a "wave" there is a zero-crossing twice per cycle and near the zero crossing the amplitude becomes infinitesimally small, and at that time you are amplifying very-small voltages.

Quote
So, mustn't a microphone pre-amp pick a range of voltage to amplify?
Yes.  At the low-end you are limited by noise, and by clipping at the high-end.    With very loud sounds (like a gunshot) you can sometimes use a dynamic microphone and bypass the preamp.
 
Quote
If that wasn't true we wouldn't need a gain knob in the same way there isn't a gain knob in a high end stereo system amplifier because the amplifier assumes it's receiving a voltage optimized for amplification.
There are certain "things" like the Berhinger UCA202 audio interface, or most USB turntables, that don't have a recording-level control.   They are usually calibrated for lower digital levels so they don't easily clip.  Most inexpensive USB mics don't have a control either and they also put-out low digital levels with "regular voice" but they will overload if you stick them in front of a loud guitar amp or kick-drum. 

You have to cover a super-wide range.
20dB is a factor of 10
40dB is a factor of 100
60dB is a factor of 1,000.
80 dB is a factor of 10,000

If you are getting 0.001mV from the mic and you go 60dB louder, you'll be getting 1mV.

There are some new 32-bit floating-point interfaces.   I assume that will be getting more common.    From what I recall, this gives you a digital range of around -1000dB to +1000 dB (infinite for all practical audio purposes) and Zoom claims "no need to set gain".   But there is still analog noise and analog clipping.   You might get 20dB more usable dynamic range with a floating point interface, but I'm just guessing.

Re: What is the pre-amp input window range for dynamic microphone voltages

Reply #4
There are some new 32-bit floating-point interfaces.   I assume that will be getting more common.    From what I recall, this gives you a digital range of around -1000dB to +1000 dB (infinite for all practical audio purposes) and Zoom claims "no need to set gain".   But there is still analog noise and analog clipping.   You might get 20dB more usable dynamic range with a floating point interface, but I'm just guessing.
I've been kicked off the ASR forum for insisting 32-bit float cannot reduce clipping in any way.  So best I put that on another thread after this.  I wrote an essay explaining why, if you want to polish your knives  :))

Please allow me to put what you said another way.  Let me know how I'm confused/wrong.

1. When I plug a dynamic mic into an audio interface (for example) it will amplify the voltage changes near perfectly, x to y at whatever linear factor one wants
2. There will be noise, as in anything electrical, but either buried in a strong voltage or distorting a low unimportant voltage.
3. When gain is applied it amplifies all the voltage at greater factor, resulting in more noise in the final quantization (and potential clipping if the voltage is above what the ADC can handle as an input).
4. However, if I don't create a clipping voltage I will have the convenience of having a signal closer to where I'll normalize.

I've tried to test this a bit, so might not be surprised at your answer.

There is no difference in noise between a low-gain recording and a high-gain recording (assuming no clipping).  The only difference is you'll notice it immediately with the high gain recording, but will only notice it in the low-gain recording if you have to boost the levels.

Thanks again for you time and patience!  Is that correct?

 

Re: What is the pre-amp input window range for dynamic microphone voltages

Reply #5
There are some new 32-bit floating-point interfaces.   I assume that will be getting more common.    From what I recall, this gives you a digital range of around -1000dB to +1000 dB (infinite for all practical audio purposes) and Zoom claims "no need to set gain".   But there is still analog noise and analog clipping.   You might get 20dB more usable dynamic range with a floating point interface, but I'm just guessing.
I've been kicked off the ASR forum for insisting 32-bit float cannot reduce clipping in any way.
I agree. There is always a level at which an input is too hot to handle, and as far as I know there is no hardware that can fill more than 24-bit meaningfully.

Anyway, I do not fully agree with DVDdoug. Let me explain.

If you have a dynamic microphone, there is usually no amplifier in the mic like for a condensor microphone. So let us assume there are only 2 components in the chain: the pre-amp and the ADC. If a very good ADC is paired with a mediocre pre-amp, it can be the THD+N (distortion plus noise) introduced by the pre-amp is (much) louder than the THD+N of the ADC, even at the lowest gain setting. If that is the case, there is no point in turning up the gain: if the lowest gain and the highest gain result in an equal amount of noise (compared to the signal, i.e., SNR) then using the lowest gain will give the most headroom.

However, if the ADC is good and the pre-amp is good too, then usually the THD+N of the ADC will be higher than the THD+N of the pre-amp at lowest gain. If that is the case, turning up the gain will result in a higher SNR, until the point that the THD+N of the pre-amp gets louder than the THD+N of the ADC. From that point, turning up the gain doesn't improve SNR.
Music: sounds arranged such that they construct feelings.

Re: What is the pre-amp input window range for dynamic microphone voltages

Reply #6
However, if the ADC is good and the pre-amp is good too, then usually the THD+N of the ADC will be higher than the THD+N of the pre-amp at lowest gain. If that is the case, turning up the gain will result in a higher SNR, until the point that the THD+N of the pre-amp gets louder than the THD+N of the ADC. From that point, turning up the gain doesn't improve SNR.
If we can ignore the ADC (?), the way I understand amplification is, to simplify, if you have a really weak current you're going to need to seed or bias that current to get the amplification going.  There are many ways to do this of course.  That helper current is noise.

If you don't help the weak current, it just warms up the amplifier, so to speak, without getting it to work.  At the other end, you've optimized your amplifier to work with a known voltage.  That becomes the sweet spot.  The more a voltage goes above that the more the current slops out of its buckets, so to speak.   

That's all theory, which I believe holds because of thermodynamics, but the question in practical terms, are those things a real issue?  I suspect they are because if they weren't we wouldn't have a gain knob.  I believe the gain knob either increases the output voltage someway or another which has a cost in fidelity.  But if it's more important to resolve the weak voltage you pay it?

In a perfect world, you'd build a specific amplifier for every mic and it would flash yellow if current too low, green if good, and red if too hot.  Since we don't have that world we have have amplification kludges, controlled by gain knobs.  Am I off?

Re: What is the pre-amp input window range for dynamic microphone voltages

Reply #7
the way I understand amplification is, to simplify, if you have a really weak current you're going to need to seed or bias that current to get the amplification going.
No.  That is simply incorrect.  Amplifiers are internally biased to their "centre point", the input signal disturbs it away from that centre point whether the input be positive or negative.  That is the reason the signal path is AC-coupled, so that DC offsets do not affect the centre point.  I hope I've made that understandable.

("Centre Point" is not a standard technical term, I've used it to try to explain.)

You may have been confused by the tricks used in power amplifier output stage design, where extra voltages are needed to ensure there is no cross-over distortion, or the bias signal required to overcome hysteresis in magnetic recording.  These are compensated for in the design of the overall unit, do not affect the input or output (especially on properly-designed modern kit), and are not relevant to the likes of a pre-amp.

Something else which might be confusing you is the input bias current specification for op-amp ICs.  These are a consideration for the designer of the amplifier (if op-amps are used in the design) and not for the user of the amplifier.  If you are trying to design a high-performance pre-amp, then good luck with that!

No (audio) signal input should draw a DC current from the signal source.  Frankly, if the input did require "seeding to get the amplifier going", it would be a terrible amplifier!
It's your privilege to disagree, but that doesn't make you right and me wrong.

Re: What is the pre-amp input window range for dynamic microphone voltages

Reply #8
You may have been confused by the tricks used in power amplifier output stage design, where extra voltages are needed to ensure there is no cross-over distortion
I believe we're talking past each other.  My point is that ALL amplification must make a trade-off.  It's a matter of thermodynamics...friction...noise...whatever you want to call it.  When one uses extra voltage to "ensure there is no cross-over distortion" that isn't, wasn't, ever desirable.  It's a kludge.  If transistors were 100% efficient it wouldn't be needed, right? 

Every component of electronics has a side-effect, so to speak.  So another component is added, not to remove it completely (which it can't) but to reduce it to the smallest cost possible.

Let me put it another way, I use a microphone.  I can't get the signal I want.  I ask why.  How much can't I get.  I get answers about all kinds of "this is the way the world (electronics) works."  But seldom, how do I put it, an honest answer.  So I say, there must be an answer because we have "laws" in how to measure currents so we must know where and how much isn't coming through.  Limitations in the materials used to make microphones, the amplifiers, etc.

It's like some people feel I'm attacking electrical engineering or something.  That I'm questioning their knowledge. 

Recording form microphones is OBVIOUSLY difficult which anyone who tries it understands.  There's a gain knob.  In effect, it is only a TRADEOFF knob.  You trade off one problem for the other. 

My question is, what are the tradeoffs as a percentage of theoretically perfect amplification and quantization.   Thoughts?

Re: What is the pre-amp input window range for dynamic microphone voltages

Reply #9
You seem to be shifting the goal posts.

First you claim there is some initial current required to prime the amplifier input, then when I say there isn't you shift your argument onto quantization noise.

If you're talking about the fundamental physics, then at the very lowest levels you could be dealing with individual electrons and what's known as "shot noise"... but no real-life application is anywhere near that low a level and you would need specialist "instrumentation" amplifiers.

Microphones are essentially mechanical.  You will find specifications for their minimum sound pressure level sensitivity, and that is effectively a hysteresis.  Note this is nothing to do with the amplifier.  "Consumer grade" microphones will have a high minimum sound pressure level, and only be suitable for relatively high volumes, but studio microphones (expensive!), eg ribbon mics, are to all intents and purposes linear.  Once the microphone transfer function irregularities are below the background noise SPL it makes no difference.  The problem is not the minimum SPL, but the dynamic range.  It is practically impossible to build a microphone sensitive enough to collect the tiniest sound above the noise floor but not clip the loudest sounds.

Now: are you discussing a real-life problem you have with amplifying low level sounds, or is this just blue sky stuff?  If this is a real problem with low-level sounds, you just have to get hold of the most sensitive microphone you can afford, and use a proper studio microphone pre-amp.  Ordinary consumer electronics is not fitted with proper microphone inputs (even if it is marked as such).

To answer one of your basic questions, an unamplified microphone signal output is of the order 10mV, but it very much depends what technology the microphone is based on.  Signals of that level require very good noise management – screening, no hum loops, balanced lines etc, hence consumer equipment uses high-output microphones (ceramic) and/or amplification at the microphone end (batteries, or sometimes phantom power).  Only well-designed dedicated studios are suitable for the very best unamplified microphones.

What you call "kludges" are engineering out the imperfections of real-world components when trying to get more linearity and more output power.  Getting the best performance from things we can actually make in practice and within cost is the difference between engineering and science.  You might think of it as a compromise, but the alternative is nothing.  Engineers have been doing this stuff, very successfully, ever since the vacuum tube was invented – if you think you have a better idea, put your money where your mouth is and see if it flies in practice.

And as for not wanting to deal in dB – sorry, but that's what sound engineers work in.  Human aural perception is logarithmic, and by using a logarithmic scale we can deal with numbers in the range 1-100 (or so) instead of 0.00001 to 10.  Learn it and get used to it.
It's your privilege to disagree, but that doesn't make you right and me wrong.

Re: What is the pre-amp input window range for dynamic microphone voltages

Reply #10
if you think you have a better idea, put your money where your mouth is and see if it flies in practice.
That's offensive.  When did I ever say I was trying to build a better anything?  I am asking a rather simple question and you are taking it personally.  Why?