Thanks for the long detailed reply! Sure, with a strong enough voltage an amplifier can effectively become a linear voltage multiplier. The loss in resolution isn't noticed. But I don't believe any pre-amp can perform accurate linear voltage multiplication with a microphone voltage/current because the current is too weak. Or can be too weak if a person is speaking too far away. Or, if they're speaking too loudly, forcing the amplifier to focus on the strong end while the weak end falls off the sensitivity of the pre-amp.QuoteI assume when I raise or lower the analog gain on my ADC I cannot amplify EVERY voltage (change) input from the microphone equally.An amplifier is simply a linear voltage multiplier. Digital amplification/attenuation is also done as multiplication. Each sample (44100 samples per second, etc.) is amplified by the same factor (greater than 1 for amplification and less than one for attenuation).
So, mustn't a microphone pre-amp pick a range of voltage to amplify? Which it can do through automatic gain control electronics or you can do manually with a gain knob.
If that wasn't true we wouldn't need a gain knob in the same way there isn't a gain knob in a high end stereo system amplifier because the amplifier assumes it's receiving a voltage optimized for amplification.
MOD edit: Nest quotes [[original] reply]