[linux-audio-user] [OT] recording levels (mismatch)

Anahata anahata at treewind.co.uk
Wed Oct 13 03:39:33 EDT 2004

On Tue, Oct 12, 2004 at 02:42:59PM -0700, Mark Knecht wrote:

> Much of the 'consumer' equipment out there uses high resistance output
> drivers since they are less expensive to build. Pro equipment will
> almost always use low impedance outputs.

I'd expect most consumer equipment to use op-amp based gain blocks which
are cheap enough and have very low output impedance. Usually the output
has a resistor (100 ohm or so) in series for protection against

[snip explanation of impedance mismatching]

> 1V amp PP output  -> 1000 Ohm output <---cable----> 100 Ohm input

100 ohms is very unlikely for an input. Even a low-z mic input has an
impedance of over 1k, and a consumer line level input is likely to be

The actual voltage levels used by consumer equipment do vary, though.
There is a semi-professional standard of -10dBu for RCA connectors, but
domestic equipment does not necessarily follow that. You often see that
in a stereo 'separates' system where you routinely have to adjust the 
volume when switching between tuner/CD player/tape etc.

I suspect this may actually not be too much of a problem.
25% of the way up a VU meter (VU meters are linear) means 12 dB below
full scale, and a 16 bit digitized signal will actually be 14 bits.
So boost the gain electronically (e.g. with normalize, or I'm
sure sox or ecasound can do it too) after it's in the computer. There
will in theory be some loss of signal-noise ratio, but as the original
poster was using a Sound Blaster, we're not talking about professional 
audio here.

A professional-grade sound input device would either have variable
analogue gain to match the input level precisely, or acccurately
specified input level requirements so the user knew exactly what to

anahata at treewind.co.uk       Tel: 01638 720444
http://www.treewind.co.uk    Mob: 07976 263827

More information about the linux-audio-user mailing list