Input sensitivity

triode17

Super Member
Why do all tube amplifiers seem to have a different sensitivity? I find a large range from 0.2v to 1.6v, vintage to new designs.
 
Just about the reason that Peter states. Each different designer has his or her own ideas of what works for themselves and their companies. Since there is no standard, only an expected range, each one is free to choose their own design.

Shelly_D
 
Just about the reason that Peter states. Each different designer has his or her own ideas of what works for themselves and their companies. Since there is no standard, only an expected range, each one is free to choose their own design.

Shelly_D
Yes but a standard for consumer equipt. was later established as -10dBV or 0.316v. Why not tube amps?
 
A standard was developed because of all the different sensitivity specifications.

The issue largely goes back to the two major competing high fidelity power amplifier designs of the 50s: The original Williamson amplifier required about 2.0 vac RMS to drive it to full power output. On the other hand, the Mullard design only required about 0.5 vac RMS, and often less. Different manufacturers championed different basic designs: Heath patterned their equipment after the Williamson circuit, while Eico went after the Mullard design, as but two examples. Other manufacturers used a marrying of these two circuits and sensitivity levels as they thought most appropriate.

With sensitivity levels from all the various designs all over the map, most preamps (but not all) were designed to be able to handle a worst case situation: The Williamson amplifier. That's why most of the better preamps of the day utilized about 20 db of line stage gain (a gain of 10X), so that they would work for those owners of Williamson amplifiers. But that level of gain was waaaay to much for (say) an Eico power amplifier, so Eico (as but one example) included level controls on their power amps to be able to match the gain of the preamp with that which the power amp offered. On the other end of the scale, Leak preamps at best had unity gain (or less), requiring all the sensitivity the Eico power amps had. Other manufacturers like Dynaco tried to split things down the middle, employing a power amp sensitivity that would generally work well with most preamps (except the Leak), without the need for level controls, with their preamp then working well with most power amps, whether they had level controls or not.

The end game (at the time) was really an effort to have the volume control operate with the setting range so that:

1. On the one hand, it wasn't too touchy to operate (or operating down in the weeds), and

2. For most sources, the loudness control function could work properly at lower volume settings.

Other issues such as speaker sensitivity and power amplifier power output capability all play into the issue as well. For a given power level, 80 db speakers will provide a soft sound level, where 100 db speakers with the same amount of power applied will in fact be loud. And, a 75 watt amplifier with a .5 volt sensitivity inherently has a heck of a lot more gain built into it than a 10 watt amplifier does with the same sensitivity level (about 275% more gain). So, the issue was all over the place, with a standard being developed as a result.

For me, I dislike power amplifier level controls due to the frequency response error they most often introduce, and so prefer a sliding sensitivity level, with lower power amplifiers of about 20 watts capability displaying a sensitivity that requires about 1.3 vac RMS to developed full power output, while bigger amplifiers of 70 watts or so require 2.25 vac RMS to develop full power output. Such an approach has both examples of these power amplifiers producing about the same sound level with all else being equal in a given setup for a given volume control setting, but allows the high power amp to do its thing beyond the capability of the lower powered amp, after it's runs out of gas. And, a sensitivity level of over 1 volt also helps to minimize interconnect noise and related issues as well.

My 2¢ worth.

Dave
 
Last edited:
The other "standard" that has popped up a lot, is that most modern equipment that deals with unbalanced (RCA input, etc) sources, are made to deal with between .775v and 2v line level, for max output.

I try to stay within that range, when building amps. Toward the lower side with low power amps (15 watts or less), and higher, for the larger amps.

The nice thing about amps with .775v sensitivity, is that with a passive attenuator, they can most often run OK from a standard line-out source (from a tape deck or CD player or such), without the need for additional gain from a preamp. Of course, as Dave mentioned, such a set up must be designed to not have Miller capacitance upset the frequency response (the added input impedance of a potentiometer can cause a loss of HF response, if not properly provided for)... but if that's factored in (low capacitance input tube, such as a 12AX7, 12A\U7 or pentode or some sort being one common way), then they can work fine...

Regards,
Gordon.
 
I like stuff with a lot of sensitivity if I'm not running an active preamp. .7v is about the high end for passive stuff I've found, and if you want to use it off an ipod or the like, those are happier with about 0.5 volts. The little German deathtrap amp I own maxes out at 0.35v and its absolutely fantastic off my ipod.
 
Brit standard was iirc 0.5 VRMS, US ca 1.5-2.0 VRMS. Vintage Mac amps could accommodate all comers.
 
Back
Top Bottom