The Bias Regulator
So with the secrets of the driver section unlocked, the bias regulator circuit is up to bat now.
A few commercial designs, as well as more than a few seasoned experimenters have all toyed with or executed some form of bias regulation in equipment they've worked with. After all, it makes sense, right? Stabilizing the bias voltage must stabilize the operating point of the output stage, which must make for better performance. In reality however, unless such schemes are planned out very carefully, they can actually do more harm than good.
If you stabilize just the negative bias voltage, but not the (positive) voltages going to the other elements of a tube, it can make for radical shifts in the operating point depending on the prevailing AC line conditions: When the B+ voltage rises but the bias voltage doesn't, output tube quiescent current can rise significantly, leading to potentially overheating the output tubes. If the B+ falls but the bias doesn't, quiescent current can fall significantly, leading to increased distortion, and reduced power output. But that appears to be exactly what Fisher is doing here, right?
Well, no, they're not. The concerns regarding regulation of the bias voltage as outlined, relate to any amplifier. And while it may appear that against the backdrop just presented that Fisher made a big boo-boo, that's not what this particular bias regulator is doing.
The concept of bias regulation is virtually always made from the standpoint of regulating the SOURCE of bias voltage. But in this case, the regulator is actually quite invisible in that regard: As the B+ rises and falls, so will the bias voltage presented to the output stage. So what is the regulator actually doing then? In these amplifiers, the regulator is actually SINKING current -- current generated by the driver stage!
As long as the amplifier is operating in either in Class A or Class AB1, the bias voltage is a product of whatever the regulator sets it at, where at that point, it will then fluctuate with line voltage movement just as any of the other operating voltages within the amplifier do -- the point being that the regulator does not react to changing voltage input applied to it. However, when the amplifier shifts into Class AB2 mode, then the regulator goes to work.
Remember that in Class AB2 mode, the driver stage is developing POWER, because of the current drawn by the output tube grids during that mode. Power is a product of voltage AND current. In Class A or Class AB1 mode, there is no current drawn by the grids, and therefore no power produced by the driver stage; there is only a simple voltage swing being applied to the output tube grids from the driver stage. So when the driver stage is actually dumping power into the output stage, what is the circuit path that the current flow takes?
There are three elements in the grid current circuit path: There is the grid cathode path within the output tube, with the cathode effectively being grounded. Then, there is the driver circuit itself which is supplying the source of the current. And finally, there is the bias regulator, of which its output (the plate) is grounded. So there is now a complete circuit path created, with all three of these elements connected in series. But why is the bias regulator needed?
As with any series circuit, if any element in the circuit path represents a high impedance, then it impedes the flow of current in the complete circuit. We saw that happen in the driver circuit, where the losses in the driver transformer were great enough, that it took C10 and C11 to lower the driving impedance to the point where the driver could then develop enough current flow in the circuit path to push the output tube grids positive, and commence Class AB2 operation. That's why I spelled out the difference in driver stage internal impedance both with C10/C11 in, and out of the circuit.
And so it is with the bias supply as well. If -- as a source -- it represents a high impedance, then it will impede the flow of current in grid current circuit, and prevent the current from flowing in the grid of the output tube, preventing Class AB2 operation. In such a scenario, as Class AB2 operation tried to commence, the negative bias voltage supplied by the bias supply would actually increase (become more negative) due to the high impedance the supply represents, with the bias supply effectively absorbing the driver output voltage that was intended to be applied as a positive voltage across the grid/cathode element within the output tube. In such a case, the grid/cathode elements within the output tubes simply become rectifiers, rectifying the excessive driver voltage so that the resulting negative voltage created augments the negative voltage generated by the bias supply itself, causing a greater negative voltage to appear across the supply.
THAT then is exactly what V1B and the bias regulator circuit is designed to prevent from happening. It provides a low impedance path at the output of the bias supply, that can sink (or pass if you will) the current supplied by the driver stage during Class AB2 operation to ground. With the impedance of the driver stage itself reduced to a small value, and the impedance of the bias supply reduced to a small value as well, then it means a low impedance path now exists from the output of the driver stage, to the grid/cathode elements within the output tubes, allowing current to flow across them, and Class AB2 operation to take place.
So with the operation of all the unique design elements of these amplifiers laid bare now, is there anything that can be addressed in them today to improve the performance that sprang from their drawing board over 60 years ago? It turns out there is, with one of them being quite significant. It doesn't change the character of the amplifier at all. It just gives you more of the character it already has.
But that's next time.
Dave
So with the secrets of the driver section unlocked, the bias regulator circuit is up to bat now.
A few commercial designs, as well as more than a few seasoned experimenters have all toyed with or executed some form of bias regulation in equipment they've worked with. After all, it makes sense, right? Stabilizing the bias voltage must stabilize the operating point of the output stage, which must make for better performance. In reality however, unless such schemes are planned out very carefully, they can actually do more harm than good.
If you stabilize just the negative bias voltage, but not the (positive) voltages going to the other elements of a tube, it can make for radical shifts in the operating point depending on the prevailing AC line conditions: When the B+ voltage rises but the bias voltage doesn't, output tube quiescent current can rise significantly, leading to potentially overheating the output tubes. If the B+ falls but the bias doesn't, quiescent current can fall significantly, leading to increased distortion, and reduced power output. But that appears to be exactly what Fisher is doing here, right?
Well, no, they're not. The concerns regarding regulation of the bias voltage as outlined, relate to any amplifier. And while it may appear that against the backdrop just presented that Fisher made a big boo-boo, that's not what this particular bias regulator is doing.
The concept of bias regulation is virtually always made from the standpoint of regulating the SOURCE of bias voltage. But in this case, the regulator is actually quite invisible in that regard: As the B+ rises and falls, so will the bias voltage presented to the output stage. So what is the regulator actually doing then? In these amplifiers, the regulator is actually SINKING current -- current generated by the driver stage!
As long as the amplifier is operating in either in Class A or Class AB1, the bias voltage is a product of whatever the regulator sets it at, where at that point, it will then fluctuate with line voltage movement just as any of the other operating voltages within the amplifier do -- the point being that the regulator does not react to changing voltage input applied to it. However, when the amplifier shifts into Class AB2 mode, then the regulator goes to work.
Remember that in Class AB2 mode, the driver stage is developing POWER, because of the current drawn by the output tube grids during that mode. Power is a product of voltage AND current. In Class A or Class AB1 mode, there is no current drawn by the grids, and therefore no power produced by the driver stage; there is only a simple voltage swing being applied to the output tube grids from the driver stage. So when the driver stage is actually dumping power into the output stage, what is the circuit path that the current flow takes?
There are three elements in the grid current circuit path: There is the grid cathode path within the output tube, with the cathode effectively being grounded. Then, there is the driver circuit itself which is supplying the source of the current. And finally, there is the bias regulator, of which its output (the plate) is grounded. So there is now a complete circuit path created, with all three of these elements connected in series. But why is the bias regulator needed?
As with any series circuit, if any element in the circuit path represents a high impedance, then it impedes the flow of current in the complete circuit. We saw that happen in the driver circuit, where the losses in the driver transformer were great enough, that it took C10 and C11 to lower the driving impedance to the point where the driver could then develop enough current flow in the circuit path to push the output tube grids positive, and commence Class AB2 operation. That's why I spelled out the difference in driver stage internal impedance both with C10/C11 in, and out of the circuit.
And so it is with the bias supply as well. If -- as a source -- it represents a high impedance, then it will impede the flow of current in grid current circuit, and prevent the current from flowing in the grid of the output tube, preventing Class AB2 operation. In such a scenario, as Class AB2 operation tried to commence, the negative bias voltage supplied by the bias supply would actually increase (become more negative) due to the high impedance the supply represents, with the bias supply effectively absorbing the driver output voltage that was intended to be applied as a positive voltage across the grid/cathode element within the output tube. In such a case, the grid/cathode elements within the output tubes simply become rectifiers, rectifying the excessive driver voltage so that the resulting negative voltage created augments the negative voltage generated by the bias supply itself, causing a greater negative voltage to appear across the supply.
THAT then is exactly what V1B and the bias regulator circuit is designed to prevent from happening. It provides a low impedance path at the output of the bias supply, that can sink (or pass if you will) the current supplied by the driver stage during Class AB2 operation to ground. With the impedance of the driver stage itself reduced to a small value, and the impedance of the bias supply reduced to a small value as well, then it means a low impedance path now exists from the output of the driver stage, to the grid/cathode elements within the output tubes, allowing current to flow across them, and Class AB2 operation to take place.
So with the operation of all the unique design elements of these amplifiers laid bare now, is there anything that can be addressed in them today to improve the performance that sprang from their drawing board over 60 years ago? It turns out there is, with one of them being quite significant. It doesn't change the character of the amplifier at all. It just gives you more of the character it already has.
But that's next time.
Dave