I'm building a new amp with a 6L6 push-pull output stage. The output stage parameters are: Push pull, into a 6.6K primary load (plate-to-plate) 8 ohm dummy load resistor on 8 ohm secondary winding 300V screens (regulated through my bench power supply) Fixed bias (but with 10 ohm current sense resistors in the cathode circuit) -30 VDC grid bias Plate voltage is 440V at full power, and 460V at idle. Brand new Tung Sol 6L6-GC STR tubes. I have a installed a 150 ohm resistor on each screen so as to be able to sense screen current, plus for screen stability/protection. I drive the amp to just under clipping (35 watts) from my signal generator with a 1 KHz sine wave. At -30V grid bias, I *should* see right around a 30V peak drive signal on each of the control grids at max power output, which indeed I do as verified on my scope. So in other words everything is working as expected. Now the question: I measure 8.5 mA DC screen current draw across each 150 ohm screen resistor and 8.33 mA AC current draw across each screen resistor. Current draw in both cases is calculated by measuring either the AC or DC voltage drop across the resistor, and then dividing by the resistor size of 150 Ohms. AC voltage drop is measured from my digital volt meter, which approximates RMS readings. Double checking the screen current draw with the 6L6-GC data sheet, and looking at the "average transfer characteristics" graph where it plots grid #1 voltage on the X axis and screen current on the Y axis, and with a 400V plate voltage (a little lower than I'm using at 440V, but within the ball park), I see that at 0V grid, I should be getting right around 17 mA or 18 mA screen current draw per tube for a 300V screen voltage. But in actuality I am getting pretty much half that screen current draw for either the DC current or the AC current. See this graph from the 1959 GE 6L6-GC data sheet. The red circled area is what I am focused on: Here you can see that the graph says I should be getting about 18 mA screen current per tube at 0V grid for 300V screen and 400V plate. I am trying to figure out why the graph shows roughly double the screen current of what I am actually measuring. One thought as to why is perhaps the graph above is assuming DC conditions, whereas my test is measuring AC conditions. I'm not sure if this helps decipher things, but here is the voltage wave as seen on the scope under full power conditions (with AC coupling on the scope probe), with a 10X scope probe placed on the screen pin of one of the output tubes. Vertical scale is 0.1V per division (thus in reality 1V per division). Anyway, I am trying to make sense of the transfer characteristics graph with my actual measured screen current. At the moment they don't seem to coincide. I am wondering what is wrong with my mental model of screen current?