I realize real-world bias is a bit hard to predict, given low bias currents and varying voltage supplies. But I'm trying to design an idealized bias adjust system. For those who don't know, Doug has a great
page on bias circuits, and shows how you can use anything from a 10K to a 50K pot.
Let's say you want to bias an amp for a single power tube type -- say 6V6s -- and let's assume your PT has a 50V bias tap.
Merlin notes a 10K pot is ideal, and that the other bias resistors (such as bias divider, bias tail) should be small-ish too, with the tail resistor (his 'idiot resistor' or ground shunt) possibly the size of the pot. Then he notes the
max negative (most negative) voltage you'd need would be the screen voltage divided by the triode mu (10 for the 6V6), which would take the tube to near cutoff. So for 6V6s with screen voltage 400 you'd want max negative of -40.
If I'm right, the math is like this (skip if you want). The voltage divider formula is V(out)=V(source)*R2/(R1+R2) where here R1 is the divider and R2 is the tail -- in this case, the tail resistor + the pot. Note with a 10K pot R2(max) will be the tail + 10K (max pot setting dumps least bias voltage to ground).
The result? A big R1 (bias divider) gets you a narrow adjustment range, as little as ±3V. In some Fender layouts, if you center this range on the schematic bias voltage, you get a max negative voltage far less negative than the 40 Merlin wants. Of course, if you just swap in a small R1, you can get the bias adjustment range up to say 12V, with the upper limit at or near 40, but at the cost of making bias adjustment much less precise (let's assume you use a single-turn pot).
Yes, I know many people just run 25K or 50K and get a wider range, and I know you can use 10-turn pots, and yes, many people want to be able to bias for two or more tube types.
But for a single tube type (say 6V6) *what is a useful, real-world range of bias adjustment*? 3V? 12V? or???