> How does -37 VDC convert to mA once I set the voltage?
Ask the tube.
Every tube-type is different.
Every tube of the "same type" is a little different, maybe 20%.
Attached is the published curve for 6V6. It is plotted for 250V and g-amps tend to run higher, so the bias won't be the same.
Note that the line is NOT straight. You can't calculate this. You have to have the curve.
Let us say the designer liked 40mA of current in the tube.
OK, so the designer looks at the curve and finds 40mA takes -14V of bias.
That works if the tube(s) in the hand are "exactly average".
Now build with a thousand tubes. (Or today, a handful from different years and factories.) They will lay all over the blue-shade +/-20% zone (and worse). A "correct" -14V bias may give, not 40mA, but 32mA or 48mA.
Do we care?
If it is Class A, this is moderately critical, because the tube can't suck more than it is biased at. (More is hot, less is early distortion.) If it is class AB the exact bias is not very critical. The tube *will* suck more current when needed; a too-cold bias may be "rough" on small sounds.
Since class A should be run Cathode Bias Whenever Possible, we know this is an AB design. The exact current is not very critical. We aim so a 20% high tube won't melt real quick, and a 20% low tube won't slack-off on very small sounds.
We work with bias Voltage because it is easier to measure than current. We do not have to break the circuit, or build test-resistors in. Test resistors (1 Ohm jobs) were not real viable in 1958 because voltmeters of the day did not read 40mV very well on a 1.5V (1500mV) minimum scale. If you got the bias voltage to about 14V, the amp was good to go. We did not fret about 32mA or 48mA actual current.
> Fender considered -37V to be a safe bias point for 6V6s in that amp. They don't care how many mA is flowing through the cathode. That's why they don't... show the actual tube current.
The Designer probably had a hint if it was 10mA or 100mA. He also diddle G2 voltage up over the model years and had to diddle bias down to avoid occasional red-plating in final test. He may even have breadboarded one and cycled a crate of tubes through it to see if any behaved badly. (Many seasoned factory techs saved a few extreme tubes to try in new designs, avoided trouble in the factory and in the field.)
We now have good meters and new-kids (yes, some not young) get crazy about the current. We see that _IF_ the amp had to run AT 40.00mA, we might need -12V or -16V for different tubes from a pile.
My feeling is that, if you know the amp, you read the plate temperature and see if it seems hot or cool. Using your hand leaves burns. I used to put my face in the amp: lips are sensitive to radiant heat. Today there's these dandy remote thermometers, which are not quite accurate on hot glass, but will consistently tell you warmer/cooler. And if you repair 7 amps an hour in a shop, a quick scan with an IR thermo instantly tells if a tube is hardly-alive or sub-red-plating, before it disappoints the customer.
One thing to remember: if you change the amp's Screen voltage, you want to change the G1 bias voltage at least as much. If you have a vintage design with 5K to the screens, and change to a choke (500 Ohm screen drop), say 25% rise of Vg2, you want 25%-30% more Vg1.
> Do we really have to be so precise? After all, it's just a simple audio amp!
Right. And it is a TUBE amp. Tubes are soft. Nothing happens suddenly. 40.00mA does not ensure a Grammy any more than 35mA gets you run out of the club in disgrace.
Much over 14W Pdiss in 6V6 increases the chance of a tube-death mid-set. (But Fender runs 17W in one model.) Way-cold bias is audibly "hoarse" on very soft sounds (so play loud!).