... from what I got from Jeff cathode bias is essentially adjusting current draw in reference to the cathode. Fixed bias is adjusting in reference to ground. ...
No, this is a wrong way to think about the bias methods.
Fixed Bias: Apply a fixed, negative voltage to a grid. Grid is negative of the cathode (which is often, but not always, at 0v).
Cathode Bias: Use the cathode current to create a voltage-drop across a resistor to raise the cathode "above ground." Grid is negative of the cathode (which is not at 0v).
(not arguing, just trying to get an understanding)
So, what do we have here?
This is a cathode biased amp because we are using the current through the cathode resistor to get a voltage difference between grid and cathode...
But also...
This is a fixed bias amp, because we are putting a negative voltage(relative to cathode) on the grid.
What is this?
Say you had a cathode biased amp that was biased cold. How would you adjust the bias? To my understanding, you would decrease the value of the cathode resistor to get less of a voltage drop, thus changing the voltage at the cathode, thus changing the relative bias voltage difference between grid and cathode.
Would you not?
In order to adjust bias on a cathode biased amp you would adjust the value of the cathode resistor, right?
Why wouldn't take that same approach if we wanted an adjustABLE cathode biased amp?
In other words, there's two ways to change the voltage relationship between cathode and grid. One is to make the cathode resistor smaller, one is to introduce a negative voltage(or a 'less positive' voltage) on the grid.
If you had a cold biased cathode biased amp, how would you adjust it?
Smaller cathode resistor
OR
Add a voltage divider to the cathode(which is too positive) to take a small percentage of that voltage to put on the grid, so the relationship of cathode to grid is lower?
Say you've got a cathode biased amp where the cathode is 30V and the grid is at ground. If you wanted to bias it hotter, say 20V, would you actually add a voltage divider to the 30V cathode to get 10V, put the 10V on the grid, creating a 30V cathode to 10V grid for a 20V bias?
Or would you just use a smaller cathode resistor?
If you would say, smaller resistor, then why not use a varable cathode resistor as a bias adjustment?
That's the part I don't get.
Both ways would work, right?
So, what is the advantage of biasing it cold with a bigger cathode resistor, using a voltage divider to get a smaller positive voltage, taking that relatively smaller voltage and putting it on the grid.
You wouldn't do that to adjust a cathode biased amp's bias why take that approach for an adjustABLE bias control?
Justcuz a pot's cheaper than a rheostat?
Or is there something wrong with using a rheostat to create an adjustable cathode resistor?
Seems like replacing the cathode resistor with a smaller resistor in series with a rheostat would be the eaisest, most direct way, to add an adjustable bias, keep the amp cathode biased, without introducing a mix or Cathode/Fixed bias scheme.
No?
I must be missing something
Peace