Hey folks,
Is there any need (or benefit) to designing in a phase inverter in a low-wattage, single output tube amp? I'm toying around with a 1-watt-ish "bedroom" amp, and probably using a 12AT7 as an output... I understand phase inverters are usually used with larger tubes (pentodes and such), but a dual-triode is two tubes in one envelope (sorta), so would there be any benefit in adding this to such an amp? Or does this fall under the heading "needless complexity with no real benefit conferred?" Or is it a just plain bad idea, or totally unnecessary? Thanks for your thoughts.
You can make it a "self-split" power amp. The Firefly is a good example of a DIY project with a self-split 12AU7 power amp.
Basically, for self-split you ground the grid of the 2nd triode, and both share an unbypassed cathode resistor. So when the first triode pulls more current, the cathode resistor drops more voltage and thus the 2nd triode lets less current through, and vice versa. Really, it's not unlike how a phase inverter itself works.
It's not very efficient though, for one thing you can't bypass the cathode resistor. Of course, that can help you if you're trying to keep the output level down.
It works... but I've never been a fan of it. Or at least, I've never played a self-split amp I liked.
Another thing to think about is that a dual-triode power amp won't pull much current, so it works very well with a cathodyne PI (with big grid stoppers, like 470k big). Many 1-5W Marshall amps use a cathodyne PI. That only requires one triode.
You can also run that dual-triode as parallel SE stages and then no phase invertion required.