Hoffman Amplifiers Tube Amplifier Forum

Amp Stuff => Tube Amp Building - Tweaks - Repairs => Topic started by: dude on May 22, 2018, 04:53:40 pm

Title: Biasing a cathode amp with VVR Question
Post by: dude on May 22, 2018, 04:53:40 pm
Two EL84's (18 watt TMB), VVR wide open, 334v on plates, 12v across cathode R, 160 ohm R = 11.3 watts dissipation. (too high?)
VVR on half power, 221v on plates, 8v across Cathode R, 160 ohm R = 5.1 watts dissipation.
Never checked bias with VVR turned down, I guess when using a VVR to lower volume you also lower dissipation and tubes runs very cold...? Is this normal?


al
Title: Re: Biasing a cathode amp with VVR Question
Post by: sluckey on May 22, 2018, 05:06:59 pm
Hey dude, that's just how it works! (Always wanted to say that!)  :icon_biggrin:
Title: Re: Biasing a cathode amp with VVR Question
Post by: dude on May 22, 2018, 05:58:35 pm
 Good answer, if i drink five beers I get drunk, why..? That's the way it is.... :l2:
Title: Re: Biasing a cathode amp with VVR Question
Post by: PRR on May 22, 2018, 09:12:13 pm
> Is this normal?

Less voltage, less current, less power.... is that not what you asked for?

Tubes are non-linear. As you come down to 1/10th voltage (nominally 1/100th power) the current is dropping more than the voltage. I once tried to self-correct this. However 1% power is usually low enough; if you truly want microwatts you probably want an amp FOR that use, not a strangulated big amp.
Title: Re: Biasing a cathode amp with VVR Question
Post by: HotBluePlates on May 23, 2018, 09:19:58 am
... VVR wide open ... 12v across cathode R ... 11.3 watts dissipation. ...
VVR on half power ... 8v across Cathode R ... 5.1 watts dissipation.
... I guess when using a VVR to lower volume ... tubes runs very cold...? Is this normal?

Turning the VVR down does lower tube dissipation.  You don't need the tubes idling at 100% when shooting for less output power.  Those tubes will last a whole lot longer.

But notice your bias voltage went down...  When the drive signal peak pushes the grid-to-cathode voltage to 0v (peak drive signal = bias voltage), you're pretty much guaranteed distortion.  So less dissipation & smaller bias voltage also means less drive required to push the tubes into distortion, and at lower power output.

So it all works the way you'd want without any extra work from you!
Title: Re: Biasing a cathode amp with VVR Question
Post by: dude on May 23, 2018, 12:32:40 pm
So, I take it that when voltage is lowered. no big deal with low dissipation, your not playing. But when playing at lower VVR settings the cathode voltage goes up, maybe this is why I don't hear a bad distortion...? Of course turning the VVR way down as PRR suggested some do, will cause unwanted distortion with very low dissipation.


I never like the so called "bedroom level" using a VVR, IMO, getting the voltage lower than 200v the amp doesn't sound that well, loses highs and some mojo. I'm not a bedroom player, I use the VVR to cut about 100v or so to get a nice sweet spot at a lower output, using volume, gain and VVR pots to taste according to the playing venue. I find that doing the whole amp, cathode biased, sounds best for me. Especially when playing in smaller jam rooms, on stage I rarely lower the VVR much more than a quarter turn, if that, unless it's miked.


So for me, the lower dissipation at about 240v's is ideal as when playing it goes way up, as I can see the cathode voltage jumping up from 6v to over 12v and higher. Maybe that's why I hear no bad distortion...?


Just my 2 cents, probably not worth much...?  :icon_biggrin: