Starting this thread, to discuss this topic and also to help others looking at adding VVR voltage scaling to a AB763 type amp.
As background - I love the sound of an overdriven and cranked-up Deluxe Reverb, but I can't always make that much noise! so, wanted to build an amp that could also be used at low volumes. Accepting that part of that 'cranked sound' comes from the speaker, the OT and sheer movement of air, I knew I wasn't going to get the same tone at low volumes, as I would at high volumes, but I wanted to try and get as close as I could to that chimey, bell-like- tone with the gritty overdriven distortion on top. Think SRV.
I'd already added VVR to a 5e3 Tweed Deluxe clone I'd built, so using that knowledge and a bit of research I built a single-channel non-vibrato AV763 amp, as an amp head and using a 12" jensen P12Q speaker for testing.
I'll post the schematics and details shortly (I'm currently travelling) but the amp is an exact copy of the original AB763 circuit, vibrato channel only, but with the vibrato circuit removed and with a master volume added just before the phase invertor tube. I've used a new Princeton chassis as the basis of the build.
Since the majority of distortion in a DR seems to come from the power output stage and in the tests I did, I found clipping the phase invertor sounded really horrible, I decided to apply VVR voltage scaling to the 6V6 output stage only (not to the preamp or phase invertor stages). The idea being to drive the output stage with a fairly clean signal controlled by a master volume and reduce the plate voltages down on the 6V6 to the point they clipped.
I then added a two-part VVR circuit (schematic to follow) to control the B+ to the 6V6 tubes only and in parallel to track the fixed bias as the B+ reduces.
My first tests sounded Ok, but nothing like the tone I was looking for. I could wind down the voltage on the 6V6 tubes to around 110v to get a low volume and then increase the master volume until the output stage was just being pushed to clipping. However the resulting tone was poor. The distortion sounded really harsh and nothing like the warm tube tone I hoped for.
To cut a long story short, I tried to find some info about the correct bias point when voltage scaling a 6V6 output stage, but i couldn't find any info. So, I then messed around with the bias point on the 6V6. Most VVR circuits I'd seen online, reduced the 6V6 bias to a point that when scaled to 100v or so, the 6v6 tubes would only be drawing a few mA of current. I experimented with this and found that just like the bias needs to set at around 20-25ma at normal voltage levels to get a good warm tone, then the exact same applied to VVR usage. I ended up trying a bias current of 20ma per 6V6 tube and suddenly the tone was amazing and exactly what I was looking for.
The drawback with this is the mosfet in the VVR circuit is now having to drop around 340v across it at 20ma per tube, which equates to approx 14 watts of power at idle...that is dumping a whole load of heat! which means a heatsink is needed.
I'm wondering if some VVR circuits intentionally bias the 6V6 cold so as to avoid this power/heat issue?
Edited to be more accurate >> The main reason I'm posting this thread, is that I'm guessing there may have been people who've built fixed bias amps with VVR, not realising that the bias current needed to be set as 'hot' as normal (non VVR) operation? If I hadn't messed around with the 6v6 bias point at low plate voltages, i would have still used the amp, but just accepted it didn't sound great at low volumes.
Anyway, the end result was I added a copper heatsink plate to the chassis (one inside and one outside, connected via some copper rivets to dissipate the heat). This approach is good for a self build amp, but might be difficult to retrofit in a shop bought amp (?)
Anyone any similar experiences or info to add?
More info/photos/schematics of amp and VVR to follow