...seems to be the norm in guitar amps! what gives? how close to you adhere to that rating in your own builds, and how do you divide up that resistance between grid stoppers, grid leaks, and the bias circuit?
just about every commonly used power tube i know of besides the EL34, has a max grid circuit resistance value of no more than 100K with fixed bias. but most grid circuits i see are comprised of 220K grid leak resistors, and anywhere from 15-100K to ground in the bias circuit... considering the bias load resistance is shared by all power tubes, and the grid leaks are shared by parallel pairs when there are 4 power tubes... that calculates to a grid circuit resistance per tube of 235K at the absolute best (2 tubes, 15K bias), and at the worst with 4 power tubes and highest bias circuit resistance, 840K! exceeding the maximum rating by anywhere from 2.35, to 8.4 times.
my understanding is this limit is there to protect the power tubes when the grid is driven positive past 0V, ensuring there is a path that is low impedance enough for reverse grid current to flow, so that grid current limiting occurs and keeps the grid from being driven further positive. seems pretty crucial. i appreciate how datasheets are just the beginning of a tube's actual capabilities, and plenty of tried and true circuits use these values and have exceeded others as well... and how much it matters depends on whether or not the power amp is even being overdriven. but i just wonder how much this actually contributes to redplating and/or premature wear.
i'm designing a push-pull output section with 2x KT88's driven by a 12AT7 LTP, trying to make it stay tight with minimal blocking and crossover distortion, but a low end that'll take you off your feet, and a sustainable circuit for the KT88's that will make them go the distance under very heavy overdrive (yes, the homie i'm designing it for DOES play that loud lol). looking for less phase inverter distortion, more power amp distortion, and the ability for the LTP to really overdrive the living hell out of the KT88's without any adverse effects.
KT88 plates are at 580V, screens at 450V with 1.5K screen stoppers. LTP has normal 82K/100K anode resistors coming from a 440V supply, center biased with a 40V tail. bias circuit has 20K max resistance to ground, so that presents 40K to each grid circuit... technically only leaving 60K left per tube, divided between the grid leak and grid stopper, to still meet the maximum rating.
grid leaks provide the nice center biased AC loadline i like when they're at about 47K, leaving only 18K for the grid stoppers. i'd love to make the grid stoppers large as hell to control blocking distortion as much as possible though.
how large would you be comfortable with making the grid stoppers here? if i was to go with the conservative side of traditional design, meaning resistance 2.35x in excess of the recommended value, i could bump the grid stoppers up to about 150K, which would be awesome. but i really am expecting this power section to get heavily overdriven, more than those traditional designs likely accounted for, so i'm not sure how much i should push it...
last random question. does one take the bias circuit resistance to ground into account when calculating the AC loadline impedance for the LTP? i'm using such low resistance grid leaks that if that is the case, it would actually make a significant difference. my guess is that it does not factor in, because the out of phase AC signals cancel each other out after the grid leaks, making the negative bias supply a virtual ground of sorts for AC signals.