Thank you guys, You have been much help to me and this is is what I came up with.
Another thing is that it's essential to provide a dc path between a tube's control grid and cathode, ie the grid must be 'referenced' to the cathode. Usually that's done via a gain pot or grid leak resistor, with cathode bias, both would be referenced to circuit common 0V.
Your first and second stage's grids are missing such a reference.
The 100k input grid stopper will cause a lot of hiss; pentode input stages generally don't need a grid stopper.
Regarding the coupling cap values, they work with the resistance in the circuit to provide a high pass filter, whose corner freq is determined by the cap's value C and (total effective) resistance R values. Such that F = 1 / (2 x pi x C x R)
In the old days, a useful 'rule of thumb' (ad hoc guideline) was that for a 'full bandwidth music' (50Hz-15kHz) reproduction, C (in uF) x R (in k ohms) should = about 10 (to my recollection, the 20Hz-20kHz 'hifi' standard was more of a 70s transistor era thing).
That gives the filter's -3dB corner freq of about 16Hz, so minimal attenuation of any guitar bottom end (a bass amp may need a bit more, eg CxR = 20).
These days, using an online calc is easy, eg
http://www.sengpielaudio.com/calculator-RCpad.htm but the old CxR=10 is still a pretty handy 'hack'.
So if a stage is feeding a 1M pot (ie 1000k), a 0.01uF coupling cap may be a good place to start. If, having tried that, the bottom end needs 'tightening up', the cap's value might be reduced to 0.001uF; maybe that's too far and 0.0022uF will be settled on, or there's still too much bass, and perhaps 470pF will work out best.
With guitar amps, they tend to be overdriven, and if the above 'CxR' value is too high, a big problem is that if the stage being fed gets pushed into grid conduction (which typically defines the onset of overdrive), the Vac of the signal becomes rectified and the grid's reference Vdc gets pulled negative below 0V. This is referred to as 'bias shift'; some degree of it is inevitable and sounds fine, but too much and bias gets closer and closer to cut off, the tone gets farty, until the whole signal gets momentarily choked out, referred to as 'blocking distortion'.
Old Fender designs often used a CxR of about 20, and perhaps up to 50, hence eg a Tweed Deluxe can get farty and typical mods include reducing the coupling cap values.
And over the 60s, as Marshall gradually tweaked the 5F6a based design they started out with, some key coupling cap values dropped in value, eg the CxR of the coupling caps to power tubes reduced from 20 to 5 (JTM45 to JMP 1987 / 1959).
Aiken's tech info pages are really useful reference source, written by a proper EE but reasonably easy reading / not too daunting, eg
http://www.aikenamps.com/index.php/designing-common-cathode-triode-amplifiers