Seems to be overly complicated just to heat the tubes on a breadboard.
I wanted to play with the stand off dcv to hear for myself what, if any benefit it is to go as high as 70 to 80 dcv as apposed to the usual 30 to 40 dcv.
Do the test for experiential learning. I think it won't make a difference in the end.
D.C. standoff is most useful
*when/if* you have tubes with heater-to-cathode leakage. In such circumstances, half the heater voltage cycle, the heater voltage is negative compared to the cathode voltage. The heater emits electrons which leak to the cathode during those half-cycles when the heater is negative of cathode, and you get half-wave rectified hum in your tube cathode (and through to the plate). Biasing the heater positive always keeps the heater at a higher voltage than the cathode, reverse-biasing this "leakage diode" and stopping current flow between those elements.
{As an aside, I think the super-large value cathode bypass cap in the tweed Bassman was a case where heater standoff wasn't used, and hum due to leakage was objectionable with some tubes, so the bypass was made 10x bigger to make the cathode appear more like ground for all audio, and especially that hum.}
30-40vdc should be enough for most circumstances. Where it might not is if you have a gain stage with the cathode sitting at 45v, 50v, 60v, etc. Maybe like a cathode follower or phase inverter. Now it makes sense to bias those tubes' heaters to voltage higher than their cathode voltage. However, these kinds of stages are usually later in the amp where signal levels are higher and a little leakage might not create enough hum to be an issue. Which also explains why you usually see most of the heavy-duty hum elimination tricks in the first stage or two of the preamp.