I've noticed and tried to diagnose this phenomenon for a long time. It isn't just limited to tube amps ...
I think what you're hearing is really just a product of how waveforms add. ...
Yes to both, but add, "how waveforms add in a nonlinear impedance/amplifier."
I looked up all this stuff again. If you had a perfectly linear (0.0000000% distortion) mixer/amplifier, two input frequencies would be perfectly added and result in an output with the same two frequencies, only louder/softer as determined by the operation of the mixer/amplifier.
But if you have any non-linearity (distortion), you get harmonics of the original 2 frequencies plus sum/difference frequencies and their harmonics. How bad the intermodulation (IM) is depends on how much non-linearity there is.
There's an odd effect I re-read about, which lead me to originally think there was some old method to minimize it: If you apply negative feedback to the non-linear stage, depending on how the feedback is applied the total harmonic distortion may be reduced but the intermodulation left untouched. That results in the IM being
more apparent and ugly by comparison.
So you will not get rid of IM unless you get rid of distortion. But it doesn't increase necessarily at the same rate as harmonic distortion. You might reach a balance where it doesn't matter or you don't notice it. If you have to cut the subharmonics, and can't cut gain, you'll have to strip out low end to reduce their effective volume.
If you think about it, this is why a single note sounds fuller/thicker with distortion than when played cleanly. There's harmonic distortion of the single note, and probably some small amount of IM subharmonics being created from the difference frequencies of the note and its natural harmonics.
This is also why octaves/fifths sound better with insane levels of distortion than 3rds/2nds. If you play a 1k and 2k octave, the main IM subharmonic is the difference, or 1k, which is the same as the lower note.