My question is this: Is there any reason to over bias (bias cold) a tube in a guitar amp simply because the plate voltages run higher than normal?
We forget that output tubes provide voltage amplification to their plate (because we typically focus on plate current and how that interacts with load impedance to cause the plate voltage variation).
If you bias the tubes to idle at less current, you're applying more-negative voltage.
If you have more negative voltage, the tube can accept a larger input signal before the grid is momentarily driven to 0v on the positive input signal peak.
If the tube can accept a larger signal swing when driven to 0v on the grid, there is a greater difference in plate current between the zero signal condition and maximum signal condition.
This should yield more power output; bigger voltage swing x bigger current swing = more power, assuming the amp is designed to have the proper power supply and load impedance to allow this all to occur.
We forget that an output stage is designed in its entirety to function in a specific way, generally to hit some power goal at a given amount of distortion. Since we usually don't mind gaining/losing a few watts of output and care more about the nature of the output stage distortion, we've all ignored the fact that the designer specified a particular bias voltage to yield the designed-performance. The catch is they had to assume the tubes all had some arbitrary characteristic (plate curves or dynamic curves) which they could use to design around.