Hi,
I am trying to understand how does DC watts relates to AC watts.... I have heard things like:
"... You must, at least, supply 2.8 times DC watts for X AC watts to obtain proper frequency reproduction without audible distortion on low end..."
Others have suggested at least 4 times DC watts for X AC watts...
So, for a practical example, a 100W tube amp would need:
1) 100W AC = 280W DC
2) 100W AC = 400W DC
I must say it's all non-technical talk, i.e., no references... I goggled DC watts to AC watts with no luck either.
As far as tubes goes, DC watts could be considered heat generated on plate/screen??
I grabbed a 6550 typical operating point from G.E. data sheet; values are for tube two tubes
Plate supply = 450VDC @ 0.150A = 67.5W @ zero signal
Screen supply =310VDC @ 0.009A = 2.8W @ zero signal
Plate + Screen = 70.3W @ quiescence.
When pushed, data sheet says it can be:
Plate supply = 450 @ 0.295A = 132.75W
Screen supply = 310 @ 0.038A = 11.78W
Plate + screen = 144.53W
According to data sheet, it should be capable of providing 77W. It turns out that we seem to have 144W of heat to produce 77W of audio power?
So, when designing a power supply I must predict abuse and prepare it to withstand max signal conditions? How much headroom (extra current demanding) should be giving for real world use and abuse? Is necessarily bigger better??
I know it’s a helluva questions but I have being scratching my head with such crazy statements...
Also, if you have a reference, I will be glad if you can share it.
Thanks in advance.
Best Regards
Rzenc