I'm just trying to resolve impedance calculations on real amp,
why jcm800 uses 3.2K, instead of ~7K like JTM45?
Ohm's Law: Resistance = Volts / Current
Or ---------> Impedance = Volts / Current
Or ----> Impedance is a
ratio of (AC) Volts to (AC) Current
Forget "Idle Bias" & "Idle Plate Dissipation" as those are mostly irrelevant to the output transformer.
The Output Transformer primary Impedance is about AC Volts and AC Current present when we play the amp.
What AC Volts do we need across each Primary Impedance to develop 50 Watts?
Volts = √(Power x Impedance)
JTM45: Volts = √(50w x 7kΩ) = 591.6v AC ----> ~418v Peak per side
JCM800: Volts = √(50w x 3.4kΩ) = 412v.3 AC ----> ~291.5v Peak per side
With "Volts" lower and "Impedance" lower for the JCM800 output transformer, then how did "Current" change?
Ohm's Law
Current = Volts / Impedance
JTM45: Current = 591.6v AC / 7kΩ = 84.5mA ----> 119.5mA Peak
JCM800: Current = 412v.3 AC / 3.4kΩ = 121.3mA ----> ~171.5mA Peak
For the same power output, the JCM800 must pull more Current through the output transformer while the voltage-drop developed across the Primary Impedance is smaller. The
Ratio of Volts to Current changed.
Where this matters is output tube screen voltage, and how hot the tube runs when driven.
The screen voltage needs to be high enough to support the peak plate current pulled by the tube.
The higher peak plate current means the tube runs hotter when driven with audio.
The hotter operation of the tube (with high peak plate current) when driven means its Idle Bias needs to be cooler, so it spends a little longer in cut-off to cool the plate. This usually is not accounted for in the simple Bias Calculators that work on idle plate dissipation.
Using lower or higher output transformer Primary Impedance can also change how the output tube distorts.