Having read a number of posts from various audio geeks about the supposed "special something" that transformers can lend to an audio signal, distorted or no, a thought occurred to me:
I haven't played around with driving, say, a microphone input transformer to the point of saturation and distortion in any meaningful capacity.
I've (understandably?) yet to see an implementation of an amplifier that uses an interstage coupling transformer in between resistance coupled preamp stages, where the primary winding is fed by a coupling capacitor so it won't take the brunt of DC plate voltages.
Granted, my knowledge on the finer points of transformers is still in its infancy, but I've got a few lying around that hold no great value to me, so I want to run some tests on a 1:1 or 1:2 mic input transformer and see how it reacts to some healthy signal levels.
I found a mention of this setup (attached below) in this article by W.G. Morely from circa 1967, but it indicates a fixed bias triode on the secondary side
http://www.r-type.org/articles/art-129c.htmSo my question is:
Are there any pitfalls to using a coupling transformer in between two cathode biased triode stages, with the primary safely behind a suitable coupling capacitor?
Audio fidelity is absolutely not the goal here, so no worries there. I'd just like to avoid a premature meltdown when flipping the switch for the first time.