The best approach is asking how much signal voltage/current must the preamp output? Then design the preamp back-to-front to deliver that output at/under whatever amount of distortion you think is acceptable.
Make a survey of available power transformers, noting Volts RMS and current capacity. Decide a rough power supply approach, as it will help you figure out how much B+ voltage you're likely to get from each available PT.
Once you do that, the tubes you choose to use will tend to imply how much supply voltage will be required. You design each stage, starting with the output stage. You figure out the output swing possible while staying at/under your target distortion level, and you'll find out whether the anticipated supply voltage is high enough for that stage. Doing this will also tell you how much drive signal that final stage will need to hit its output swing target.
Rinse & repeat for each earlier stage. The next earlier stage has to be able to swing a signal at least as big as your calculated drive for the final stage, plus any extra signal to make up for losses between stages. Also important because you can't accurately know how this stage will perform until you know everything about its load, which amounts to any circuitry between this stage & the final stage, as well as what the final stage's input circuit looks like.
You can build a preamp (or any amp) without designing in this way, by assuming higher voltage equals bigger, cleaner output. However, you may wind up paying for bigger/heavier/costly transformers, higher voltage caps, etc you didn't need to get the job done.