What pdf64 said +
Assuming the supply was 300V (and lets assume that this remains constant for this example), the voltage divider by itself would consume about 20mA and produce about a 40V voltage drop to the screen node. Lets assume the front end and screens consume about 10mA at idle. This would produce and additional 20V drop. The screen node voltage at idle would then be about 240V (lets assume this was the design target). Now lets assume that at full output, the front end and screens draw about 25mA. The screen voltage at full output would be about 210V. Keep in mind that the difference in voltage drop between idle and full output is being generated across the 2K resistor.
Now lets assume that the designer wanted to maintain the screen node voltage at idle at 240V, but instead used a typical dropping resistor. Without the voltage divider, the front end would be drawing 10mA and the resistor value would need to be 4K. At full output, the voltage drop at 25mA would be 60V and produce a screen node voltage of 180V, 30V lower than using the divider.
The voltage divider would produce less of a voltage drop between idle and full output.
Of course there is no way of knowing what the designer’s intent was, but this example may suggest one possible explanation.