It seems to me that Wombat is drawing a correlation between the maximum output level of the preamp and the maximum output power of the amplifier. According to his statement, the amp only needs 1 volt from the preamp to achieve maximum output, about 210 watts. Any further increase in the voltage from the preamp to the amplifier, drives the amp into clipping. The preamp can supply a maximum of 2 volts to the amp.
Regardless what some people seem to think, the AC line voltage directly determines the maximum output power of the amplifier. All of the upgrades and modifications doesn't change that, and that includes adding a farad of electrolytic capacitors. Under continuous sinewave operation, the transformer is the limiting factor. A boat load of capacitors will maintain maximum output power for hundreds of milliseconds. but maximum output power will eventually, slightly decrease because the transformer secondary voltage decreases under heavy load. The term used to describe it is voltage regulation. It is expressed in percent of change from no load to full load. I have yet to see anybody specify the AC line voltage when measuring output power, but it makes a slight difference.
For example, if the AC line voltage is only 112 volts it is likely the continuous output power of the amp is less than 200 watts and there is nothing wrong with that. If the home is next to a substation (like mine) and the voltage at the wall outlet is 125 volts, the amp would probably produce upwards of 220 watts. At the end of the day it matters very little if the output power is 190 watts or 220 watts. Both are plenty loud.