Here's one that always amuses me.
Higher pre-out voltages DO NOT make your system louder. They DO NOT make your amplifier work less. The benefit of a higher pre-out voltage is that any distortion signals that may be transferred into your RCAs will be a lower % of your original signal, and that if you are running a large number of amps(ie more than 1 off each pre-out) you will have a higher voltage signal at each amp.
Here's why, your amplifier can only output up to a certain voltage before it clips. That's a more or less "set in stone" thing based on the amplifier itself. For the sake of argument, we'll say its 20V. The gain on your amplifier is used to match the amplifier's output voltage to its maximum by compensating for your different input voltages. Thus say we have a input voltage of 2V, you could set it to 10, or if you had a 5V pre-out, you would set it to 4. IF you had a 2V pre-out, then upgraded to a 5V pre-out without adjusting your gains you would be trying to make your amplifier run up to 50V, which is MEGA clipping.
It will not make your amp "work less" or run cooler. You amp is only capable of a certain voltage. If you want to make your amp work less, turn down your volume. A higher pre-out voltage will actually make your amp work harder and run hotter, as it will reach its maximum output voltage faster.
Also, your gain knob is NOT a volume knob!