My 2 cents regarding your first question...I've a couple of questions regarding this subject:
1: How important is the line out impedance on the head unit with regards to noise? I never quite understood that specification. I am using an Eclipse CD8051 who's line-out impedance is lower than any other head unit's I've ever seen. But, like I said, I just don't understand the importance of the line-out impedance specification.
When it comes to line-level signals, lower output impedance is better, but in most cases doesn't make a big difference.
With line-level connections, we are trying to keep the RMS volts out of the HU higher than any noise that might get picked up along the way to the next device in the signal chain.
The logic behind wanting a lower output impedance is simple. Think voltage-divider.
Consider a simple single-ended connection scenario:
HU is modeled as a perfect voltage source followed by a series resistor equal to the output impedance.
Downstream device input(s) are modeled as a shunt resistor to ground equal to the device input impedance.
The higher the output impedance, the lower the signal voltage on the RCA cable due to the voltage divider formed by the source's output impedance and the end-device's input impedance.
For example: If the source output impedance was equal to the load input impedance then the RCA signal voltage would be cut in half.
In practical application you don't need to worry too much about line-level output/input impedance unless you are driving a bunch of downstream devices via RCA splitters. In that case, each downstream device's input impedance sums in parallel and if the source device also has non-ideal (high) output impedance you may be attenuating the RCA signal voltage down too close to the noise floor.