DiyMobileAudio.com Car Stereo Forum banner

21 - 40 of 53 Posts

·
Registered
Joined
·
464 Posts
A gain control is used to match the input curve to the output curve for maximum unclipped output with reasonably low noise. Turning the amp gain (or other processor gains) higher will actually result in reduced output.
 

·
Registered
Joined
·
54 Posts
Your first statement is correct.
However after the gains are set properly, increasing gain does not reduce output as you stated. Instead it slowly hits a plateau and stops increasing.
 

·
Registered
Joined
·
182 Posts
I swapped out my 1200 watt MTX Elite amp to a 300 watt MTX xd thunder. The old amp worked fine at gain of 0, but the thunder I noticed that i had to turn the gain at least 3/4's of the way just so i can hear it (there is no sound below 1/2 gain). i cranked it almost all the way just to match my fronts which run off of the 14 watt per ch HU. The HU has sub out RCA @ 4v. There is no high input switch on the xd amp so its not like its set for a high input.

Why is that?
 

·
Registered
Joined
·
17 Posts
Running multiple amps drops RCA amperage more than RCA voltage correct?
1) It's "current", not "amperage".

2) Sort of, yes, but that's irrelevant until taken to extremes. These aren't speaker loads we're talking about: it's a milliamp or two.

If you split an output between multiple inputs, all of the inputs are in parallel and all see the same voltage. The problem arises when you have too many inputs being fed, and their combined impedance drops below what the output can source. Effectively, you're partly sort-circuiting the output and it's likely to do things like distort or show varying frequency response.

This is true whatever the preamp output voltage. A more robust output need not have a higher voltage, just be able to drive a lower impedance load.

Think of it like the difference between a 100W 1-ohm capable power amp (a low output voltage but capable of lots of current) and a 200W 4-ohm capable amp (high output voltage, but can't put out as much current). If you have a 1-ohm sub, it'll only work with the 100W amp.
 

·
Registered
Joined
·
1,436 Posts
Wow. This thread is supposed to dispel myths?

There is a lot of misinformation here!

Ideal source impedance is zero ohms (someone mentioned this). However, the best source units are 50 ohms with most in the 200-1k range.

A typical amplifier input impedance (Z) is 10k-50k ohm per channel. Do the math: with 200 ohm source impedance and 10k input impedance, you can drive 10k/200= 50 channels with no signal amplitude loss. But with 1k output Z and 10k input Z, you can only drive 10 channels before signal loss. For the SPL crowd with 30 amplifiers...this matters! They need line drivers. A line driver isn't supposed to make everything louder, it is simply a current amplifier for line level voltage. It keeps the output voltage up so that it can drive hundreds of amplifier inputs!

So where do HV output headunits actually make a difference?

It depends on how crappy your amp is.

Think about it this way: You are simply transferring the gain increase to another place. If your headunit is quieter than your amplifier at full gain, then get that HV output and turn the amplifier gain down. Vice versa? Save your pennies and keep that 1V output HU.
 

·
Registered
Joined
·
393 Posts
Another thing I would like to point out; most "4 volt" output head units I have dealt with are no different than their regular "2 volt" cousin's. It's mostly marketing BS at this point. Also the main benefit I found for the Audio Control line driver I have was the ground isolation (adjustable jumpers inside). It would quiet things down quite a bit when it was installed on old fosgate power back in the day. I still have that thing in the closet I may throw it in my truck just for old time sake ;)
 

·
Premium Member
Joined
·
6,214 Posts
Here's what I've read from Ray @ LP:
I am not sure you will be able to tell from the manufacturers specs, but by tearing them down or looking at the schematics as I do everyday, more and more radios are doing this to save money, instead of building a high end pre-amp section with its own power supply to get the high output voltage. Its much easier and cheaper for them to steal it from the high level output signal from the audio ic.

The problem here is that anytime you increase the signal, aka increase the voltage or power, and then reduce it again, noise filters into the equation. Also audio ics are well known not to have the high fidelity that you want for SQ. So its not as clean of a sound that a nicely done pre-amp voltage section has, thats problem number one. Problem two is that you have amplified the signal to 20 volts or higher thru a less than ideal set of electronics them you are buffering the voltage back down thru a cheap set of resistors to get the voltage down to the usable line voltage of say 3 to 8 volts. This adds noise, takes away dynamics. It's just a nasty way to get what appears to be a high end signal very cheaply for the manufacturer.

Almost all of your radio manufacturers are going to this on low to middle level units to save a buck and to give the lower end audio customer what he thinks is a high end feature. Your better, high end radio units will still be true preamp sections producing a very clean high voltage signal.
Kelvin
 

·
Registered
Joined
·
75 Posts
well, most amps are rated to be run to full power with as little as 200mv. as for the comment that "this is not like wiring speakers", well actually it is. the same thing happens as with your speaker outputs, and has the same limitations, as well. your source will put out x amount of voltage, say 2v. well you hook up 10 different sources, then they are all each getting 2v, the current is just allowed to increase via the reduced resistance created. the only thing that would drop that 2v is if the resistance gets to the point that the power is tapped out of the source and begins to "sag" exactly like the ouput of an amp.. and, "current" vs "amperage", well........ amperage is a measuring unit of current, so using either term does work fine, imho. as for the classd vs class a/b. i would like to point out that the efficiency, etc of the 2 is nearly the same in regaurds to power supply. you can have a class a/b power supply, and run it as a class d amp, since what makes it class d is how the output signal is converted into a high frequency square wave that averages out to the lower frequency signal. typically, they use more logic regulated power supplies, as the tech is also something that is more modern, about the same period of the d-class output tech, but it was actually used before class-d. now, although the higher voltage signal does not make your amp run different, it is usually the user that makes the amp run different. so, if the user is given a high voltage clean signal to work with, and turns it down, rather than adjusting a low voltage signal to clipping, etc, then the end result is a cleaner, cooler less-clipped amp operation, though it is simply end-user action that is causing the effect, not the equipment. as sated, the higher the signal voltage, the less the low voltage noise will be played. that is the point of high voltage. there is also one other byproduct- i hae seen/installed/worked on a hand-full of amps that would need about 3-5v input on the most sensitive gain setting to actually reach full output, and without it, the closest you can get is feeding it a clipped signal from the head. years back, i had modified the input and bias sections of low power non-hcca amps to crank out their real potential, before all this new tech made that 100x more complicated. as an example- i pulled 600watts out of an alphasonic 2035 ([email protected]) untill it shreded the cones on a pair of old school 4ohm punch 12's. more recently, i ran into a cadence that would not reach full power with 2v and bass-boost on. made for bad sound, so the sys. got turned down to a reasonable sound quality, gain still maxed. one day, i'm going to get that truck back and put a line driver in to compensate, but i digress......lol
 

·
Registered
Joined
·
75 Posts
Is that why they sound like $hit?
they don't all sound like shiz. which onesw have you listened to? i can tell you some of the first classd amps sounded awesome. there is some that had power supplies that were not quite as responsive to the output, giving a little bit of a muddy output with the fluctuations on music, and then there is bad signal processing, but that's the manufacturer, not the class d fault
 

·
Registered
Joined
·
8,524 Posts
[url=http://nwavguy.blogspot.com/2011/02/testing-methods.html]NwAvGuy: Testing Methods[/URL]

"HIGH-END BENCH DMM (revised 4/15): A surprising number of people are trying to make audio measurements with typical portable DMM’s. And the readings are often grossly wrong without even realizing it. True RMS measurements are not trivial. In effect, the meter has to accurately measure the “area under the curve” and time average it—see True RMS Measurements for more information. This proves to be rather difficult across a wide range of frequencies if you want to maintain reasonable accuracy at high frequencies and not have the reading “hunt” up and down at low frequencies. The fact is, most DMM’s priced under a few hundred dollars that claim “True RMS” are really only accurate around 60hz—i.e. power line frequencies. Some will measure sine waves accurately across the audio band, but many will not even do that. I have a $150 “True RMS” Extech meter--a relatively well regarded brand--that’s off by nearly 6 dB at 20 Khz compared to 60 hz on a sine wave and is a joke above 1 Khz on non-sinusoidal waveforms. And really complex rapidly changing waveforms like white/pink noise or real music drive such meters crazy. To do it right, you need expensive true RMS circuitry and the ability to optimize the sample rate and averaging for the waveform being measured. Good high end bench DMM’s, like the Agilent 344xx series, let you set these parameters. They also read directly in dB. I use a 6 1/2 digit Agilent true RMS bench DMM that's extremely accurate and flat from 10 hz - 100 Khz for exact levels and other measurements. It has resolution down to 0.1 microvolts so it can even be used to measure noise."
 

·
Registered
Joined
·
75 Posts
^^^and this is whay you go for the $800+ fluke. there is also the option on many gooder meters for calibration to keep it accurate. especially one bnib.
 

·
Registered
Joined
·
231 Posts
I am using a modern Nakamichi headunit, an MB-X, with 5 Volt Pre-Outs.
I intend on using a mid 1980s era Nakamichi amplifier for the active part of the front stage...2 x midrange drivers and 2 x tweeters.

A highly respected Nakamichi aficionado and friend told me that the mid 1980s Nak amplifiers were designed for their TD cassette head units of the 1980s and that I may have an issue with input sensitivity.

Thoughts?

If this is the case, should I have signal reach my other amplifier first, and then the vintage 1980s Nak amplifier, so that it sees a lower voltage being "last in line"?

Thank you.
 

·
Registered
Joined
·
256 Posts
only thing I want to add to this is that if your using a line driver to boost the voltage it HAS to be as close to the signal source to keep the signal to noise level down. Too often I see them in the back 6" from the amp and there all you are now doing is boosting the noise that was picked up already.
 

·
Registered
Joined
·
1,200 Posts
HU~~~~CABLE~~~~AMP
HU~LINE DRIVER~~~~CABLE~~~~AMP

In Summary:
Most amps (all?) can output full power when fed with "normal" or "high voltage" line level signals.

High voltage line outputs do NOT increase volume/power if gains are tuned properly

High Voltage outputs *can* help reduce the effects of *noise induced in the cable* between the HU and the AMP by keeping the signal level above the *induced* noise level and requiring less gain at the amp input. But shielded cable will help here as well.

True balanced differential pair connections help reduce common mode noise induced into the cable between the HU and the AMP. HU & AMP must have balanced differential output and inputs. The principal is simple. Identical noise is induced onto the + and - wires in the cable between HU & AMP. The amp's input circuit sums the + signal with the inverted - signal, effectively doubling the desired signal and cancelling out the common-mode induced noise.

High voltage outputs and/or balanced differential connections will not fix a noisy source. They also will not fix noisy circuits in a cheap amp.

Did I miss anything?
 

·
Registered
Joined
·
526 Posts
I've a couple of questions regarding this subject:

1: How important is the line out impedance on the head unit with regards to noise? I never quite understood that specification. I am using an Eclipse CD8051 who's line-out impedance is lower than any other head unit's I've ever seen. But, like I said, I just don't understand the importance of the line-out impedance specification.

2. I, also, have an Eclipse CD8053 without the BLA. I was considering getting the BLA but I, also, started considering upgrading my amp. One of the amps I was considering was an old-school Zapco Symbilink amp. For lack of noise purposes only, would I be better off going with the BLA and a non-Symbilink Zapco amp or getting the Symbilik amp and not the BLA?

Can anybody, here, educate me?
 
21 - 40 of 53 Posts
Top