or, from Andy at Audiofrog:
Andy's tech for today:
Settings gains with a DSP.
After tech supporting someone yesterday, it occurred to me that this process may require a little explanation because it's a little more complicated than the standard process we're all used to using with analog gear.
With analog gear and a scope, we're used to finding the maximum undistorted output of a piece of gear and setting the input sensitivity of the next piece of gear to provide maximum undistorted output for it and then ultimately, setting the "gain" of the amplifier with some "overlap" so that the system sounds loud and isn't noisy. A little clipping isn't a big deal in ANALOG devices. This process was originally super important because we had a long chain of gain stages, each with its own noise floor and head unit output was really low-like 100mV. Systems were noisy.
In most cases, products don't have input sensitivity AND output gain controls, with the exception of Audio Control preamp processors. Most only have an input sensitivity control.
With DSP, there is an additional consideration. For DSPs, we have to be concerned with analog input gain when we are using analog input. We also have to be concerned with digital gain. They aren't the same.
When an analog signal goes into a DSP, it's converted to a digital signal. ADCs (analog to digital convertors) are designed to convert analog signals of up to some maximum voltage into digital signals and some number of volume control steps are available to describe the voltage. The bit depth (16 bit, 24 bit, 32 bit, etc) determines the number of possible volume levels that can be used to describe the input voltage when it's converted into digital.
This ADC has a maximum input voltage and it isn't adjustable. If the maximum signal sent from the radio to the ADC is only a small part of that range, then the resolution of those volume steps is compromised. For a 16 bit signal, there are 65,536 steps (2^16). For a 24 bit signal, there are 16,777,216 steps (2^24).
If our ADC has a max input voltage of 4V and our input signal is 2V max, we waste half of those volume steps--we reduce the resolution with which the ADC can describe the signal. If our input signal is higher than the maximum input, we "clip" the input and that produces distortion.
Digital distortion isn't like analog clipping. It isn't smooth and it becomes nasty sounding instantly.
So, we don't want to overdrive the input to the ADC. In some DSPs, this is set with standard jumpers or pots to match the input voltage to the ADC's maximum. This is the case in the Helix Pro and it's well documented in the manual. For many other DSPs, this isn't so obvious. If the bit depth of the ADC is 16 bit, matching these is pretty important. If it's a 24 bit ADC, it's less important because there's plenty of resolution to throw away. It's ALWAYS important to be sure that we don't OVERDRIVE the input.
Once the signal has been converted and sent to the processor, we often have input and output level controls. The input level control in the digital chain is NOT the same as the input sensitivity control for the analog signal before the ADC and the distinction is important.
In digital audio, there's no gain. There are no values above zero. Maximum signal is defines as "all high bits". There are no numbers to describe a signal greater than that and if it's exceeded, the numbers are sort of undefined. That's digital clipping and it sounds horrible--nasty--mechanical--kind of like a square wave, but it happens instantly.
So, in the DSP UI, if you have your input level control at 0dB (or 100% in some GUIs), you have no room for any boost. In that case, a signal that comes from the radio that reaches maximum in the ADC and is then sent to the DSP as all high bits is as loud as it can be. If you boost the EQ by 1dB...BAM...clipping. If you use a crossover filter with a Q higher than .707...BAM clipping.
It matters where in the digital chain these level controls are placed. Attenuating the digital input is important if you're going to do any boosting of the signal in the EQ or crossover or if you're going to boost the output of the sub. If the input reaches all high bits and you boost the level of the sub channel in the DSP...BAM...clipping--nasty clipping.
So, when you are tuning with a DSP and you hear nasty distortion, it's important to try to figure out where it's coming from. If you attenuate the DIGITAL input and it goes away, then it's in your EQ/Crossover/Output level settings. If you attenuate the digital input and it doesn't go away, then you're overdriving the ADC and you should adjust the analog input sensitivity. If there's no analog input sensitivity adjustment on your DSP, then you have to use the radio's volume control judiciously or you need to build a voltage divider to attenuate the signal.
My suggestion for this is to set the analog sensitivity to give you the maximum resolution if possible. Then, attenuate the digital input level in the UI. Then, tune the DSP. Then, with the amp gains turned way down, adjust the output levels of the DSP so that you hear NO distortion on any channel.
Then, set adjust your amp gains all by the same amount to set the system level. A scope and a sine wave track aren't going to be all that helpful unless you know precisely the frequency in your signal that's at the highest level and you have a sine wave track for that particular frequency.
Just get close doing this by ear so they system plays loud and so that noise is at a minimum. Or, if you MUST use a scope or a DD1 and s sine wave disc, set the gains with the input and output levels in the DSP at -3dB or so and with the EQ set to flat and all crossovers at a Q of .707 or lower. Then, turn the digital input and output gains down, do your tuning and then turn them up proportionately until the system is loud and there's still no distortion. When you do that, you are ONLY optimizing the resolution of the digital signal.