Noah Santer
New Member
Hello.
I am a DAC designer and was long ago swayed by the argument that higher bit depths (ie than 16) provide no benefit because the increase in dynamic range doesn't amount to anything useful (assuming the PCM data is mixed to occupy the full range); but during recent simulations for a sigma-delta 1-bit DAC I graphed the results for 24 bits on a whim, and was shocked to find a significant improvement in several metrics.
The most obvious was the noise floor dropped about 48dB (!), which I can talk myself into being both a) irrelevant as it was already absurdly low and b) an artifact of simulation unlikely to show up in real life where electrical (mostly supply) noise is already the dominant factor. However, the usual harmonic spikes -- which before were the only form of noise about the 16-stop/96dB limit, were reduced, to a lesser extent, as well.
Is it possible that the effects of quantization amplified by the DAC are ameliorated by the increased bit depth? Or is this just a quirk of ideal circumstances that won't have any impact on a real-world system? I'm going to do A-B testing on it in the next week or so, when I get a chance -- as well as hopefully some Klippel analysis, but until then I thought I'd ask here.
I attached an example spectrum.
Full legal disclosure, I am in the employ of Harman International (although not as a DAC designer.)
This is my first time posting, so I'm eager to here what people have to say!
I am a DAC designer and was long ago swayed by the argument that higher bit depths (ie than 16) provide no benefit because the increase in dynamic range doesn't amount to anything useful (assuming the PCM data is mixed to occupy the full range); but during recent simulations for a sigma-delta 1-bit DAC I graphed the results for 24 bits on a whim, and was shocked to find a significant improvement in several metrics.
The most obvious was the noise floor dropped about 48dB (!), which I can talk myself into being both a) irrelevant as it was already absurdly low and b) an artifact of simulation unlikely to show up in real life where electrical (mostly supply) noise is already the dominant factor. However, the usual harmonic spikes -- which before were the only form of noise about the 16-stop/96dB limit, were reduced, to a lesser extent, as well.
Is it possible that the effects of quantization amplified by the DAC are ameliorated by the increased bit depth? Or is this just a quirk of ideal circumstances that won't have any impact on a real-world system? I'm going to do A-B testing on it in the next week or so, when I get a chance -- as well as hopefully some Klippel analysis, but until then I thought I'd ask here.
I attached an example spectrum.
Full legal disclosure, I am in the employ of Harman International (although not as a DAC designer.)
This is my first time posting, so I'm eager to here what people have to say!