Voltage vs Current DACs

Gmm213

New Member
I came across a website which mentioned both but did not explain about the differences. I tried searching but there's almost nothing in the google and the search here as far as I can tell won't allow me to require certain words so it returns every post that contains dac or current or voltage but not all three
 
I see the difference as merely a convenience thing. Typically, the output of the element creating the stairstepped analog has a current output. With current output DAC chips, the very next stage is an I/V converter so that the stairstep is now a voltage. Chip makers can opt to include an I/V converter inside the DAC package, thus converting a current output DAC to a voltage output DAC for space savings and cost savings. Some designers of DAC units (the things audio enthusiasts purchase to put on their shelf) feel that using another op-amp to be the I/V converter is just another veil between the listener and nirvana. Thus, they may choose to use a single resistor on the current output DAC chip to be the I/V converter. A simple single resistor has got to sound better than an entire op-amp, right? The problem here is that the current output DAC chip will have a certain voltage compliance that it's output must abide to, typically less than a diode drop (like around 0.5V or less). In this case, this resistive I/V converter is then followed by a voltage gain stage of 12dB to get to a 2V output. The net result is you either have to have an I/V stage or a gain stage, so the total benefit is usually a wash for resistive I/V converters. Simple solution is for the chip manufacturer to put the I/V converter inside the DAC chip, and just go from there. Audio purists will say the voltage output DAC is better because you don't need the extra stage of an I/V converter. The reality is that it is still in the path, just hidden. What you can't see can't hurt you, right?
 
Back
Top Bottom