I am not debating the fact--yes fact--that actual DAC chipsets have "improved" and expanded their capabilities, but I will still remind everyone that the biggest AUDIBLE difference amongst DAC units is the final analogue preamp stage that is necessary to take that "newly converted" analogue signal up to line-level output. I have harped on this many times. Once you convert the digital signal to an analogue one--how is it amplified up to a useable line-level output? A single cheap IC, a couple (or single) opamps, discrete SS components, or even tubes or tube-hybrid preamp stage?
Everyone knows (or I at least think acknowledges) that preamp stages in ANY system are a critical element in the "sonic signature" of an entire system. A crappy phono stage can make a $2K cart on a $5K table sound like sh*t--expensive sh*t, but sh*t, nonetheless. A crappy preamp stage (overall, for the entire system)--whether it be the preamp stage of a receiver, integrated or a stand-alone unit can make the best of the rest of the whole system sound like sh*t.
So are "old" DACs obselete, and new ones "better"? IMO--not necessarily. Much of it depends on that final analogue preamp stage component of the DAC unit. I have heard many "old", but higher-end (for their tme) DACs eat "modern" DACs for lunch and still want dessert. Low-end back then was low-end. High-end back then was high-end. The same applies now.