DAC (Total noobie warning!)

Njord Noatun

Super Member
I got this cheapo Sony DVD player that I mostly intend to use for playing DVD movies: For audio (and right out of the box) it sounds pretty shrilll, although I am hoping that giving it a proper burn-in might help some. I figure it might be a good candidate to hook up to a DAC such as the much touted Entech Number Cruncher.

This is where my understanding of these things is pretty much exhausted:

How do I hook up the DAC to the the DVDP? Using the digital out/coax output on the DVDP? No need to mod or bypass anything internally in the DVDP?

Can I hook up more than one digial output device to one DAC (e.g., another DVDP) using some kind of digital switch hub? Which other devices have digital out that can be fed through a DAC - do any iPod docks have it?

Sorry for all the random questions - as I said, I am the DVDP/DAC noobie poster boy!

Thanks for any answers - random or not!

All the best,
 
Njord,

The Entech 205.2 has two coax "digital in" and one optical "digital in" so you can essentially hook up three devices- to via coax-digital, one via optical. Not 100% sure if the 203.2 has three as well- check before you buy.

If the sony has either an optical or digital coax "out", all you need to do is get a cable.

I will recommend that, regarding cables, go for a good one.

normally I don't pay much attention to this, but I've read extensively that an optical cable's quality has a big effect on the end result. Probably somewhat so for the digital coax on this too.

Seeing your Sony is coax digital out, you'd use that.

Other devices that use it are minidiscs, CD/DVD players, CD stand-alone transports, and although the iPod doesn't have an optical or digital out, there's a company that will modify it to.
 
How do I hook up the DAC to the the DVDP? Using the digital out/coax output on the DVDP? No need to mod or bypass anything internally in the DVDP?

No, but you may need to choose a menu setting on the DVDP that sets its digital output to PCM. External DACs that I know of won't accept a DD/DTS signal. Your DVDP manual should describe this.

Can I hook up more than one digial output device to one DAC (e.g., another DVDP) using some kind of digital switch hub?

Yes. Some DACs have digital input switching, but there are also external digital switch boxes available from Audio Authority, Inday, and others. Sorry, don't know about iPod docks.
 
I've actually read that an optical cable's quality is mostly irrelevant. Being a digital signal, there is no way to have any sound quality losses. It's all or nothing. You receive the full digital signal or silence. It is like hdtv. You can get a better antenna to get more stations but you can't get a better picture than the ones you are already receiving, assuming you are recieving it, there is no ghosting, or snow or noise that is not already in the actual video.
But then again, this is what I've read, and while it seems to make sense, I can't back it up with any detailed understanding. I will say that my hdtv quality never changed not matter what antenna I've used, once I have a station tuned in.
I have also read that a coaxial spdif cable quality can affect the sound but I don't really get that either. It is also digital. I'd assume it's all or nothing. If you only are getting part of the digital information you should be getting something totally unintelligible to your receiver.
 
I've actually read that an optical cable's quality is mostly irrelevant. Being a digital signal, there is no way to have any sound quality losses. It's all or nothing. You receive the full digital signal or silence. It is like hdtv. You can get a better antenna to get more stations but you can't get a better picture than the ones you are already receiving, assuming you are recieving it, there is no ghosting, or snow or noise that is not already in the actual video.
But then again, this is what I've read, and while it seems to make sense, I can't back it up with any detailed understanding. I will say that my hdtv quality never changed not matter what antenna I've used, once I have a station tuned in.
I have also read that a coaxial spdif cable quality can affect the sound but I don't really get that either. It is also digital. I'd assume it's all or nothing. If you only are getting part of the digital information you should be getting something totally unintelligible to your receiver.


This makes total sense to me as well, and it was with disdain I turned my cheek at fellas who pooh-poohed bargain cables, for the sake of bragging they had multi-dollar cables tucked behind teir systems, all the while a cardboard toilet paper roll and piece of twine could have fooled 'em.

It was only after listening to the arguements of a few logical folk- some here, one a friend who writes for A$$A, and a few other folk out in the community did I start to believe any of it.

There's a big thing to consider- the law of diminishing returns. One can quickly spend money for a nominal improvement. So be forewarned.

And to cables, I (think I) can tell the difference between those cheap thin Radio Shack cables and, say, the beefier, but still bargain-priced MCM cables, which are better shielded, ect.

There's a big difference in silver vs copper, I'm told. I'll concure with that, on theory.

But the optical cable- that's a tough one. What makes it better?

Does a better cable offer less light diffraction do to stringent quality standards and control?

Does it keep the pulse brighter, by reducing absorption in it's cable path do to higher-quality materials?

Is it a brighter, cleaner signal, because of optics that are polished and treated to a higher degree?

I dunno.

But when a few frugal friends who have a great ear convince me that all the above may apply, what the heck is spending a little more money than the $10 Best Buy cable?

worth a shot for me.
 
I believe that digital cable quality does make a difference, perhaps even more so for optical cables. My understanding is that optical cables can differ both in bandwidth and connector quality; certainly coax cables can differ in shielding and connector quality. Why does this matter? In terms of the bitstream, anything that compromises the ideal square-wave shape of the waveform make it harder to receive the digital signal with accurate timing. Electrical shielding on coax cables can affect the extent to which other cables/components pick up RF interference radiated from the digital signal. Of course, components can be designed so that cables matter less, but in the real world, to my ears, they do make a difference.
 
One of the gurus can weigh in with the technical info, but in very general terms, a digital coax cable is supposed to conform to a standard called the "75 ohm standard". The standard has many specifications, designed to make the cable perform correctly between the two transponders so that there is a minimal amount of reflection of the digital stream back down the cable. If too much reflection occurs, you get jitter, or worse, you get data corruption. IIRC, the only way to really measure performance of a digital coax cable is with some damn fancy toy called a "Time Domain Reflectometer".

Anyone wanting to theorize that there can't be any difference need only go back to the early days of computer networking, before 10BaseT came out. Coax cable was used, and when the installer was done with each run, they used a "Time Domain Reflectometer" to test the link. If the link was too short, too long, kinked, too near a 110 volt line, or made of cheap cable, the computer actually couldn't communicate, even though there was full continuity in the electrical connection.

It's also true that tests have shown that some very expensive digital cables don't meet the 75 ohm standard, and accordingly don't sound right in many systems. It depends on length, and the types of transponders on either end.

Now, do I understand why all this is? No. Nor do I understand what the "75 ohm standard" is all about. I am clear however that the quality of a digital connection isn't as simple as sending ones and zeroes down a line.
 
The Bit stream is the key; if it is not perfect the error correction system must kick in and "redo" the signal. In case this is not possible there will be an artefact in the sound, it might just be a small piece of sound being pulled out of the sound so the tune becomes a little shorter. This will not be noticed by most listeners since we are talking fractions of a second here, but it shows how important the cable is to the quality of the sound. The more missing 1's or 0's the lesser the quality of the sound.

This is the same for bough optical and cobber/silver/gold what ever is being used to transfer the signal.
 
It's also true that tests have shown that some very expensive digital cables don't meet the 75 ohm standard, and accordingly don't sound right in many systems. It depends on length, and the types of transponders on either end.

Now, do I understand why all this is? No. Nor do I understand what the "75 ohm standard" is all about. I am clear however that the quality of a digital connection isn't as simple as sending ones and zeroes down a line.

It's just a 75 ohm transmission line. CATV coax would work fine, though you'd have to use connector adaptors to get from RCA to F (CATV is a 75 ohm impedance system). Or use good cables meant for baseband video, which is also a 75 ohm impedance system.

As for reflections caused by bad (or the wrong impedance) cables, the length of the cable will also have an impact. You probably could get away with a very short wrong impedance cable (like a foot length) between the say the CD player digital output and the DAC box input, and never notice anything bad about the sound. But use the same kind of wrong cable for a 20 foot run between the say HDTV receiver and the DAC box, and it may get choppy or just suffer severe jitter issues from the reflections having enough time delay to mess up the "eye" pattern timing of the bits. In digital transmission systems, "eye pattern" is a measure of how much margin you have between the lowest 1 bits and the highest 0 bits, and also the clarity of unambigious 1 and 0 time intervals. Get enough reflections, and the previous 0 could reflect enough to muddle up the 1 you are to receive right now...
 
With the coax cable, my understanding is that the connectors are the key. This is where the 75 ohm spec falls flat on a typical rca connector. This includes the plug on the recieving unit. I have some rg-59 with soldered, gold plated rca connectors. I have used it for stereo,spdif, and subwoofer cable. Except for a broken solder connection, it has performed flawlessly, to my ears. Now, I don't know if it is accurate or not. We still have some of the coax network connections at work. Now that's cheap, 12 yrs and still running though. Maybe there is a reflectometer lying around. I'll have to check.
 
FYI, the coax used for networking applications was 50 Ohm, rather than 75 Ohm, and used RG-58/U cable. This is not suitable for 75 Ohm audio and video applications.
 
FYI, the coax used for networking applications was 50 Ohm, rather than 75 Ohm, and used RG-58/U cable. This is not suitable for 75 Ohm audio and video applications.

True. My point was only that the standards don't refer to a particular resistance per meter or anything as simple as that. The plugs are different, and the length matters, and the shielding is a big damn deal.

I remember reading that 2m digital cables usually sound better than 1m, and that made me recall the old network coax standards in which connections couldn't be less than 4 ft or longer than... some other number that I've forgotten. Frankly, 10baseT was a huge improvement, but it won't work for music stuff.
 
I wasn't trying to pick nits, David.... just didn't want someone getting the idea to try some old network coax they might find lying discarded somewhere at the office.
 
Back
Top Bottom