DVI has no audio pass-thru capabilities as yet. PERIOD.
As DVI and HDMI connections become more and more widely used, we are often asked: which is better, DVI (or HDMI) or component video? The answer, as it happens, is not cut-and-dried.
First, to clear away one element that can be confusing: DVI and HDMI are exactly the same as one another, image-quality-wise. The principal differences are that HDMI carries audio as well as video, and uses a different type of connector, but both use the same encoding scheme, and that's why a DVI source can be connected to an HDMI monitor, or vice versa, with a DVI/HDMI cable, with no intervening converter box.
The upshot of this article--in case you're not inclined to read all the details--is that it's very hard to predict whether a digital DVI or HDMI connection will produce a better or worse image than an analog component video connection. There will often be significant differences between the digital and the analog signals, but those differences are not inherent in the connection type and instead depend upon the characteristics of the source device (e.g., your DVD player) and the display device (e.g., your TV set). Why that is, however, requires a bit more discussion.
What are DVI, HDMI and Component Video?
DVI/HDMI and Component Video are all video standards which support a variety of resolutions, but which deliver the signal from the source to the display in very different ways. The principal important difference is that DVI/HDMI deliver the signal in a digital format, much the same way that a file is delivered from one computer to another along a network, while Component Video is an analog format, delivering the signal not as a bitstream, but as a set of continuously varying voltages representing (albeit indirectly, as we'll get to in a moment) the red, green and blue components of the signal.
Both DVI/HDMI and Component Video deliver signals as discrete red, green, and blue color components, together with sync information which allows the display to determine when a new line, or a new frame, begins. The DVI/HDMI standard delivers these along three data channels in a format called T.M.D.S., which stands for "Transmission Minimized Differential Signaling." Big words aside, the T.M.D.S. format basically involves a blue channel to which horizontal and vertical sync are added, and separate green and red channels.
Component Video is delivered, similarly, with the color information split up three ways. However, component video uses a "color-difference" type signal, which consists of Luminance (the "Y", or "green," channel, representing the total brightness of the image), Red Minus Luminance (the "Pr," or "Red," channel), and Blue Minus Luminance (the "Pb," or "Blue," channel). The sync pulses for both horizontal and vertical are delivered on the Y channel. The display calculates the values of red, green and blue from the Y, Pb, and Pr signals.
Both signal types, then, are fundamentally quite similar; they break up the image in similar ways, and deliver the same type of information to the display, albeit in different forms. How they differ, as we'll see, will depend to a great extent upon the particular characteristics of the source and display devices, and can depend upon cabling as well.
Isn't Digital Just Better?
It is often supposed by writers on this subject that "digital is better." Digital signal transfer, it is assumed, is error-free, while analog signals are always subject to some amount of degradation and information loss. There is an element of truth to this argument, but it tends to fly in the face of real-world considerations. First, there is no reason why any perceptible degradation of an analog component video signal should occur even over rather substantial distances; the maximum runs in home theater installations do not present a challenge for analog cabling built to professional standards. Second, it is a flawed assumption to suppose that digital signal handling is always error-free. DVI and HDMI signals aren't subject to error correction; once information is lost, it's lost for good. That is not a consideration with well-made cable over short distances, but can easily become a factor at distance.
So What Does Determine Image Quality?
Video doesn't just translate directly from source material to displays, for a variety of reasons. Very few displays operate at the native resolutions of common source material, so when you're viewing material in 480p, 720p, or 1080i, there is, of necessity, some scaling going on. Meanwhile, the signals representing colors have to be accurately rendered, which is dependent on black level and "delta," the relationship between signal level and actual as-rendered color level. Original signal formats don't correspond well to display hardware; for example, DVD recordings have 480 lines, but non-square pixels. What all of this means is that there is signal processing to go on along the signal chain.
The argument often made for the DVI or HDMI signal formats is the "pure digital" argument--that by taking a digital recording, such as a DVD or a digital satellite signal, and rendering it straight into digital form as a DVI or HDMI signal, and then delivering that digital signal straight to the display, there is a sort of a perfect no-loss-and-no-alteration-of-information signal chain. If the display itself is a native digital display (e.g. an LCD or Plasma display), the argument goes, the signal never has to undergo digital-to-analog conversion and therefore is less altered along the way.
That might be true, were it not for the fact that digital signals are encoded in different ways and have to be converted, and that these signals have to be scaled and processed to be displayed. Consequently, there are always conversions going on, and these conversions aren't always easy going. "Digital to digital" conversion is no more a guarantee of signal quality than "digital to analog," and in practice may be substantially worse. Whether it's better or worse will depend upon the circuitry involved--and that is something which isn't usually practical to figure out. As a general rule, with consumer equipment, one simply doesn't know how signals are processed, and one doesn't know how that processing varies by input. Analog and digital inputs must either be scaled through separate circuits, or one must be converted to the other to use the same scaler. How is that done? In general, you won't find an answer to that anywhere in your instruction manual, and even if you did, it'd be hard to judge which is the better scaler without viewing the actual video output. It's fair to say, in general, that even in very high-end consumer gear, the quality of circuits for signal processing and scaling is quite variable.
Additionally, it's not uncommon to find that the display characteristics of different inputs have been set up differently. Black level, for example, may vary considerably from the digital to the analog inputs, and depending on how sophisticated your setup options on your display are, that may not be an easy thing to recalibrate.
The Role of Cable and Connection Quality
Cable quality, in general, should not be a significant factor in the DVI/HDMI versus Component Video comparison, as long as the cables in question are of high quality. There are, however, ways in which cable quality issues can come into play.
Analog component video is an extremely robust signal type; we have had our customers run analog component, without any need for boosters, relays or other special equipment, up to 200 feet without any signal quality issues at all. However, at long lengths, cable quality can be a consideration--in particular, impedance needs to be strictly controlled to a tight tolerance (ideally, 75 +/- 1.5 ohms) to prevent problems with signal reflection which can cause ghosting or ringing.
DVI and HDMI, unfortunately, are not so robust. The problem here is the same as the virtue of analog component: tight control over impedance. When the professional video industry went to digital signals, it settled upon a standard--SDI, serial digital video--which was designed to be run in coaxial cables, where impedance can be controlled very tightly, and consequently, uncompressed, full-blown HD signals can be run hundreds of feet with no loss of information in SDI. For reasons known only to the designers of the DVI and HDMI standards, this very sound design principle was ignored; instead of coaxial cable, the DVI and HDMI signals are run balanced, through twisted-pair cable. The best twisted pair cables control impedance to about +/- 10%. When a digital signal is run through a cable, the edges of the bits (represented by sudden transitions in voltage) round off, and the rounding increases dramatically with distance. Meanwhile, poor control over impedance results in signal reflections--portions of the signal bounce off of the display end of the line, propagate back down the cable, and return, interfering with later information in the same bitstream. At some point, the data become unrecoverable, and with no error correction available, there's no way to restore the lost information.
DVI and HDMI connections, for this reason, are subject to the "digital cliff" phenomenon. Up to some length, a DVI or HDMI cable will perform just fine; the rounding and reflections will not compromise the ability of the display device to reconstruct the original bitstream, and no information will be lost. As we make the cable longer and longer, the difficulty of reconstructing the bitstream increases. At some point, unrecoverable bit errors start to occur; these are colloquially described in the home theater community as "sparklies," because the bit errors manifest themselves as pixel dropouts which make the image sparkle. If we make the cable just a bit longer, so much information is lost that the display becomes unable to reconstitute enough information to even render an image; the bitstream has fallen off the digital cliff, so called because of the abruptness of the failure. A cable design that works perfectly at 20 feet may get "sparkly" at 25, and stop working entirely at 30.
In practice, it's very hard to say when a DVI or HDMI signal will fail. We have found well-made DVI cables to be quite reliable up to 50 feet, but HDMI cable, with its smaller profile, is a bit more of a puzzle. Because the ability to reconstitute the bitstream varies depending on the quality of the circuitry in the source and display devices, it's not uncommon for a cable to work fine at 30, 40, or 50 feet on one source/display combination, and not work at all on another.
The Upshot: It Depends
So, which is better, DVI or component? HDMI or component? The answer--unsatisfying, perhaps, but true--is that it depends. It depends upon your source and display devices, and there's no good way, in principle, to say in advance whether the digital or the analog connection will render a better picture. You may even find, say, that your DVD player looks better through its DVI or HDMI output, while your satellite or cable box looks better through its component output, on the same display. In this case, there's no real substitute for simply plugging it in and giving it a try both ways.
http://www.pcmag.com/encyclopedia_term/0,2542,t=HDMI&i=44161,00.asp
http://www.ramelectronics.net/html/howto-dvi-hdmi.html (scroll down)
http://www.ept.ca/docs/index.php?PageName=article&ArticleID=18462&ShowMode=long