I knew VGA is an analog standard while DVI and HDMI are digital standards, but I didn't know the differences between them in terms of video quality, so I searched online for information when I got home. I found an informative article DVI vs. HDMI vs. Component Video -- Which is Better? at eCoustics.com Electronics Forums.
I also found another interesting article at HDMI vs DVI, which had the following information for DVI and HDMI as well as additional information on their differences and compatibility with one another:
Digital Visual Interface (DVI) is a digital standard introduced in 1999 by the Digital Display Working Group (DDWG). It is designed primarily for carrying uncompressed digital video data to a display. Originally the display was a computer monitor but DVI is now commonly used for television as well. One of the main areas of confusion with DVI is the number of different connectors available, which represent different functionality. There are three main connection types for DVI, DVI-D (digital only), DVI-A (analog only) and DVI-I (digital & analog).
High-Definition Multimedia Interface (HDMI), released late in 2002, is an all-digital audio/video interface capable of transmitting uncompressed streams of data similar to DVI. However HDMI also provides the ability to carry audio signals, in addition to video, as well as incorporating HDCP, which is a Digital Rights Management technology.
I was also interested in the maximum cable lengths for the different standards. The article mentions "neither HDMI or DVI work well over distances greater then 15 feet. If you need a cable longer then 10 feet you will definitely want to consider top quality cables. For anything greater then 15 feet, some companies offer amplifiers, equalizers and repeaters that can help bridge longer distances." A DVI Inline Repeater & Booster is available from DataPro International Computer Cables. That company states "Although the mandated DVI spec is 5 meters, we do carry cables up to 25 feet, and have succesfully extended them even longer than that (although results do vary depending on hardware). For guaranteed signal quality on long runs, you should consider using a powered DVI signal booster."
The author of the article, James Unterreiner is editor and publisher of Home Theater, Automation and Electronics. He provides additional information on the differences and similarities between HDMI and DVI on his own website at HDMI vs DVI.
I knew there were different tupes of DVI connectors, but didn't know the differences between them. He explains the differences and has a diagram, which I've included below that shows the connector differences:
DVI-I (integrated - digital & analog)
DVI-D (digital only)
DVI-A (analog only)
Quoting from his webpage on HDMI vs DVI:
One of the main areas of confusion with DVI is the number of different connectors available, which represent different functionality. There are three main connection types for DVI.
DVI-D (digital only)
DVI-A (analog only)
DVI-I (digital & analog)
In addition the digital connections also add to the confusion by adding Dual Link creating an additional connection. Dual Link simple adds additional pins for a second set of data signals.
He notes that HDMI doesn't create the possibility for confusion due to different connector types, but, on the other hand, there are different versions of the HDMI specification and one has to keep that in mind when interconnecting devices, since later versions of the HDMI specification support capabilities not present in earlier versions.
An excellent webpage providing detailed information on the types of DVI connectors and when each is used is All About DVI, which appears on the website of a cable seller, DataPro International Computer Cables. The webpage provides diagrams that allow one to easily determine the type of DVI cable from the arrangement of pins on the connectors. The DVI-D connectors do not have any pins around the flat pin.
DVI Connector Guide
|DVI-D Single Link||DVI-A||DVI-I Single Link|
|Digital Only||Analog Only||Digital & Analog|
|Two sets of nine pins, and a solitary flat blade||One set of eight pins and one set of four pins, with four contacts around the blade||Two sets of nine pins and four contacts around the blade|
|DVI-D Dual Link||DVI-I Dual Link|
|Digital Only||Digital & Analog|
|Three rows of eight pins and a solitary flat blade||Three rows of eight pins and four contacts around the blade|
For the digital versions, DVI-D and DVI-I, there are single link and dual link standards. The dual link versions support a higher maximum resolution. These DVI cables send information using a digital information format called TMDS (transition minimized differential signaling). Single link cables use one TMDS 165Mhz transmitter, while dual links use two. The dual link DVI pins effectively double the power of transmission and provide an increase of speed and signal quality; i.e. a DVI single link 60-Hz LCD can display a resolution of 1920 x 1200, while a DVI dual link can display a resolution of 2560 x 1600.
I was mainly concerned about the quality of the video between the two standards. From what I read, there doesn't seem to be a quality difference. The main differences are that the later HDMI standard supports audio as well as video and incorporates content protection called High Definition Content Protection (HDCP), which is not a selling point for me. Backward compatibility exists between the two standards, since they use the same encoding for video, which allows a simple conversion cable to be used to interconnect two devices when one has an HDMI connector and the other a DVI connector. The Wikipedia article on HDMI states "A DVI signal is electrically compatible with an HDMI video signal; no signal conversion is required when an adapter or asymmetric cable is used, and consequently no loss in video quality occurs"
While I was looking at monitors and LCD TVs, I also checked the cable prices for HDMI cables. The Best Buy where I was shopping carried HDMI cables from Monster Cable, but they seemed too expensive to me. From what I've read, people don't find any noticeable improvement in audio or video quality when buying Monster Cable cables. I'm sure many people assume their cables provide higher quality audio/video simply because they cost more. The company seems to be litigious and not averse to filing frivoulous lawsuits against other companies in unrelated fields using "Monster" in their company name (see Monster Cable learns nothing, sues Monster Transmission ). As a counterpoint, though, James Unterreiner, the editor and publisher of Home Theater, Automation and Electronics recommended Monster Cable for HDMI cables over 15 feet long, where the quality of the cable used is more important, in his HDMI vs DVI article. Several of the posters commenting on DVI vs. HDMI vs. Component Video -- Which is Better?, recommended MonoPrice.com as a good source for inexpensive cables.
As for the difference between the digital standards, such as DVI and HDMI and the older VGA standard, a CNET review from 2004 titled LCD connections: analog vs. digital states "the advantage of digital signals for LCDs is of somewhat less importance now than it was a few years ago. Analog signal processing has improved to the point where major differences in image quality can be difficult to detect. Unless you're a pro photographer, a prepress professional, or someone else who needs superprecise, top-notch image quality, you should be fine using a CRT or an LCD on an analog signal." And, though, I'd previously heard claims that using a digital interface between a computer and monitor is better than using an analog one, because by using digital interfaces on both ends there is no analog to digital conversion, the author of DVI vs. HDMI vs. Component Video -- Which is Better? has this to say on the matter:
That might be true, were it not for the fact that digital signals are encoded in different ways and have to be converted, and that these signals have to be scaled and processed to be displayed. Consequently, there are always conversions going on, and these conversions aren't always easy going. "Digital to digital" conversion is no more a guarantee of signal quality than "digital to analog," and in practice may be substantially worse. Whether it's better or worse will depend upon the circuitry involved--and that is something which isn't usually practical to figure out. As a general rule, with consumer equipment, one simply doesn't know how signals are processed, and one doesn't know how that processing varies by input. Analog and digital inputs must either be scaled through separate circuits, or one must be converted to the other to use the same scaler. How is that done? In general, you won't find an answer to that anywhere in your instruction manual, and even if you did, it'd be hard to judge which is the better scaler without viewing the actual video output. It's fair to say, in general, that even in very high-end consumer gear, the quality of circuits for signal processing and scaling is quite variable.
Chris Pirillo, who runs Lockergnome, has a video, DVI or VGA, in which he talks about the differences between the two. There's also information on the pros and cons of these and other video and audio cabling standards at How to connect your HDTV and Home Theater.
In addition to looking in the PC monitor section of Best Buy, I checked what was available in the LCD TV section, thinking I might also be able to connect other video devices other than her PC to it as well. I thought a Dynex 26" model DX-L26-10A LCD TV, which supports 720p , might work. 720p has a widescreen aspect ratio of 16:9, a vertical resolution of 720 pixels and a horizontal resolution of 1280 pixels, or 1280x720, for a total of 921,600 pixels. The front of the Dynex DX-L26-10A box stated "720p 1366 x 768 High Definition Resolution". The Wikipedia article on 720p states the following:
Actual 1280x720 flat-panel native resolution is uncommon. Displays with 1280x720 resolution include the Gateway FPD1775W, Westinghouse LCM-27w4, HP w15v, a small number of notebooks, and some Toshiba and Sharp televisions. Most TV's capable of 720p but not 1080p are 1280x768, 1280x800, 1360x768, 1366x768, 1440x900, or 1680x1050 resolution and display 720p with letterboxing or scaling. For resolutions with 1280 horizontal resolution, but higher than 720 vertical resolution, 1280x720 can be displayed with letterboxing.
The person for whom I was going to buy the unit had an LG Flatiron L1930B (Analog) monitor connected to an NVIDIA GeForce 6150LE video connector built into her computer's motherboard. The NVIDIA GeForce 6150LE provides only a VGA connection. She had her resolution set to 1280 x 1024. I wasn't sure whether she might see some diminution of quality in the display. I found some people replying to an LCD TV as a Computer Monitor posting submitting comments indicating that text might look poor on an LCD TV. I also found posters at LCD Monitor Vs LCD HD TV suggesting an LCD monitor would be preferable to an LCD TV for use with a PC because of the increased resolution that can be obtained with an LCD monitor. So that made me worry that perhaps purchasing a 26" LCD TV for someone who wanted a larger monitor, primarily for her PC connection, might be a poor choice. Though I did find someone responding to another post, LCD TV HDTV Compatibility w/ PCs at Tom's Hardware Forums indicating that he is using a 30" LCD TV with a PC and found it acceptable.