Display Ports Question
Why would you use the VGA, when there are DVI to HDMI interface cables?
As to the resulting image quality, I couldnt' say.
Because I didn't know those adaptors existed?
Still it gives me something to look for. Anyone else know on the throughput question, though? My curiosity's still piqued, and I might need to know for plan B.
DVI = HDMI
Seriously, it's the same exact wiring, HDMI is just a more compact connector, and a couple extra wires for sound. You should be able to easily find a DVI<->HDMI adapter or cable. That's more likely to work and give good results with an HDTV then a VGA connection is.
Good luck!
EDIT: Some more technical details.
DVI and HDMI both include a digital video stream. VGA is of course, analog. DVI also includes in its specification (though it's optional) that around half the pins are analog VGA-style output. So that little DVI to VGA connector is just an adapter to let you connect your old analog monitor to the DVI port and still get a signal.
HDMI's digital part is identical to DVI, but it doesn't include the extra pins for an analog signal.
In practice, the signal degredation from going through an analog (VGA) step will vary based on your quality of cabling and the connectors, and how much interference you get. It will be more obvious with high resolutions than lower ones. You may also encounter phasing problems with the VGA cable if the HDTV has a fixed pixel grid, such as an LCD. You can work around that by using a special test pattern, and letting the TV re-sync to the signal, but in general it's easier and more bulletproof to just use the digital connection.
EDIT Again: More on the phasing issue. The test pattern is here:
http://www.techmind.org/lcd/phasing.html
If you go to that page on an LCD that's connected by an analog cable, you'll see lots of motion and jumping around.
It helps if you know what HDMI is; that being, it's DVI. Quite literally. Oh sure, DVI has extra pins for extra stuff (like VGA passthrough and a second set of links for really high resolutions) but they're electronically identical. A DVI to HDMI adaptor just re-routes some wires to make everything fit.
When the people who make TV cables came to draw up HDMI, they used DVI as a basis for it. They took out the VGA analog bits, removed the second set of links, made the cable thinner, designed a plug that could be blindly inserted (for when you fumble around behind the TV), and added copy protection to it.
Just remember, don't try to go VGA > DVI > HDMI, as I heard someone once did. DVI has separate cables for analog VGA and digital DVI/HDMI picture. This doesn't apply when you use a graphics card with a built-in DVI output, but it does mean you can't set up a bunch of adaptors for an older VGA graphics card. Analog doesn't magically turn into digital.
tl;dr, get a HDMI cable, DVI adaptor, HDMI adaptor or whatever. Your HDTV will work with it.
Necrobond - 50 BS/Inv Scrapper made in I1
Rickar - 50 Bots/FF Mastermind
Anti-Muon - 42 Warshade
Ivory Sicarius - 45 Crab Spider
Aber ja, nat�rlich Hans nass ist, er steht unter einem Wasserfall.
Here, DVI to HDMI dongles and cables.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
I would just buy a TV that had DVI in. I find when I connect via an adaptor to an HDMI port from a DVI port I don't get as nice an image as when I connect to a DVI port. Unless of course the HDMI port on the TV is PC enabled.
So, to make sure I understand the answer (I'm a bit thick, so bear with me), there'd still be some degradation of the digital signal s it's fed through an analog port? (Specific details may vary, please make sure you live in a participating state, etc.)
Thanks muchly to Father Xmas for the converter link; assuming the card I end up getting doesn't have HDMI output, I'll just stick one of them onto the end and let it rip.
Thanks everyone!
So, to make sure I understand the answer (I'm a bit thick, so bear with me), there'd still be some degradation of the digital signal s it's fed through an analog port?
|

DVI-A (A for Analog) uses certain of those pins to send an analog signal; this is basically the same as an old-school VGA ("blue") plug has, and you can get a simple cable or passive adapter that will convert between the two. Some cheap devices claim "DVI interface" but only support DVI-A, and are therefore no better quality than VGA.
DVI-D (D for Digital) uses certain other of those pins to send a digital signal; this is basically the same as the video-only portion of a basic HDMI 1.0 signal, except that there are much higher minimum requirements for HDMI devices. You can get a simple cable or passive adapter that will convert between the two.
DVI-I (I for Integrated) uses all or most of the pins to handle both DVI-A and DVI-D on the same physical plug. If you use the proper cable to plug it into a DVI-D-capable device, you get the better signal, but it's still backwards compatible with older devices by simply using a different cable / adapter. Hopefully, most quality devices with a DVI interface are using this by now, but buyer beware.
The best results are from end-to-end HDMI, with all devices HDCP compliant and supporting at least HDMI 1.3a (HDMI 1.4 is nominally out, but still very rare). Barring that, you'll get nearly as good basic video quality out of a DVI-D or DVI-I in digital mode to HDMI, but you may loose out on support for some of the advanced features of your HDTV (deep color support, Dolby TrueHD, etc.) and you have to run your audio separately. Involving analog anything will basically destroy any advantage of having a HDTV, and depending on the size and resolution may not work at all.
In your particular situation, I would suggest getting a male DVI to female HDMI adapter to plug into your video card, and then run an ordinary male-male HDMI cable from the adapter to your HDTV. HDMI cables are thinner and (theoretically) cheaper than full DVI cables of the same quality, particularly for longer runs, since they don't need to have provision for the old-style analog wires. This way later on when if you go true HDMI, you'll not need a different cable. (You'll also need to separately wire your PC's audio into either your HDTV's audio inputs, or your stereo / surround system.)
Side note: Buying HDMI cables at someplace like Best Buy is a bad idea; many A/V places make much of their profit on overpriced cables and accessories. It's not unusual to see lower-grade cables consumer priced 3x to 4x the cost of higher-grade cables ordered online.
DVI (M) to HDMI (F) adapter examples:
http://www.dealextreme.com/details.dx/sku.1279
http://www.ramelectronics.net/audio-...HDMIDVIR2.html
http://www.newegg.com/Product/Produc...82E16812186016
Miuramir, Windchime, Sariel the Golden, Scarlet Antinomist...
Casino Extortion #4031: Neutral, Council+Custom [SFMA/MLMA/SLMA/FHMA/CFMA]
Bad Candy #87938: Neutral, Custom [SFMA/MLMA/SLMA/FHMA/HFMA]
CoH Helper * HijackThis
Let me preface this by warning that it's going to be very vague; my new rig is only just in the planning stages right now, which means I haven't decided on any actual hardware just yet. Once the devs let us know the requirements for GR, that will change.
The situation is: I'm going to be hooking up the new rig to an HDTV to use as a monitor, and I just saw a potential snag. See, my current card (which is attached to a normal monitor) only has DVI output. I'm assuming this is going to be a trend for newer cards, and thus can't guarantee that the new rig will have the necessary HDMI output for use with the TV. However, said existing card did supply a DVI-to-VGA adaptor for those with older monitors.
So, the question is: will the DVI-quality output from the card still display in all its glory? Or does the VGA-style plug have some inherent limitation that will step image quality down no matter how pretty it should be?