Jump to content

dvi vs VGA


Slowtrain

Recommended Posts

Do you lose picture quality if you run from a dvi video card through a DVI-VGA adaptor to the VGA input on the monitor instead of just running from the dvi out of the graphics card to the dvi in of the monitor?

Notice how I can belittle your beliefs without calling you names. It's a useful skill to have particularly where you aren't allowed to call people names. It's a mistake to get too drawn in/worked up. I mean it's not life or death, it's just two guys posting their thoughts on a message board. If it were personal or face to face all the usual restraints would be in place, and we would never have reached this place in the first place. Try to remember that.
Link to comment
Share on other sites

Most definitely. DVI is intended to maximize picture quality of a digital display. Converting it over to analog loses that. I can't give you any technical reason for it, but a) yes DVI has better picture quality than VGA and b) no, you do not retain that quality converting to VGA.

 

From what I'm grasping from the wiki entries on the two, the 29 pins of DVI transmit quite a bit more information than the 15 pins of VGA.

Edited by Tale
"Show me a man who "plays fair" and I'll show you a very talented cheater."
Link to comment
Share on other sites

Most definitely. DVI is intended to maximize picture quality using a digital system. Converting it over to analog loses that. I can't give you any technical reason for it, but a) yes DVI has better picture quality than VGA and b) no, you do not retain that quality converting to VGA.

 

 

Thanks, Tale.

 

For some reason my dvi connection has died, though my VGA is working. I will replace the cable this weekend. Hopefully that is the problem and not something on the monitor itself. That would totally suck.

Notice how I can belittle your beliefs without calling you names. It's a useful skill to have particularly where you aren't allowed to call people names. It's a mistake to get too drawn in/worked up. I mean it's not life or death, it's just two guys posting their thoughts on a message board. If it were personal or face to face all the usual restraints would be in place, and we would never have reached this place in the first place. Try to remember that.
Link to comment
Share on other sites

Most definitely. DVI is intended to maximize picture quality of a digital display. Converting it over to analog loses that. I can't give you any technical reason for it, but

technically, any signal is transmitted in analog in at least some sense since truly "digital" is only a mathematical construct. DVI signals are differential signals similar to LVDS (low voltage differential signalling). essentially, each of the three twisted pairs (RGB) has two possible phases, {1, -1} or {-1, 1}, one of which represents a "1" and the other a "0". this is very similar to something called bi-phase shift keying. the signal has some coding on it (similar to 8b/10b).

 

the "gain" is due to the fact that since only 1s and 0s are being sent differentially, even when there is noise the comparator at the receive end, it only sees the difference between the two lines in a pair, and decodes it as its respective 1 or 0 output. for example, say a {1, -1} was transmitted but there is a bunch of noise. since both lines in a pair are together, they both see approximately the same noise. even if the noise is high, say +1 and the comparator sees {2, 0}, it will still decode the difference as +2 (first minus last), which is the same as a {1, -1} (usually, a + difference is a 0, and a - difference is a 1, but that's another story, and it depends on other things).

 

barring extremely high noise environments, the data is received exactly as it is transmitted, so there is no loss, i.e. you see exactly what is intended to be seen. the only errors will result from the standard bit error rate of transmitting such signals, which is pretty low over short distances (i.e. maybe one out of a million bits is in error, probably less than that).

 

that said, converting to analog simply means that the each of the RGB components is converted to a number, 8-bits per line, and transmitted as an amplitude modulated signal (essentially, it is more complicated than that). now you have to detect the level and determine what number it represents. while not necessarily more difficult, it is more prone to error, particularly in the presence of noise since a subtle shift in voltage will appear as a different value for the respective line, altering the picture. another thing that has an impact is imperfect load matching, but that's yet another story.

 

a) yes DVI has better picture quality than VGA and b) no, you do not retain that quality converting to VGA.

 

From what I'm grasping from the wiki entries on the two, the 29 pins of DVI transmit quite a bit more information than the 15 pins of VGA.

yes, yes and yes.

 

taks

Edited by taks

comrade taks... just because.

Link to comment
Share on other sites

well, I tested with a dvi cable I borrowed from a friend, but it didn't solve the problem.

 

ABout 50% of the time, it takes about 5 minutes for the monitor to display any image (which after 5 minutes is the windows desktop), the other 50% of the time it starts showing an image the moment the system begins booting. If I switch over to the VGA inputs the problem disappears.

 

As long as the problem doesn't get worse I can live with it.

Notice how I can belittle your beliefs without calling you names. It's a useful skill to have particularly where you aren't allowed to call people names. It's a mistake to get too drawn in/worked up. I mean it's not life or death, it's just two guys posting their thoughts on a message board. If it were personal or face to face all the usual restraints would be in place, and we would never have reached this place in the first place. Try to remember that.
Link to comment
Share on other sites

as i reread your post, it sounds like the input port on the monitor has died or is in the process of dying. could simply just be a bad connector, maybe one of the pins broke away from the solder point. hard to say without looking at the connector.

 

really, the difference between DVI and VGA is not that huge, particularly if you have a crappy monitor. :thumbsup:

 

taks

comrade taks... just because.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...