DVI and TFT / Advantages?
I bought an 18 Samsung TFT. I am thinking of a new video card as well. The monitor supports DVI. Now my question: is there an advantage of using DVI on both sides (video card and monitor)? Is the picture better or the monitor faster? Or does it change simply nothing? If not I don't have to buy a new video card with ...
I bought an 18" Samsung TFT. I am thinking of a new video card as well. The monitor supports DVI. Now my question: is there an advantage of using DVI on both sides (video card and monitor)? Is the picture better or the monitor faster? Or does it change simply nothing? If not I don't have to buy a new video card with DVI.
Participate on our website and join the conversation
This topic is archived. New comments cannot be posted and votes cannot be cast.
Responses to this topic
With an analogue (sub-D type connector), the red, green and blue channels are combined and transmitted as a composite analogue signal which then has to be split back by the monitor into the individual R, G & B signals. This conversion process causes a degradation in the signal; analogue signals are also subject to noise and interference, which causes further degradation.
With DVI, the monitor signal is digital, and is split into distinct red, blue and green channels. This means that there is no degradation of the signal quality, which ultimately means that the monitor has a clearer picture - in practice, less blurry, with a more distinct boundary between areas of different colour.
Most video cards produced in the last year or so come with DVI outputs, but it's safer to check before you buy.
Rgds
AndyF
With DVI, the monitor signal is digital, and is split into distinct red, blue and green channels. This means that there is no degradation of the signal quality, which ultimately means that the monitor has a clearer picture - in practice, less blurry, with a more distinct boundary between areas of different colour.
Most video cards produced in the last year or so come with DVI outputs, but it's safer to check before you buy.
Rgds
AndyF
I have bought an HP 15" TFT as well, and Im using the analog VGA connector on my GF2MX. I would like to upgrade, though, and I was looking at the ATI Radeon 8500 (LE). The problem with Nvidia cards is that they support DVI only on their 128MB models (GF3&4), which pushes them out of my price range. The Radeon has one of the best, if not the best, display qualities in both 2D and 3D. I am seriously thinking of ditching the Nvidia train and catch the ATI bandwagon
On the other hand I will ditch the AMD wagon, and jump the Intel train
On the other hand I will ditch the AMD wagon, and jump the Intel train
My GF2 Ultra (Gainward) certainly had a DVI output on.
My GF3 Ti200 (Hercules) 64MB certainly doesn't, I think it's down to the individual cards rather than a blanket "No NVidia 64MB cards have DVI output on".
My GF3 Ti200 (Hercules) 64MB certainly doesn't, I think it's down to the individual cards rather than a blanket "No NVidia 64MB cards have DVI output on".
According to Gainward's site their GF3 Original and GF3 Ti500 both have DVI output's on too.
Quote:but as far as I know their driver are not best and can cause some problems
I have an ati 8500 and use it for gaming and dual monitor support. Af far as I am concerned, ati drivers are great. They have been for about 6 months or so now. I was totally pro nvidia, then, well...I worked at a PC repair shop, and I have seen ALOT of nvidia run computers with graphics problems. I am so happy with my 8500, and for $274 cdn I am saving alot over the $400+ for a gf3 or gf4 (non MX)
I have an ati 8500 and use it for gaming and dual monitor support. Af far as I am concerned, ati drivers are great. They have been for about 6 months or so now. I was totally pro nvidia, then, well...I worked at a PC repair shop, and I have seen ALOT of nvidia run computers with graphics problems. I am so happy with my 8500, and for $274 cdn I am saving alot over the $400+ for a gf3 or gf4 (non MX)
Well Vasco, you could wait until Tuesday 14th when Matrox announce their new card.
If rumours are true it's going to be anamazing piece of kit, the usual Matrox image quality and available to market soon after the announcement.
If rumours are true it's going to be anamazing piece of kit, the usual Matrox image quality and available to market soon after the announcement.
Quote:
With an analogue (sub-D type connector), the red, green and blue channels are combined and transmitted as a composite analogue signal which then has to be split back by the monitor into the individual R, G & B signals. This conversion process causes a degradation in the signal; analogue signals are also subject to noise and interference, which causes further degradation.
With DVI, the monitor signal is digital, and is split into distinct red, blue and green channels. This means that there is no degradation of the signal quality, which ultimately means that the monitor has a clearer picture - in practice, less blurry, with a more distinct boundary between areas of different colour.
Most video cards produced in the last year or so come with DVI outputs, but it's safer to check before you buy.
Rgds
AndyF
I'm fairly certain analogue VGA uses discreet red, green and blue signals...
I'm also fairly certain that since flat panels are, by their very nature, digital devices having to convert the incoming analogue signal to digital (after it's already been converted from digital to analogue by the RAMDAC in the video card) is the source of much of the signal degradation...
With an analogue (sub-D type connector), the red, green and blue channels are combined and transmitted as a composite analogue signal which then has to be split back by the monitor into the individual R, G & B signals. This conversion process causes a degradation in the signal; analogue signals are also subject to noise and interference, which causes further degradation.
With DVI, the monitor signal is digital, and is split into distinct red, blue and green channels. This means that there is no degradation of the signal quality, which ultimately means that the monitor has a clearer picture - in practice, less blurry, with a more distinct boundary between areas of different colour.
Most video cards produced in the last year or so come with DVI outputs, but it's safer to check before you buy.
Rgds
AndyF
I'm fairly certain analogue VGA uses discreet red, green and blue signals...
I'm also fairly certain that since flat panels are, by their very nature, digital devices having to convert the incoming analogue signal to digital (after it's already been converted from digital to analogue by the RAMDAC in the video card) is the source of much of the signal degradation...
Quote:
I'm also fairly certain that since flat panels are, by their very nature, digital devices having to convert the incoming analogue signal to digital (after it's already been converted from digital to analogue by the RAMDAC in the video card) is the source of much of the signal degradation... That's why it is much better to get a video card and flat panel with DVI connectors - then it stays digital the whole way...
I'm also fairly certain that since flat panels are, by their very nature, digital devices having to convert the incoming analogue signal to digital (after it's already been converted from digital to analogue by the RAMDAC in the video card) is the source of much of the signal degradation... That's why it is much better to get a video card and flat panel with DVI connectors - then it stays digital the whole way...