DVI and TFT / Advantages?

I bought an 18 Samsung TFT. I am thinking of a new video card as well. The monitor supports DVI. Now my question: is there an advantage of using DVI on both sides (video card and monitor)? Is the picture better or the monitor faster? Or does it change simply nothing? If not I don't have to buy a new video card with ...

Windows Hardware 9627 This topic was started by ,


data/avatar/default/avatar03.webp

10 Posts
Location -
Joined 2002-05-10
I bought an 18" Samsung TFT. I am thinking of a new video card as well. The monitor supports DVI. Now my question: is there an advantage of using DVI on both sides (video card and monitor)? Is the picture better or the monitor faster? Or does it change simply nothing? If not I don't have to buy a new video card with DVI.

Participate on our website and join the conversation

You have already an account on our website? Use the link below to login.
Login
Create a new user account. Registration is free and takes only a few seconds.
Register
This topic is archived. New comments cannot be posted and votes cannot be cast.

Responses to this topic


data/avatar/default/avatar05.webp

748 Posts
Location -
Joined 2001-05-21
With an analogue (sub-D type connector), the red, green and blue channels are combined and transmitted as a composite analogue signal which then has to be split back by the monitor into the individual R, G & B signals. This conversion process causes a degradation in the signal; analogue signals are also subject to noise and interference, which causes further degradation.
 
With DVI, the monitor signal is digital, and is split into distinct red, blue and green channels. This means that there is no degradation of the signal quality, which ultimately means that the monitor has a clearer picture - in practice, less blurry, with a more distinct boundary between areas of different colour.
 
Most video cards produced in the last year or so come with DVI outputs, but it's safer to check before you buy.
 
Rgds
AndyF

data/avatar/default/avatar33.webp

723 Posts
Location -
Joined 2000-02-05
I have bought an HP 15" TFT as well, and Im using the analog VGA connector on my GF2MX. I would like to upgrade, though, and I was looking at the ATI Radeon 8500 (LE). The problem with Nvidia cards is that they support DVI only on their 128MB models (GF3&4), which pushes them out of my price range. The Radeon has one of the best, if not the best, display qualities in both 2D and 3D. I am seriously thinking of ditching the Nvidia train and catch the ATI bandwagon
 
On the other hand I will ditch the AMD wagon, and jump the Intel train

data/avatar/default/avatar32.webp

671 Posts
Location -
Joined 2000-05-04
Quote:
The problem with Nvidia cards is that they support DVI only on their 128MB models (GF3&4), which pushes them out of my price range.
That's odd. My 64MB Hercules GF2U has a DVI port, and I'm fairly sure my Leadtek GF3 (again 64MB) has one, but I'm not 100% certain on that.

data/avatar/default/avatar36.webp

1207 Posts
Location -
Joined 2000-03-27
My GF2 Ultra (Gainward) certainly had a DVI output on.
My GF3 Ti200 (Hercules) 64MB certainly doesn't, I think it's down to the individual cards rather than a blanket "No NVidia 64MB cards have DVI output on".

data/avatar/default/avatar33.webp

723 Posts
Location -
Joined 2000-02-05
Yea ok, I did more research...the GF3 range with 64MB do not have DVI, nor the GF4 TIs with 64MB. The 128MB models do have DVI (both GF3,4), probably the GF2 Ultras as well.

data/avatar/default/avatar39.webp

83 Posts
Location -
Joined 2000-04-15
Not being awkward but my 64MB Creative GF3 Ti200 has a DVI port - they're under £200 stg now I think

data/avatar/default/avatar36.webp

1207 Posts
Location -
Joined 2000-03-27
According to Gainward's site their GF3 Original and GF3 Ti500 both have DVI output's on too.

data/avatar/default/avatar03.webp

10 Posts
Location -
Joined 2002-05-10
OP
Thanks for the input. I think I will go for a GF4 with DVI then. About ATI's Radeon: they have good hardware, but as far as I know their driver are not best and can cause some problems. For the image quality I would have to buy a Matrox, but then I can forget about UT2003

data/avatar/default/avatar17.webp

757 Posts
Location -
Joined 2000-10-14
Quote:but as far as I know their driver are not best and can cause some problems

I have an ati 8500 and use it for gaming and dual monitor support. Af far as I am concerned, ati drivers are great. They have been for about 6 months or so now. I was totally pro nvidia, then, well...I worked at a PC repair shop, and I have seen ALOT of nvidia run computers with graphics problems. I am so happy with my 8500, and for $274 cdn I am saving alot over the $400+ for a gf3 or gf4 (non MX)

data/avatar/default/avatar36.webp

1207 Posts
Location -
Joined 2000-03-27
Well Vasco, you could wait until Tuesday 14th when Matrox announce their new card.
If rumours are true it's going to be anamazing piece of kit, the usual Matrox image quality and available to market soon after the announcement.

data/avatar/default/avatar05.webp

430 Posts
Location -
Joined 2001-04-09
My GF3 Ti200 64MB has DVI on it.Dont use it but its on there.

data/avatar/default/avatar03.webp

10 Posts
Location -
Joined 2002-05-10
OP
You made me curious Blade about the new Matrox card. I will wait and let's see what magazines etc. say about it...

data/avatar/default/avatar32.webp

989 Posts
Location -
Joined 2001-08-14
Quote:
With an analogue (sub-D type connector), the red, green and blue channels are combined and transmitted as a composite analogue signal which then has to be split back by the monitor into the individual R, G & B signals. This conversion process causes a degradation in the signal; analogue signals are also subject to noise and interference, which causes further degradation.

With DVI, the monitor signal is digital, and is split into distinct red, blue and green channels. This means that there is no degradation of the signal quality, which ultimately means that the monitor has a clearer picture - in practice, less blurry, with a more distinct boundary between areas of different colour.

Most video cards produced in the last year or so come with DVI outputs, but it's safer to check before you buy.

Rgds
AndyF

I'm fairly certain analogue VGA uses discreet red, green and blue signals...

I'm also fairly certain that since flat panels are, by their very nature, digital devices having to convert the incoming analogue signal to digital (after it's already been converted from digital to analogue by the RAMDAC in the video card) is the source of much of the signal degradation...

data/avatar/default/avatar27.webp

1117 Posts
Location -
Joined 2000-01-23
Quote:
I'm also fairly certain that since flat panels are, by their very nature, digital devices having to convert the incoming analogue signal to digital (after it's already been converted from digital to analogue by the RAMDAC in the video card) is the source of much of the signal degradation... That's why it is much better to get a video card and flat panel with DVI connectors - then it stays digital the whole way...