DVI Output Nvidia vs ATI

This is a discussion about DVI Output Nvidia vs ATI in the Slack Space category; Hi ladies, I just read something on Toms Hardware that the Nvidia DVI implementation is a bit non standard, it has something to so with bandwidth. That's why they use a second DVI chip or something. .

Slack Space 1613 This topic was started by ,


data/avatar/default/avatar33.webp

723 Posts
Location -
Joined 2000-02-05
Hi ladies,
I just read something on Toms Hardware that the Nvidia DVI implementation is a bit non standard, it has something to so with bandwidth. That's why they use a second DVI chip or something...if I find the link again, I'll post it.
 
ATI has however the better DVI implementation, according to the article. Im interested in all this because I will need to buy a DVI capable card, and Im in between an ATI Radeon 8500LE and a Geforce 4Ti 4200. And YES, I checked..only the 128MB model has DVI out
 
Ok I found the review, it was Anandtech, not Tom's HArdware:
 
http://www.anandtech.com/video/showdoc.html?i=1577

Participate in our website and join the conversation

You already have an account on our website? To log in, use the link provided below.
Login
Create a new user account. Registration is free and takes only a few seconds.
Register
This subject has been archived. New comments and votes cannot be submitted.

Responses to this topic


data/avatar/default/avatar18.webp

989 Posts
Location -
Joined 2000-05-12
that talks about hte GeForce 2 and 3..not the GF4.
 
there may be a difference...probably is, due to the dual monitor ability.

data/avatar/default/avatar33.webp

723 Posts
Location -
Joined 2000-02-05
OP
"With the release of the GeForce2 GTS, NVIDIA had integrated a TMDS transmitter into their GPU that would remain there even to this day in the current Titanium line of cards"
 
Read the whole article, not just the title....

data/avatar/default/avatar18.webp

989 Posts
Location -
Joined 2000-05-12
I did
 
the thing was made in January.
 
 
that's talking about the GF3 and GF2, just like I said.
 
 
GF4 has been said to have a MUCH better picture than the GF3...and the GF3 was muchbetter than the GF2
 
I can't back that up until Friday, most likely
 
Finally put in the order for a GB of Corsair XMS3200....and the RAM was all that was missing in my system.

data/avatar/default/avatar40.webp

3087 Posts
Location -
Joined 2001-01-21
It also depends on who makes the GF2/3 card. Hercules and Gainward make very good cards in the image quality department. MSI is middle of the road. Gainward ownz j00!

data/avatar/default/avatar33.webp

723 Posts
Location -
Joined 2000-02-05
OP
The improvements of 2D quality with the GF4 are to be noticed at resolutions of 1280x1024 and above...that's when things start getting blurry. How fast is the GF4 RAMDAC? 350MHz?
 
And Im talking about the DVI implementation on NVIDIA, not the 2D quality of the desktop at high resolutions, or TV out quality.