DVI Output Nvidia vs ATI
This is a discussion about DVI Output Nvidia vs ATI in the Slack Space category; Hi ladies, I just read something on Toms Hardware that the Nvidia DVI implementation is a bit non standard, it has something to so with bandwidth. That's why they use a second DVI chip or something. .
Hi ladies,
I just read something on Toms Hardware that the Nvidia DVI implementation is a bit non standard, it has something to so with bandwidth. That's why they use a second DVI chip or something...if I find the link again, I'll post it.
ATI has however the better DVI implementation, according to the article. Im interested in all this because I will need to buy a DVI capable card, and Im in between an ATI Radeon 8500LE and a Geforce 4Ti 4200. And YES, I checked..only the 128MB model has DVI out
Ok I found the review, it was Anandtech, not Tom's HArdware:
http://www.anandtech.com/video/showdoc.html?i=1577
I just read something on Toms Hardware that the Nvidia DVI implementation is a bit non standard, it has something to so with bandwidth. That's why they use a second DVI chip or something...if I find the link again, I'll post it.
ATI has however the better DVI implementation, according to the article. Im interested in all this because I will need to buy a DVI capable card, and Im in between an ATI Radeon 8500LE and a Geforce 4Ti 4200. And YES, I checked..only the 128MB model has DVI out
Ok I found the review, it was Anandtech, not Tom's HArdware:
http://www.anandtech.com/video/showdoc.html?i=1577
Participate in our website and join the conversation
This subject has been archived. New comments and votes cannot be submitted.
Responses to this topic
that talks about hte GeForce 2 and 3..not the GF4.
there may be a difference...probably is, due to the dual monitor ability.
there may be a difference...probably is, due to the dual monitor ability.
OP
"With the release of the GeForce2 GTS, NVIDIA had integrated a TMDS transmitter into their GPU that would remain there even to this day in the current Titanium line of cards"
Read the whole article, not just the title....
Read the whole article, not just the title....
I did
the thing was made in January.
that's talking about the GF3 and GF2, just like I said.
GF4 has been said to have a MUCH better picture than the GF3...and the GF3 was muchbetter than the GF2
I can't back that up until Friday, most likely
Finally put in the order for a GB of Corsair XMS3200....and the RAM was all that was missing in my system.
the thing was made in January.
that's talking about the GF3 and GF2, just like I said.
GF4 has been said to have a MUCH better picture than the GF3...and the GF3 was muchbetter than the GF2
I can't back that up until Friday, most likely
Finally put in the order for a GB of Corsair XMS3200....and the RAM was all that was missing in my system.
It also depends on who makes the GF2/3 card. Hercules and Gainward make very good cards in the image quality department. MSI is middle of the road. Gainward ownz j00!
OP
The improvements of 2D quality with the GF4 are to be noticed at resolutions of 1280x1024 and above...that's when things start getting blurry. How fast is the GF4 RAMDAC? 350MHz?
And Im talking about the DVI implementation on NVIDIA, not the 2D quality of the desktop at high resolutions, or TV out quality.
And Im talking about the DVI implementation on NVIDIA, not the 2D quality of the desktop at high resolutions, or TV out quality.