Geforce 4 MX 4000 - slowness issues
Dell Dimension 2400, Win 2k, 2. 6 ghz, 256 RAM. EVGA Geforce 4 MX 4000, PCI, 64 MB. I've installed a GeForce 4 MX 4000 (I know it sucks, but I'm not trying to run anything newer than Halo). After having some fun with drivers, the current state is as follows: The integrated video card is disabled, and in BIOS-> i ...
Dell Dimension 2400, Win 2k, 2.6 ghz, 256 RAM.
EVGA Geforce 4 MX 4000, PCI, 64 MB.
I've installed a GeForce 4 MX 4000 (I know it sucks, but I'm not trying to run anything newer than Halo). After having some fun with drivers, the current state is as follows:
The integrated video card is disabled, and in BIOS-> integraded devices Auto is selected for the graphics card (the two options are Integrated and Auto).
The integrated video card's drivers have been uninstalled.
The GeForce drivers are installed (ForceWare version 77.77).t
The Geforce card is, as far as I can tell, active and working.
I have seen significant speed increases in Diablo II and counterstrike. Everything else runs as slowly as it did with the integrated video card, or more slowly. In Halo, my frame rate is 10-20 fps with the lowest graphics settings, about 1/4 what it is on a similar computer with a GeForce 2 MX 400. Since I have heard from two seperate sources that the GeForce 4 MX 4000 is better than the GeForce 2 MX (series), I have reason to believe that this is a problem.
Anti-aliasing is off. Anisotropic filtering is off. Image settings are high performance. Vertical sync is off.
EDIT: there's what looks like a two-prong power plug on the card, but the installation directions don't say to plug anything in. So I didn't.
An older version of the ForceWare drivers (56.72) allowed me to view "advanced options" in the Nvidia settings control panel thing. Under those advanced options, there was one labeled "hardware acceleration". It was marked Off in the right collumn, and there was no way to turn it on. Whether or not this is normal, I do not know. The newer drivers don't allow one to view advanced options.
If anyone has any bloody idea what - or if - the problem is, I would surely appreciate the help.
EVGA Geforce 4 MX 4000, PCI, 64 MB.
I've installed a GeForce 4 MX 4000 (I know it sucks, but I'm not trying to run anything newer than Halo). After having some fun with drivers, the current state is as follows:
The integrated video card is disabled, and in BIOS-> integraded devices Auto is selected for the graphics card (the two options are Integrated and Auto).
The integrated video card's drivers have been uninstalled.
The GeForce drivers are installed (ForceWare version 77.77).t
The Geforce card is, as far as I can tell, active and working.
I have seen significant speed increases in Diablo II and counterstrike. Everything else runs as slowly as it did with the integrated video card, or more slowly. In Halo, my frame rate is 10-20 fps with the lowest graphics settings, about 1/4 what it is on a similar computer with a GeForce 2 MX 400. Since I have heard from two seperate sources that the GeForce 4 MX 4000 is better than the GeForce 2 MX (series), I have reason to believe that this is a problem.
Anti-aliasing is off. Anisotropic filtering is off. Image settings are high performance. Vertical sync is off.
EDIT: there's what looks like a two-prong power plug on the card, but the installation directions don't say to plug anything in. So I didn't.
An older version of the ForceWare drivers (56.72) allowed me to view "advanced options" in the Nvidia settings control panel thing. Under those advanced options, there was one labeled "hardware acceleration". It was marked Off in the right collumn, and there was no way to turn it on. Whether or not this is normal, I do not know. The newer drivers don't allow one to view advanced options.
If anyone has any bloody idea what - or if - the problem is, I would surely appreciate the help.
Participate on our website and join the conversation
This topic is archived. New comments cannot be posted and votes cannot be cast.
Responses to this topic
First off, PCI cards are not meant for gaming, there's no way around that.
Is it better than a GF2 MX200? Probably. An MX400, AGP, 64mb/128bit, will smoke the MX4000 though.
Your MX4000 is PCI, that's the first problem, the PCI bus is slower than the AGP bus. Next, your card has 64mb of memory and is 32bit. For today's games, a minimum of 128mb/128bit is required. That 32bit is the transfer rate through the memory bus of the card, 32bits/second. That's very slow.
For example, I bought a BFG FX5500 OC, not knowing it was only a 64bit card. It was to replace my 5200. The 5200 smoked it, as it was a 128bit card. That's how much of a difference there is.
Top of the line MX4000 AGP is something like 128mb/64bit, and I do believe an MX400 64mb/128bit would give it a run for it's money. An MX440 AGP, 128/128 would smoke it.
I feel for you as you were given some questionable advice, happens to everyone somewhere along the line, hard lessons are the ones we remember.
The 2 pin connector is for a fan, used on models with "active-cooling", you could add a fan to the heatsink, but it doesn't need one, it won't get hot.
I have more to say about the drivers, back in a bit, must drive my 14 year old to work now.
Is it better than a GF2 MX200? Probably. An MX400, AGP, 64mb/128bit, will smoke the MX4000 though.
Your MX4000 is PCI, that's the first problem, the PCI bus is slower than the AGP bus. Next, your card has 64mb of memory and is 32bit. For today's games, a minimum of 128mb/128bit is required. That 32bit is the transfer rate through the memory bus of the card, 32bits/second. That's very slow.
For example, I bought a BFG FX5500 OC, not knowing it was only a 64bit card. It was to replace my 5200. The 5200 smoked it, as it was a 128bit card. That's how much of a difference there is.
Top of the line MX4000 AGP is something like 128mb/64bit, and I do believe an MX400 64mb/128bit would give it a run for it's money. An MX440 AGP, 128/128 would smoke it.
I feel for you as you were given some questionable advice, happens to everyone somewhere along the line, hard lessons are the ones we remember.
The 2 pin connector is for a fan, used on models with "active-cooling", you could add a fan to the heatsink, but it doesn't need one, it won't get hot.
I have more to say about the drivers, back in a bit, must drive my 14 year old to work now.
Vidcard manufacturers tend to "tweak" drivers for their cards, and that doesn't always work for your system. Get the nVidia drivers, I know they work better for me. The 77.77's can be had here: http://downloads.guru3d.com/download.php?det=1145
To get a few more options in your settings, dwnld NVTweak: http://downloads.guru3d.com/download.php?det=911
When you swap out drivers, you should make sure you clean them out totally, so after uninstalling the drivers, run Driver Cleaner Pro in safe mode: http://majorgeeks.com/Driver_Cleaner_Professional_d3214.html
To get a few more options in your settings, dwnld NVTweak: http://downloads.guru3d.com/download.php?det=911
When you swap out drivers, you should make sure you clean them out totally, so after uninstalling the drivers, run Driver Cleaner Pro in safe mode: http://majorgeeks.com/Driver_Cleaner_Professional_d3214.html
Trying them driver things (56K modem....)
I have found that in Open GL mode, everything works perfectly.
Counterstrike and UT (origional) get 40+ fps in 1024x768 with the graphics settings all the way up. Programs that use direct 3D seem to be the ones which experience problems.
To test this, I ran Diablo II in direct 3D mode. Instead of the 75 fps it got previously, it got 20 fps on average.
Therefore, it seems likely that direct 3D is the root of the problem. To test this, I will downgrade to Direct X 8.1, which is supposedly the latest version that the MX 4000 supports, and see what happens.
I have found that in Open GL mode, everything works perfectly.
Counterstrike and UT (origional) get 40+ fps in 1024x768 with the graphics settings all the way up. Programs that use direct 3D seem to be the ones which experience problems.
To test this, I ran Diablo II in direct 3D mode. Instead of the 75 fps it got previously, it got 20 fps on average.
Therefore, it seems likely that direct 3D is the root of the problem. To test this, I will downgrade to Direct X 8.1, which is supposedly the latest version that the MX 4000 supports, and see what happens.
I believe that the MX 4000 supports pixelshader version 1.4 (available in directx 8.1) but not version 2.0 (DX 9.0 I think) or 3.0 (DX 9.0c, I think). The box says it has "optimised DX 8.1 and OpenGL 1.3 acceleration"....
Transparencies seem to be a significant component of the problem. When much of the screen is taken up by a transparent effect, such as the Diablo 2 automap in D3D mode or a flaring plasma weapon in Halo, the frame rare drops by about three fifths.
Update: Downgrading to DX 8.1 (used DX eradicator to remove 9.0c) had no efect. Prior to downgrade, dxdiag indicated no problems with DX. I'll test drivers tomorrow....
[Edited by Asc on 2006-01-02 06:20:16]
Transparencies seem to be a significant component of the problem. When much of the screen is taken up by a transparent effect, such as the Diablo 2 automap in D3D mode or a flaring plasma weapon in Halo, the frame rare drops by about three fifths.
Update: Downgrading to DX 8.1 (used DX eradicator to remove 9.0c) had no efect. Prior to downgrade, dxdiag indicated no problems with DX. I'll test drivers tomorrow....
[Edited by Asc on 2006-01-02 06:20:16]
Did Eradicator actually work? Have you checked to make sure that you're now actually using DX8.1 with DX diagnostic? If it did work, bonus, if not, try "Happy Uninstaller". Some of these DX9 uninstallers don't work, I forget which one I used.
If the probs are only in D3D, try DX Tweaker, maybe you can tweak your D3D settings so you get it working right, or at least better anyhow. http://majorgeeks.com/DirectX_Tweaker_d4575.html
Just an opinion here, but I don't think DX9 is your problem, if anything, your 4000 should do okay with DX9. I had(have in my desk drawer) a ti4200, it did better in DX9 than my FX5200, which supports DX9. The 5200 was too slow for gaming, but it tried to use everything available(eye-candy) in DX9. My ti4200 on the otherhand, did not support DX9, so all that extra eye-candy could not be used. Thus, my ti4200 smoked the 5200 in the same DX9 games. The games still looked very good, better than what the 5200 could run at, as the 5200 had to have the eye-candy turned off to play a game.
Hopefully you can get some decent frame rates playing with the D3D settings.
If the probs are only in D3D, try DX Tweaker, maybe you can tweak your D3D settings so you get it working right, or at least better anyhow. http://majorgeeks.com/DirectX_Tweaker_d4575.html
Just an opinion here, but I don't think DX9 is your problem, if anything, your 4000 should do okay with DX9. I had(have in my desk drawer) a ti4200, it did better in DX9 than my FX5200, which supports DX9. The 5200 was too slow for gaming, but it tried to use everything available(eye-candy) in DX9. My ti4200 on the otherhand, did not support DX9, so all that extra eye-candy could not be used. Thus, my ti4200 smoked the 5200 in the same DX9 games. The games still looked very good, better than what the 5200 could run at, as the 5200 had to have the eye-candy turned off to play a game.
Hopefully you can get some decent frame rates playing with the D3D settings.
It did uninstall DX correctly - I checked under the Nvidia control panel several times during the process.
Interestingly, DX tweak doesn't seem to be working. I re-upgraded to 9.0c and installed the .net framework (and restarted...), but after following the directions (activating the configuration and checking the modules, and starting from DXtweak and so forth) none of the modules seem to work. For example, I've set , to display pixel shader type, and . to display wireframe, but in-game pressing them doesn't do anything.
Interestingly, DX tweak doesn't seem to be working. I re-upgraded to 9.0c and installed the .net framework (and restarted...), but after following the directions (activating the configuration and checking the modules, and starting from DXtweak and so forth) none of the modules seem to work. For example, I've set , to display pixel shader type, and . to display wireframe, but in-game pressing them doesn't do anything.