3DFX + GeForce = GeForce FX ?? Hmmm
So, how many of you out there noticed the new (old) tag Nvidia is sporting? Looks like the last 3DFX project hybrid with GeForce technology. I can't wait to get this card. Opinions Please. . . . .
So, how many of you out there noticed the new (old) tag Nvidia is sporting? Looks like the last 3DFX project hybrid with GeForce technology. I can't wait to get this card.
Opinions Please....
Opinions Please....
Participate on our website and join the conversation
This topic is archived. New comments cannot be posted and votes cannot be cast.
Responses to this topic
Ok, i saw this over at megagames:
Quote:As far as speculation about the name is concerned, nVidia chose the FX as a tribute to 3DFX, whose former engineers played a large part in the production of the new card but they will include a 5 in the model number. The 5 will indicate that the card is the fifth in the GF range. The first card to be released, although not officially announced, will be called the GeForce FX 5800 and will cost a mere USD 399, same as the GF 4 TI4600 did upon release. If you think that's high-end then consider the reved-up up version, the GeForce FX 5800 Ultra, which will set, its lucky owner, back USD 499.
nVidia with a Hint of 3DFX!
Quote:As far as speculation about the name is concerned, nVidia chose the FX as a tribute to 3DFX, whose former engineers played a large part in the production of the new card but they will include a 5 in the model number. The 5 will indicate that the card is the fifth in the GF range. The first card to be released, although not officially announced, will be called the GeForce FX 5800 and will cost a mere USD 399, same as the GF 4 TI4600 did upon release. If you think that's high-end then consider the reved-up up version, the GeForce FX 5800 Ultra, which will set, its lucky owner, back USD 499.
nVidia with a Hint of 3DFX!
The GFFX requires an external source of power, doesn't it? It's not hard to see where the 3Dfx influence comes in, is it?
Seriously though, with USD$399 minimum price tag (and a two year lead before the technology becomes even remotely useful) they can forget it. Anyone who buys one of those at that price is a retard.
Seriously though, with USD$399 minimum price tag (and a two year lead before the technology becomes even remotely useful) they can forget it. Anyone who buys one of those at that price is a retard.
Honestly I never liked ATI cards other than TV Tuner/Capture. Back in the day it was 3DFX, Nvidia and Matrox. I seems as though ATI is always plagued with some sort of prob. or incompat. with hardware/software etc. It's not good when you have the fastest GPU (currently) and the slowest driver/compatibility time tables. I was a die hard 3DFX fan, and with this newer GeForce coming out, I couldn't even ponder switching yet again.
Nvidia should just say f*ck it and try to purchase ATI.
Nvidia should just say f*ck it and try to purchase ATI.
Quote:Honestly I never liked ATI cards other than TV Tuner/Capture. Back in the day it was 3DFX, Nvidia and Matrox. I seems as though ATI is always plagued with some sort of prob. or incompat. with hardware/software etc. It's not good when you have the fastest GPU (currently) and the slowest driver/compatibility time tables. I was a die hard 3DFX fan, and with this newer GeForce coming out, I couldn't even ponder switching yet again.
Nvidia should just say f*ck it and try to purchase ATI.
This coming from an AMD/Via fan?
Nvidia should just say f*ck it and try to purchase ATI.
This coming from an AMD/Via fan?
Quote:This coming from an AMD/Via fan?
I got no problem with AMD...
But I had NO IDEA there was such a thing as a VIA fan... :x
I am going to wait for the 6 month refrsh of GeforceFX (NV35?), The refresh product is worth waiting an additional 6 months.
Hopefully with a 256 bit mem bus and 256 meg mem. i wonder how Nvidia will handle that with their crossbar memory controller... 32 bit x 8 or 64 bit x 4? I wanna see 256 bit memory with true ddr2 performance (4 reads per clock not 2 like the geforcefx uses)... 256 bit, 256 meg @ 2.0 ghz(effective). Memory bandwidth problems would be a NON-ISSUE for some time to come.
i am too lazy to do the math, but i know that equals god d@mn fast.
Don't get me wrong... geforce fx is very nice... but for $500+ i am gonna wait a little longer.
i wanna play doom 3 at 1600 x 1200, all bells and whistles on with 60+ fps.
i think (hope!) nv35 will do this.
I got no problem with AMD...
But I had NO IDEA there was such a thing as a VIA fan... :x
I am going to wait for the 6 month refrsh of GeforceFX (NV35?), The refresh product is worth waiting an additional 6 months.
Hopefully with a 256 bit mem bus and 256 meg mem. i wonder how Nvidia will handle that with their crossbar memory controller... 32 bit x 8 or 64 bit x 4? I wanna see 256 bit memory with true ddr2 performance (4 reads per clock not 2 like the geforcefx uses)... 256 bit, 256 meg @ 2.0 ghz(effective). Memory bandwidth problems would be a NON-ISSUE for some time to come.
i am too lazy to do the math, but i know that equals god d@mn fast.
Don't get me wrong... geforce fx is very nice... but for $500+ i am gonna wait a little longer.
i wanna play doom 3 at 1600 x 1200, all bells and whistles on with 60+ fps.
i think (hope!) nv35 will do this.
Quote:
What he said.
What he said.
Quote:Does that mean im a retard for buying a 9700pro ?
yes.
j/k
It's nice to have the fastest thing on the block, but I have a really hard time justifying much over $200 for any single component.
GFFX sounds cool, but the cooling requirements seem to be getting a little nuts. I haven't really read up on it, but I swear I've seen something about it needing a cooling solution like Abit's OTES cooler. If that's the case, that's gonna kinda suck for people that need every last available expansion slot.
I hope the cooling market gets some better solutions out there, as we're really gonna need it.
yes.
j/k
It's nice to have the fastest thing on the block, but I have a really hard time justifying much over $200 for any single component.
GFFX sounds cool, but the cooling requirements seem to be getting a little nuts. I haven't really read up on it, but I swear I've seen something about it needing a cooling solution like Abit's OTES cooler. If that's the case, that's gonna kinda suck for people that need every last available expansion slot.
I hope the cooling market gets some better solutions out there, as we're really gonna need it.
Quote:GFFX sounds cool, but the cooling requirements seem to be getting a little nuts. I haven't really read up on it, but I swear I've seen something about it needing a cooling solution like Abit's OTES cooler. If that's the case, that's gonna kinda suck for people that need every last available expansion slot.
Ya, it's definitely beefy.
Click for more picks and AnandTech's GeForce FX preview
Oh, and on the topic of 3dfx, here is the avatar I use on other sites. To bad this one doesn't have 'em. (it's an animated gif, may have to wait a few seconds for the 3dfx bit)
Ya, it's definitely beefy.
Click for more picks and AnandTech's GeForce FX preview
Oh, and on the topic of 3dfx, here is the avatar I use on other sites. To bad this one doesn't have 'em. (it's an animated gif, may have to wait a few seconds for the 3dfx bit)
Good god! That is huge!. I usually don't have enough PCI cards to necessitate using every PCI slot---and PCI 1 is usually left open...but still the loss of a PCI slot isn't terribly attractive. I hope nVidia will do something to deal with the heat better.
Generally, it's best to leave PCI1 unpopulated anyway, as the IRQ can be shared with the AGP slot.
Quote:Good god! That is huge!. I usually don't have enough PCI cards to necessitate using every PCI slot---and PCI 1 is usually left open...but still the loss of a PCI slot isn't terribly attractive. I hope nVidia will do something to deal with the heat better.
I'm waiting to see if they justify the ATX case and motherboard factors!!!!
I love ATI all-inWonder serries. recently ATI has been good with drivers and support. besides i could call them anytime during business day without paying a long-distance charge!!!
I have problems with AMD. somehow any AMD athlon older than a year needs a bigger heatsink and fan than what it already has. In my experience they just overheat by themselves. I'm happy with my P4 for now.
I'm waiting to see if they justify the ATX case and motherboard factors!!!!
I love ATI all-inWonder serries. recently ATI has been good with drivers and support. besides i could call them anytime during business day without paying a long-distance charge!!!
I have problems with AMD. somehow any AMD athlon older than a year needs a bigger heatsink and fan than what it already has. In my experience they just overheat by themselves. I'm happy with my P4 for now.
Quote:Quote:This cooling does have an advantage:
The GPU won't put as much heat into the case. Overclockers, rejoice!
the memory clock is 1000 Mhz and the GPU clock is 500MHz.
Question: why they don't just use a Pentium 3 CPU instead? it is going to be the same!!! may be even cheaper!!!
why would they do something as dumb as that? ;(
The GPU won't put as much heat into the case. Overclockers, rejoice!
the memory clock is 1000 Mhz and the GPU clock is 500MHz.
Question: why they don't just use a Pentium 3 CPU instead? it is going to be the same!!! may be even cheaper!!!
why would they do something as dumb as that? ;(