3DFX + GeForce = GeForce FX ?? Hmmm

So, how many of you out there noticed the new (old) tag Nvidia is sporting? Looks like the last 3DFX project hybrid with GeForce technology. I can't wait to get this card. Opinions Please. . . . .

Windows Hardware 9627 This topic was started by ,


data/avatar/default/avatar20.webp

645 Posts
Location -
Joined 2000-09-16
So, how many of you out there noticed the new (old) tag Nvidia is sporting? Looks like the last 3DFX project hybrid with GeForce technology. I can't wait to get this card.
 
Opinions Please....

Participate on our website and join the conversation

You have already an account on our website? Use the link below to login.
Login
Create a new user account. Registration is free and takes only a few seconds.
Register
This topic is archived. New comments cannot be posted and votes cannot be cast.

Responses to this topic


data/avatar/default/avatar12.webp

1915 Posts
Location -
Joined 2000-03-30
I'll buy one only if it comes in a 256MB version

data/avatar/default/avatar02.webp

238 Posts
Location -
Joined 2002-04-25
Ok, i saw this over at megagames:

Quote:As far as speculation about the name is concerned, nVidia chose the FX as a tribute to 3DFX, whose former engineers played a large part in the production of the new card but they will include a 5 in the model number. The 5 will indicate that the card is the fifth in the GF range. The first card to be released, although not officially announced, will be called the GeForce FX 5800 and will cost a mere USD 399, same as the GF 4 TI4600 did upon release. If you think that's high-end then consider the reved-up up version, the GeForce FX 5800 Ultra, which will set, its lucky owner, back USD 499.

 
nVidia with a Hint of 3DFX!

data/avatar/default/avatar32.webp

989 Posts
Location -
Joined 2001-08-14
The GFFX requires an external source of power, doesn't it? It's not hard to see where the 3Dfx influence comes in, is it?
 
Seriously though, with USD$399 minimum price tag (and a two year lead before the technology becomes even remotely useful) they can forget it. Anyone who buys one of those at that price is a retard.

data/avatar/default/avatar31.webp

1015 Posts
Location -
Joined 2001-06-29
Does that mean im a retard for buying a 9700pro ?

data/avatar/default/avatar20.webp

645 Posts
Location -
Joined 2000-09-16
OP
Honestly I never liked ATI cards other than TV Tuner/Capture. Back in the day it was 3DFX, Nvidia and Matrox. I seems as though ATI is always plagued with some sort of prob. or incompat. with hardware/software etc. It's not good when you have the fastest GPU (currently) and the slowest driver/compatibility time tables. I was a die hard 3DFX fan, and with this newer GeForce coming out, I couldn't even ponder switching yet again.
 
Nvidia should just say f*ck it and try to purchase ATI.

data/avatar/default/avatar12.webp

1915 Posts
Location -
Joined 2000-03-30
Quote:Honestly I never liked ATI cards other than TV Tuner/Capture. Back in the day it was 3DFX, Nvidia and Matrox. I seems as though ATI is always plagued with some sort of prob. or incompat. with hardware/software etc. It's not good when you have the fastest GPU (currently) and the slowest driver/compatibility time tables. I was a die hard 3DFX fan, and with this newer GeForce coming out, I couldn't even ponder switching yet again.

Nvidia should just say f*ck it and try to purchase ATI.

This coming from an AMD/Via fan?

data/avatar/default/avatar05.webp

163 Posts
Location -
Joined 2000-02-26
Quote:This coming from an AMD/Via fan?





I got no problem with AMD...

But I had NO IDEA there was such a thing as a VIA fan... :x

I am going to wait for the 6 month refrsh of GeforceFX (NV35?), The refresh product is worth waiting an additional 6 months.
Hopefully with a 256 bit mem bus and 256 meg mem. i wonder how Nvidia will handle that with their crossbar memory controller... 32 bit x 8 or 64 bit x 4? I wanna see 256 bit memory with true ddr2 performance (4 reads per clock not 2 like the geforcefx uses)... 256 bit, 256 meg @ 2.0 ghz(effective). Memory bandwidth problems would be a NON-ISSUE for some time to come.

i am too lazy to do the math, but i know that equals god d@mn fast.
Don't get me wrong... geforce fx is very nice... but for $500+ i am gonna wait a little longer.

i wanna play doom 3 at 1600 x 1200, all bells and whistles on with 60+ fps.

i think (hope!) nv35 will do this.

data/avatar/default/avatar05.webp

163 Posts
Location -
Joined 2000-02-26
Actually, i am kinda hoping nvidia purchases AMD...

data/avatar/default/avatar20.webp

645 Posts
Location -
Joined 2000-09-16
OP
No way man MACs rule all the way!!!! :x

data/avatar/default/avatar40.webp

3087 Posts
Location -
Joined 2001-01-21
Quote:Does that mean im a retard for buying a 9700pro ?

yes.


j/k

It's nice to have the fastest thing on the block, but I have a really hard time justifying much over $200 for any single component.

GFFX sounds cool, but the cooling requirements seem to be getting a little nuts. I haven't really read up on it, but I swear I've seen something about it needing a cooling solution like Abit's OTES cooler. If that's the case, that's gonna kinda suck for people that need every last available expansion slot.
I hope the cooling market gets some better solutions out there, as we're really gonna need it.

data/avatar/default/avatar18.webp

484 Posts
Location -
Joined 2001-10-23
Quote:GFFX sounds cool, but the cooling requirements seem to be getting a little nuts. I haven't really read up on it, but I swear I've seen something about it needing a cooling solution like Abit's OTES cooler. If that's the case, that's gonna kinda suck for people that need every last available expansion slot.
Ya, it's definitely beefy.

Click for more picks and AnandTech's GeForce FX preview


Oh, and on the topic of 3dfx, here is the avatar I use on other sites. To bad this one doesn't have 'em. (it's an animated gif, may have to wait a few seconds for the 3dfx bit)


data/avatar/default/avatar40.webp

3087 Posts
Location -
Joined 2001-01-21
Good god! That is huge!. I usually don't have enough PCI cards to necessitate using every PCI slot---and PCI 1 is usually left open...but still the loss of a PCI slot isn't terribly attractive. I hope nVidia will do something to deal with the heat better.

data/avatar/default/avatar35.webp

2172 Posts
Location -
Joined 2002-08-26
Generally, it's best to leave PCI1 unpopulated anyway, as the IRQ can be shared with the AGP slot.

data/avatar/default/avatar18.webp

484 Posts
Location -
Joined 2001-10-23
This cooling does have an advantage:
 
The GPU won't put as much heat into the case. Overclockers, rejoice!

data/avatar/default/avatar35.webp

316 Posts
Location -
Joined 2001-07-27
Quote:Good god! That is huge!. I usually don't have enough PCI cards to necessitate using every PCI slot---and PCI 1 is usually left open...but still the loss of a PCI slot isn't terribly attractive. I hope nVidia will do something to deal with the heat better.


I'm waiting to see if they justify the ATX case and motherboard factors!!!!

I love ATI all-inWonder serries. recently ATI has been good with drivers and support. besides i could call them anytime during business day without paying a long-distance charge!!!


I have problems with AMD. somehow any AMD athlon older than a year needs a bigger heatsink and fan than what it already has. In my experience they just overheat by themselves. I'm happy with my P4 for now.

data/avatar/default/avatar35.webp

316 Posts
Location -
Joined 2001-07-27
Quote:This cooling does have an advantage:

The GPU won't put as much heat into the case. Overclockers, rejoice!

the memory clock is 1000 Mhz and the GPU clock is 500MHz.

Question: why they don't just use a Pentium 3 CPU instead? it is going to be the same!!! may be even cheaper!!!

data/avatar/default/avatar31.webp

34 Posts
Location -
Joined 2002-04-25
Quote:Quote:This cooling does have an advantage:

The GPU won't put as much heat into the case. Overclockers, rejoice!

the memory clock is 1000 Mhz and the GPU clock is 500MHz.

Question: why they don't just use a Pentium 3 CPU instead? it is going to be the same!!! may be even cheaper!!!

why would they do something as dumb as that? ;(

data/avatar/default/avatar35.webp

316 Posts
Location -
Joined 2001-07-27
Quote:

why would they do something as dumb as that? ;(

because it is going to be the same way anyways. they want faster GPU's as they needed faster CPU's.
why not just modify the product that is already there instead of building a totally new one !!! why not? that is going to be a revelution!