Buying a new video card and.......
Hey all well i am finally going ALL out on a video card, Since i am down in Antigua, u can't get crap here,and i will be in toronto this saturday for a week, so grabbing a card up here, sick of this crap TNT2, can't even play spider man on it! either: It was between the G4 -4600 and the 8500 128mb ATI, but i just f ...
Hey all
well i am finally going ALL out on a video card, Since i am down in Antigua, u can't get crap here,and i will be in toronto this saturday for a week, so grabbing a card up here, sick of this crap TNT2, can't even play spider man on it! either:
It was between the G4 -4600 and the 8500 128mb ATI, but i just found out that ATI now has a 128mb All in wonder out!!!!!! (yes i will pay the price and i am not going down to a lower geforce or ati card )
Asus v8460 Ultra GeForce4 - 4600 w/ 128Mb DDR & TV-Out AGP Retail Box $624.99
ALL-IN-WONDER RADEON 8500 128MB AGP Eng/Fr$629 JUST FOUND THIS!!! , make my life harder!!!
[/list:u]
I have read mixed reviews, and that ATI is actually better in a few ways.. (that is the ATI 8500 128mb, not
the a-i-w, but they use the same GPU)
Now the new all in wonder 128mb card i have not read any reviews on yet, but i had an all in wonder 32mb card
before and loved it.the tuner, and stuff, remote control....
Some specs:
GeForce 4 Ti 4600 NVIDIA Reference v28.32 300/650 128Bit DDR
High speed 128MB DDR video memory with 10.4 GB/sec bandwidth
Uses "Immersion Gold Coating" for enhanced reliability and circuit connections
Special "Copper Fan Sink" cooling solution for superior overclocking and performance
DVI output for easy connectivity to digital flat-panel displays
Supports multiple displays
High quality TV output for big screen gaming or presentations
ALL-IN-WONDER RADEON 8500 128MB AGP
h**p://www.ati.com/products/pc/aiwradeon8500/images/broAIW8500.pdf
PIXEL TAPESTRY II technology increases memory bandwidth up to 11 GB/sec
Radio frequency remote control
and the usual and more All in wonder stuff.
Is 1g more a second a huge difference from the geforce?
On toms hardware on his VGA charts, the G4-4600 blows away the ATI 8500 128mb (not the all in wonder) in frame
rates, but on Jedi knight II they are almost the same, because the ATI has better integration with Direct X
8.1 and is already working towards direct X 9
Now, i can't find any real good detailed specs on the all-in wonder I plan on playing a fair amount of games,
as well as 3d GFX and movie watching / editing.. so i for sure am thinking the all in wonder , if ican get it
Anyways
what do u think, and cau u provide an more like details...technical specs, numbers etc
thnx for the input!.
well i am finally going ALL out on a video card, Since i am down in Antigua, u can't get crap here,and i will be in toronto this saturday for a week, so grabbing a card up here, sick of this crap TNT2, can't even play spider man on it! either:
It was between the G4 -4600 and the 8500 128mb ATI, but i just found out that ATI now has a 128mb All in wonder out!!!!!! (yes i will pay the price and i am not going down to a lower geforce or ati card )
- (canadian prices)
Asus v8460 Ultra GeForce4 - 4600 w/ 128Mb DDR & TV-Out AGP Retail Box $624.99
ALL-IN-WONDER RADEON 8500 128MB AGP Eng/Fr$629 JUST FOUND THIS!!! , make my life harder!!!
[/list:u]
I have read mixed reviews, and that ATI is actually better in a few ways.. (that is the ATI 8500 128mb, not
the a-i-w, but they use the same GPU)
Now the new all in wonder 128mb card i have not read any reviews on yet, but i had an all in wonder 32mb card
before and loved it.the tuner, and stuff, remote control....
Some specs:
GeForce 4 Ti 4600 NVIDIA Reference v28.32 300/650 128Bit DDR
High speed 128MB DDR video memory with 10.4 GB/sec bandwidth
Uses "Immersion Gold Coating" for enhanced reliability and circuit connections
Special "Copper Fan Sink" cooling solution for superior overclocking and performance
DVI output for easy connectivity to digital flat-panel displays
Supports multiple displays
High quality TV output for big screen gaming or presentations
ALL-IN-WONDER RADEON 8500 128MB AGP
h**p://www.ati.com/products/pc/aiwradeon8500/images/broAIW8500.pdf
PIXEL TAPESTRY II technology increases memory bandwidth up to 11 GB/sec
Radio frequency remote control
and the usual and more All in wonder stuff.
Is 1g more a second a huge difference from the geforce?
On toms hardware on his VGA charts, the G4-4600 blows away the ATI 8500 128mb (not the all in wonder) in frame
rates, but on Jedi knight II they are almost the same, because the ATI has better integration with Direct X
8.1 and is already working towards direct X 9
Now, i can't find any real good detailed specs on the all-in wonder I plan on playing a fair amount of games,
as well as 3d GFX and movie watching / editing.. so i for sure am thinking the all in wonder , if ican get it
Anyways
what do u think, and cau u provide an more like details...technical specs, numbers etc
thnx for the input!.
Participate on our website and join the conversation
This topic is archived. New comments cannot be posted and votes cannot be cast.
Responses to this topic
matrox is a piece of **** co..
they make dumb *** cards.
1) they do not support their cards!!!
all the people that forked out $500 bucks a few months ago, will be happy to know that Matrox will not give them drivers for win xp, not today and not tomorrow or ever.
this is how they operate. this is their style.
it will be the same for the next operating system, which is gona be a 64bit OS from MS. Do some research...
So if u actually knew how they operate and were one of the (all) the customers of theirs that they ****ed over without a flinch, u are a dumb *** monkey to want to buy anything from them again.
good luck
2) their drivers are so increadibly **** it is not even funny. those cards are a ***** to use.
3) the image clarity is hardly special! I know what i am talking about don't argue with me !! Buy a good monitor instead, if you think u can notice the image diff between a matrox millenium G400 and or 450 and a TNT1 original from a decent company such as Elsa for example, you r f'n bulshitting... bottom line, lets not exagerate here. Get back to reality.
4) Matrox cards are always overpiced. meaning performance/price is always one of the worst on the market, at least in past. Its all hype. when the G400 came out a lot of loosers that thought they knew something about something in regards to computers thought it was the new "PATH" the "light" or whatever.
I said from the start bullshit. That card aint ****, and nvidia is still the best and will continiue to be the best by far. look now, was i right? Damn straight i was. And u know what.,., nothing has changed.
matrox are good marketers and put on a good bulshit parade. they pay off the right people to make buzz, but they always disapoint and their products become substandard within a few months.
If u really want dual monitor get ati. Nvidia does not have dual monitor support. what they have is not the real thing, it is really ****.
Right now, there is nothing to rival the RADEON and RADEON DV. this is not due to raw performance, but also because of price, and features. The nvdia ards are way overpriced for no perf benefit at all worth mentioning, and have 0 features!! Ati is full featured,
-the best dual mon support in th industry
-best tv out
-best dvd hardware
-best vid cap for grafx card and prety much studio quality
i mean capture dvd res at 30fps. put this in perspective in u mind! that is power!!! real power, something u can hold in ur hand, not some bulshit 3fps in quakcke III extra.
It will be years before NVidia will come up with something like this in their cards that are triplle the price. but hey, nvidia targets kids, they r hardly smart enough to be able to leave the in crowd and join the people to start a new phase of what is cool. Ati is the really cool thing right now and all the real smart people that actually know what they r talking about and computers in general are switching to ATI for the 1st time because at this point in time they really r the best and it takes branes and research to recognize these things.
to the poeple that bought ATI in the past i say, what the hell were u thinking, how dumb. will they be good buys in future, prob not, but who cares we r living in the present.
i would be suprised if ati can hold the winning streak for long.
Ciao
they make dumb *** cards.
1) they do not support their cards!!!
all the people that forked out $500 bucks a few months ago, will be happy to know that Matrox will not give them drivers for win xp, not today and not tomorrow or ever.
this is how they operate. this is their style.
it will be the same for the next operating system, which is gona be a 64bit OS from MS. Do some research...
So if u actually knew how they operate and were one of the (all) the customers of theirs that they ****ed over without a flinch, u are a dumb *** monkey to want to buy anything from them again.
good luck
2) their drivers are so increadibly **** it is not even funny. those cards are a ***** to use.
3) the image clarity is hardly special! I know what i am talking about don't argue with me !! Buy a good monitor instead, if you think u can notice the image diff between a matrox millenium G400 and or 450 and a TNT1 original from a decent company such as Elsa for example, you r f'n bulshitting... bottom line, lets not exagerate here. Get back to reality.
4) Matrox cards are always overpiced. meaning performance/price is always one of the worst on the market, at least in past. Its all hype. when the G400 came out a lot of loosers that thought they knew something about something in regards to computers thought it was the new "PATH" the "light" or whatever.
I said from the start bullshit. That card aint ****, and nvidia is still the best and will continiue to be the best by far. look now, was i right? Damn straight i was. And u know what.,., nothing has changed.
matrox are good marketers and put on a good bulshit parade. they pay off the right people to make buzz, but they always disapoint and their products become substandard within a few months.
If u really want dual monitor get ati. Nvidia does not have dual monitor support. what they have is not the real thing, it is really ****.
Right now, there is nothing to rival the RADEON and RADEON DV. this is not due to raw performance, but also because of price, and features. The nvdia ards are way overpriced for no perf benefit at all worth mentioning, and have 0 features!! Ati is full featured,
-the best dual mon support in th industry
-best tv out
-best dvd hardware
-best vid cap for grafx card and prety much studio quality
i mean capture dvd res at 30fps. put this in perspective in u mind! that is power!!! real power, something u can hold in ur hand, not some bulshit 3fps in quakcke III extra.
It will be years before NVidia will come up with something like this in their cards that are triplle the price. but hey, nvidia targets kids, they r hardly smart enough to be able to leave the in crowd and join the people to start a new phase of what is cool. Ati is the really cool thing right now and all the real smart people that actually know what they r talking about and computers in general are switching to ATI for the 1st time because at this point in time they really r the best and it takes branes and research to recognize these things.
to the poeple that bought ATI in the past i say, what the hell were u thinking, how dumb. will they be good buys in future, prob not, but who cares we r living in the present.
i would be suprised if ati can hold the winning streak for long.
Ciao
There are some discussions on these boards where I can really enjoy getting involved.
Sometimes I'm all for the original post, other times against it.
I'm strongly anti software-theft and enjoy those threads too.
This thread has become one I'm enjoying because Matrox and their products are something I respect and know how good they are.
For that reason I am more than compelled to write a reply to the "trash" I've just read.
1) they do not support their cards!!!
This is totally rubbish, of course they support their cards.
Are you talking about the lack of WinXP driver support for the original Marvel range of cards?
WinXP was not even thought about when that card was released, it is fully supported by the OS's that were available at the time, future compatability is a risk you will always take when buying hardware.
Not one single item inside your PC is guaranteed to work with any future OS's.
ATI had the same problem, do you remember the Fury MAXX and the fact that it was and is unsupported under Win2k?
Matrox first released Win2k drivers some six months before the official Win2k release, these drivers were updated regularly and by the time Win2k was released, Matrox already had a mature driver.
That is what I call support, not the way most other companies responded with "We do not support beta operating systems".
it will be the same for the next operating system, which is gona be a 64bit OS from MS. Do some research...
So, we appear to have a seer on the boards, anything else you'd like to tell us about the future?
Six numbers between 1 & 49 for this Saturday would be cool.
As an MS beta tester, have been since Win95, I'm pretty clued up on the future MS products, but if you know something I don't then please tell.
So if u actually knew how they operate and were one of the (all) the customers of theirs that they ****ed over without a flinch, u are a dumb *** monkey to want to buy anything from them again.
I've owned three Matrox cards at home, Millenium, Mystique 220 & G400 MAX, so yes I am a Matrox customer.
I've bought at least 15 PC's over the past year all with Matrox G550 cards in, so even more so I am a Matrox customer.
You insulting me now?
I feel being a Matrox customer is no worse than being an NVidia or ATI drone.
2) their drivers are so increadibly **** it is not even funny. those cards are a ***** to use
As I said above, driver support from Matrox is excellent, they were the first to have Win2k drivers available for their range and a lot of companies copied them by releasing early WinXP drivers.
The cards are plug in and away you go, no more difficult than any other brand of graphics card out there, the above statement is BS.
3) the image clarity is hardly special! I know what i am talking about don't argue with me........
I will argue with you because once again you are tlaking complete and utter BS.
The image quality on a Matrox card cannot be touched, a three year old G400 still offers better image quality than a one month old GF4.
Sure NVidia have been attempting to sort their image quality issues for ages, sure there was an improvement with the GF3, the GF4 is marginly better, but still cannot touch a Matrox.
ATI is the closest you will get and they still cannot compare.
If you have a beat up old car with a dodgy engine, do you think getting all the dents knocked out and a respray will make the car go any better?
No!!
A good monitor does not make the image quality, it's the filters on the card itself, read around, nobody with even half a brain cell will compare NVidia image quality with that of NVidia.
Do not even start talking about comparing original TNT1 cards with a Matrox - you need your eyes testing, seriously.
4) Matrox cards are always overpiced....
And you feel that the latest and greates and top of the line GF4's are well priced then?
Every new card, every new technology will cost a lot in the beginning.
At the time of the G400's release it was the best card available, the best image quality, brilliant 3D speed and the first card ever to support hardware EMBM, of course you will pay for something that new and advanced.
When the new card is released it too will bbe expensive because it is going to contain features that nothing else on the market currently does.
Sure the G400 lost it's crown as the best card available, well news flash, every GF card you buy looses it's crown every six months when NVidia release their next chipset.
If u really want dual monitor get ati. Nvidia does not have dual monitor support. what they have is not the real thing, it is really ****.
No, if you want dual monitor support you will buy Matrox.
Matrox again lead the industry by being the first to release a dual port card to the masses.
Because they were a year ahead of everybody else their technology is superior to everyone else's.
All other companies will always be 12 months behind Matrox, ATI still are.
matrox are good marketers and put on a good bulshit parade. they pay off the right people to make buzz, but they always disapoint and their products become substandard within a few months
What rubbish you talk, every single Matrox card has more than delivered on it's promise.
The Mystique was untouchable when it was first released.
The G400 & G400MAX again were leaders in their field, they offered features you could not get on any current chipset from the likes of NVidia, ATI, S3 etc.
Of course their marketing is good because their teams have a lot of positive points to push.
Ati is the really cool thing right now and all the real smart people that actually know what they r talking about and computers in general are switching to ATI for the 1st time because at this point in time they really r the best and it takes branes and research to recognize these things.
I don't know which "people who know what they are talking about woth respect to computers" you are talking to, but it's certainly not the professionals.
ATI has been plagued with extremely bad driver support in the past and this has not changed with their current line of cards, some of the driver releases have offered lackluster performance at best.
People in the know are either using Matrox for workstations that do not require massive amounts of 3D power but demand excellent image quality & sound and stable drivers or NVidia for workstations where performance is paramount, currently there is no middle ground on image quality and performance, where performance raises quality drops, this will change with the G1000.
The bottom line here is that no Matrox product has ever failed.
They have all delivered exactly what they have promised and at the time of their release they are at the top of thier field.
They are not big enough to keep up with the 6 month product cycle that the likes of Nvidia & ATI use, however releasing a new product every six months is doing nothing but screwing their customer.
Please do not get me started on image quality.
I have done the tests, as a Systems Administrator part of my job is making sure we have the best combinations of hardware.
I have seen cards from ATI, NVidia & Matrox running on identical PC's and in turn all attached to the same make/model of monitor.
There is absolutely no comparision between these, Matrox come out very much on top, ATI follow behind and the bottom of the pile is NVidia.
I find the whole of your Matrox bashing amusing and unfounded.
I cannot find one statement you have made that is true with respect to Matrox products.
When the new card from Matrox is released, if it turns out to be technically inferior to other cards currently on the market yet at double the price of the others then I shall come back and appologise.
As this is not going to happen I just pity somebody who is so hell bent on brand loyalty and wont even look at what the competitors offer.
Sometimes I'm all for the original post, other times against it.
I'm strongly anti software-theft and enjoy those threads too.
This thread has become one I'm enjoying because Matrox and their products are something I respect and know how good they are.
For that reason I am more than compelled to write a reply to the "trash" I've just read.
1) they do not support their cards!!!
This is totally rubbish, of course they support their cards.
Are you talking about the lack of WinXP driver support for the original Marvel range of cards?
WinXP was not even thought about when that card was released, it is fully supported by the OS's that were available at the time, future compatability is a risk you will always take when buying hardware.
Not one single item inside your PC is guaranteed to work with any future OS's.
ATI had the same problem, do you remember the Fury MAXX and the fact that it was and is unsupported under Win2k?
Matrox first released Win2k drivers some six months before the official Win2k release, these drivers were updated regularly and by the time Win2k was released, Matrox already had a mature driver.
That is what I call support, not the way most other companies responded with "We do not support beta operating systems".
it will be the same for the next operating system, which is gona be a 64bit OS from MS. Do some research...
So, we appear to have a seer on the boards, anything else you'd like to tell us about the future?
Six numbers between 1 & 49 for this Saturday would be cool.
As an MS beta tester, have been since Win95, I'm pretty clued up on the future MS products, but if you know something I don't then please tell.
So if u actually knew how they operate and were one of the (all) the customers of theirs that they ****ed over without a flinch, u are a dumb *** monkey to want to buy anything from them again.
I've owned three Matrox cards at home, Millenium, Mystique 220 & G400 MAX, so yes I am a Matrox customer.
I've bought at least 15 PC's over the past year all with Matrox G550 cards in, so even more so I am a Matrox customer.
You insulting me now?
I feel being a Matrox customer is no worse than being an NVidia or ATI drone.
2) their drivers are so increadibly **** it is not even funny. those cards are a ***** to use
As I said above, driver support from Matrox is excellent, they were the first to have Win2k drivers available for their range and a lot of companies copied them by releasing early WinXP drivers.
The cards are plug in and away you go, no more difficult than any other brand of graphics card out there, the above statement is BS.
3) the image clarity is hardly special! I know what i am talking about don't argue with me........
I will argue with you because once again you are tlaking complete and utter BS.
The image quality on a Matrox card cannot be touched, a three year old G400 still offers better image quality than a one month old GF4.
Sure NVidia have been attempting to sort their image quality issues for ages, sure there was an improvement with the GF3, the GF4 is marginly better, but still cannot touch a Matrox.
ATI is the closest you will get and they still cannot compare.
If you have a beat up old car with a dodgy engine, do you think getting all the dents knocked out and a respray will make the car go any better?
No!!
A good monitor does not make the image quality, it's the filters on the card itself, read around, nobody with even half a brain cell will compare NVidia image quality with that of NVidia.
Do not even start talking about comparing original TNT1 cards with a Matrox - you need your eyes testing, seriously.
4) Matrox cards are always overpiced....
And you feel that the latest and greates and top of the line GF4's are well priced then?
Every new card, every new technology will cost a lot in the beginning.
At the time of the G400's release it was the best card available, the best image quality, brilliant 3D speed and the first card ever to support hardware EMBM, of course you will pay for something that new and advanced.
When the new card is released it too will bbe expensive because it is going to contain features that nothing else on the market currently does.
Sure the G400 lost it's crown as the best card available, well news flash, every GF card you buy looses it's crown every six months when NVidia release their next chipset.
If u really want dual monitor get ati. Nvidia does not have dual monitor support. what they have is not the real thing, it is really ****.
No, if you want dual monitor support you will buy Matrox.
Matrox again lead the industry by being the first to release a dual port card to the masses.
Because they were a year ahead of everybody else their technology is superior to everyone else's.
All other companies will always be 12 months behind Matrox, ATI still are.
matrox are good marketers and put on a good bulshit parade. they pay off the right people to make buzz, but they always disapoint and their products become substandard within a few months
What rubbish you talk, every single Matrox card has more than delivered on it's promise.
The Mystique was untouchable when it was first released.
The G400 & G400MAX again were leaders in their field, they offered features you could not get on any current chipset from the likes of NVidia, ATI, S3 etc.
Of course their marketing is good because their teams have a lot of positive points to push.
Ati is the really cool thing right now and all the real smart people that actually know what they r talking about and computers in general are switching to ATI for the 1st time because at this point in time they really r the best and it takes branes and research to recognize these things.
I don't know which "people who know what they are talking about woth respect to computers" you are talking to, but it's certainly not the professionals.
ATI has been plagued with extremely bad driver support in the past and this has not changed with their current line of cards, some of the driver releases have offered lackluster performance at best.
People in the know are either using Matrox for workstations that do not require massive amounts of 3D power but demand excellent image quality & sound and stable drivers or NVidia for workstations where performance is paramount, currently there is no middle ground on image quality and performance, where performance raises quality drops, this will change with the G1000.
The bottom line here is that no Matrox product has ever failed.
They have all delivered exactly what they have promised and at the time of their release they are at the top of thier field.
They are not big enough to keep up with the 6 month product cycle that the likes of Nvidia & ATI use, however releasing a new product every six months is doing nothing but screwing their customer.
Please do not get me started on image quality.
I have done the tests, as a Systems Administrator part of my job is making sure we have the best combinations of hardware.
I have seen cards from ATI, NVidia & Matrox running on identical PC's and in turn all attached to the same make/model of monitor.
There is absolutely no comparision between these, Matrox come out very much on top, ATI follow behind and the bottom of the pile is NVidia.
I find the whole of your Matrox bashing amusing and unfounded.
I cannot find one statement you have made that is true with respect to Matrox products.
When the new card from Matrox is released, if it turns out to be technically inferior to other cards currently on the market yet at double the price of the others then I shall come back and appologise.
As this is not going to happen I just pity somebody who is so hell bent on brand loyalty and wont even look at what the competitors offer.
man, *cought* the rumble is on!
`````````````````````First,the driver situation;
point a)
I have the matrox G400 MARVEL. To those who don't know, ...most people reading this thread, this matrox card cost 2x as much as the one BladeRunner has.
You tell me that they have good driver support? Are you nuts? no offense
For even the 450 and so on Matrox has officially said that those cards ill never be supported by Windows XP. If you spent $600 bucks on vid card a 4months to 1 1/2 years ago and they tell u sorry, if u want to use Win XP you can't use your card that is HARDLY good driver support. I would like to hear your arguments explaining how this is a responsible policy and how this shows god driver support...
This is as bad as u can get!!
point
A big reason I do not like Matrox is that they design the driver interface in windows very thoughtleslly. It is messy and hard to use. A real pain in arse. The ATI has the best design bay far and is by FAAAAAAAAAAAAR the easiest to use for dual monitor support. Also you extreamily easily control what u see on each monitor.
On the matrox card it is so increadibly coplicated to use and control, and so toublesome that no one uses feature.
```````````````````````````````Regarding Image
I am absolutely shocked that you said getting a good monitor has nothing to do with image clarity. i am ab so f'n lutely shocked.
Image quality is 99.9999999999999999999999999999999999999999999% result of the monitor in question. You can have superman 9811 video card but if u use sub par monitor it is all for nothing. I recomend invest at least $1000 dollars in a decent monitor and THEN, get back to me. And i do not mean LCD, but a real monitor... quality CRT.
Once u do that, u can do a little test. ...very much as i have. I had an Esla TNT1 16MB and i got tel you that there was absolutely no image quality diff between that card and the G400 marvell that cost 5x as much. actually almost 6x as much.
that $400 dollars extra could have been used to buy a $800 monitor insead of a $400 monitor and you would have increased your image quality tenfold if not more. That is the real issue. If u do not understand this point i suggest do not make buying recomendations to others as u r cluless. I would like to hear your comments regarding this point in detail so we can all see how what i said above is not true.
(( in fact as u have stated above, a $279 monitor coupled with a G400 or G500 is superiror image quality compared to a TNT1 ""ELSA"" and a super clear high rest low dpi high Q monitor for lets say $800 )) ...this in fact is what u claim, but some people, namely "m4carbine" fail to understand. Instead they just look at you POST COUNT and say yep yep yep to everything u say.
Please address this issue and clarify your statement.
The bottom line is that the image qual diff is so small that to the naked eye it is totally of no issue at all. It is all about the monitor quality.
..............................regarding pricing
If you have read my post carefully u would know that i also agreed that the 4600 is overpriced so do not use this against me.
Please list the features that the Matrox cards have SPECIFICALLY and the PRICE
Then after u do that I will list the features the ATI radeon has and the PRICE
THis will settle quite nicelly our arguments regarding this issue... There is not much to debate here. Obviously i am right. But if u do not give in, please i challenge u to list those features. We can settle this issue very fast without any missunderstandings...
................................regarding best company
I never said ATI is a great company. I do not belive so. I also do not like Matrox. Nvidia in fact is the best company no question about it. If you buy card today u can be asssured they will suport you for the next 6 years no matter wha with new drivers.
They r still coming out with TNT1 driver updates almost monthly. Fantastic support. Just great. They r a model to emulate for the industry. Hopefully other co's will have such support in future. Not just give me your money (huge premium price at that) and then tell you to ****-off a year later as Matrox did with their customers.
............................................finally
I would like to appoligize to those that I may have insulted in previous post buy saying u were dumb to buy ati cards in past. I meant it as a joke. Fortunately BladeRunner seems to have a good sense of humor but others may have taken it seriously.
I was uncalled for, i didn't need to be so rude so i am sorry.
I think it was a mistake to buy an ATI card in the past as they have never made a great card for its time considering alternatives.
matrox however for certain people in the past was the only choice. So it was not a mistake. if u wanted qality video in and such feauter Matrox was the only choice basically.
The point is we should not live in th past. ATI has out done themselves this time arround and that is all i am saying. they r really good in this point in time today. By fa the best deal, packed with crazy perf/features that can't be matched and nothing comes even close at its price point.
`````````````````````First,the driver situation;
point a)
I have the matrox G400 MARVEL. To those who don't know, ...most people reading this thread, this matrox card cost 2x as much as the one BladeRunner has.
You tell me that they have good driver support? Are you nuts? no offense
For even the 450 and so on Matrox has officially said that those cards ill never be supported by Windows XP. If you spent $600 bucks on vid card a 4months to 1 1/2 years ago and they tell u sorry, if u want to use Win XP you can't use your card that is HARDLY good driver support. I would like to hear your arguments explaining how this is a responsible policy and how this shows god driver support...
This is as bad as u can get!!
point
A big reason I do not like Matrox is that they design the driver interface in windows very thoughtleslly. It is messy and hard to use. A real pain in arse. The ATI has the best design bay far and is by FAAAAAAAAAAAAR the easiest to use for dual monitor support. Also you extreamily easily control what u see on each monitor.
On the matrox card it is so increadibly coplicated to use and control, and so toublesome that no one uses feature.
```````````````````````````````Regarding Image
I am absolutely shocked that you said getting a good monitor has nothing to do with image clarity. i am ab so f'n lutely shocked.
Image quality is 99.9999999999999999999999999999999999999999999% result of the monitor in question. You can have superman 9811 video card but if u use sub par monitor it is all for nothing. I recomend invest at least $1000 dollars in a decent monitor and THEN, get back to me. And i do not mean LCD, but a real monitor... quality CRT.
Once u do that, u can do a little test. ...very much as i have. I had an Esla TNT1 16MB and i got tel you that there was absolutely no image quality diff between that card and the G400 marvell that cost 5x as much. actually almost 6x as much.
that $400 dollars extra could have been used to buy a $800 monitor insead of a $400 monitor and you would have increased your image quality tenfold if not more. That is the real issue. If u do not understand this point i suggest do not make buying recomendations to others as u r cluless. I would like to hear your comments regarding this point in detail so we can all see how what i said above is not true.
(( in fact as u have stated above, a $279 monitor coupled with a G400 or G500 is superiror image quality compared to a TNT1 ""ELSA"" and a super clear high rest low dpi high Q monitor for lets say $800 )) ...this in fact is what u claim, but some people, namely "m4carbine" fail to understand. Instead they just look at you POST COUNT and say yep yep yep to everything u say.
Please address this issue and clarify your statement.
The bottom line is that the image qual diff is so small that to the naked eye it is totally of no issue at all. It is all about the monitor quality.
..............................regarding pricing
If you have read my post carefully u would know that i also agreed that the 4600 is overpriced so do not use this against me.
Please list the features that the Matrox cards have SPECIFICALLY and the PRICE
Then after u do that I will list the features the ATI radeon has and the PRICE
THis will settle quite nicelly our arguments regarding this issue... There is not much to debate here. Obviously i am right. But if u do not give in, please i challenge u to list those features. We can settle this issue very fast without any missunderstandings...
................................regarding best company
I never said ATI is a great company. I do not belive so. I also do not like Matrox. Nvidia in fact is the best company no question about it. If you buy card today u can be asssured they will suport you for the next 6 years no matter wha with new drivers.
They r still coming out with TNT1 driver updates almost monthly. Fantastic support. Just great. They r a model to emulate for the industry. Hopefully other co's will have such support in future. Not just give me your money (huge premium price at that) and then tell you to ****-off a year later as Matrox did with their customers.
............................................finally
I would like to appoligize to those that I may have insulted in previous post buy saying u were dumb to buy ati cards in past. I meant it as a joke. Fortunately BladeRunner seems to have a good sense of humor but others may have taken it seriously.
I was uncalled for, i didn't need to be so rude so i am sorry.
I think it was a mistake to buy an ATI card in the past as they have never made a great card for its time considering alternatives.
matrox however for certain people in the past was the only choice. So it was not a mistake. if u wanted qality video in and such feauter Matrox was the only choice basically.
The point is we should not live in th past. ATI has out done themselves this time arround and that is all i am saying. they r really good in this point in time today. By fa the best deal, packed with crazy perf/features that can't be matched and nothing comes even close at its price point.
Right, first back to the driver support.
I shall start by not talking about the Marvel, I shall move onto that a little bit later.
Lets first post a link to the page containg Matrox's latest driver releases:
http://www.matrox.com/mga/support/drivers/latest/home.cfm
Lets take a look under the WinXP column and see which of Matrox's cards have had specific WinXP drivers released for them.
G550, G450 eTV, G450, G400, G400-TV, G400MAX, G200 MMS, G200 with TV Tuner, G200, Mystique G200, MGA G200, Productiva G100 Range, Millennium II, Mystique 220, Mystique & Millennium.
We are talking about a lot of old cards here having specific WinXP support, some of these cards are ancient, the Mystique for example was originally a 2MB PCI card.
Don't you think that the above link kind of puts to shame the following statement:
For even the 450 and so on Matrox has officially said that those cards ill never be supported by Windows XP?
Now the Marvel G400 was a different animal, Matrox addmitted really early on that it was unlikely that they would be able to get full driver support or video tools available for this card sorted out.
Why?
I really do not know, I am not a driver writer and I don't work for Matrox but there has to be a reason behind it.
Companies do not just decide to discontinue support of a device for no reason unless the card is coming towards the end of it's life, something the Marvel G400 isn't doing.
Maybe that question would be best aimed at Matrox directly.
However, if I remember correctly Matrox did offer a very generous rebate on the purchase of a Marvel G450 for Marvel G400 owners, I cannot remember the full amount, but it wasn't too bad at all.
A big reason I do not like Matrox is that they design the driver interface in windows very thoughtleslly
It still amazes me when somebody clicks "no" when closing a Word document when asked "DO you wich to save changes?" and then moan because they did want to save them.
It still amazes me when people find their PC's riddled with virus's because they love to open e-mail's from an unknown source.
The point I'm trying to get to here is that there really is no accounting for how people will cope with things on a PC.
I personally have never spoken to a Matrox owner who has found the dual-head software difficult and awkward to use.
Sure, initially it can look confusing as there are just so many more options available to a Matrox owner than there is to an ATI or NVidia one.
Maybe Matrox should release a "Simple version" driver release for their dual-head cards for those who find the current ones a little bit difficult.
am absolutely shocked that you said getting a good monitor has nothing to do with image clarity...........
I totally stand by this statement.
In fact it has totaly shocked me to think that anybody feels that the monitor is what makes all the difference with respect to image quality.
Actually, the more I think about this the more my head starts to hurt, the monitor making all the difference?
So then why don't all these professional workstations have 8MB ATI cards costing about £25 each and then having really expensive monitors?
Why, because the thought that the monitor is what makes 99.9% of the difference is just crazy!!
Now, you seem to be questioning my testing this, which is fair enough, you're call, we live in a world where proof is always required.
At work I'm more or less exclusively using Iiyama monitors.
The Iiyama units offer great overall build quality, their support is great and touch wood, I've hardly seen any problem units.
these range from 17" basic units, through the Pro models, then move onto 19" both standard and Pro and finish off with 22" units.
The cards that I've used to drive these monitors have ranged from onboard Intel, through cheap ATI's, then Matrox G400's, G450's & G550's.
Also NVidia GF2's, GF3's, one that originally had a GF4 and CAD stations with Quadro Pro's.
I was first alerted to how poor the NVidia image quality is when some users were upgraded to new PC's.
They did have P2 systems with G400's in and they were upgrade to P3's with GF2 cards in.
A few support calls to me about how bland the screen was looking, slight flicker in the corners of the display, fuzy lines and the like.
I went to see the displays and to be honest couldn't see anymajor problems.
To please the users we set things up in the labs, we put PC's next to each other using identical model monitors, one system with a G400 in, another with a GF2 and connected them both to a 22" Iiyama.
Once the two were next to each other there was no comparison, it was like night and day, like black and white.
The Matrox based system was giving a clear, sharp image from corner to corner.
All graphics were sharp, nothing was blurred, no fuzzy lines, as near perfect as could be.
A quick swap of the two monitors showed that neither monitor was faulty at all, it was purely the GF2's lack of decent filters.
The issue was further shown with the purchase of a whole batch of PC's containing GF3 cards.
The non-Matrox users who were upgraded were happy, the Matrox users demanded that I took a look at things myself.
End result, we sent all the GF3's back and had them replaced with G550's - users are now happy to look at their screens all day.
I am not in any way saying that the monitor itself does not make a difference, saying that is as crazy as saying that the monitor makes up 99.9% of the image quality.
However what I am saying is that taking a monitor and graphic card combination, the card plays at the very least 50% of the overall quality.
Just do a search on your favourite search engine for "Image Quality" or "2D Image Quality" and things like that, you will get thousands of pages listed that will all tell you that Matrox are still unbeatable with respect to these issues.
The truth is, as you increase resolution on non-Matrox cards the image quality becomes noticeably worse, this is not the case on Matrox cards.
You want to use 800x600 or even 1024x768 as bases for this argument, then true the differences are not as noticeable, once we hit 1280x1024 and above the Matrox, even many years old is untouchable.
Please list the features that the Matrox cards have SPECIFICALLY and the PRICE
Not sure what to tell you here, the G1000/Parhelia is not going to be announced until next week, as Matrox like to keep things secret until the actual announcement I can niehter give you specifications nor price.
Taking the G550, well we have 32MB DDR RAM, 360Mhz RAMDAC.....and one of these cards would set you back about £89.
However the point here is that £89 for the worlds best image quality output is what myself and a lot of computer people are willing to pay.
It would appear that the bottom line here is that you are a pissed off Marvel G400 owner.
That is the whole basis for you thinking Matrox are a buch of wankers, you bought a card that was quite capable of running under all currently available OS's yet wasn't able to run under a then unannounced one.
At least the G400 Marvel can still be used as a basic graphics card under WinXP, something the ATI Fury MAXX users were not able to do under Win2k.
You have obviously not done image quality test, either professionally or with the naked eye, if you had then you would be agreeing with 99% of all the professional testing sites / tech web sites that will all tell you that the Matrox image quality is still unbeatable.
I shall start by not talking about the Marvel, I shall move onto that a little bit later.
Lets first post a link to the page containg Matrox's latest driver releases:
http://www.matrox.com/mga/support/drivers/latest/home.cfm
Lets take a look under the WinXP column and see which of Matrox's cards have had specific WinXP drivers released for them.
G550, G450 eTV, G450, G400, G400-TV, G400MAX, G200 MMS, G200 with TV Tuner, G200, Mystique G200, MGA G200, Productiva G100 Range, Millennium II, Mystique 220, Mystique & Millennium.
We are talking about a lot of old cards here having specific WinXP support, some of these cards are ancient, the Mystique for example was originally a 2MB PCI card.
Don't you think that the above link kind of puts to shame the following statement:
For even the 450 and so on Matrox has officially said that those cards ill never be supported by Windows XP?
Now the Marvel G400 was a different animal, Matrox addmitted really early on that it was unlikely that they would be able to get full driver support or video tools available for this card sorted out.
Why?
I really do not know, I am not a driver writer and I don't work for Matrox but there has to be a reason behind it.
Companies do not just decide to discontinue support of a device for no reason unless the card is coming towards the end of it's life, something the Marvel G400 isn't doing.
Maybe that question would be best aimed at Matrox directly.
However, if I remember correctly Matrox did offer a very generous rebate on the purchase of a Marvel G450 for Marvel G400 owners, I cannot remember the full amount, but it wasn't too bad at all.
A big reason I do not like Matrox is that they design the driver interface in windows very thoughtleslly
It still amazes me when somebody clicks "no" when closing a Word document when asked "DO you wich to save changes?" and then moan because they did want to save them.
It still amazes me when people find their PC's riddled with virus's because they love to open e-mail's from an unknown source.
The point I'm trying to get to here is that there really is no accounting for how people will cope with things on a PC.
I personally have never spoken to a Matrox owner who has found the dual-head software difficult and awkward to use.
Sure, initially it can look confusing as there are just so many more options available to a Matrox owner than there is to an ATI or NVidia one.
Maybe Matrox should release a "Simple version" driver release for their dual-head cards for those who find the current ones a little bit difficult.
am absolutely shocked that you said getting a good monitor has nothing to do with image clarity...........
I totally stand by this statement.
In fact it has totaly shocked me to think that anybody feels that the monitor is what makes all the difference with respect to image quality.
Actually, the more I think about this the more my head starts to hurt, the monitor making all the difference?
So then why don't all these professional workstations have 8MB ATI cards costing about £25 each and then having really expensive monitors?
Why, because the thought that the monitor is what makes 99.9% of the difference is just crazy!!
Now, you seem to be questioning my testing this, which is fair enough, you're call, we live in a world where proof is always required.
At work I'm more or less exclusively using Iiyama monitors.
The Iiyama units offer great overall build quality, their support is great and touch wood, I've hardly seen any problem units.
these range from 17" basic units, through the Pro models, then move onto 19" both standard and Pro and finish off with 22" units.
The cards that I've used to drive these monitors have ranged from onboard Intel, through cheap ATI's, then Matrox G400's, G450's & G550's.
Also NVidia GF2's, GF3's, one that originally had a GF4 and CAD stations with Quadro Pro's.
I was first alerted to how poor the NVidia image quality is when some users were upgraded to new PC's.
They did have P2 systems with G400's in and they were upgrade to P3's with GF2 cards in.
A few support calls to me about how bland the screen was looking, slight flicker in the corners of the display, fuzy lines and the like.
I went to see the displays and to be honest couldn't see anymajor problems.
To please the users we set things up in the labs, we put PC's next to each other using identical model monitors, one system with a G400 in, another with a GF2 and connected them both to a 22" Iiyama.
Once the two were next to each other there was no comparison, it was like night and day, like black and white.
The Matrox based system was giving a clear, sharp image from corner to corner.
All graphics were sharp, nothing was blurred, no fuzzy lines, as near perfect as could be.
A quick swap of the two monitors showed that neither monitor was faulty at all, it was purely the GF2's lack of decent filters.
The issue was further shown with the purchase of a whole batch of PC's containing GF3 cards.
The non-Matrox users who were upgraded were happy, the Matrox users demanded that I took a look at things myself.
End result, we sent all the GF3's back and had them replaced with G550's - users are now happy to look at their screens all day.
I am not in any way saying that the monitor itself does not make a difference, saying that is as crazy as saying that the monitor makes up 99.9% of the image quality.
However what I am saying is that taking a monitor and graphic card combination, the card plays at the very least 50% of the overall quality.
Just do a search on your favourite search engine for "Image Quality" or "2D Image Quality" and things like that, you will get thousands of pages listed that will all tell you that Matrox are still unbeatable with respect to these issues.
The truth is, as you increase resolution on non-Matrox cards the image quality becomes noticeably worse, this is not the case on Matrox cards.
You want to use 800x600 or even 1024x768 as bases for this argument, then true the differences are not as noticeable, once we hit 1280x1024 and above the Matrox, even many years old is untouchable.
Please list the features that the Matrox cards have SPECIFICALLY and the PRICE
Not sure what to tell you here, the G1000/Parhelia is not going to be announced until next week, as Matrox like to keep things secret until the actual announcement I can niehter give you specifications nor price.
Taking the G550, well we have 32MB DDR RAM, 360Mhz RAMDAC.....and one of these cards would set you back about £89.
However the point here is that £89 for the worlds best image quality output is what myself and a lot of computer people are willing to pay.
It would appear that the bottom line here is that you are a pissed off Marvel G400 owner.
That is the whole basis for you thinking Matrox are a buch of wankers, you bought a card that was quite capable of running under all currently available OS's yet wasn't able to run under a then unannounced one.
At least the G400 Marvel can still be used as a basic graphics card under WinXP, something the ATI Fury MAXX users were not able to do under Win2k.
You have obviously not done image quality test, either professionally or with the naked eye, if you had then you would be agreeing with 99% of all the professional testing sites / tech web sites that will all tell you that the Matrox image quality is still unbeatable.
OK BladeRunner,
good post. Good comments and a nice comeback.
-------------------------------------------------------------------
Now the Marvel G400 was a different animal, Matrox addmitted really early on that it was unlikely that they would be able to get full driver support or video tools available for this card sorted out.
Why?
I really do not know, I am not a driver writer and I don't work for Matrox but there has to be a reason behind it.
Companies do not just decide to discontinue support of a device for no reason unless the card is coming towards the end of it's life, something the Marvel G400 isn't doing.
Maybe that question would be best aimed at Matrox directly.
However, if I remember correctly Matrox did offer a very generous rebate on the purchase of a Marvel G450 for Marvel G400 owners, I cannot remember the full amount, but it wasn't too bad at all.
---------------------------------------------------------------------
This is what i am talking about^^... The vid capture and all that jazz - the"marvel" part of the G400 Marvell does not work.
^not true either. There is a work arround however. you can buy 3rd party programs... So I am not realy pissed off. Just not impressed at all. They could easily fix it. Its a software thing not hardware.
Next.------------------------Price-------------------------------
PART A)
My CaRD
http://www.matrox.com/mga/products/pricing/home.cfm
please note that the G400 Marvell is still their top product and also the most expensive one to this day.
PART
http://www.matrox.com/mga/products/marv_g450_etv/home.cfm
-Matrox G450-eTV chip
-32 MB(DDR)
-AGP 4x
-""High-quality"" DVD and video playback
-TV tuner with Personal Video Recorder
-capture video at 320x240 VCD resolution ONLY
-S-video and composite video input
-Timeshifting with picture-in-picture
-DX7
-UltraSharp 360 MHz RAMDAC
- and
-DVI output
-TV output
-for games (roughly equal to a geforce1MX)
PRICE===============================>>230
ATI READEON 8500 DV
http://www.ati.com/products/pc/aiwradeon8500dv/index.html
-it has everything that the Matrox has above and then some
-stereo tv tuner
-remore!!!
-firewire ports
-time shifting
-itegrated interactive program guide
-64 to 128mb DDR
-HYDRAVISION (dual head)
-""highest quality"" dvd playback in "industry" !!
-capture digital video at 720x480 @30fps (DVD quality) !!!!!!!!!!!!!!!
Awesome 3D gaming and graphics
Powered by the revolutionary RADEON 8500 GPU and 64MB DDR memory for the most advanced 3D graphics in its class
ATI's innovative TRUFORM, SMARTSHADER and SMOOTHVISION technologies make 3D characters and objects more realistic
ATI's HYPER Z II technology conserves memory bandwidth for improved performance in demanding applications
ATI's latest 3D rendering technologies, CHARISMA ENGINE II and PIXEL TAPESTRY II, power incredible 3D processing capabilities leading to unbelievable graphics quality
(in short equal to Geforce 3 ti500)
and more!!!
PRICE================================>>>$180
hmmm.
i don't want to argue. i think we did a good job. poeple should be able to make a prety good decision here based on our little battle. I guess Matrox cards could have better image but to be perfecly honest I personally can't see any difference. Some popele are more sensitive than others i guess for these things.
good post. Good comments and a nice comeback.
-------------------------------------------------------------------
Now the Marvel G400 was a different animal, Matrox addmitted really early on that it was unlikely that they would be able to get full driver support or video tools available for this card sorted out.
Why?
I really do not know, I am not a driver writer and I don't work for Matrox but there has to be a reason behind it.
Companies do not just decide to discontinue support of a device for no reason unless the card is coming towards the end of it's life, something the Marvel G400 isn't doing.
Maybe that question would be best aimed at Matrox directly.
However, if I remember correctly Matrox did offer a very generous rebate on the purchase of a Marvel G450 for Marvel G400 owners, I cannot remember the full amount, but it wasn't too bad at all.
---------------------------------------------------------------------
This is what i am talking about^^... The vid capture and all that jazz - the"marvel" part of the G400 Marvell does not work.
^not true either. There is a work arround however. you can buy 3rd party programs... So I am not realy pissed off. Just not impressed at all. They could easily fix it. Its a software thing not hardware.
Next.------------------------Price-------------------------------
PART A)
My CaRD
http://www.matrox.com/mga/products/pricing/home.cfm
please note that the G400 Marvell is still their top product and also the most expensive one to this day.
PART
http://www.matrox.com/mga/products/marv_g450_etv/home.cfm
-Matrox G450-eTV chip
-32 MB(DDR)
-AGP 4x
-""High-quality"" DVD and video playback
-TV tuner with Personal Video Recorder
-capture video at 320x240 VCD resolution ONLY
-S-video and composite video input
-Timeshifting with picture-in-picture
-DX7
-UltraSharp 360 MHz RAMDAC
- and
-DVI output
-TV output
-for games (roughly equal to a geforce1MX)
PRICE===============================>>230
ATI READEON 8500 DV
http://www.ati.com/products/pc/aiwradeon8500dv/index.html
-it has everything that the Matrox has above and then some
-stereo tv tuner
-remore!!!
-firewire ports
-time shifting
-itegrated interactive program guide
-64 to 128mb DDR
-HYDRAVISION (dual head)
-""highest quality"" dvd playback in "industry" !!
-capture digital video at 720x480 @30fps (DVD quality) !!!!!!!!!!!!!!!
Awesome 3D gaming and graphics
Powered by the revolutionary RADEON 8500 GPU and 64MB DDR memory for the most advanced 3D graphics in its class
ATI's innovative TRUFORM, SMARTSHADER and SMOOTHVISION technologies make 3D characters and objects more realistic
ATI's HYPER Z II technology conserves memory bandwidth for improved performance in demanding applications
ATI's latest 3D rendering technologies, CHARISMA ENGINE II and PIXEL TAPESTRY II, power incredible 3D processing capabilities leading to unbelievable graphics quality
(in short equal to Geforce 3 ti500)
and more!!!
PRICE================================>>>$180
hmmm.
i don't want to argue. i think we did a good job. poeple should be able to make a prety good decision here based on our little battle. I guess Matrox cards could have better image but to be perfecly honest I personally can't see any difference. Some popele are more sensitive than others i guess for these things.
Quote:Originally posted by plato
Powered by the revolutionary RADEON 8500 GPU and 64MB DDR memory for the most advanced 3D graphics in its class
ATI's innovative TRUFORM, SMARTSHADER and SMOOTHVISION technologies make 3D characters and objects more realistic
ATI's HYPER Z II technology conserves memory bandwidth for improved performance in demanding applications
ATI's latest 3D rendering technologies, CHARISMA ENGINE II and PIXEL TAPESTRY II, power incredible 3D processing capabilities leading to unbelievable graphics quality
More marketing babble than you can read in an evening. Incredible that people fall for it.
Any PC parts manufacturers pages are stuffed with superlatives and SUPER MEGA 's that do not really mean a thing. Independent testing is what you should follow when selecting parts.
As Dilbert said: "Marketing - two drinks minimum"
H.
Powered by the revolutionary RADEON 8500 GPU and 64MB DDR memory for the most advanced 3D graphics in its class
ATI's innovative TRUFORM, SMARTSHADER and SMOOTHVISION technologies make 3D characters and objects more realistic
ATI's HYPER Z II technology conserves memory bandwidth for improved performance in demanding applications
ATI's latest 3D rendering technologies, CHARISMA ENGINE II and PIXEL TAPESTRY II, power incredible 3D processing capabilities leading to unbelievable graphics quality
More marketing babble than you can read in an evening. Incredible that people fall for it.
Any PC parts manufacturers pages are stuffed with superlatives and SUPER MEGA 's that do not really mean a thing. Independent testing is what you should follow when selecting parts.
As Dilbert said: "Marketing - two drinks minimum"
H.
jeez...some excitment
I dont know how you can possibly say that you cannot see the difference in 2d quality on a matrox card vs almost any other card on the market.
If there wasnt any difference in quality, then why have so many magazines/websites/forums spoke so highly of matrox cards?
Quote:matrox are good marketers and put on a good bulshit parade. they pay off the right people to make buzz, but they always disapoint and their products become substandard within a few months
OK....so lets assume matrox is paying of some companies that own websites/magazines and what not. Heh...aint no way that this is gonna fly. If im gonna go out and buy a card expecting to see visual quality in 2d appl - I sure as hell am gonna want to see the quality. And if my old ati card is visually better, then I'm obviously gonna post this problem in a respectible forum like this one. Since I got interested in computers (when I was 13) I have known about matrox cards being VERY impressive cards from the get go.
Now....as far as you saying that the monitors make 99.9 % of visual quality...your foolin yourself man. When I upgraded from an ati radeon 64meg ddr vivo, to a ati radeon 8500 64 MB card...the imporvment in visual quality was apparent to me. The monitors whites were washed in Tide, and text was way more sharp. Now....dont get me wrong, a monitor does play a large role in quality (tubes mostly). BUT ... now dont argue with me cause I know alot about filters ... filtering has alot to do with configuration of electronic parts. Some companies have ideas, some companies have differing ones. As far as filter quality goes...its up to the math and matching the components in a filter (depending on what TYPE of filter that you are configuring) as to how well the filter works. If you want a filter to cut out certain frequencies, there is a math aspect to this as well as part quality.
As far as I am concerned matrox knows how to design great filters and implement them properlly. Something as trivial as stray capacitance or minor phase differences can have a huge effect on something like detail in a crt. Especially with all the DAC and ADC goin on.
I have had the pleasure to work on a few workstations with matrox cards in them, and they have performed admirably. I wasnt playing q3 arena at work, but I was using applications like photoshop and the odd bit of autocad. Even textpad looked defined and sharp. Now I was runnin this box on an old monitor, and it looked alright, granted, anything looks better on a Sony 21" but, I would buy a matrox card if I had money to build a real 'workstation' (for me, this constitues no games and the like. Just powerful, 2d, content creation applications).
I dont know how you can possibly say that you cannot see the difference in 2d quality on a matrox card vs almost any other card on the market.
If there wasnt any difference in quality, then why have so many magazines/websites/forums spoke so highly of matrox cards?
Quote:matrox are good marketers and put on a good bulshit parade. they pay off the right people to make buzz, but they always disapoint and their products become substandard within a few months
OK....so lets assume matrox is paying of some companies that own websites/magazines and what not. Heh...aint no way that this is gonna fly. If im gonna go out and buy a card expecting to see visual quality in 2d appl - I sure as hell am gonna want to see the quality. And if my old ati card is visually better, then I'm obviously gonna post this problem in a respectible forum like this one. Since I got interested in computers (when I was 13) I have known about matrox cards being VERY impressive cards from the get go.
Now....as far as you saying that the monitors make 99.9 % of visual quality...your foolin yourself man. When I upgraded from an ati radeon 64meg ddr vivo, to a ati radeon 8500 64 MB card...the imporvment in visual quality was apparent to me. The monitors whites were washed in Tide, and text was way more sharp. Now....dont get me wrong, a monitor does play a large role in quality (tubes mostly). BUT ... now dont argue with me cause I know alot about filters ... filtering has alot to do with configuration of electronic parts. Some companies have ideas, some companies have differing ones. As far as filter quality goes...its up to the math and matching the components in a filter (depending on what TYPE of filter that you are configuring) as to how well the filter works. If you want a filter to cut out certain frequencies, there is a math aspect to this as well as part quality.
As far as I am concerned matrox knows how to design great filters and implement them properlly. Something as trivial as stray capacitance or minor phase differences can have a huge effect on something like detail in a crt. Especially with all the DAC and ADC goin on.
I have had the pleasure to work on a few workstations with matrox cards in them, and they have performed admirably. I wasnt playing q3 arena at work, but I was using applications like photoshop and the odd bit of autocad. Even textpad looked defined and sharp. Now I was runnin this box on an old monitor, and it looked alright, granted, anything looks better on a Sony 21" but, I would buy a matrox card if I had money to build a real 'workstation' (for me, this constitues no games and the like. Just powerful, 2d, content creation applications).
well....new matrox card is out....looks very appealing.
http://www.matrox.com/mga/products/parhelia512/home.cfm
http://www.matrox.com/mga/products/parhelia512/home.cfm
check out the new creative card
Anyways
I ended up getting the G4 - 4600 Ultra, i could not find the ati 128 all in wonder ANYwhere!! and i only had a week to get it
the card is good so far, i will know for sure once i got it in my good computer back home.
As for ATI, i have used them all the time, this is my first Geforce card, and never used maxtor, same as my friend, he swears by them, we have never had any driver issue problems, they alwasy worked.
Ati All In Wonder 128 Pro 32mb (4x agp)
Ati All In Wonder 128 32mb (2x AGP)
Rage Fury MAXX 64mb (first 64mb card out)
Tv Wonder
Radeon 7000 64mb x 2
Radeon 8500 64mb
Radeon 8500 DV (64mb)
ATI Tv Tuner
[/list:u]
My friends tends to buy new cards when they come out , he already wants to grab the 128mb all in wonder.
When i first got XP, my Ati all in wonder rage 128, did not suport the tv tuner, but they do now, so all is good.
Anyways
i am more looking forward to Creative's new card, which no one has mentioned as of yet
Supposed to have like 20g/sec memory bandwidth
3Dlabs took the unusual step of pulling forward the announcement of its next generation graphics architecture, the P10 visual processing unit (VPU), a product we didn't expect to be talking about for another two weeks. While this architecture is going to find its way initially into the Oxygen line of workstation graphics cards from 3Dlabs, it also heralds some of what we can hope to see coming from Creative Labs this Christmas.
I have to say that there wasn't as much information to back up the P10 launch as you would normally expect with a major chip launch, and this is an announcement that is two years overdue from 3Dlabs, but like I said the announcement was extremely hurried. Nevetheless, it's interesting to see the direction 3Dlabs is taking and reflect on the influence of Longhorn on the P10's specs. The next killer app for 3D may be the operating system, which I think is a signifcant development. It also means that we can predict what to expect in next gen products from other graphics chip vendors, and do geek gossip over coffee.
Therefore, I surmise, 3Dlabs has done us all a big service rushing out their announcement. By making the P10 public the company is giving us a glimpse into the issues that are going to drive 3D graphics hardware architectures in the coming months for almost all the graphics industry, and it's damn good stuff.
Some of the key points of damn goodness came up at WinHEC 2002 this year and P10's feature list reflects the directions Microsoft was giving hardware developers at their conference. Things like,
Full programmability - While Nvidia and ATi have mature programmable graphics products on the market, it's worth noting that they still retain some level of support for the old fixed function pipeline with some form of integrated T&L circuitry. The next step for the graphics chip industry is to move to a fully programmable pipeline and to remove those pesky transistors for fixed function graphics. Graphics is going to need all the silicon real estate it can muster, but every chip will use those extra transistors differently.
Multi-tasking Graphics - Microsoft's nextF generation operating system, Longhorn, is pushing the industry to create graphics processors that will offload almost all of the typical functions of managing windowed displays. This means that every window on your desktop becomes a 3D texture, whether it is running a game, a digital video, or an Office application. The CPU has to handle all of Longhorn's open apps, videos, and games running in multiple windows, and Microsoft is working on determining how much graphics hardware it should ask for as a minimum to keep its OS humming. The graphics processor becomes a true partner processor for the CPU, but the question is, how low will Microsoft keep the bar on graphics performance and features? Will Microsoft open up the PC and graphics markets by demanding a significantly higher level of 3D graphics performance for base level Longhorn systems than what we are seeing today, or will it try and hedge its bets by staying a generation or two behind the curve?
Bye bye VGA - We have to say bye bye to VGA, and the sooner the better. VGA is the last of the big legacy items remaining on the PC. It makes ISA look nimble and hip. With no VGA, graphics processors get to ditch the lowest common denominator.
Just in case you are unfamiliar with the nuances of the programmable 3D graphics pipeline, I suggest you give Tom's excellent review of the GeForce3's technologies a look:
High-Tech And Vertex Juggling - NVIDIA's New GeForce3 GPU
http://www.tomshardware.com/graphic/01q1/010227/index.html
The above article is a great place to get a good grounding on where the programmable 3D graphics pipe got its big start in the mainstream. And Tom does a good job of explaining terminology and how pixels flow through the pipeline. I could have cut and pasted the stuff, but I believe that's illegal.
[/list:u]
Anyways
I ended up getting the G4 - 4600 Ultra, i could not find the ati 128 all in wonder ANYwhere!! and i only had a week to get it
the card is good so far, i will know for sure once i got it in my good computer back home.
As for ATI, i have used them all the time, this is my first Geforce card, and never used maxtor, same as my friend, he swears by them, we have never had any driver issue problems, they alwasy worked.
Ati All In Wonder 128 Pro 32mb (4x agp)
Ati All In Wonder 128 32mb (2x AGP)
Rage Fury MAXX 64mb (first 64mb card out)
Tv Wonder
Radeon 7000 64mb x 2
Radeon 8500 64mb
Radeon 8500 DV (64mb)
ATI Tv Tuner
[/list:u]
My friends tends to buy new cards when they come out , he already wants to grab the 128mb all in wonder.
When i first got XP, my Ati all in wonder rage 128, did not suport the tv tuner, but they do now, so all is good.
Anyways
i am more looking forward to Creative's new card, which no one has mentioned as of yet
Supposed to have like 20g/sec memory bandwidth
3Dlabs took the unusual step of pulling forward the announcement of its next generation graphics architecture, the P10 visual processing unit (VPU), a product we didn't expect to be talking about for another two weeks. While this architecture is going to find its way initially into the Oxygen line of workstation graphics cards from 3Dlabs, it also heralds some of what we can hope to see coming from Creative Labs this Christmas.
I have to say that there wasn't as much information to back up the P10 launch as you would normally expect with a major chip launch, and this is an announcement that is two years overdue from 3Dlabs, but like I said the announcement was extremely hurried. Nevetheless, it's interesting to see the direction 3Dlabs is taking and reflect on the influence of Longhorn on the P10's specs. The next killer app for 3D may be the operating system, which I think is a signifcant development. It also means that we can predict what to expect in next gen products from other graphics chip vendors, and do geek gossip over coffee.
Therefore, I surmise, 3Dlabs has done us all a big service rushing out their announcement. By making the P10 public the company is giving us a glimpse into the issues that are going to drive 3D graphics hardware architectures in the coming months for almost all the graphics industry, and it's damn good stuff.
Some of the key points of damn goodness came up at WinHEC 2002 this year and P10's feature list reflects the directions Microsoft was giving hardware developers at their conference. Things like,
Full programmability - While Nvidia and ATi have mature programmable graphics products on the market, it's worth noting that they still retain some level of support for the old fixed function pipeline with some form of integrated T&L circuitry. The next step for the graphics chip industry is to move to a fully programmable pipeline and to remove those pesky transistors for fixed function graphics. Graphics is going to need all the silicon real estate it can muster, but every chip will use those extra transistors differently.
Multi-tasking Graphics - Microsoft's nextF generation operating system, Longhorn, is pushing the industry to create graphics processors that will offload almost all of the typical functions of managing windowed displays. This means that every window on your desktop becomes a 3D texture, whether it is running a game, a digital video, or an Office application. The CPU has to handle all of Longhorn's open apps, videos, and games running in multiple windows, and Microsoft is working on determining how much graphics hardware it should ask for as a minimum to keep its OS humming. The graphics processor becomes a true partner processor for the CPU, but the question is, how low will Microsoft keep the bar on graphics performance and features? Will Microsoft open up the PC and graphics markets by demanding a significantly higher level of 3D graphics performance for base level Longhorn systems than what we are seeing today, or will it try and hedge its bets by staying a generation or two behind the curve?
Bye bye VGA - We have to say bye bye to VGA, and the sooner the better. VGA is the last of the big legacy items remaining on the PC. It makes ISA look nimble and hip. With no VGA, graphics processors get to ditch the lowest common denominator.
Just in case you are unfamiliar with the nuances of the programmable 3D graphics pipeline, I suggest you give Tom's excellent review of the GeForce3's technologies a look:
High-Tech And Vertex Juggling - NVIDIA's New GeForce3 GPU
http://www.tomshardware.com/graphic/01q1/010227/index.html
The above article is a great place to get a good grounding on where the programmable 3D graphics pipe got its big start in the mainstream. And Tom does a good job of explaining terminology and how pixels flow through the pipeline. I could have cut and pasted the stuff, but I believe that's illegal.
[/list:u]
Quote:Supposed to have like 20g/sec memory bandwidth
Same with the new matrox card.
Same with the new matrox card.
I've read the now official specifications on the new Matrox card and all I can say is wow!
No wonder 3Dlabs were forced to bring the announcement for the P10 forward because at the current time that is the only card that is likely to rival the Matrox.
Matrox will have on market a card every bit as good as the P10 will be but a good few months ahead - there is a lot to be said for three year product cycles over the six month ones NVidia work on.
Matrox promised and the specifications would appear to show they delivered too.
Apparently the new Matrox cards will be available en mass from June and that is when they will be getting a huge wad od cash from me.
I cannot see my buying the very top card, 256MB RAM would be impressive but not exactly needed, so I shall be looking at the 128MB version.
A card with those specifications will be good for a couple of years while the rest of the industry plays catch up.
Tripple head display *Drool* seriously worth considering some cheap 17" monitors for that kind of setup.
No wonder 3Dlabs were forced to bring the announcement for the P10 forward because at the current time that is the only card that is likely to rival the Matrox.
Matrox will have on market a card every bit as good as the P10 will be but a good few months ahead - there is a lot to be said for three year product cycles over the six month ones NVidia work on.
Matrox promised and the specifications would appear to show they delivered too.
Apparently the new Matrox cards will be available en mass from June and that is when they will be getting a huge wad od cash from me.
I cannot see my buying the very top card, 256MB RAM would be impressive but not exactly needed, so I shall be looking at the 128MB version.
A card with those specifications will be good for a couple of years while the rest of the industry plays catch up.
Tripple head display *Drool* seriously worth considering some cheap 17" monitors for that kind of setup.
Quote:Tripple head display *Drool* seriously worth considering some cheap 17" monitors for that kind of setup.
I am with you on that one...dual is nice....but three...well...thats getting a little bit serious
I am with you on that one...dual is nice....but three...well...thats getting a little bit serious
A couple suggestions from a budding graphics programmer
I've been watching the Big Four compete with their different technologies for some time now. nVidia (who also may be using some technologies from 3dfx, whom they acquired in 2000) produce really fast cards, high clock cycle speeds, and so on. ATI produces (from what I have seen) robust technologies which lack in speed and driver support (heard LOTS about their flaky drivers).
But there are 2 technologies that may yet overtake the above two, in terms of superior graphics technology; only problem is they are only evolving now into something that could eventually be "Big". And the reason they may pull this off is because they provide intelligent technologies.
Matrox's new Parhelia GPU is scary... lifelike graphics from a mediocre model, given a texture for color, a texture for bump, and a texture for polygon interpolation from the original. Take a look at http://matrox.com/mga/products/parhelia512/technology/disp_map.cfm, they have amazing demos there. The main reason why this card will rock? Less bandwidth usage for textures and polygon data (and less of a need for blazing fast RAM when there's not as much data to push across the bus).
And PowerVR produces probably the most intelligent cards out there; problem is that their GPUs are not up to snuff in terms of capabilities and programmability. http://www.powervr.com dictates that the fastest card is the one that only renders what is seen... and they do the best job of the listed 4 technologies here, in terms of hidden surface removal.
Right now, the best graphics card to buy is one which (a) has a programmable GPU, because this means it's more than just a "purchase", it's more like an "investment", where for months and years to come, more and more games will take advantage of their capabilities. (Currently, few do. Doom 3, and that's about it.) And ( that has crisp, clear rendering without a lot of washed out image quality. Take any game, max out the graphics, and see which technology shows up best... and which one scrambles faster to make sure that the drivers work properly. This is your best choice.
(Of course you could be like me, and hold out with your Voodoo3 because you're too damn bullheaded to buy an nVidia card, and too damn poor to buy an ATI! )
I've been watching the Big Four compete with their different technologies for some time now. nVidia (who also may be using some technologies from 3dfx, whom they acquired in 2000) produce really fast cards, high clock cycle speeds, and so on. ATI produces (from what I have seen) robust technologies which lack in speed and driver support (heard LOTS about their flaky drivers).
But there are 2 technologies that may yet overtake the above two, in terms of superior graphics technology; only problem is they are only evolving now into something that could eventually be "Big". And the reason they may pull this off is because they provide intelligent technologies.
Matrox's new Parhelia GPU is scary... lifelike graphics from a mediocre model, given a texture for color, a texture for bump, and a texture for polygon interpolation from the original. Take a look at http://matrox.com/mga/products/parhelia512/technology/disp_map.cfm, they have amazing demos there. The main reason why this card will rock? Less bandwidth usage for textures and polygon data (and less of a need for blazing fast RAM when there's not as much data to push across the bus).
And PowerVR produces probably the most intelligent cards out there; problem is that their GPUs are not up to snuff in terms of capabilities and programmability. http://www.powervr.com dictates that the fastest card is the one that only renders what is seen... and they do the best job of the listed 4 technologies here, in terms of hidden surface removal.
Right now, the best graphics card to buy is one which (a) has a programmable GPU, because this means it's more than just a "purchase", it's more like an "investment", where for months and years to come, more and more games will take advantage of their capabilities. (Currently, few do. Doom 3, and that's about it.) And ( that has crisp, clear rendering without a lot of washed out image quality. Take any game, max out the graphics, and see which technology shows up best... and which one scrambles faster to make sure that the drivers work properly. This is your best choice.
(Of course you could be like me, and hold out with your Voodoo3 because you're too damn bullheaded to buy an nVidia card, and too damn poor to buy an ATI! )