Ultra-mode video card shopping guide


5th_Player

 

Posted

Quote:
Originally Posted by Balorn View Post
but I do wonder if Ultra-mode will allow it to do something with the other 3/4 of my i7.
I had a chance to profile CoX in more detail on my i7-860. I'm assuming you either have an 8xx or 9xx (4 core hyperthreaded to 8), in which case I doubt GR is going to come anywhere close to maxing out your processor. But there is a lot of potential headroom. The best I could do was run around in a high-density mission (x8) with a lot of stuff going on, while demorecording, with 3D sound, and with high graphics setting and high particle count but low intrinsic resolution (low res because the 4350 I have at the moment is not suitable for high performance graphics and bottlenecks the system at high resolution). I can get the primary compute threads to approach 10% total system utilization (which is approaching one core's worth of workload) but things like the PhysX thread and the sound thread are only using very small amounts of CPU even under high load conditions. If they added features to Ultra Mode that, say, doubled the maximum CPU load for the game client under the top performance conditions, it would probably benefit from an I7 over, say, a Core2 Duo (or even a Core2 Quad). But you'd probably still not have the CPU be a significant bottleneck.

So far, the multi-threaded and multi-core overheads appears to be trivial.

Just a reminder, its not a simple case of CoX using two threads and therefore two cores. CoX is actually using 15 threads, most with very little CPU relative to the primary game client. But starting a demorecord, say, kicked a previously idle thread into just under 1% utilization, as did turning on 3D sound. I did have six threads with measurable CPU performance running simultaneously under some test conditions. Win7, as I mentioned previously, seemed to be doing a good job of isolating those six threads to the four even-numbered CPUs (which are the non-hyperthread ones) without pegging any of them (the highest CPU, CPU zero, only reached about 70% utilization maximum and I did have other things running at the time).


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

Quote:
Originally Posted by Positron View Post
Also, a video card is not the ONLY thing you should take into consideration. Your PC having a decent processor and memory will also enhance your performance. As we continue to test and get more information available we will update you. I hope this helps, and happy holidays!
Ya know, I have to ask "What constitutes a decent processor these days?"

While I've no doubt about i5 and i7's, I was more curious about AMD's offerings these days. I'm debating upgrading my poor old C2D 6300 @ 1.8ghz, and the Athlon X4 620 is looking good.


 

Posted

Ya thats the board I currently have I was just wondering what the pci e 2 ment over it just being a X16 pci e slot

I would hate to buy a expencive card and not use it to its full potential




Quote:
Originally Posted by je_saist View Post
I have one the D975XBX motherboards: http://www.intel.com/products/motherboard/D975XBX/

As far as I can tell, Intel doesn't actually specify what's different between the d975xbx and d975xbx2. Never mind, found it. The 2 model added several processors that really should have been added in a BIOS update to the first motherboard...

D975XBX 1 Processor List

D975XBX 2 Processor List

The only reason you'd buy one is if you wanted to run a really early Core 2 Duo with Crossfired Radeons, since this was one of, if not, the first Intel board to carry Crossfire support. It does not, however, support SLI as later and more modern Intel motherboards do.

I'd also have a hard time suggesting this board on it's own merits. The BIOS for the original model is rubbish, and since it looks like Intel just re-released the board with an updated BIOS for new processors instead of... you know.. actually releasing the BIOS so that if you had the original board you could use the new processors... I somewhat doubtful that the 2 model is any better.

Getting a SATA drive to boot from the system was difficult. And it chewed through more power than it really should have with it's feature support.

***

If I mis-read this as you actually having one and aren't just looking at buying one, yes, it's PCI-Express slots will support Nvidia cards.

Just not in SLI.


 

Posted

Quote:
Originally Posted by Psygon_NA View Post
Ya thats the board I currently have I was just wondering what the pci e 2 ment over it just being a X16 pci e slot

I would hate to buy a expencive card and not use it to its full potential
There's really not much performance difference between a PCIE x16 and a PCIE 2.0 x16 slot. As much as I detest referencing Wikipedia, I have no desire to scrounge through PCI-SIG's site for the correct data, so take a look here: http://en.wikipedia.org/wiki/PCI_Exp...CI_Express_2.0

Quote:
Ya know, I have to ask "What constitutes a decent processor these days?"

While I've no doubt about i5 and i7's, I was more curious about AMD's offerings these days. I'm debating upgrading my poor old C2D 6300 @ 1.8ghz, and the Athlon X4 620 is looking good.
Really depends on who you ask and when.

For the purposes of gaming in resolutions of 1280*720 (720p) or 1440*900, two common resolutions for low end LCD's, even Socket 754 or late Pentium4's can deliver enough punch for most games.

For the purposes of gaming in resolutions like 1680*1050, 1920*1080 (1080p), or 1920*1200, processors from the Intel Core2 Duo lineup and the AMD Socket 939 and AM2 dual core CPU's can deliver enough back-end power.

Right now, for most shipping games against the Socket AM3 Phenom's or Intel I5 / I7 processors, you'll run into a limit on your graphics card long before you run into the limit on your processor.

From a price for performance standpoint I have a hard time suggesting people buy Intel processors and motherboards. Earlier in the thread I was talking about the Intel D975XBX. This board was a contemporary of the Asus M2R32-MVP as both were launched in 2006. The M2R32-MVP cost... well... quite a lot less than the Intel board. I think I paid around $130 for the Asus board, and the Intel board was somewhere in the neighborhood over $250 if memory serves correctly. The Asus board had more PCIE bandwidth for each PCIE slot (both slots were 16x, the slots on the Intel were 4x, 8x, then 16x, and if you ran in crossfire it was 8x,8x) The Asus board used less power. And so on and so on.

What really got me though is that as AMD introduced new processors, Asus pretty much kept updating the BIOS to the literal physical limits of what the board could handle. That M2R32-MVP of mine is currently running with a Phenom 9600 that's overclocked.

The Intel board, as noted earlier, never had a bios update with new processors. Rather, as I found out when researching the board name for this thread, if I wanted to utilize new processors from Intel, I would have to buy a completely new $200+ motherboard... with the same exact physical hardware. I'm sure some accountant somewhere is sitting there, nodding his head, saying Yeah, that's how we make money. I'm sitting here going that's how you *CENSORED* your customers!

***

Now, we do know that AMD intends to keep the ongoing Socket Compatibility that they have right now for future processors. The upcoming Bulldozer will have a new socket, but reportedly will also work on currently shipping Socket AM3 motherboards.

So if you bought an AM3 motherboard today, there's some assurance that just like Socket AM3 processors will work on most Socket AM2+ motherboards, consumer oriented Bulldozer designs will work on most Socket AM3 motherboards.

You don't get that kind of... assurance with Intel. Rather, you get the opposite. Case in point is the recent release of the LGA 1156. Rather than re-using the existing LGA 1136 and maintaining Socket Compatibility across processors, Intel simply releases a new socket design. Which is rather rotten if you want to stop and think about it for a minute.

***

So if you were buying now, I'd push you towards a Socket AM3 motherboard with a Socket AM3 processor.

I'm just not sure which motherboard I'd push you too.

DFI burnt me pretty badly with a couple of their Intel I7 motherboard offerings that refused to run in Triple Channel memory mode, so I'm a little hesitant at pointing out the nice looking DFI LANParty DK 790FXB-M3H5. Sure, all of the DFI AMD boards I've had have been awesome, so maybe it was just an Intel thing or something. Asus makes a wonderful Quad-Crossfire board... but it's almost $200, which is Intel board territory. MSI has the 790FX-GD70... but it's an MSI... and every single MSI board I've had with an AMD chipset has been complete and utter *CENSORED.

I'd actually be tempted to live with the ASUS M4A79XTD EVO. Yeah, it's only got dual 8x PCIE slots, so you do lose a bit of bandwidth if you run in Crossfire mode. It's also available for around $110, so it's a bit of a deal for it's feature set.


 

Posted

I rebuilt most of my puter over the summer and I have to say the difference between my old Geforce 7600 and the new GTX 260 is startling. I also upgraded my motherboard from an Asus M2V to a MSI k9nsli v2 (for future upgrades), I went from 1gb ram to 2gb, and my power supply from a 200 watt weakling to a 750 watt octopus on steroids. >8) I also added 2tb external HD. I should be good to go.

My next upgrades will be Windows 7 64bit Ultimate Edition, 4gb ram, Quad-core processor, and a Blu-Ray burner.


 

Posted

Quote:
Originally Posted by je_saist View Post
So if you were buying now, I'd push you towards a Socket AM3 motherboard with a Socket AM3 processor.
If you're looking to build a true low-end budget PC (that will still play games ok), this is a fine suggestion. But given that you can find core i7 920s for $200 (as of 4mo ago), and they run circles around any AMD processor, if you're not looking at a total budget system, I'd get the core i7 920.


 

Posted

Quote:
Originally Posted by Smurch View Post
Give it a few months. Prices for things like this drop very rapidly. By the time GR is released, these cards will probably cost significantly less.
Don't count on it!

The price of graphics cards is usually driven perpetually-downward by the back and forth between Nvidia and ATI. Right now though, that situation doesn't apply for either company.


Nvidia is in a very nasty developmental position right now. Their new G300 processor is delayed until at least late spring/early summer of next year; it was supposed to have debuted before the ATI 5xxx cards a few months ago. Early sample reports about their performance are not encouraging either.

Meanwhile, the company is bleeding money on the GTX 2xx series cards introduced around this time last year. The G200 processor is fairly huge, larger than the contemporary ATI processor of the 4xxxs, and is relatively expensive to produce because of that. But 4xxxs not only equal-or-out-performed the GTX 2xxs, they undercut them on total price as well. They had to reduce the prices on the GTX 2xxs to the point they were basically unprofitable in order to maintain their leading market-share. But Nvidia has a sales-guarantee with its card producers wherein Nvidia will pay the difference if the producer ends up having to sell their cards for less than Nvidia's reference MSRP. I've seen it described as Nvidia having to wrap each chip in a $20 bill for someone to be willing to take them out of the warehouse.

Now the 5xxxs have hit and Nvidia still doesn't have an equivalent offering. To prevent having to lose even *more* money from a price decrease in response to the flatly superior 5xxxs, Nvidia told its card producers that they were "having production difficulties" and started to taper off distribution of GTX 2xx chips. The GTX 2xxs have been made "artificially scarce" to keep their price up. Then Nvidia announced a few weeks ago that they were End Of Lifing the GTX 260, 275, and 285; what's in the production chain *right now* is all there will ever be of these cards.

So Nvidia has to keep prices of their current cards up while they struggle for a new product, have constructed a situation to make that happen, and are now stopping production of the cards entirely to staunch the bleeding.

So no, the price of Nvidia cards isn't likely to drop between now and next spring.


For ATI's part, they are currently dominating the market with superior products and do not have an equivalent competing product on the immediate horizon. However, they have a superior product...but there aren't any around to sell. The same company (Taiwan Semiconductor Manuf. Co.) that is fabricating Nvidia's G300 chips is also making ATI's 5xxxs with the same manufacturing technique...and it's suffering similar production problems. There are mid-range 57xx boards available, but there are virtually no 58xxs. Indeed, the price for the 5850s has gone UP $50 since they were released in September. There's rumor that supply will get better around Dec 15th, but we'll have to wait and see.

ATI has a free run at the market, with Nvidia sitting on its hands, and they can't take full advantage of it because of supply problems! Add in that ATI is carrying a large amount of structured debt they have to service and they have no reason to reduce prices, every reason to keep them up, and real logistical problems forcing them to keep it that way.

So no, ATI won't be dropping prices on their cards any time soon either. (They will however, be dropping low-range versions of their new cards on the market in the first quarter of next year.)

If you want a cheaper card, I recommend trying to catch a holiday sale of some sort.


As a related note, I'd like to mention that Nvidia has just announced release of a G310 card. Do not buy this thing thinking it is part of a new high-performance line and will be better than a 2xx card! The G310 is a re-named G210 (which was a piece of junk) that Nvidia hopes will sound attractive because "300" is higher than "200". It's the same thing they did with the GTS 250 actually being a renamed 9800 GTX+; still using an older G92 chip rather than one of the newer G200s despite the name. The more I learn about Nvidia's business practices, the less I like the company.


 

Posted

Quote:
Originally Posted by Hallowed View Post
If you're looking to build a true low-end budget PC (that will still play games ok), this is a fine suggestion. But given that you can find core i7 920s for $200 (as of 4mo ago), and they run circles around any AMD processor, if you're not looking at a total budget system, I'd get the core i7 920.
Where are you seeing this? i7 920s on Newegg are going for $280.

However, the i5 750 gives most of the performance of the i7 920, costs $200 currently, and will require much less money to give it a motherboard and RAM than an i7 920. That's also a possibility if you're working inside a strict budget.


 

Posted

Quote:
Originally Posted by Psyte View Post
Ya know, I have to ask "What constitutes a decent processor these days?"

While I've no doubt about i5 and i7's, I was more curious about AMD's offerings these days. I'm debating upgrading my poor old C2D 6300 @ 1.8ghz, and the Athlon X4 620 is looking good.

I hate to get involved in religious wars but here is some objective information for cpu comparison

for general performance

http://www.tomshardware.com/charts/2...tage,1394.html

here we go for gaming related

http://www.tomshardware.com/charts/2....0.2,1396.html


 

Posted

Quote:
Originally Posted by Human_Being View Post
Don't count on it!


As a related note, I'd like to mention that Nvidia has just announced release of a G310 card. Do not buy this thing thinking it is part of a new high-performance line and will be better than a 2xx card! The G310 is a re-named G210 (which was a piece of junk) that Nvidia hopes will sound attractive because "300" is higher than "200". It's the same thing they did with the GTS 250 actually being a renamed 9800 GTX+; still using an older G92 chip rather than one of the newer G200s despite the name. The more I learn about Nvidia's business practices, the less I like the company.
LOL thats the entire computer business in a nutshell. Its hardly new with Nvidia. I doubt you can find a computer company that hasn't participated in the sport of fleecing the customer


 

Posted

Ok, I've been thinking about it, and have decided to post the main equipment I plan to purchase for my new system (CPU, GFX, ETC) to get a feel for the performance it will have from someone more experienced in this area. I'm a diagnostician, not a power PC builder.

So with that in mind:

CPU: AMD Phenom II x4

http://www.newegg.com/Product/Produc...82E16819103471

GFX Card: EVGA 02G-P3-1185-AR GeForce GTX 285

http://www.newegg.com/Product/Produc...82E16814130486

GIGABYTE GA-MA790X-UD4P AM3/AM2+/AM2 AMD 790X ATX AMD Motherboard

http://www.newegg.com/Product/Produc...82E16813128387

Any thoughts? Will this system stack up to GR's Ultra Mode and handle the load? I know that, according to Posi, the card will handle Ultra Mode at full tilt, but I want to make sure before making any purchases that the rest can handle it.

And, of course, I have a compatible case, PSU, etc. picked out, so if you need more info, just ask!


"Iron defenses and a crappy attitude do not, a tanker, make."

Proud Leader and founder of The Gangbusters Super Group and The Madhouse Villain Group: Ask me about becoming a member!

 

Posted

Quote:
Originally Posted by Another_Fan View Post
I hate to get involved in religious wars but here is some objective information for cpu comparison

for general performance

http://www.tomshardware.com/charts/2...tage,1394.html

here we go for gaming related

http://www.tomshardware.com/charts/2....0.2,1396.html
Appreciate it. Its a nice spot in that anything these days is a nice improvement for me. I spent some time last night trying to find/figure out the price difference if I bought or made an Athlon II X4 system or an i5 (felt the inner geek getting stronger!). Its just that I found the comment, "a decent processor" to be, well, vague, you know? i5's and i7's are lovely to be sure, but they're also a bit expensive. The $100 I'd save from going from an i5 to an X4 could be used for other necessary components or going ahead and buying a second Radeon 4670/1g card (which when Crossfired should get me close to 4850 performance). I just wish I knew what that cut-off was for a "decent processor" (even though I'm quite sure my C2D is well under it).


 

Posted

Quote:
Originally Posted by Flarecrow View Post
Ok, I've been thinking about it, and have decided to post the main equipment I plan to purchase for my new system (CPU, GFX, ETC) to get a feel for the performance it will have from someone more experienced in this area. I'm a diagnostician, not a power PC builder.

So with that in mind:

CPU: AMD Phenom II x4

http://www.newegg.com/Product/Produc...82E16819103471

GIGABYTE GA-MA790X-UD4P AM3/AM2+/AM2 AMD 790X ATX AMD Motherboard

http://www.newegg.com/Product/Produc...82E16813128387

Any thoughts? Will this system stack up to GR's Ultra Mode and handle the load? I know that, according to Posi, the card will handle Ultra Mode at full tilt, but I want to make sure before making any purchases that the rest can handle it.
That should be fine.

Quote:
GFX Card: EVGA 02G-P3-1185-AR GeForce GTX 285

http://www.newegg.com/Product/Produc...82E16814130486
You might hold off on this purchase though, unless you want it Real Soon(TM). Positron said they don't have any data on the ATI 58xxs yet. Not surprising since no one can get one. On the review sites however, ATI 5850s were delivering equivalent-or-superior performance to the GTX 285 at a price point of $300; $115 less than the card you have there. The 5850 also runs cooler, quieter, and consuming less power than the GTX 285.

If the price point is no object, then you could also grab an ATI 5870 for the same money ($400-$420) and get something that flatly outstrips the GTX 285 head to head. Since the 5850 and 5870 use the same chip, the available dies are all going to the 5870 and it's a little easier to actually find a 5870 for sale.

Unless there's something bizarre about the implementation, I would expect the ATI cards to run at their maximum strength in CoX; as Bill Z pointed out to me, the ATI sticker has replaced the Nvidia one at the bottom of the CoH Homepage . We don't have empirical data that the 58xxs will perform that well with CoX, but as they are using 48xxs as a baseline, the 58xxs are a scale-up of similar architecture, and ATI is now the official graphics partner for CoX, it's quite likely to be so. Supposedly, new 58xxs are hurtling towards market as we speak and will appear magically on Dec 15th; just in time to be sold for Christmas. If they don't appear then, they'll certainly show up in January.

So if you want the card now, it would be easier to get a GTX 285. If you wait a bit, you could probably get the same performance for cheaper or better performance for the same price.


With all that said, EVGA is an excellent company. They're kind of the Cadillac of Nvidia board makers. If you decided to go with the Nvidia card, that's likely to be a quality one.


 

Posted

Thank you very much for the info HB. I can't actually make the purchases right now, so waiting to keep an eye out on the cards you suggested is something I will definitely be doing. That, and, I would like to see if Posi or another Dev will eventually come forward with ATI information.

However, I do have a few questions.

Firstly, If I do end up going for the ATI cards, it will be my first ATI Card. Is there anythign I would need to know about differences between nVidia and ATI?

Second, What company should I shoot for with the Card's board. I'm leaning towards the ASUS board, since I do have some experiendce with their products, but any information you could give there would be helpful.

And, Lastly, the ATI card has 1GB memory while the nVidia card I selected has 2GB. Will this difference be noticeable in the performance of the card?


"Iron defenses and a crappy attitude do not, a tanker, make."

Proud Leader and founder of The Gangbusters Super Group and The Madhouse Villain Group: Ask me about becoming a member!

 

Posted

Quote:
Originally Posted by Another_Fan View Post
I hate to get involved in religious wars but here is some objective information for cpu comparison

for general performance

http://www.tomshardware.com/charts/2...tage,1394.html

here we go for gaming related

http://www.tomshardware.com/charts/2....0.2,1396.html
Excellent link. I believe that is a good place to look at to keep things in persepctive. Looking at the gaming focused 3D Mark table it show that ignoring the lowest scorer (which almost seems like a data error) the spread between the best and the worst is just over 6%. Given that, my guess would be that hardly anyone would notice the difference in performance (assuming they are only playing a game and aren't doing a dozen things at the side which would result in different requirements).


 

Posted

The problem I having with ATI cards is fact that half the games I play beside CoH have a page long issues with ATI cards. Yeah the new cards run rings around nVidia cards but at least nVidia run alot more stable than the ATI cards. And the sticky for playing CoH with ATI cards still scares me away from buying a new one.


 

Posted

Quote:
Originally Posted by Another_Fan View Post
I hate to get involved in religious wars but here is some objective information for cpu comparison

for general performance

http://www.tomshardware.com/charts/2...tage,1394.html

here we go for gaming related

http://www.tomshardware.com/charts/2....0.2,1396.html
Just commenting on this, but I wouldn't really cite Tom's site as a reference on how processor's perform. THG has been busted multiple times in the past accepting vendor money to slant reviews.

Quote:
The problem I having with ATI cards is fact that half the games I play beside CoH have a page long issues with ATI cards. Yeah the new cards run rings around nVidia cards but at least nVidia run alot more stable than the ATI cards. And the sticky for playing CoH with ATI cards still scares me away from buying a new one.
Question for you: How many of those cards are from the The Way It's Meant to be Played program from Nvidia? I think you'll find that most shipping games with ATi card issues have been built with money / assistance from Nvidia. Which is actually technically illegal and grounds for a class action lawsuit, although Nvidia has enough of those types of problems as is.

Quote:
Firstly, If I do end up going for the ATI cards, it will be my first ATI Card. Is there anything I would need to know about differences between nVidia and ATI?
Well, ATi has better drivers. The hardware specifications and programming guide are actually available if you are interested. On the hard issues of performance and image quality ATi and Nvidia typically trade blows. As I referenced with an earlier comment in this response, Nvidia has been caught out / busted for paying game developers to sabotage games on AMD graphics cards.

Quote:
Second, What company should I shoot for with the Card's board. I'm leaning towards the ASUS board, since I do have some experiendce with their products, but any information you could give there would be helpful.
Have an easier time telling you who NOT to buy. Don't buy a Diamond Multimedia Card. Other than that, most of the AMD vendors, like Sapphire, Asus, PowerColor, HIS, and XFX are pretty much the same in physical quality.

Quote:
And, Lastly, the ATI card has 1GB memory while the nVidia card I selected has 2GB. Will this difference be noticeable in the performance of the card?
Yes / no / not really. AMD redesigns the memory controllers on their GPU's about once every 5 minutes, and the efficiency is... pretty good. AMD has also been using GDDR5 memory on their mid-range to high-end graphics cards for a while now. Nvidia won't be using GDDR5 until Fermi... so Nvidia sort of needs that extra memory to make up the difference in speed.


 

Posted

Would anyone know how well does COH do with Crossfire using AFR these days? Do you know, by chance? I couldn't find anything, but I don't know if that's my search-fu being weak or there not being much.


 

Posted

Quote:
Firstly, If I do end up going for the ATI cards, it will be my first ATI Card. Is there anythign I would need to know about differences between nVidia and ATI?
They're different companies with different architectures and will both behave a little different in each situation. There are two significant things to note however.

Firstly, in current implementation of the game, ATI cards have a number of "graphic glitches" in CoX. (See Bill Z's aforementioned sticky thread in the Technical Issues and Bugs section.) We've had no official word if these are going to be remedied in Going Rogue. It is likely that Ultra Mode will not see these problems with ATI since they were demoing it on ATI boards and ATI is the official graphics partner. However, we can't know for certain if these oddities will go away entirely when Going Rogue is live, or if they will only disappear in Ultra Mode but remain in the standard version of the game.

Heck, for all we know, Nvidia cards might end up sporting "graphical oddities" in Ultra Mode =P.

For now though, be aware that there are quirks with using ATI in the current game.


Secondly, ATI does not support hardware-accelerated PhysX.

Wait, what does that mean? For CoX, pretty much nothing.

Ageia was purchased by Nvidia a while ago, and a proprietary physics engine got even more proprietary. All Nvidia cards of 8800 or newer vintage (except the "brand new" G210/G310 =P) will run the PhysX calculations on their GPU. If you *don't* have an Nvidia card, the PhysX gets done entirely by the CPU.

In CoX, that's not a huge burden. I've never heard of ATI users complaining of massive framerate drops when flying debris appears on their screen in-game. The PhysX calculations are light enough that the CPU can handle it as implemented; which makes sense considering the minimum hardware specifications Paragon wants to maintain. Given the processor you've suggested, you shouldn't even notice the "lack of hardware PhysX".

Where it might become important though, is in other games. There aren't very many that use PhysX, and even fewer that use them intensively. The only one I can think of, though it's definitely significant, is Batman: Arkham Asylum. The PhysX implementation in the game is very intense when set to "High". You can be using an ATI card in B:AA with PhysX effects turned on, zoom along at 60 frames per second, and as soon as you encounter volumetric smoke or flying sheets of folding paper, the PhysX will crush the frame rate until it passes. Even a high end Nvidia card will see a dip in frame rate under those circumstances.

Some people have taken to using a second Nvidia card in another motherboard slot as a Dedicated PhysX Processor; there's an option in the Nvidia driver control panel to implement that when there are two graphics cards present. They don't have to be the same card either. People have been taking low-end 9600 GTs or 9800 GTs, pairing them with GTX 260s, and seeing a boost in overall performance since neither the CPU nor Graphics processor have to deal with the PhysX stuff. The weaker, older cards are plenty enough to handle the specialized task.

Here's the kicker though: you can do the same thing with an ATI card as main GPU! In fact I've seen some reviews that say an ATI card outperforms the equivalent Nvidia one in B:AA while using a 9x00 GT as dedicated PhysX unit. You don't need an SLI or Crossfire motherboard (the two companies proprietary dual-card standards). You do have to be using either Win XP or Win 7 though; both have the required "Driver Test" mode wherein you can run two graphics drivers at the same time. Vista doesn't have this capability.

A little while ago, Nvidia caught on to this and disable dedicated PhysX operation in their drivers unless there was an Nvidia card in the other motherboard slot. Amongst the antics of an increasingly erratic company, I find this particularly asinine. Nvidia has essentially abandoned the high-end graphics card market and all they have to compete on are their lower-class boards. Those same boards are going to become even less competitive in Jan/Feb when the new ATI bargain cards arrive. If people are choosing ATI performance cards for quality, but could still pick up a cheap Nvidia card to run with it for specialized purposes, Nvidia might still make some money. But I'm not in marketing, so what do I know =P.

You can still do the different-cards trick if you use older (pre-190.xx I think) drivers. If you are using Win 7, there is also a homemade patch (predictably ) floating around that will let you use the newest Nvidia drivers while telling them "No no, that Radeon chip in the other PCIe slot is really a Geforce ! Really !"


Off the top of my head, those are the two things you need to know: as currently implemented, CoX has graphics quirks with ATI cards and a small handful of games (but not CoX) can see drastic performance fluxtuations with PhysX enabled in the absence of an Nvidia card.


Quote:
Second, What company should I shoot for with the Card's board. I'm leaning towards the ASUS board, since I do have some experiendce with their products, but any information you could give there would be helpful.
I know ASUS makes solid motherboards, I don't know what their reputation in Graphics Cards is; though I would be surprised if it wasn't at least decent.

I do know that Sapphire is well respected for ATI cards. Sapphire is basically to ATI what EVGA is to Nvidia. Sapphire also has a good line of custom, non-reference-standard versions of the cards they sell, like the Vapor-X line with better cooling/quieter fan/slight overclocking.

Another good company is XFX. XFX is the never-quite-overtaking-them closest quality-competitor for EVGA in the Nvidia line. About a year ago though, they looked at Nvidia and went "these guys are nuts !" and started setting up to build ATI cards as well. From looking at newegg after following your links, I also noticed that XFX had (past tense) both ATI 5850s and 5870s available for a $60 premium today; apparently no other company had them stocked with Newegg for Black Friday. Definitely no flies on these guys.


Quote:
And, Lastly, the ATI card has 1GB memory while the nVidia card I selected has 2GB. Will this difference be noticeable in the performance of the card?
I will refer you to Father Xmas' superior answer to that question.


 

Posted

Quote:
Originally Posted by je_saist View Post
Just commenting on this, but I wouldn't really cite Tom's site as a reference on how processor's perform. THG has been busted multiple times in the past accepting vendor money to slant reviews.
................. Good to know .


 

Posted

Just out of interest, any opinions on the Geforce 295? I'm looking to build a new hi-spec system (yay tax deductions!) and have seen that going as part of certain deals.


Always remember, we were Heroes.

 

Posted

Quote:
Originally Posted by Dr_Darkspeed View Post
Just out of interest, any opinions on the Geforce 295? I'm looking to build a new hi-spec system (yay tax deductions!) and have seen that going as part of certain deals.
A GTX 295 is two GTX 275-class chips stuck on the same board and slightly under-clocked in a desperate attempt to stay below 300 Watts TDP. It's how the two companies compete for the "fastest single graphics card in the world" title. Despite lackluster performance in the rest of the GTX 2xx line this year, the GTX 295 barely managed to hold onto the title against the ATI 4870 X2 (two 4870s built into the same package). However, they lost that standing just last week against the ATI 5970 (two 5870s on the same board and underclocked to 5850-speeds in a desperate attempt to stay under 300 Watts TDP =P).

These cards are the "best discrete cards in the world". They also are huge, run hot, and tend to make noise like a Basset Hound being sucked into a vacuum cleaner. (Of course, if you've got the money to buy one or more double-chip cards, you probably have the money to drop on a hefty water-cooling system).

I would note that I think there have been three revisions of the GTX 295, one of them being two distinct circuit boards with one chip each inside the cowling instead of two chips on one board. I'm not sure what the dissatisfaction with them was, but apparently not all GTX 295s are built the same.


 

Posted

Quote:
Originally Posted by Dr_Darkspeed View Post
Just out of interest, any opinions on the Geforce 295? I'm looking to build a new hi-spec system (yay tax deductions!) and have seen that going as part of certain deals.
Wait, I just reread that... Okay, I expect you meant tax refunds, not that you were able to write off a gaming PC as a tax deduction, but that makes me think of one of my favorite Penny Arcade strips. (Warning: some language)

You can click the "News" link to read the story behind the strip.


 

Posted

real quick:

Quote:
Here's the kicker though: you can do the same thing with an ATI card as main GPU!
no you can't, or at least not anymore. Nvidia disabled this with recent driver updates. It's one of the reasons why Intel and AMD are pushing OpenCL as a gaming physics solution.

(still reading the rest of the thread since my last post)

... and I see you actually mentioned this as I read further.

Quote:
Would anyone know how well does COH do with Crossfire using AFR these days? Do you know, by chance? I couldn't find anything, but I don't know if that's my search-fu being weak or there not being much.
I've been running CoH under Xp / Vista with Crossfired RadeonHD 3870's and 4850's, and under Vista on a GTS 250 Triple SLI setup. Honestly, at the resolutions I run (1920*1200 on the Radeons, 1680*1050 on the Nvidia), there's not really a heap of difference in speed between the cards in single mode and multi-GPU mode.

This will likely change as of Going Rogue as the new engine supposedly will take advantage of multi-gpu rendering.

Quote:
Another good company is XFX.
I'm sort of tossed on XFX. My first couple of cards with them were the AGP Geforce 6600 GT's I had... which had bloody awful heatsink designs. However, it seems that Nvidia's 6600 AGP design was something vendors couldn't deviate from, so I really couldn't blame XFX for the cards. I've been meaning to try a couple of their Radeon offerings, but everytime I went to buy, Sapphire was cheaper.

Sooo.. quick note on Sapphire. Sapphire's traditionally been a bit closer to ATi than other vendors. Back when ATi was selling their own branded cards, Sapphire was the actual manufacturer. One of the things to keep in mind is that for a long time ATi didn't sell chips to 3rd party vendors.

However, I did pay for that cheapness on the RadeonHD 4850's I picked up. They actually are the most annoying fans since the 6600 GT's I had, and remind me quite a lot of an FX 5800 Ultra I borrowed. At full tilt it's like a Banshee Howling fest.

Quote:
Just out of interest, any opinions on the Geforce 295? I'm looking to build a new hi-spec system (yay tax deductions!) and have seen that going as part of certain deals.
Human Being beat me to this.

A GTX 275 starts around $240.

A good one with 1.7gb of memory is around $320.

A GTX 295 starts around $530.... well, the only one Newegg has is $530.

So you're basically paying a $50 premium to use up 2 slots in your computer rather than 4 slots... since i'm not aware of any single slot GTX 275's.

Now, if you've got the cash, yeah, the GTX 295 is a freaking monster. And the one on Newegg is actually cheaper than a 4870x2 I found on Dell.

But I can get a Stock 4870 from XFX for around $154. Two of those are only $308...

I can also find RadeonHD 4890's for around $200... and two of those in Crossfire are going to set you back around $400... and you'd be running away from GTX 285's just on single cards.

So, I wouldn't be buying a GTX 295. I can save a lot more money by being willing to sacrifice space in the case.


 

Posted

Quote:
Originally Posted by Flarecrow View Post
Thanks for that info, load of of my mind.

All I have to do now is see if the brand new Power Supply I bought to replace my burned out oldie can power the awesome nVidia Card I picked out, as well as the rest of the system.
May have to consider how much ohms your PSU puts out depending on what card you get. I bought a NVidia 280 and a PSU with 300 more watts that it recommended. I installed everything, powered it up, and it kept using on-board video. Typically, it would use PCIx16 automatically with the card in place. Lights on the card came on and thought to myself how quiet it was 'cause I couldnt hear the fans.

Found that I needed 42 ohms and I had 35 ohms on the 12V-rail. So I had to take back the PSU and find another with enough ohms and watts.

Just something you should think about before buying like I didn't.