How Important is Crossfire/SLI?


Cade Lawson

 

Posted

I'm debating finally getting a rig primarily for gaming/media, and I'm dithering over the video card. As I understand it, if I plan to go the multi-vid card route I have to get a mobo that supports it from the get-go. Question is, is it really worth it? Two cards means a beefier power supply (not much of an issue, just an annoyance) to better thermal regulation (definately an issue) and possibly other things that I don't even know about (biggest issue).

So, is the gain in graphics shineyness worth juggling the above factors? Or should I just get a decent single card that's near-cutting edge?


 

Posted

First obvious question: How many displays are you planning to run?


 

Posted

Quote:
Originally Posted by stoneheart View Post
I'm debating finally getting a rig primarily for gaming/media, and I'm dithering over the video card. As I understand it, if I plan to go the multi-vid card route I have to get a mobo that supports it from the get-go. Question is, is it really worth it? Two cards means a beefier power supply (not much of an issue, just an annoyance) to better thermal regulation (definately an issue) and possibly other things that I don't even know about (biggest issue).

So, is the gain in graphics shineyness worth juggling the above factors? Or should I just get a decent single card that's near-cutting edge?
really depends on who you ask, and what you play.

Some games don't benefit from multi-gpu setups. Some games do. Some games only show benefits with Nvidia SLI. Some games shine best with Crossfire.

The advantage to multi-gpu setups is that you can often get equivalent rendering power for less cash. For example, a RadeonHD 5870 will cost you around $400 to purchase. Two RadeonHD 5770's, which cost around $160-$170 each will offer rendering that will get you close to, if not past, a single 5870: http://www.xbitlabs.com/articles/vid..._14.html#sect0

two RadeonHD 5750's, which run around $130-$140 each, can keep up with the $300 5850.

So you can spend a bit of cash now, and say get a single graphics card, and then get an identical card a couple months down the line, and have even more rendering power.

Then there's the added advantage of processing technologies like Nvidia PhysX or OpenCl that can utilize "extra" processors. The scope of OpenCL is wider than the scope of PhysX, and game developers are already planning on how to write code that can utilize additional GPU's to deliver better physics, AI, or even better positional audio. With both Intel and AMD delivering GPU/CPU combination's over the next 6 to 8 months, developers will have a growing platform base upon which OpenCL acceleration can be useful.

Personally, I don't build a computer now without making plans for a multi-gpu setup... but that's just me.


 

Posted

I'm not a fan of multiple graphic cards even though my current $1200 build is designed to for someone to upgrade with a second card as well as limited overclocking of the CPU.

The problem stems from cheaper, higher resolution monitors and game engines that are designed for hardware requirements that run on the high side. Add in gamers who believe the only way to enjoy a game is with all the quality knobs turned to 11 but then complain about the poor performance.

I'm seeing this already with tons of PMs and threads about Going Rogue's Ultra-Mode graphics. The game will be the same game with or without those new effects but there is a almost bizarre impression that if you can't run it then the game won't play right. Frankly we don't know. Positron says that the 9800GT could run all the new ultra options on the lowest setting. Higher settings will require higher end cards and possibly SLi and Crossfire in extreme cases but we just don't know.

Once the Open Beta starts then we can all find out about how each of the new settings impact current performance.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

Quote:
Originally Posted by Arcbinder View Post
First obvious question: How many displays are you planning to run?
A valid question, and an important point I probably should have mentioned. I'm only going to be running one display, but it's a large one; about 47 inches, if I remember right. (Gotta love HDMI ports.)

As I found out the hard way when upgrading my multi-purpose system, larger monitors need larger cards. So if I do go the single-card route, I would have to get something pretty beefy to start with.

For multi-card rigs, though, do all of the cards have to be equivalent, like with RAM? Or can I get a decent card now and link it up with a newer card down the line? And can you link up the on-board graphics processor into the loop?


 

Posted

Quote:
Originally Posted by stoneheart View Post

For multi-card rigs, though, do all of the cards have to be equivalent, like with RAM? Or can I get a decent card now and link it up with a newer card down the line? And can you link up the on-board graphics processor into the loop?
That used to be simple to answer. >.< Use equivalent cards (a 9800 and GT260 do not get SLI'd together.) Now, ATI does have something which can use the onboard and a new card, tri-card setups with one card being used for physics are seen, and MSI (I believe) has released a board with fuzion's "hydra" chispset which doesn't particularly care (in theory you could mix ATI and nVidia cards. I'd hate to see - or more specifically, troubleshoot - what happens.)

Generally, though, I'd still assume the answer to be "yes" to using the same card.


 

Posted

Quote:
Originally Posted by stoneheart View Post
For multi-card rigs, though, do all of the cards have to be equivalent, like with RAM? Or can I get a decent card now and link it up with a newer card down the line? And can you link up the on-board graphics processor into the loop?
Assuming a motherboard that can handle nVidia SLi, the cards have to have the be the same model, but can be different manufacturers. Problem is on occasion the life span of a particular nVidia model can be quite short if you are planning to add a 2nd card at a later date. Also nVidia has been on a kick recently renaming products, problem is the drivers won't allow SLi to work even if, other than the name, the card's are identical (GTS 250 and the 9800GTX+).

For a motherboard that can handle ATI Crossfire, they don't need to be the same model. For example Xbit-Labs did a test mixing an HD 5770 with an HD 5750.

There is a motherboard chip talked about that can make nVidia cards work together with ATI cards. But there is only one motherboard with it and the testing wasn't all that favorable.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet