Ultra-mode video card shopping guide
OK, I am not sure what planets aligned... but the spousal unit has tentatively approved a new rig.
I am examining some Alienware setups in my comparison shopping. Dual 1GB GDDR5 ATI Radeon HD 5670 CrossfireX Enabled or Single 1.8GB NVIDIA® GeForce® GTX 295 or Single 1GB GDDR5 ATI Radeon HD 5870 (also an option for Dual 1GB GDDR5 ATI Radeon HD 5870, but don't think I can go that far) |
Alienware, the King of overpriced Hardware!
You could build the exact same spec system for roughly 2/3 what they charge.
The 5870. Hands down. Dual, at this point, won't benefit you a great deal for gaming (may not be long off though).
|
Time is worth much as well, and frankly the last thing I want is to deal with all that, especially if I screw something up. But your point is taken - after all I said I was shopping around. And I am comparing many avenues
I would go with the HD 5870. The GTX 295 is really an SLi setup on one card and it's unclear yet if SLi or Crossfire will make a difference.
Plus two HD 5670s (2x 400 SPs at 775MHz) is less than one HD 5770 (800 SPs at 850MHz). The HD 5870 is 1600 SPs at 850Mhz.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
Yes it's me again...
I've been trying to get the dimensions of the HIS HD 5750 iCooler IV 1GB (128bit) GDDR5 PCIe (DirectX 11/ Eyefinity) particularly were the cooler assembly is located on the card relative to the ends of the card. However trying to get answers of the manufacturers is like getting blood out of stone.
Anyone able to give me some pointers or able to supply the dimensions? I just need to know if it'll fit my case before I tell the shop to ship it.
Sig characters appear in the Ch�teau Rouge thread starting from post #100
I Support Nerd Flirting! - Story/Discussion/Sequel
Yes it's me again...
I've been trying to get the dimensions of the HIS HD 5750 iCooler IV 1GB (128bit) GDDR5 PCIe (DirectX 11/ Eyefinity) particularly were the cooler assembly is located on the card relative to the ends of the card. However trying to get answers of the manufacturers is like getting blood out of stone. Anyone able to give me some pointers or able to supply the dimensions? I just need to know if it'll fit my case before I tell the shop to ship it. |
If you're looking at that card, I should point out that Newegg has the (higher quality) Sapphire Vapor-X 5750 listed at the same price as the HIS card ($145), but with free shipping. It also has the standard version from Sapphire on sale for $135; again with free shipping and with a limited-time checkout promo-code for an additional $15 off.
The card's length is 19.7cm, width of 4cm. Now if you click on the picture of the card that shows the card from the top edge (next to the last one on the right), it will blow up and display a picture that's almost as wide as your browser window. Capture that image and print it out (landscape mode). If you don't have a printer then you will be forced to measure off the screen.
Now break out your calculator and ruler. Measure the length of the card (board part, exclude the bracket). Remember that it should be 19.7cm. So calculate the scale factor of actual to measured (19.7/measured). Now go ahead and take measurements of what you are interested in. Then multiply it by the calculated scale factor, round up and you get a pretty good idea of the actual measurements.
Edit: What I get is the cooler stops at around 17.5cm (measured from the bracket), at that point it's 2.5cm. It widens to 3.0cm at 15.0cm where the fan cage starts which widens to 4cm unit you're 6cm from the bracket and then drops back to 3cm. You should still do your own calculations.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
Well, just by eyeballing it, it appears that it extends about the same distance on either side of the PCIe x16 slot. You can measure from the edge of the slot to the back of the case and then add the same amount (plus the length of a 6-pin plug on your power supply) on the opposite side of the slot. Regardless it should be less than the 9.6 inches of a standard ATX/micro-ATX motherboard; probably around 7.5.
If you're looking at that card, I should point out that Newegg has the (higher quality) Sapphire Vapor-X 5750 listed at the same price as the HIS card ($145), but with free shipping. It also has the standard version from Sapphire on sale for $135; again with free shipping and with a limited-time checkout promo-code for an additional $15 off. |
The card's length is 19.7cm, width of 4cm. Now if you click on the picture of the card that shows the card from the top edge (next to the last one on the right), it will blow up and display a picture that's almost as wide as your browser window. Capture that image and print it out (landscape mode). If you don't have a printer then you will be forced to measure off the screen.
Now break out your calculator and ruler. Measure the length of the card (board part, exclude the bracket). Remember that it should be 19.7cm. So calculate the scale factor of actual to measured (19.7/measured). Now go ahead and take measurements of what you are interested in. Then multiply it by the calculated scale factor, round up and you get a pretty good idea of the actual measurements. Edit: What I get is the cooler stops at around 17.5cm (measured from the bracket), at that point it's 2.5cm. It widens to 3.0cm at 15.0cm where the fan cage starts which widens to 4cm unit you're 6cm from the bracket and then drops back to 3cm. You should still do your own calculations. |
Shader.
Sig characters appear in the Ch�teau Rouge thread starting from post #100
I Support Nerd Flirting! - Story/Discussion/Sequel
Been thinking about getting a card for the wife so she can go nuts with Ultra Mode: Hoping this would do it for Low-Mid: http://www.newegg.com/Product/Produc...82E16814143186 BFG Tech GeForce 9800 GT 512MB;
If you can recommend a mid to High Card for 100-200 that isn't HUGE (I need a card that only takes up 1 slot); that would be appreciated.
Caveats: I tend to shop NewEgg for their Customer service (Been burned on PriceWatch too often), and shop NVidia.
You bring up a good point: some top-end video cards require their own power cable in your case, and some even require two. Make sure your PC's power supply can handle what you are planning on putting in. In addition, some of these cards are physically huge. I know the card I ended up putting in my home PC -barely- fit and I had to remove several other things just to get it into the slot. If you find yourself in this position and are not comfortable with the inside of your PC, have a friend do the install for you, or have it professionally done.
|
It takes a little while longer to boot up from the bios; not sure why. However, once I'm in windows I'm gravy.
Jumped from 8800 GTX to HD 5850 and my framerate doubled. (using sapphire's overclocked model though, which is more like a 5870).
Been thinking about getting a card for the wife so she can go nuts with Ultra Mode: Hoping this would do it for Low-Mid: http://www.newegg.com/Product/Produc...82E16814143186 BFG Tech GeForce 9800 GT 512MB;
If you can recommend a mid to High Card for 100-200 that isn't HUGE (I need a card that only takes up 1 slot); that would be appreciated. Caveats: I tend to shop NewEgg for their Customer service (Been burned on PriceWatch too often), and shop NVidia. |
9800 GT takes up one PCI-e slot, but IT CAN BE as physically tall as two slots.
So be careful to get specs on the physical size. It really depends on the manufacturer.
9800 GTX is guaranteed to be as big as two slots.
That's why you look at the pictures such as the one that's the bracket/ports view. Wider coolers will stick out and be all obvious like.
Kry8ter, sorry, can't think of one. The more powerful the card, the more power it uses, the more heat it generates, the bigger the cooler it needs. The 9800GT is pretty much the limit when it comes to low profile, single slot coolers.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
That's why you look at the pictures such as the one that's the bracket/ports view. Wider coolers will stick out and be all obvious like.
Kry8ter, sorry, can't think of one. The more powerful the card, the more power it uses, the more heat it generates, the bigger the cooler it needs. The 9800GT is pretty much the limit when it comes to low profile, single slot coolers. |
It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.
As stated by some others, if you're building a new rig solely for CoH/V, then, even with Ultra mode coming down the pipe, a dual GPU system probably won't be much help to you, as SLI/CrossfireX tends to yield less impressive results in MMORPGs than in most other games.
Now I understand the idea of wanting a dual card set-up, I have one myself, but SLI or CrossfireX really needs to be treated one of two ways: either as the starting point for a very high end gaming system; or as an upgrade path on a mid-range system. I see people looking at building something new with dual cards at the mid-range which is something that doesn't really make a lot of sense. For example, the Radeon 5670s mentioned in earlier in an Alienware build would be outperformed by a 5770 or 5830 for less money with lower power consumption and heat generation. But if you already have a card like a 5670 and you're just looking to add some extra power to your graphics processing, then adding in a second 5670 makes a lot of sense, as its probably still cheaper than selling off the used mid-range card and then buying a new high end card.
However, I feel this thread represents a pretty large failure on the dev's part: by February, they should have had a pretty exact specification needed to run ultra mode, instead of all this guess work that we players have been making. And while I understand the need to still be flexible while this is under development, considering that many of their customers are considering a fairly expensive upgrade to coincide with this graphics update, I really think that it was their responsibility to nail down and post an exact minimum spec. While everyone is sort of panicking with the whole "can my system run ultra mode?" I'm pretty sure that ultra mode's requirements are going to be far less hefty than most are scared of: a dual core CPU at about 2.6GHz and a Geforce 8600 or Radeon 3850 are probably all that will be required for minimum spec (most game developers shoot for specs much lower than this for minimum specs, and we're talking about on very popular games like Mass Effect or Call of Duty; its stupid to eliminate 70% of potential customers out there with obscenely high requirements); cards in the current mid-range like GTS 250 or Radeon 5750 will probably be all that's needed to run the game at relatively high settings with a good frame rate. The bulk of worried posts in this thread could have been largely avoided by simply making a statement along the lines of "You WILL need at least CPU XX, X amount of RAM and card XXX from Nvidia or XXXX from ATI in order to run ultra mode: we are still shooting for a slightly lower minimum spec, but this is the guaranteed minimum spec required for ultra mode." This hasn't shown up yet, and so now we have people running around trying to shop for an unknown future, and quite possibly spending money that they could have put to better use somewhere else simply because the dev's have not nailed down a guaranteed minimum spec for ultra mode, and haven't posted that spec up in lights with arrows pointing to it for everyone to see.
So what is the absolute best card out there to run CoX with all the Ultra mode options on the highest settings?
If it matters, I have a 2.85Ghz Quad Core system with 8GB of Ram, 64-bit Windows Vista Ultimate, and a 1000w power supply, not to mention a fairly huge case, so size of card of the power it takes isn't a great concern.
My Mission Architect arcs:
Attack of the Toymenator - Arc # 207874
Attack of the Monsters of Legend - Arc # 82060
Visit Cerulean Shadow's Myspace page!
As stated by some others, if you're building a new rig solely for CoH/V, then, even with Ultra mode coming down the pipe, a dual GPU system probably won't be much help to you, as SLI/CrossfireX tends to yield less impressive results in MMORPGs than in most other games.
Now I understand the idea of wanting a dual card set-up, I have one myself, but SLI or CrossfireX really needs to be treated one of two ways: either as the starting point for a very high end gaming system; or as an upgrade path on a mid-range system. I see people looking at building something new with dual cards at the mid-range which is something that doesn't really make a lot of sense. For example, the Radeon 5670s mentioned in earlier in an Alienware build would be outperformed by a 5770 or 5830 for less money with lower power consumption and heat generation. But if you already have a card like a 5670 and you're just looking to add some extra power to your graphics processing, then adding in a second 5670 makes a lot of sense, as its probably still cheaper than selling off the used mid-range card and then buying a new high end card. However, I feel this thread represents a pretty large failure on the dev's part: by February, they should have had a pretty exact specification needed to run ultra mode, instead of all this guess work that we players have been making. And while I understand the need to still be flexible while this is under development, considering that many of their customers are considering a fairly expensive upgrade to coincide with this graphics update, I really think that it was their responsibility to nail down and post an exact minimum spec. While everyone is sort of panicking with the whole "can my system run ultra mode?" I'm pretty sure that ultra mode's requirements are going to be far less hefty than most are scared of: a dual core CPU at about 2.6GHz and a Geforce 8600 or Radeon 3850 are probably all that will be required for minimum spec (most game developers shoot for specs much lower than this for minimum specs, and we're talking about on very popular games like Mass Effect or Call of Duty; its stupid to eliminate 70% of potential customers out there with obscenely high requirements); cards in the current mid-range like GTS 250 or Radeon 5750 will probably be all that's needed to run the game at relatively high settings with a good frame rate. The bulk of worried posts in this thread could have been largely avoided by simply making a statement along the lines of "You WILL need at least CPU XX, X amount of RAM and card XXX from Nvidia or XXXX from ATI in order to run ultra mode: we are still shooting for a slightly lower minimum spec, but this is the guaranteed minimum spec required for ultra mode." This hasn't shown up yet, and so now we have people running around trying to shop for an unknown future, and quite possibly spending money that they could have put to better use somewhere else simply because the dev's have not nailed down a guaranteed minimum spec for ultra mode, and haven't posted that spec up in lights with arrows pointing to it for everyone to see. |
Beyond that, while they haven't specified a CPU/memory requirement for UM, Posi has listed a range of what cards they expect to do what level of UM. And thus it can be taken that the 9800GT and equivalents is the current minimum spec for UM. Also keep in mind that no one is being run off. If you can play now, you'll still be able to play under I17. UM is an *optional* feature. Anyways, until I17 at least leaves Closed Beta, I think you're expecting too much.
It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.
What I can't figure out is why I can run Crysis, Fallout3, and Mass Effect 2 with all the bells and whitles with my GTX260 but CoH is a lag fest if I turn on all the effects. And I have good net access.
So what is the absolute best card out there to run CoX with all the Ultra mode options on the highest settings?
If it matters, I have a 2.85Ghz Quad Core system with 8GB of Ram, 64-bit Windows Vista Ultimate, and a 1000w power supply, not to mention a fairly huge case, so size of card of the power it takes isn't a great concern. |
It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.
I was thinking of getting the GeForce GTX 285 because Positron mentioned that one in his initial post in this thread, so thanks for the heads up re: DirectX 11. I guess money is an object to a point - I don't want to spend $700+ for the Radeon 5970 - but the Radeon 5870 or Nvidia GTX 480 look like possibilities.
My Mission Architect arcs:
Attack of the Toymenator - Arc # 207874
Attack of the Monsters of Legend - Arc # 82060
Visit Cerulean Shadow's Myspace page!
Well first off, for the moment, SLI and Crossfire are stated to not work with UM so until they are, a dual GPU setup is less than optimal for that reason alone. Presumably, they will get it working at some point but we don't know when. Until then, single GPU upgrades are the way to go.
Beyond that, while they haven't specified a CPU/memory requirement for UM, Posi has listed a range of what cards they expect to do what level of UM. And thus it can be taken that the 9800GT and equivalents is the current minimum spec for UM. Also keep in mind that no one is being run off. If you can play now, you'll still be able to play under I17. UM is an *optional* feature. Anyways, until I17 at least leaves Closed Beta, I think you're expecting too much. |
I think you might have misread my statement about running off 70% of customers due to high system requirements: I was actually referring to why even the most recent cutting edge games tend to have relatively low minimum system requirements. And as I stated, I expect the minimum demands of UM to be along the lines of what it would take to run a DirectX 10 only game, despite the fact that CoH/V actually uses OpenGL, which would be comprised of what was considered mid-range hardware circa 2007.
But my point is that even as an optional graphical upgrade, a guaranteed minimum spec should have been posted at least a month ago; all we've seen are "well, you can expect this sort of result with this... etc" statements, which are not exactly concrete information: I can throw a Radeon 5970 into an eight year old system with a Pentium 4 CPU and will not see anywhere near the same result as someone with the same card running with a Phenom II quad core CPU. Its fine for for the exact specification to still be in downward flux while at this stage of development, but we should have been told a CPU, RAM and graphics spec that was guaranteed to meet minimum requirements for UM; it gives them room to work on a less demanding spec, but it also would give their customers a firm idea of what they should be looking for if they are considering an upgrade to allow for UM. Honestly, I'm not concerned about my rig's ability to run UM at maximum settings (or any currently existing game), but not everyone has the money available to go out and get themselves a Core i7 system with a Radeon 5870 or GTX 480, and that's why a guaranteed minimum spec should have been posted long before now.
OpenGL has evolved as well as DirectX, both taking advantage of the hardware available in Dx10/10.1/11 cards. Actually OpenGL had features found in Dx10 before there was a Dx10.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
Apple announced their new i7 MacBook Pro computers today with the following video card info:
The graphics chip has been bumped to NVIDIA GeForce GT 330M, with a choice of graphics memory options. This replaces the NVIDIA 9400M and 9600M GT cards of the previous generation. |
Thanks,
Buxley
"Don't cry because it's over, smile because it happened." -- Dr. Seuss
Apple announced their new i7 MacBook Pro computers today with the following video card info:
Any ideas/guesses as to how good the NVIDIA GeForce GT 330M would be at handling Ultra Mode? Thanks, Buxley |
However, I'd honestly stay away from the current crop of mobile Core i7s. The clock speeds on them are very low. Yes they have a great turbo boost feature, but that only boosts the speed on the first two cores when multi-threaded applications are not coming heavily into play: once you get into using a program optimized for multiple threads, or into multi-tasking, any advantage from the turbo boost disappears and you're left with a very slow quad core processor (not difficult for a gamer to kill turbo boost's advantage: run the game, a web browser and a music program at the same time). I personally feel that the mobile Core i5s are a much better value: higher default clock speeds on a hyper threaded dual core CPU give you the power of a high speed quad core for a much lower price; plus, they still have the great turbo boost for when you're not running anything demanding.
I am examining some Alienware setups in my comparison shopping.
Dual 1GB GDDR5 ATI Radeon HD 5670 CrossfireX Enabled
or
Single 1.8GB NVIDIA® GeForce® GTX 295
or
Single 1GB GDDR5 ATI Radeon HD 5870
(also an option for Dual 1GB GDDR5 ATI Radeon HD 5870, but don't think I can go that far)