Ultra-mode video card shopping guide
one point also, graphic cards drop in price quickly when new cards come out, as GR is slated for Q2 next year ,that is 6 months or so,by tht time the ATI 5XXX will have had at least 1 price drop and maybe more,Nvidia,is suppose to come out with the 300 series of cards if so the high end 295 will be reduced in price.
and also the new GPU can't do much for you if your CPU can't feed it fast enough.
Fluffy Bunny 1 Person SG
Rabid Bunny 1 Person VG
Both on Pinnacle
Hobbit's Hole 1 Person SG
Spider's Web 1 Person VG
Both on Freedom
A note to all Nvidia 8800 owners. Do not weep! If your 8800 card has the N92 core, then this card is IDENTICAL to the 9800 range (which are nothing but renamed 8800's), and thus you will be able to run Ultra Mode (though apparently in reduced quality settings).
Seeing as my old 8800GT can throw things like Crysis around at 50+fps, and Dragons Age: Origins, at max quality, at 50+fps, I'm not worried about reduced quality settings...
@FloatingFatMan
Do not go gentle into that good night.
Rage, rage against the dying of the light.
<noob>
So, is it worth me buying a new PC with a GTX 295 or waiting for the 300s? I'd really like one sooner rather than later, but if I can get something better for a similar (or lower) price when they come out, I may just wait.
Also - Does anyone know how well CoH runs on 64bit Windows 7? I'm considering moving to 64bit, or possibly dual booting between 64bit for some games and 32bit for general use.
</noob>
[CENTER]Euro side: [B]@Orion Star[/B] & [B]@Orions Star[/B][/CENTER]
May have to consider how much ohms your PSU puts out depending on what card you get. I bought a NVidia 280 and a PSU with 300 more watts that it recommended. I installed everything, powered it up, and it kept using on-board video. Typically, it would use PCIx16 automatically with the card in place. Lights on the card came on and thought to myself how quiet it was 'cause I couldnt hear the fans.
Found that I needed 42 ohms and I had 35 ohms on the 12V-rail. So I had to take back the PSU and find another with enough ohms and watts. Just something you should think about before buying like I didn't. |
An Ohm is a measure of resistance or impedance, You're more likely to run across that when selecting speakers for a home theater system than you would be looking at ratings of power supplies.
Also be aware that many higher end power supplies have a PAIR of 12v rails, to try and distribute the load. That can be really handy for folks with a lot of hard drives, with draw high power when spinnng up. I'm not sure in that case if you can (presuming the card has two 12v connectors) connect both rails to the video card in order to provide it with enough power, of if that would be a 'bad idea'.
Used to be I had to worry about he power drawn by the CPU, and keeping it cool. These days that's small potatoes compared to the power and cooling needed by the video card. Of course most video cards currently represent a massivly parallel computing capacity that would put many older mainframe systems to shame, so i guess that's not entirely unexpected.
(Times like this I shake my head and wonder how the hell I ever got by on an 8086, 640K of memory, and a 20Mb hard disk. Then again my little Sansa media player probably has more computing power than that old Compaq Deskpro did. It's enough to make a geek feel Ancient it is.. _
My GeForce 8800GT just died and I replaced it with a RADEON HD4670. I know currently the games is more geared torwards the Nvidia cards. Any word on wether or not the new Ultra Mode will be more ATI friendly?
Yes, that was me. I'm not quite ready to order a funerary floral arrangement for Nvidia, but they are in a very nasty position.
And that's where Taiwan Semiconductor Manufacturing Company's bungling becomes important... The new 40 nm resolution manufacturing process that both Nvidia and ATI are using from TSMC has turned out to be more problematic than advertised. ATI was getting very low yields of functional chips per batch of silicon. Fermi, being half-again more complex than Cypress, is even more vulnerable to random fatal defects. Reports were that the original test production runs resulted in 1.7% functional chips. Those that functioned turned out to do so at much lower clock rates than expected. If you have a chip with massive capabilities that works at a relatively low frequency, it may not outperform a less elaborate chip that can run faster... Nvidia is in better shape than ATI to take a(nother) financial hit. The question is what, if anything, the company will be able to do *afterwards*? So yeah, Nvidia? Deep trouble. |
Great info BTW thanks for taking the time to type that up. For the un-initiated, the material for silicon chips is basically a giant crystalline structure that is 'grown' by being pulled out of molten doped silicon material (http://en.wikipedia.org/wiki/Czochralski_process) and there's pretty much a limit to the diameter of the stuff (currently around 30cm). That's then sliced into thin polished 'wafers' that are the raw material used to produce the chips. The smaller the traces on the chip (currently measured in Nano-meters) the smaller each chip will be (and also the less power it consumes, heat it generates, etc). The more transisters on the chip, the larger it will be. And of course the smaller scale you are trying to work at (e.g. 40nm vs 55)nm, the more precise everything has to be and the smaller the margin of error in order to produce working chips.
The 'per wafer' fab expenses and fab time tends to be relatively constant, so the more (working!) chips you can get per wafer (yeild), the cheaper your production costs per chip, and also the higher your potential output . So the drive to get to a viable 40nm process is huge, but doesn't pay off pricewise (and capacity wise) until your process is reliable enough to get more working chips per wafer. Especially when you consider that you have to test each chip, and don't get paid for the ones that don't work. 40nm has thus far 'backfired' because the yeilds have been so low, that the costs have been higher, and output lower, something nobody is happy about.
Both Nvidia and ATI need 40nm process chips in order to keep cards inside reasonable limits for power consumption and heat output, and the new stuff has all been designed around that, so it's likely very impractical for them to 'punt' and just use 55nm process stuff until the bugs are worked out of the other fab lines.
OTOH, the older 55nm stuff is reliable fab wise, and since that process is mature, the yeilds per wafer are pretty good, meaning it's probably a lot cheaper (until those 40nm yeilds get better) Especially if you've managed to pay off your investment in the design of those chips.
The result of that is that I'd personally expect both companies to hold the prices of newer stuff that uses 40nm process a lot higher, especially if there's not currently capacity to generate those parts in quantity anyway (no point to lowering the price to sell more, if you don't have the 'more' to sell) Meanwhile I'd expect the prices of the older stuff to stay low, and represent a majority of their sales.
So, is it worth me buying a new PC with a GTX 295 or waiting for the 300s? I'd really like one sooner rather than later, but if I can get something better for a similar (or lower) price when they come out, I may just wait. |
Thing is though, ATI is now the official graphics partner for Paragon Studios, rather than Nvidia. The only hesitance I have in recommending someone go with an ATI board is that we don't yet have confirmation that the regular graphics engine (the standard one, not the Ultra Mode) will have the ATI-bugs in it fixed with GR.
If those remain in the regular version, you have the possibility of someone "upgrading" to an ATI board on a budget and chancing to put themselves just below what they personally consider playable for Ultra Mode and stuck with ATI graphical oddities in regular mode.
If you have the money to spend on a GTX 295 you're shopping for something in the class well above that grey area. I'd go with a Radeon HD 5850 or 5870, depending on your preference.
As to waiting, the most reliable estimates I've seen of Fermi/G300 hitting actual market shelves is April/May of 2010. Assuming Fermi turns out to be a competent offering, then ATI boards might drop a bit in price. I don't see a reason for them to do so any earlier than that.
one point also, graphic cards drop in price quickly when new cards come out, as GR is slated for Q2 next year ,that is 6 months or so,by tht time the ATI 5XXX will have had at least 1 price drop and maybe more,Nvidia,is suppose to come out with the 300 series of cards if so the high end 295 will be reduced in price.
and also the new GPU can't do much for you if your CPU can't feed it fast enough. |
As to your last point.. very very valid. The overall system performance is going to tend to be constrained by the first bottleneck. Improvements outside of that area will result in very little if any change in performance. Improvements TO the bottleneck will have a very direct and measurable result, right up until something else becomes the bottleneck. This is basic "Theory of Constraints". The tricky part is figuring out where the constraint is, especially when it's not always easy to instrument things like how busy the GPU is.
My general rule of thumbs are:
1) if most of your memory is used when CoX is running, you ought to close down other stuff that's running, or get more ram. (up to the limit of what your OS supports, which is just over 3 GB for most 32 bit OS's)
2) If at least one CPU core is at or very near 100%, when CoX is running, then getting a faster CPU will probably help.
3) If neither CPU or Memory are maxed when CoX is running, the faster video will probably help.
Could you guys with a lot of video card know-how and experience drop by this thread and give me a hand? I'm trying to decide between a HD 5770 and a GTX 260. Any insight is appreciated.
www.SaveCOH.com: Calls to Action and Events Calendar
This is what 3700 heroes in a single zone looks like.
Thanks to @EnsonsDeath for the GVE code that made me VIP again!
wanted to catch a quick comment from something I didn't see addressed:
So if TSMC has so badly messed up with both AMD/ATI and Nvidia, it sounds a lot more like a situation where the first of them to come to agreements with a new Fab partner might be the winner in this whole debacle. Or is TSMC the only game in town for cranking out silicon in the 40nm process? |
Intel
Chartered Semiconductors
Taiwan Semiconductor Manufacturing Company (TSMC)
GlobalFoundries
IBM
UMC
Well, Intel's out of the question. They are not about to produce Nvidia or ATi graphics chips when they have their sites set on the add-in market with Larrabee.
In a similar manner, GlobalFoundries is partly out of the question. Most of their business is taking up on AMD Central Processors. They would need to increase their capacity in order to handle also producing ATi chipsets.
TSMC is also out of the question, which is why AMD is having trouble keeping 58xx cards in stock, and they are partially responsible for Nvidia's Fermi being put to the side.
Chartered might be both out of the question and the solution for AMD. They've been bought by ATIC, which is GlobalFoundries: http://www.charteredsemi.com/newsroo.../20091104.aspx However, Chartered already had it's own list of customers they were selling chips to, so that's not a lot of headroom.
That leaves the United Microelectronics Corporation and IBM as the only foundries who have the capacity, and the technology, to possibly fill the orders for ATi and Nvidia.
UMC is already working for Nvidia, and was partially responsible for helping to deliver the G92 graphics processors. However, UMC is in the same position as TSMC... they are only getting around to 40nm support.
That leaves IBM as one of the few foundries that can possibly deliver on 40nm in large quantities. I suspect that AMD is probably trying to coax a deal out of IBM for that reason.
That depends on how much money you have and how much you value your sanity and hearing. In operation, reviewers describe the sound coming from dual-chip boards as "uncomfortable". If you've got the money to watercool the board or are someone who never plays without headphones on, that could be different.
Thing is though, ATI is now the official graphics partner for Paragon Studios, rather than Nvidia. The only hesitance I have in recommending someone go with an ATI board is that we don't yet have confirmation that the regular graphics engine (the standard one, not the Ultra Mode) will have the ATI-bugs in it fixed with GR. If those remain in the regular version, you have the possibility of someone "upgrading" to an ATI board on a budget and chancing to put themselves just below what they personally consider playable for Ultra Mode and stuck with ATI graphical oddities in regular mode. If you have the money to spend on a GTX 295 you're shopping for something in the class well above that grey area. I'd go with a Radeon HD 5850 or 5870, depending on your preference. As to waiting, the most reliable estimates I've seen of Fermi/G300 hitting actual market shelves is April/May of 2010. Assuming Fermi turns out to be a competent offering, then ATI boards might drop a bit in price. I don't see a reason for them to do so any earlier than that. |
Is the GTX 285 2048mb a double chip board too? If not I'll go for that one - The noise of my current machine is one of my main reasons for upgrading.
Thanks
[CENTER]Euro side: [B]@Orion Star[/B] & [B]@Orions Star[/B][/CENTER]
Waiting 'till April is a bit too long to be honest Swapping to an ATI card isn't an option really as I'm a huge fan of 3d vision.
Is the GTX 285 2048mb a double chip board too? If not I'll go for that one - The noise of my current machine is one of my main reasons for upgrading. Thanks |
And no, the GTX 285 is a single chip board. It's also bloody loud.
I'll throw a few cents in for everyone. Of my last check the 5800 cards from ATI were generally speaking not available. From what the devs have said a 4800 from ATI is just about where the line is drawn as to the grey area better cards will handle ultra mode better and the 4870, which can handle the current graphics well, will be able to handle some but not all of the graphical updates.
This means if you are looking to upgrade for it to be a reasonable improvement you'll want to be moving from a 4870 to a 5800 or from something in the 8800series to something 4870 or better. If you are running something lower on the totem than a 8800 i'd dare say that any modern card would be a good idea, but do your research first.
If you are upgrading to something in the *8** (either brand) you are more than likely going to need a power supply with a PCIe connector. A single +12V rail in the 500-600W range is your best bet. Double rails are ok, but less desireable, more than that and you should be wary. You should be able to look at the power sticker on device to see how many rails there are, the number of times you see +12V in the main power table is the number of rails there are. I know it sounds confusing, but believe in yourself and you can trouble out what that sticker is telling you. voltage times amperage equals watts so under the +12V rail you'll see an amperage say 30Amps, 12*30 = 360W. As always do your research, read reviews, read customer reviews, it takes time but you wont be shortchanged or sold a lemon.
hope this helps you with your purchases.
Roxy On DA...Finally!
Waiting 'till April is a bit too long to be honest Swapping to an ATI card isn't an option really as I'm a huge fan of 3d vision.
Is the GTX 285 2048mb a double chip board too? If not I'll go for that one - The noise of my current machine is one of my main reasons for upgrading. Thanks |
While the GTX 285 will be quieter than the 295, you're still going to be quite aware that it's there in operation. Again, if you've got the money to go to a water cooler, then the noise will be less of an issue with the GTX 295 and it's the most powerful Nvidia offering you can find right now. If water cooling isn't an option, you might look at a GTX 285 and something like this. There is a GTX 285 compatible version of it. (Do read the review of the other version though.)
Before buying one of those, be aware of several things. Firstly, it's going to make a large graphics card even larger. Break out a ruler and measure if it will (A) fit in your case and (B) you're willing to give up that much space. Secondly, it's not going to vent the heated air out the back of your case, it's going to spread it ambiently in your system. While not ideal, you can deal with that if you've got decent exhaust fan ventilation to begin with. If not, you really should and need to be looking at some fan modifications anyway*. Thirdly, you may have to hunt around for one of these things and it's going to cost you around $70 on top of the card. That's still less than a stock GTX 295 is going to cost though. Finally, you have to feel confident in your ability to operate a screwdriver and cleaning cloth. However, the MX-2 that comes pre-applied on the cooler for you is quite spiffy thermal interface material and you won't have to worry about spreading it correctly (just seat it right the first time you put it on the board).
Thermalright also sells a video card cooler, but from reviews it performs less efficiently than the Accelero, is more sprawling in your case, and is even more "do it yourself" in installation.
(*If you're looking for good, quite fans, the various Noiseblocker Multiframe models are awesome in operation. They'll cost you about $25 apiece, but you literally can't get better ones, you would still be under the cost of a single GTX 295, and the manufacturer gives a Mean Time Between Failures for the bearing of about 20 years =P (no, really).)
One other thing that I'd love to know..
If indeed ATI is becoming the official 'logo-on-the-spash-screen' video vendor, then is there any thought being offered to support for ATI's "Eyefinity" multi-display feature?
For those not familiar, here's a review, the video on particular is very enlightening..
http://hardocp.com/article/2009/09/2...nology_review/
The prospect of something like say 3 widescreens, (perhaps even in portrait mode?) all at once in a 'wrap around' configuration is well.. drool inspiring..
Can you say 'immersive'? knew you could.
*head tilts*
Sine I don't an HD 5x00 card on hand, and since my old Matrox Triple Head setup is sitting somewhere in a dumpster, I'm not actually able to test Eyefinity. I am pretty sure from reading the documentation that the game itself doesn't have to explicitly support the technology.
Theoretically, Eyefinity resolutions are specified at the driver level, and was working across all versions of the Catalyst Driver (Linux, NT5, and NT6). So you should be able to set Eyefinity resolutions with the current game engine.
For some games though, you might need the widescreen fixer : http://imk.cx/pc/widescreenfixer/
***
Edit, also, Bezel management for Eyefinity should be arriving sometime next year : http://hardocp.com/news/2009/12/01/e...are_next_year/
There's also supposed to be an "Eyefinity Edition" of some 5xxx cards coming out "later". We don't know which cards will be built to the specification, but at least the 5870 will. The Eyefinity Edition will have 6 mini-Displayport jacks instead of 2 DVI, 1 Displayport, and 1 HDMI connectors like the other 5xxx cards. Presumably this means you can run 6 monitors off of one card, or twelve in crossfire. Incidentally, the 6 mini-Displayports also mean that second-slot venting on the card goes all the way across instead of being blocked by the DVI ports; letting the card breathe and cool itself better.
Given the trouble they've had getting standard 58xxs to market, I expect "later" for the Eyefinity Edition will be significantly later.
Everything I've seen from AMD says that the 5750, 5770, 5850, and 5870 will all support Eyefinity. Abet, for most of those you would be limited to 3 screens max (2DVI 1DisplayPort), that is until we see the 6DP model(s?) arrive.
Yeah my understanding is that it looks like 'one monitor' to the game, but I've no idea if the game is going to properly present me with the choice for a 5760x1080 or 5760x1200 'display', nor if the game engine itself is able to deal with calculating for that many pixels. I mean 57x12 is well beyond 'widescreen' or 'landscape' and well into the land of 'panaramic'
Even tricker would be to see if it would allow me to do 3240x1920 or 3600x1920 if you have monitors that can be flipped to 'portrait' mode.
and yeah, those old Matrox cards were the bomb, nice to see the technology resurfacing in a way
Everything I've seen from AMD says that the 5750, 5770, 5850, and 5870 will all support Eyefinity. Abet, for most of those you would be limited to 3 screens max (2DVI 1DisplayPort), that is until we see the 6DP model(s?) arrive.
Yeah my understanding is that it looks like 'one monitor' to the game, but I've no idea if the game is going to properly present me with the choice for a 5760x1080 or 5760x1200 'display', nor if the game engine itself is able to deal with calculating for that many pixels. I mean 57x12 is well beyond 'widescreen' or 'landscape' and well into the land of 'panaramic' Even tricker would be to see if it would allow me to do 3240x1920 or 3600x1920 if you have monitors that can be flipped to 'portrait' mode. and yeah, those old Matrox cards were the bomb, nice to see the technology resurfacing in a way |
As far as Eyefinity is concerned, until someone definitively states that CoX is working with it, I would presume no. From everything I've seen so far, part of Ultra's upgrade is to get in line with proper device driver support so things like Eyefinity and other additions won't have to be explicitly coded for.
Besides, the maximum field of view of CoX has been reduced since its inception, so extra wide views do NOT get you any more peripheral vision, it actually just enlarges the graphics to match the maximum allowed width. This means extra wide screens actually cut down on the vertical field of view in CoX. I would guess they did this for PvP to ensure a 'level' playing field.
As far as Eyefinity is concerned, until someone definitively states that CoX is working with it, I would presume no. From everything I've seen so far, part of Ultra's upgrade is to get in line with proper device driver support so things like Eyefinity and other additions won't have to be explicitly coded for.
|
Besides, the maximum field of view of CoX has been reduced since its inception, so extra wide views do NOT get you any more peripheral vision, it actually just enlarges the graphics to match the maximum allowed width. This means extra wide screens actually cut down on the vertical field of view in CoX. I would guess they did this for PvP to ensure a 'level' playing field. |
This means your best bet with an eyefinity rig would be three-across portrait, or three by two portrait set up, and you'd want portrait or pivot monitors.
[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]
In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)
Is there a good forum section to ask for help in putting a system together? I am looking to get something new, with a price point under $950 delivered.
Intel and Nvidia preferred. No monitor needed. Win 7 as the OS.
Is there a good forum section to ask for help in putting a system together? I am looking to get something new, with a price point under $950 delivered.
Intel and Nvidia preferred. No monitor needed. Win 7 as the OS. |
You could ask in the Technical Issues and Bugs section of the Forum. You could also go to any Father Xmas post and click the links in the sig for builds that bracket what you are looking for ($600 and $1200). Also in his sig is a thread with suggestions for how to judge components and explanations for why he picked the components he did in the current versions of the two builds.
|
[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]
In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)
http://www.hardocp.com/article/2009/..._card_review/1
has a good review of the AMD’s ATI Radeon HD 5770 an compare it vs NVIDIA GeForce GTX 260 an a ATI Radeon HD 4870.
some of the results were intresting.
"The Bottom Line
If you currently own an ATI Radeon HD 4870 or a NVIDIA GeForce GTX 260, don’t look for a huge gameplay improvement in today’s games, it just isn’t there right now. But if you are currently in the market for something new, supporting forward looking DX11 games, and the ability to run triple-display gaming at an affordable price, then the PowerColor HD5770 is where you want to be looking. The PowerColor HD5770 provides an excellent gameplay experience for the price of $165 with the future in mind.
"
from the article.