Let's Talk Graphics Cards..
I be not a technician but I'm gona take a flying leap and say not well. High-quality real-time shadows are murderously difficult to render and they seem to be the main component of ze ultramode. A 6200 unless I am ill-informed is a rather elderly card now and probably won't stand much chance.
Character references! Artz! Whatnot!
Have a 6200 myself and I just know it isn't up to the task of Ultra mode. Currently it'll run CoH fine, as long as I keep the shader on low(the low that's one up from off that is!) and turn off water effects. If I turn those settings up one then it's really sluggish to play.
I think the card would melt with all the extra shadows and reflections of Ultra Been looking at getting a GTS 250, but getting a bit confudled with everything I'm suppose to look for alot of site conflict with one another, might be better off just getting a new pc!
I'm afraid 'badly' is the answer. I'm making an assumption you have an AGP slot so you're very limited as to your choices.
Defiant 50's
Many and varied!
@Miss Chief
By the way, I have ATI right now - but would it be better to switch to Nvidia for GR? Something like two 9800s together? Or would two GTX be better?
@Golden Girl
City of Heroes comics and artwork
I have an GEForce 8800 GT. Just curious how you think that would hold up...
131430 Starfare: First Contact
178774 Tales of Croatoa: A Rose By Any Other Name ( 2009 MA Best In-Canon Arc ) ( 2009 Player Awards - Best Serious Arc )
By the way, I have ATI right now - but would it be better to switch to Nvidia for GR? Something like two 9800s together? Or would two GTX be better?
|
http://hardocp.com/article/2009/09/3...o_card_review/
http://hardocp.com/article/2009/09/2...o_card_review/
If you are really trying to future proof, until Nvidia can show that Fermi isn't just a pipe-dream, the HD 5x00 series is the only real option.
And don't think that the problems that currently affect ATi card users are going to affect users with the upcoming engine improvements. The demonstration at HeroCon was run on Crossfired 4870 X2's... but according to the developers, even the RadeonHD 4850 1gig can handle just about everything Ultra-Mode does and maintain a playable frame rate. http://boards.cityofheroes.com/showp...5&postcount=25
So yes, those issues have finally been fixed.
***
Now, onto the original post. A 6200 hasn't got a prayer. If the RadeonHD 4850 is the base point of reference for the minimum card that can handle Ultra-Mode, you'd be looking at Geforce GTX 260 or better.
So isn't Nvidia as good as ATI? And aren't double cards any better than single ones?
@Golden Girl
City of Heroes comics and artwork
So isn't Nvidia as good as ATI? And aren't double cards any better than single ones?
|
First things first. Nvidia's GTS / GTX cards use almost the exact same architecture as the Geforce 8800 released in 2006. http://www.techarp.com/showarticle.aspx?artno=88&pgno=5 Now, don't get me wrong, the 8800 was an astoundingly good design when it launched, and it's produced extremely long lasting legs. However, on a feature standpoint, the line-up is rather long in the tooth.
Basically, the only reason Nvidia's still selling cards is because of inertia. People buy Nvidia cards because Nvidia used to have the best cards. They used to have the best drivers. They used to have the best price-points.
However, that's all... used to
For the past several months Nvidia's been bleeding it's bread and butter gamer market to AMD cards. Fact is, when it came time to actually buy, most gamers found that the RadeonHD 3x00 series, RadeonHD 4x00 series, and the new RadeonHD 5x00 series offered better performance at a better price point.
Nvidia's drivers used to be the baseline standard for good, now it's the baseline standard for awful. A good case in point is their 64bit drivers. I've got a couple of Athlon64 systems, one with a Triple-SLI GTS 250 setup and another with an older factory overclocked 7900 GT. I've gotten used to applications that work in 32bit Linux, 32bit Xp, 32bit Vista, and 32bit Vista SP2 (Win7) featuring corrupted graphics, or no graphics in the 64bit versions of these OS's. On the surface this might not sound like such a big deal... till you realize that the most amount of memory a 32bit OS can use is about 4gig's... and if you hand say, 4gigs of ram to Windows 7, it will only be able to use 2.5gb. This means that in a 32bit OS, the fastest memory configuration is going to be a dual-channel 2gig configuration. Yes, that's a lot of memory, but really, if you're buying a new computer... why only 2gigs?
When I look at my Radeon cards though, going back to a Radeon 9600, I've yet to run into a graphics problem or corruption caused by the drivers across any of the 64bit OS's I've used.
There also other market practicalities to consider.
First off, Nvidia's pretty much abandoned multi-platform SLI.
If you want SLI support on AMD's Socket AM3 platform, you're pretty much stuck with buying an MSI board :
http://www.pricewatch.com/search?q=S...n=Motherboards
Newegg Socket AM3 + SLI .. and you'd top at Triple SLI
If wanted, say, Quad-Crossfire though, an Asus is under $190 for a Socket AM3 board... and a little over / under $200 for an AsRock Intel Core I7 motherboard (that's also limited to Triple SLI.)
Motherboard support isn't the only place Nvidia plays a less than pleasant game.
Right now AMD, Apple, Intel, and other vendors are pushing game developers to use OpenCL for hardware accelerated physics: http://www.khronos.org/#tab-opencl
The only other vendor with a hardware physics solution is Nvidia, and they offer PhsyX. As gamers found out with recent Nvidia driver updates, if you aren't actually using an Nvidia card to render graphics, PhysX is turned off. Yes, right now that's only a problem for gamers using an Nvidia card for PhysX and an ATi card to render graphics. Only Intel's looking to enter the add-in board market with the mid-ranged Larrabee set next year... and that could be very interesting.
Nvidia's reputation for slimy behavior also extends to game development. Nvidia was caught pretty much paying off Rocksteady to not allow some filtering affects, such as Anti-Aliasing, on Non-Nvidia graphics cards. Again, this pretty much only affected ATi users since no current Intel GPU owner probably tried to run Batman: Arkham Asylum.
These artificial knock-downs behind the scenes only help contribute to a poor picture of Nvidia. Investors were left more than a little shaky after the Nvidias GPU Technology conference. Right now the Fermi card doesn't actually exist, and Nvidia reportedly could not even show the architecture working in IKOS boxes. Coupled with extremely bad relations with Original Design Manufacturers (ODM) and Original Equipment Manufactures (OEM) over the exploding laptop fiasco's...
And it's actually questionable whether or not Nvidia is going to exist as the company we know it as next year.
***
Now, this all does stand to change.
Right now ATi cards are the clear choice, on price, on performance, on driver support, on everything that matters.
However, it wasn't that long ago that ATi was pushing out the RadeonHD 2600 series... which was overly warm... while Nvidia was pumping out it's first of many 8800 derivatives.. which romped all over the Radeon cards.
It is very possible that another company, maybe Apple, could come in and purchase Nvidia up, and put the company back on track. It is possible that Nvidia could actually deliver a successor to Fermi that delivers on gaming and computational performance.
It is possible that AMD stumbles, that they follow-up the RadeonHD 5x00 series with a Freedom Pug. It is possible that Intel's Larrabee architecture will be more price competitive and performance competitive than initial reports suggest.
In the graphics realm, the position of each vendor can change really quickly.
Yes, AMD is the best... right now.
That doesn't mean they'll be the best in 6 months... or even 4 months.
That doesn't really help very much - I have to decide quite soon, so I can make sure I get my new computer by christmas
@Golden Girl
City of Heroes comics and artwork
I have an GEForce 8800 GT. Just curious how you think that would hold up...
|
However, should work and should work WELL, are two different things entirely. At this point, the only way we're going to be able to tell is if either the devs tell us how well it'll run on an 8800, or we wait for the beta.
@FloatingFatMan
Do not go gentle into that good night.
Rage, rage against the dying of the light.
I think the card would melt with all the extra shadows and reflections of Ultra Been looking at getting a GTS 250, but getting a bit confudled with everything I'm suppose to look for alot of site conflict with one another, might be better off just getting a new pc!
|
------->"Sic Semper Tyrannis"<-------
Take a look at Tom's hardware for reviews of the cards in your price bracket.
They are very in-depth, but the interesting stuff is at the end (If you're not interested in the techie detail) Also take a look at the reviews for higher-end cards of the same type. They inevitably draw conclusions with the lesser cards, giving you a better idea of what to go for.
I splashed out on a GTX260 when it arrived. At the time it gave the biggest bang for my buck, and as they come down in price it becomes increasingly attractive to add another card in SLI mode. So far it's not disappointed.
Oh, and remember that it's not just the graphics card that can slow you down. The CPU is a major component in getting the most out of your machine.
General rule of thumb. Set your budget. Stick to it, and don't faff around waiting for the next big thing to arrive; cause it will, about 3 weeks after you buy something.
Union: @Ban-Sidhe
Samuel_Tow is the only poster that makes me want to punch him in the head more often when I'm agreeing with him than when I'm disagreeing with him.
|
SLi/Crossfire is a stupid waste of money and doesn't give anywhere near the performance boost you'd expect for running multiple video cards.
Beter to get one of the multi-GPU'd video cards and save on electricity.
@FloatingFatMan
Do not go gentle into that good night.
Rage, rage against the dying of the light.
I'm not running in SLI (Only have 1 card) so I don't know how much of a difference it makes. There's loads of people on the forums with SLI cards though FFM has 8800's if I remember correctly
GR's requirements, so far, seem to be geared to making better use of OpenGL (Dread to think how the Mac's are going to deal with it) that's been in the spec for some time now. SLI will probably make a difference, but I don't know.
There were ruminations that NVIDIA would be supporting COX in 3D mode with their glasses as well. So that might be VERY cool, but will probably lay waste to many gaming rigs across the world. So a high-end NVIDIA card might be worth it if you want the gimmicky 3D. I think they said March 2010, which is around the time GR is expected to land anyway.
Union: @Ban-Sidhe
I'm slightly guessing, but I suspect the 8/9000 series NVidia cards and the rough equivalents should be able to handle whatever Ultra Mode throws at it.
I'll try to find out what my current PC is running (I haven't a clue, offhand) since I know that handles real-time shadows in LotRO. That may give somekind of clue as to what we need.
I can definately say that a 6800 is probably too low spec, just for the record.
Disclaimer: The above may be humerous, or at least may be an attempt at humour. Try reading it that way.
Posts are OOC unless noted to be IC, or in an IC thread.
It would be SO much easier to work it out if CoH used DirectX. The 8800 can handle DX10 fine, and runs Crysis at 1440x900 at about 40ish fps on my PC. Alas, I have no OpenGL 3.0 games at all, so have no idea how it performs on the card...
Hell, I don't even know what games use OpenGL 3.0. Anyone got any ideas?
@FloatingFatMan
Do not go gentle into that good night.
Rage, rage against the dying of the light.
General rule of thumb. Set your budget. Stick to it, and don't faff around waiting for the next big thing to arrive; cause it will, about 3 weeks after you buy something.
|
Its always hard to resist the new shiny.
------->"Sic Semper Tyrannis"<-------
It is soooo hard to resist the new stuff. But like I said, establish your budget and stick to it. Shop around, get the most bang for your buck and don't bemoan the fact that there is a more expensive card. You can wait for the darn thing to come down in price, but when it reaches your price bracket it'll be out of date and the new spangly stuff will be vying for your attention. So it's not worth the hassle. Any card over £100 will probably be able to run Ultra mode, and most modern cards under £100 will probably be playable too.
Noticed FFM has studiously missed my comment about him having SLI'd 8800's, that or I misinterpreted his views on SLI being a complete waste of time Obviously I was wrong
Union: @Ban-Sidhe
I currnetly run a single 512 mb radeon hd4850. I will probably have to upgrade but was wondering which of the new ati cards are more energy efficient as i only have a corsair 450w psu? I´m able to crossfire but not sure if my psu can handle that.
I currnetly run a single 512 mb radeon hd4850. I will probably have to upgrade but was wondering which of the new ati cards are more energy efficient as i only have a corsair 450w psu? I´m able to crossfire but not sure if my psu can handle that.
|
Anyways, the HD 5x00 series generally has the same thermal and power envelopes as the previous 4x00 cards for each branding, so you theoretically should be able to manage the 5850 in the same power envelope.
I read about the time of the HeroCon news that the 4xxx ATI series cards would run Ultra mode with just a couple of the features not running. I can't be any more specific than that and I cant find the direct quote at the moment.
@Golden Girl
City of Heroes comics and artwork
.. For Going Rogue.
At the moment, I'm running off a Nvidia GeForce 6200
-DirectX9.0
-64bit
-AGPX8
I'm just curious as to how well this will perform if running Ultra Mode in GR?