Ultra-mode video card shopping guide


5th_Player

 

Posted

Heres to hoping my system can handle it :O


 

Posted

Quote:
Originally Posted by Zikar View Post
I wonder what my GTS 250 (1gig) will be able to handle...
A GTS 250 is basically a die-shinked version of the 9800, which ranks about the same as a RadeonHD 4850 in real-world performance. Which is, again, about the starting point specified during HeroCon for Ultra Mode gaphics. If you check out comparisons of graphics cards, like this one: http://hardocp.com/article/2009/10/1..._5750_review/9 : you'll see that the GTS 250 plays about in the same price bracket as the RadeonHD 5750, but the two tend to trade-off performance in existing games. Although this does seem to be more a case of deliberate code sabotage in games built under the "the way it's meant to be played program from Nvidia. e.g. Borderlands. e.g. Batman Arkham Asylum. e.g. Resident Evil 5.

Off the cuff then, I'd say you shouldn't have anything to worry about. Actually, given Nvidia's recent history with game developers, and the comments from Positron at the start of thread, I am a bit concerned about Nvidia's involvement and what it could mean for Radeon cards in the expansion next year. I'd rather not see a game artificially limited to running 20fps on Crossfired RadeonHD 3870's and Crossfired RadeonHD 4850's... which is what Borderlands does. And don't get me started on the shading issues if you use Xp.


 

Posted

Quote:
Originally Posted by Mystic_Fortune View Post
Well, as much as I want to cry now.. I kind of figured as much. I imagine tax time will come around again just before beta for GR starts. Looks like I'll be doing a complete desktop upgrade in 2010.
I figured I was going to do an upgrade next summer. Might just need to do it a wee bit sooner. I'm good on Proc but could use a GFX and memory upgrade. Looks like I know when I'll be doing that.


Quote:
Originally Posted by eltonio View Post
This is over the top mental slavery.

 

Posted

Sweet. Looks I'm good with my current machine.


 

Posted

So us poor people that don't have a strong power supply, my little old 8600 GT card will have to do with the max regular settings.

Darn






-Hawk


@Hawkeyed
P.E.R.C. Senior Pinnacle Rep


[url="http://www.guildportal.com/Guild.aspx?GuildID=217406&TabID=1833355"]PERC Site[/url]


"Nothing grabs your attention like a pink fluffy bunny with imps dancing around it" -Kranny

 

Posted

Quote:
Originally Posted by Hawkeyed View Post
So us poor people that don't have a strong power supply, my little old 8600 GT card will have to do with the max regular settings.

Darn
I'll note that power supplies don't *have* to be expensive... and may just be on sale now. >.> (just snagged a corsair 650w, normally $159 at $90, and have another $30 in rebates as well over at newegg. Gives the planned build a good bit of headroom, actually.)


 

Posted

excellent advice and thanks for addressing this issue. one fo the big topics for us at hero con was what each of us would need to upgrade for GR. having the OP and feedback of other players with more knowledge than i have is very helpful


The Marshmallow spectrum
Gud � Peeps
| Rice Krispy Treats
| Smores

Ebil - Circus Peanuts

 

Posted

Hey Posi, will GR support multi-core processors?


 

Posted

Quote:
Originally Posted by Evilanna View Post
Hey Posi, will GR support multi-core processors?

mmm yah, I'm wondering too...probably tho, can't see why not...


 

Posted

Quote:
Originally Posted by Evilanna View Post
Hey Posi, will GR support multi-core processors?
Doesn't COH already do that?


 

Posted

Quote:
Originally Posted by Psyte View Post
Doesn't COH already [support multi-core processors]?
As I mentioned earlier, I believe it will use one core for main processing and a second for physics (if you don't have a dedicated card), but I do wonder if Ultra-mode will allow it to do something with the other 3/4 of my i7.


Tech support IRL, CLR/DRU/MED/WHM/PRI/DEF. Hmm, I sense a pattern...
S 80% E 80% A 40% K 0%
A few of my alts

 

Posted

Can somebody speak to what this means for Mac users?

I've got a 2007 iMac with an ATI RadeonHD2600, which according to the handy chart posted earlier, is several tiers below the NVidia 9800 GT.
If I want to stick with an iMac, I would NEED to go with the high end 27", as its ATI Radeon HD 4850 is the only one that's better than the NVidia 9800 GT.

I'm sure Mac Pro, Mac Mini, and MacBook users also will at some point want this question addressed.

Thanks for the other info, though.


 

Posted

Quote:
Originally Posted by Balorn View Post
As I mentioned earlier, I believe it will use one core for main processing and a second for physics (if you don't have a dedicated card), but I do wonder if Ultra-mode will allow it to do something with the other 3/4 of my i7.
City of Heroes seems to have three main active threads while running, two burning a lot of CPU which are probably the main thread and the physics thread. There is a third thread that burns minor CPU probably related to networking. And then there appears to be a fourth thread that looks to be related to the main processing of OpenGL that is probably system dependent (depending on the OpenGL libraries on your system).

However, City of Heroes seems to open between 14 and 15 threads total, most are just idle most of the time.


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

Quote:
Originally Posted by Balorn View Post
As I mentioned earlier, I believe it will use one core for main processing and a second for physics (if you don't have a dedicated card), but I do wonder if Ultra-mode will allow it to do something with the other 3/4 of my i7.
that'll pretty much only happen if the game switches to OpenCL for physics. Nvidia's behavior with PhysX leads me to doubt that they'll... actively... develop acceleration on CPU's.

Quote:
Doesn't COH already do that?
Yes / No / Not really.

One of the big problems with game development coming off of the Xbox, Gamecube, and PS2 systems onto the Xbox 360, PS3, and Wii consoles is that the current crop of consoles are built to be multi-threaded with multiple cores. If you want to get technical the Gamecube was actually ahead of that curve with it's split 32/32 bit - 64bit processor pipeline. The change-over produced ill feelings from a development industry trained on single threaded production techniques that had to start working on developing techniques and code that would be able to utilize a larger number of instruction threads, rather than just a faster instruction thread.

The basic problem is that software written to perform well with multiple instruction threads generally doesn't work too well on single threaded / cored processors. This is one of the reasons why Microsoft Windows systems do so poorly in server related tasks: since Microsoft's main money making market is the desktop, they've never really been able to optimize the NT kernel for multi-threaded enviroments. You'll actually find that Microsoft tends to maintain a separate kernel, the HPC kernel, for tasks that require multi-threading. One of the big deals about Windows 7 is that the kernel's ability to handle additional processing cores and threads had been improved for consumers, which theoretically means that those with Triple, Quad, Hex, or Oct cores / processors will see larger performance gains in basic tasks.

If you are actually interested in the coding side of making SMP and Single Thread instructions work together, and how to optimize for each, you can check out the Linux Kernel Mailing list and look up the work that went into making SMP-enabled Linux kernels work properly on single thread-systems.

If you aren't interested and just want to know how this relates to CoH, keep reading.

One of the points of development we can take away from HeroCon is that the developers are interested in maintaining the existing playerbase and system specifications, which are largely based against processors around the 800mhz mark from the Pentium III and original Athlon line-ups. One of the theorized reasons is that NCSoft wants the game to be viable on netbook type systems, where something like an Intel Atom has around the same instructions per clock (IPC), as the years old PIII and Athlon designs. Looking ahead the netbook / notebook market is going to do nothing but grow, and there's the huge problem for Microsoft that Google's getting into the act...

No, I'm not saying that we'll see a downloadable app for ChromeOS on launch that installs CoH to an SD card or USB memory stick and runs through Transgaming Cedega... although Transgaming is working with both Google and Paragon Studio's in different capacities. As a future option to keep the customer base growing, it's something that NCSoft, as a publisher, and Paragon Studios as a developer, would have a hard time not thinking about or considering.

Since the developers seem to be interested in maintaining that older performance profile, there's only so much that can be done with the base engine to harness multiple cores or processing threads. Thus, while CoH can take some advantage of multiple cores, short of an update that enables a new underlying CPU engine, it's never really going to be able to take advantage of quad-cores, hex-cores, or oct-cores.


 

Posted

Quote:
Originally Posted by Hypatia View Post
Can somebody speak to what this means for Mac users?

I've got a 2007 iMac with an ATI RadeonHD2600, which according to the handy chart posted earlier, is several tiers below the NVidia 9800 GT.
If I want to stick with an iMac, I would NEED to go with the high end 27", as its ATI Radeon HD 4850 is the only one that's better than the NVidia 9800 GT.

I'm sure Mac Pro, Mac Mini, and MacBook users also will at some point want this question addressed.

Thanks for the other info, though.
It's... pretty much already been answered. Hardware is hardware. If the RadeonHD 4850 is the hardware performance starting point... that's the hardware performance starting point, regardless of what OS you are using.


 

Posted

Quote:
Originally Posted by je_saist View Post
It's... pretty much already been answered. Hardware is hardware. If the RadeonHD 4850 is the hardware performance starting point... that's the hardware performance starting point, regardless of what OS you are using.
But it hasn't been answered. We don't know if the limiting factor is the pixel-pushing ability of the card (in which case the playability depends on your tolerance for low framerates), or if it's the presence of specific hardware features (in which case a sub-par graphics card can't do Ultra Mode at any framerate).


 

Posted

Quote:
Originally Posted by Katie V View Post
But it hasn't been answered. We don't know if the limiting factor is the pixel-pushing ability of the card (in which case the playability depends on your tolerance for low framerates), or if it's the presence of specific hardware features (in which case a sub-par graphics card can't do Ultra Mode at any framerate).
This actually has been answered as well, although not directly.

CoH uses OpenGL as it's rendering API, which is why the graphics in Going Rogue aren't tied to NT6 / DirectX 11. We also know that neither Nvidia, nor ATi, have had different feature sets in differently named graphics cards for years. The last time Nvidia had different feature sets in graphics cards that were branded the same came back during the Geforce 4 years. The Geforce 4 TI series were DirectX 8 cards, the Geforce 4 MX cards were simply overclocked Geforce 2 MX's with DX 7 support.

There are a couple of oddities on AMD's mobile lineup... exampling the Mobile Radeon 2100 and x2300... which were DirectX 9.0c cards. They weren't branded with the RadeonHD tag that signified DirectX 10 support.

Anyways, since the RadeonHD 4850 was named by Ghost Falcon, and the 9800 GT was named by Positron, we can compare the technical specifications of these cards. For the purposes of OpenGL, most of the features of each card should be exposed to a developer through OpenGL 3.0, although it's possible that the developers are building off against OpenGL 2.0 ES.

Since the feature set in the cards listed by the developers is pretty much identical to lower end cards, like the Radeon 46xx series, or the Geforce 9500 and 9600 cards, we can pretty much say that the limiting factor is performance. Not features.

***

edit: also, Positron uses the terms : best bet and recommend

which also enforces the point that it's about performance more than features.

If the developers are using OpenGL 3.0, all of the RadeonHD series and all of the Geforce 8x00 series onwards should be able to drive Going Rogue's graphics, although maybe not at 30 fps. Crossfired 3870's, for example, should be able to match a single 4850 since they can do so in most other shipping games.

If the developers are using OpenGL 2.0 ES, theoretically, you could go back to the Radeon 9700 series or the Geforce 6x000 and still have all of Going Rogue's visuals... I... somewhat doubt... that OpenGL 2.0 ES is the break point though.

**

double edit: doh. I forgot that ATi also did the renaming / rebranding thing with the Radeon 8500 series. They kept re-releasing the 8500 as the Radeon 9000, then the 9200, and used the Radeon 8500 in the integrated chipset, the Radeon 9100. Again though, that was also years ago.


 

Posted

Hoping with the GeForce GTX 285 I can run it at 1920x1200 max settings with at least 30 frames a second.


Playstation 3 - XBox 360 - Wii - PSP

Remember kids, crack is whack!

Samuel_Tow: Your avatar is... I think I like it

 

Posted

Quote:
Originally Posted by bAss_ackwards View Post
Hoping with the GeForce GTX 285 I can run it at 1920x1200 max settings with at least 30 frames a second.
Well for me one thing i noticed is no matter how stable your system is, however good your ram/cpu and gpu are, you'll always get them pesky fps drops in farsight areas redside like nerva/cap and st. martial. Especially with visscale 4. Even if your system is more than capable of owning CoH requirement wise.

Like i usually average 60fps 99% of the time, but in areas like the ones mentiond above i get drops anywhere from 30 to 15FPS, and this is on a system thats capable of playing CoH 3 times over.


 

Posted

Quote:
Originally Posted by Hypatia View Post
Can somebody speak to what this means for Mac users?

I've got a 2007 iMac with an ATI RadeonHD2600, which according to the handy chart posted earlier, is several tiers below the NVidia 9800 GT.
If I want to stick with an iMac, I would NEED to go with the high end 27", as its ATI Radeon HD 4850 is the only one that's better than the NVidia 9800 GT.

I'm sure Mac Pro, Mac Mini, and MacBook users also will at some point want this question addressed.

Thanks for the other info, though.
Probably the best person to PM about this will be Ghost Falcon. If you are keenly interested in finding out if your Mac can handle Ultra mode, perhaps you can ask if you qualify for mac beta testing.

From what I've seen on Mac performance of CoX, there is significant overhead from using a non-native port under Cider. However, this doesn't mean that the game will be unplayable/unenjoyable.

Ultra will likely be the best implementation of a Cider port so far, since Paragon is basically rewriting the graphics engine at this point, and they have the Mac port in mind instead of trying to fit in the mac as an after the fact implementation.

Right now I would estimate that the current mac implementation runs at 40%-70% GPU efficiency of an identical windows platform. With Ultra you'll probably see 70%-85% GPU efficiency. (The first numbers are seat of the pants numbers by taking my 8800GTX off my windows box and plunking the same card into my hackintosh - identical hardware otherwise).

Therefore I would guess that the 27" iMac with the 4850 would be the minimum for Ultra mode on MacOS, and it would probably struggle a bit under the native resolution. You probably would want to use Boot Camp to switch to a native windows environment to run Ultra CoX, or switch to a lower resolution under the Mac.

Mac Pros would have to use 4870s or GTX285s and most likely there would be significant benefit from running natively under boot camp as well. I don't think there is any SLI native support for the Mac (and/or Cider) so it will have to be Fermi or 58x0 cards only as a potential upgrade path for Mac Pros (over what is currently available).


 

Posted

Quote:
Originally Posted by PumBumbler View Post
Right now I would estimate that the current mac implementation runs at 40%-70% GPU efficiency of an identical windows platform. With Ultra you'll probably see 70%-85% GPU efficiency. (The first numbers are seat of the pants numbers by taking my 8800GTX off my windows box and plunking the same card into my hackintosh - identical hardware otherwise).
*head tilts*

Um... the Cedega overhead on Nvidia cards is closer to offering 85%-95% of the same performance on the same card at the same detail levels with CoH, and I'm basing that on Geforce 6200, Geforce 6600 GT, Geforce 6800, Geforce 7900 GT KO, Geforce 9500 GT, and Geforce GTS 250. I'm sort of curious as to why you'd only be getting 40% of that same performance on Cider. Although given my propensity for knocking Nvidia on the quality, or lack-thereof, on their drivers, that could be why.


 

Posted

I can't wait to play with these new graphical settings. I've got a GTX 275 with an OC'd 3.4GHz Q6600.


 

Posted

Quote:
Originally Posted by Psygon_NA View Post
So with this mother board http://www.motherboards.org/reviews/...ds/1690_2.html

Will it run one of the gtx 285's ? I see they are pci-e 2 ?
I have one the D975XBX motherboards: http://www.intel.com/products/motherboard/D975XBX/

As far as I can tell, Intel doesn't actually specify what's different between the d975xbx and d975xbx2. Never mind, found it. The 2 model added several processors that really should have been added in a BIOS update to the first motherboard...

D975XBX 1 Processor List

D975XBX 2 Processor List

The only reason you'd buy one is if you wanted to run a really early Core 2 Duo with Crossfired Radeons, since this was one of, if not, the first Intel board to carry Crossfire support. It does not, however, support SLI as later and more modern Intel motherboards do.

I'd also have a hard time suggesting this board on it's own merits. The BIOS for the original model is rubbish, and since it looks like Intel just re-released the board with an updated BIOS for new processors instead of... you know.. actually releasing the BIOS so that if you had the original board you could use the new processors... I somewhat doubtful that the 2 model is any better.

Getting a SATA drive to boot from the system was difficult. And it chewed through more power than it really should have with it's feature support.

***

If I mis-read this as you actually having one and aren't just looking at buying one, yes, it's PCI-Express slots will support Nvidia cards.

Just not in SLI.


 

Posted

Quote:
Originally Posted by je_saist View Post
*head tilts*

Um... the Cedega overhead on Nvidia cards is closer to offering 85%-95% of the same performance on the same card at the samxe detail levels with CoH, and I'm basing that on Geforce 6200, Geforce 6600 GT, Geforce 6800, Geforce 7900 GT KO, Geforce 9500 GT, and Geforce GTS 250. I'm sort of curious as to why you'd only be getting 40% of that same performance on Cider. Although given my propensity for knocking Nvidia on the quality, or lack-thereof, on their drivers, that could be why.
There are specific areas in the game which slow the game severely, such as the RWZ base and Grandville. Of course, those may not necessarily be graphics constrained but Cider bottlenecks. This issue has been listed in CuppaManga's guide here, although I believe the discussion around the specific details about the performance degradation was based in the beta mac section of the now defunct old forums. So the 40% can be more like 20% for very short instances but feels like 40% when 'averaged' by gut feel over a few seconds. Also at extreme cases like rikti ship raid herding or cim herding (where you get 50+ mobs in a small area), I think the Mac client suffers worse than the windows client, although I haven't tested this out in a long time.

As for performance overall, I can play smoothly at 2560x1600 on the same hardware config windows side but on the mac side 1920x1200 is more realistic for smooth performance. I haven't tested much at lower resolutions, since going much lower for me is too drastic a change in the gaming experience to bother, although the frame rate improves dramatically.

Additionally, there is no anti-aliasing or other goodies available on the Mac client currently, although I'm crossing my fingers that AA etc. will be available with Ultra on the Mac.

I could try a more definitive test to get at the wheres and hows of where CoX Mac side is underperforming, as I do have a fair bit of Mac Hardware here as well as Windows stations, but since the Mac version has performed well enough for my limited needs (I just use windows7/GTX280 for my primary CoX), I've just been too lazy to look at submitting any further performance profiles.

This may change if I have early enough access to the Mac Ultra port (if any, has anyone actually confirmed Ultra support for the Mac client?).