Ultra-mode video card shopping guide
After browsing/skimming/reading through all 11 pages of this thread, it looks like je_saist or Arcanaville may be the people to ask for this:
I have dual nVidia Quadro FX 3500 cards set up by SLI (512MB of Video Memory).... They're pretty old cards in "computer months/years," but they still handle everything that I throw at it... I originally put them in my self-built computer because I'm a video editor, and those series of cards are wonderful with both HD video editing and any sort of AUTOCAD software tools.... Needless to say, it doesn't matter what sort of game I install, these cards kick butt. Unreal Tournament III? No problem. Modern Warfare 2? No issues. Fear 2? Resident Evil 5? Fallout 3? They wish they could drag my system down (haha)....
So, my question is... Is this "ultra mode" going to be what finally breaks my system and causes me to get rid of these wonderful little beasts?
Right now, I'm running Windows Vista 64-bit with 8 gigs of memory...
What's the verdict, guys? Have my cards finally reached the end of the line as far as handling high graphics in modern gaming? Or will I be able to handle this new "Ultra mode?"
"Alien"
76 characters and Twenty-four 50s later, I still love this game.
AlienOne's Human-Form Warshade Guide (Old guide+New guide = 12,000+ views!)
I know that some things matter more than others when it comes to performance, but there are a few things I'm still not so sure about...
I have an Intel D975XBX2 series motherboard with the absolute highest-end processor it will take (2.5GHz Core 2 Quad) 4GB of DDR2 memory, and a separate hard drive dedicated solely to running CoH and a couple other games. The weak link in my system is obviously my super-cheap GeForce 8600, as I've been informed by Windows 7's built-in performance tester (and honestly, I've noticed that the card is kind of crappy, but it was the best one in my price range after my previous card burned out).
I know that PCIe 2.0 video cards are backwards compatible to PCIe 1.0 slots (which is what my mobo has), and that v2.0 vs v1.0 is basically just a bus speed increase.
My conundrum is this: Will I get enough of a performance increase just replacing the video card for one of the recommended cards in the $250 price range?
Or will I need to replace the motherboard (and thus the processor and memory as well) to get the performance necessary out of the new card?
Main Hero: Chad Gulzow-Man (Victory) 50, 1396 Badges
Main Villain: Evil Gulzow-Man (Victory) 50, 1193 Badges
Mission Architect arcs: Doctor Brainstorm's An Experiment Gone Awry, Arc ID 2093
-----
Chad : for the most part you'll only crunch the bandwidth on a PCIE 16x 1.0 slot if you are trying to use Crossfire or SLI without a bridge connector. E.G. something like RadeonHD 4650 in Crossfire, or a Hybrid Crossfire setup. In these situations the bandwith constraints will really hurt multi-gpu processing.
If you are using a multi-gpu setup with an external bridge or a single card, there's more than enough bandwidth. I've also got one of the older D975XBX boards, and I can tell you from personal testing that most modern cards don't give a flip about whether or not they are on PCIE 16x 1.0 or PCIE 16x 2.0.
je_saist, do you know anything about the cards I have? Or no?
"Alien"
76 characters and Twenty-four 50s later, I still love this game.
AlienOne's Human-Form Warshade Guide (Old guide+New guide = 12,000+ views!)
Quadro FX 3500 |
I'll point you over to xbitlabs for a couple reasons why:
http://www.xbitlabs.com/articles/vid...x-firepro.html
http://www.xbitlabs.com/articles/vid...vs-firegl.html
http://www.xbitlabs.com/articles/vid...uadrofx_5.html
The basic problem is that Workstation card drivers are not optimized for first-pass fast rendering. Basically you'll be hamstringing your performance compared to a stock 7900 GTX in most common games.
The other aspect is that the drivers certified for the Quadro cards... aren't the same drivers that consumer Geforce users would be using: http://www.nvidia.com/object/Quadro_...1.78_whql.html
Now, as to whether or not they'll work in Ultra Mode at all? I doubt it.
The presumption right now is that Going Rogue leverages OpenGL 3.0... and the Geforce 7900 is not an OpenGL 3.0 part, which means that your Quadro Variants are not OpenGL 3.0 either.
If Going Rogue is leveraging OpenGL 2.0, 2.0 ES, or 2.1, then your cards will support the rendering API.
Thanks for the info....
One question though... Why do they seem to have no problem handling any sort of modern game right now (Crysis, UTIII, etc., etc.), if they're that bad for gaming? I thought that if they could handle something like Crysis, which is notorious for system bog-down, then I could handle a "higher graphics mode" for CoH...
Guess I was wrong.
"Alien"
76 characters and Twenty-four 50s later, I still love this game.
AlienOne's Human-Form Warshade Guide (Old guide+New guide = 12,000+ views!)
Saist: Thanks for the very quick reply. Can't beat personal experience! Glad to hear the more-or-less good news.
Main Hero: Chad Gulzow-Man (Victory) 50, 1396 Badges
Main Villain: Evil Gulzow-Man (Victory) 50, 1193 Badges
Mission Architect arcs: Doctor Brainstorm's An Experiment Gone Awry, Arc ID 2093
-----
Thanks for the info....
One question though... Why do they seem to have no problem handling any sort of modern game right now (Crysis, UTIII, etc., etc.), if they're that bad for gaming? I thought that if they could handle something like Crysis, which is notorious for system bog-down, then I could handle a "higher graphics mode" for CoH... Guess I was wrong. "Alien" |
Most recent entries I can find in Google point to Toms Hardware Forums: http://www.tomshardware.com/forum/26...er-video-cards
or TechPowerUp: http://forums.techpowerup.com/showthread.php?t=67824
There is also the aspect that you have to look at the problem in the aspect of rendering modes.
The 7900 GTX was a monster in DirectX 9 rendering, and even if you knocked 30% performance off, it's still bloody fast. Because that's the rendering limit, OpenGL 2.1 / DirectX 9, that's what games like Crysis are going to default to. You are not going to be running the same code path with the same quality settings as somebody with a Geforce 8800 or a RadeonHD 2900.
The reason why they seem good for modern gaming is that... they aren't running modern games. They are using the old(er) API rendering paths.
Oh. They'll probably work. Your performance will just be less than the hardware-equivalent GeForce card.
|
If we can get a developer to comment on what the OpenGL API in use is for Going Rogue, that may or may not determine if it will work. If it's OpenGL 3.0, the cards won't work.
***
Saist: Thanks for the very quick reply. Can't beat personal experience! Glad to hear the more-or-less good news. |
Thanks for the info....
One question though... Why do they seem to have no problem handling any sort of modern game right now (Crysis, UTIII, etc., etc.), if they're that bad for gaming? I thought that if they could handle something like Crysis, which is notorious for system bog-down, then I could handle a "higher graphics mode" for CoH... Guess I was wrong. "Alien" |
It's not so much that they're "bad" for gaming. Simply that the hardware and software have been optimized for accurate final-pass rendering rather than first-pass like you see in a lot of games. They're for applications where the final product has to be RIGHT, PERIOD. If a certain frame out of umpty-bajillion in CoH has a blown pixel or texture someplace, no biggie, it's gone in under a second.
As it is, your card is probably dealing with just the stuff it's drivers know how to handle (DX7/8/9, OGL2.0, etc) and just ignoring directives from anything else. So it's giving you acceptable performance because the image is degraded.
If, as Je_Saist suspects, they're leveraging OGL 3.0 for CoH, you're going to get mostly identical performance in CoH simply because it's not rendering the Going Rogue graphical enhancements. At worst, your performance will fall off because it's having to spend time discarding commands that the hardware and drivers won't support.
It's not so much that they're "bad" for gaming. Simply that the hardware and software have been optimized for accurate final-pass rendering rather than first-pass like you see in a lot of games. They're for applications where the final product has to be RIGHT, PERIOD.
|
Therefore, I won't be able to "upgrade" (lol) to a $300 "gaming" card (from two $1,000 professional cards), just because I want performance from a single game's upgrade to be better... It'd just be nice for Ultra Mode to work. I guess I'll just have to wait and see...
"Alien"
76 characters and Twenty-four 50s later, I still love this game.
AlienOne's Human-Form Warshade Guide (Old guide+New guide = 12,000+ views!)
After demoing the Ultra-mode at HeroCon, one of the most frequently asked questions we have been receiving is what are the video cards you recommend to get the most out of Ultra-mode? With the holiday season coming up, we asked the programmers in charge of implementing the features of Ultra-mode what they would recommend. We would like to forward this information on to you so you can better work on your holiday shopping lists.
If you are looking to spend under US$100, then an NVidia 9800 GT is your best bet. For AMD (ATI/Radeon), we dont have enough of these cards at this price point to get you good data. This would be the minimum card for enabling all the features, only at reduced quality settings. If you are looking to spend between $100 and $200, the Radeon HD 4890 and GeForce GTX 260 will do you well. We dont have numbers from the Radeon 57xx series yet to verify if that is better or worse though. This would end up somewhere in the middle of Ultra-mode quality. Finally if you are going to spend over $200, the GeForce GTX 285 is an excellent choice. We dont have numbers from the Radeon 58xx series yet. This would be able to run Ultra-mode with the quality maxed out (except we have no data on Anti-aliasing, so caveat emptor when it comes to that specific feature). Note: This is based on our current internal testing and might change by the time Going Rogue is released. Also, a video card is not the ONLY thing you should take into consideration. Your PC having a decent processor and memory will also enhance your performance. As we continue to test and get more information available we will update you. I hope this helps, and happy holidays! |
Arc: 378122 "Tales of the Terran Space Marines -The Apocalypse Initiative" 5stars!
http://www.youtube.com/watch?v=6Rgl4...687B0FC89F142C
Arc: 481545 "Twilight of the Gods - The Praetorian conflict"8000+ hits!
http://www.youtube.com/watch?v=XxJ6S...848B21E2350DCC
When do we expect to get numbers on the Radeon 57xx and 58xx (and even the lower end models?)
|
Video card manufacturer's aren't likely just to say "Hey, you guys are putting out a new expansion to your game with some new graphics additions, so why don't we just give you one of each of our series of video cards so that you can test with them and see how they perform?" It's possible, but not probable.
Which means that it's a matter of Paragon Studios purchasing the cards most likely. Which means that it's a budget item that will get prioritized, and not necessarily a high priority to get a full spread of each series.
Now, it's possible that someone on the staff may purchase a new graphics card for their personal system and it's possible that they may bring that system to the PS offices and say "Hey, I got a new XXXX video card so why don't we test it for Ultra Mode so that we can have a bit more information for the players?" It's also possible that there may be company policies that prevent bringing in personal systems or loading future code on personal systems.
If the game spit out 20 dollar bills people would complain that they weren't sequentially numbered. If they were sequentially numbered people would complain that they weren't random enough.
Black Pebble is my new hero.
True, am deciding if I should just wait to order a new pc vs. ordering now with whatever gpu card comes with it (with end of year deals/rebates) or hold off and hope price for processor goes down anyway (looking at an i5-750), assuming i will need to upgrade later and replace power supply or just wait on everything. Haven't purchased a PC in 6 years and the options and changes are staggering (who would have thought I'd pay almost as much for the GPU as the main server itself)
Arc: 378122 "Tales of the Terran Space Marines -The Apocalypse Initiative" 5stars!
http://www.youtube.com/watch?v=6Rgl4...687B0FC89F142C
Arc: 481545 "Twilight of the Gods - The Praetorian conflict"8000+ hits!
http://www.youtube.com/watch?v=XxJ6S...848B21E2350DCC
True, am deciding if I should just wait to order a new pc vs. ordering now with whatever gpu card comes with it (with end of year deals/rebates) or hold off and hope price for processor goes down anyway (looking at an i5-750), assuming i will need to upgrade later and replace power supply or just wait on everything. Haven't purchased a PC in 6 years and the options and changes are staggering (who would have thought I'd pay almost as much for the GPU as the main server itself)
|
================================================== ===============
HP 2159m 21.5" Diagonal Full HD Widescreen LCD Monitor This item ships free FV585AA#ABA 1 $239.99 $147.77
HP Pavilion Elite e9250t PC
* Genuine Windows 7 Home Premium 64-bit
* Intel(R) Core(TM) i5-750 processor [2.66GHz, 1MB L2 + 8MB shared L3 cache]
* FREE UPGRADE! 6GB DDR3-1333MHz SDRAM [3 DIMMs] from 4GB
* FREE UPGRADE! 640GB 7200 rpm SATA 3Gb/s hard drive from 500GB
* 512MB ATI Radeon HD 4350 [DVI, HDMI, VGA adapter]
* LightScribe 16X max. DVD+/-R/RW SuperMulti drive
* 16x max. DVD ROM (player)
* Premium Wireless-N LAN card
* 15-in-1 memory card reader, 1 USB, 1394, audio
* No TV Tuner
* Integrated 7.1 channel sound with front audio ports
* No speakers
* HP multimedia keyboard and HP optical mouse
* Microsoft(R) Works 9.0
* Norton Internet Security(TM) 2010 - 15 month
* HP Home & Home Office Store in-box envelope
I figure with the 4350 I can at least make the current game play a little better vs. my geforce card from 6 years ago (smile); but will wait for the 5850's to come down (hopefully) in price for GR.
Arc: 378122 "Tales of the Terran Space Marines -The Apocalypse Initiative" 5stars!
http://www.youtube.com/watch?v=6Rgl4...687B0FC89F142C
Arc: 481545 "Twilight of the Gods - The Praetorian conflict"8000+ hits!
http://www.youtube.com/watch?v=XxJ6S...848B21E2350DCC
Well, I just hit the submit button on my computer...fingers crossed..hope I can change the power supply later...
|
I figure with the 4350 I can at least make the current game play a little better vs. my geforce card from 6 years ago (smile); but will wait for the 5850's to come down (hopefully) in price for GR. |
EDIT: Oh yeah, and those system specs do look solid save for the graphics card. That should be fine.
Aargh, my faithful Asus 8800 GTS just bit the dust tonight... video corruption even on the BIOS screen and only functioning as a basic VGA adapter in Windows. Device manager claims that the card can't start; uninstall/reinstall does nothing.
Due to the problem persisting even on the boot screen before Windows even starts loading I think the card's probably dead.
Well, I was looking to buy a new card in March... I guess my timetable got advanced although it really could have picked a better time. Just ordered a replacement from Newegg, BFG 275 GTX. Unfortunately I'm stuck with the laptop probably until Tuesday.
COH has just been murdered by NCSoft. http://www.change.org/petitions/ncso...city-of-heroes
If worst came to worst, your "problem" would be for mounting the new power supply in the case. The actual electrical connectors are going to be the same regardless. So if the screw-holes are in the wrong place to mount a new power suppy or such, that can be remedied with someone's dremel tool.
As discussed further up the thread, you probably shouldn't hold your breath for either ATI or Nvidia to reduce their prices in the next few months. Hopefully supply will loosen up a bit though. EDIT: Oh yeah, and those system specs do look solid save for the graphics card. That should be fine. |
Arc: 378122 "Tales of the Terran Space Marines -The Apocalypse Initiative" 5stars!
http://www.youtube.com/watch?v=6Rgl4...687B0FC89F142C
Arc: 481545 "Twilight of the Gods - The Praetorian conflict"8000+ hits!
http://www.youtube.com/watch?v=XxJ6S...848B21E2350DCC
Thanks Human, and you are right, I was checking prices again on the cards, they just keep going up.. I read somewhere that NVidia is supposed to come out with new cards in Feb/March, hopefully that will put some pressure on ATI to reduce theirs a bit but then again, with so much function coming out with them, they can keep the supply low knowing the games demand them so we will pay :-( Once my box comes in I'm going to go for the power supply and then, maybe after my tax return comes in... dig deeper and get the card :-( .. and I'll keep up with Toms hardware lol
|
http://firingsquad.com/news/newsarti...searchid=22425
TMSC has reportedly fixed their 40nm problems with the RadeonHD cards. However the RadeonHD chips are significantly less complex than the Fermi chips Nvidia is trying to make (~2.15 billion versus 3billion plus transistors), so although the equipment is working now for AMD parts orders, it might not be working entirely for Nvidia parts orders.
Thanks Human, and you are right, I was checking prices again on the cards, they just keep going up.. I read somewhere that NVidia is supposed to come out with new cards in Feb/March, hopefully that will put some pressure on ATI to reduce theirs a bit but then again, with so much function coming out with them, they can keep the supply low knowing the games demand them so we will pay :-( Once my box comes in I'm going to go for the power supply and then, maybe after my tax return comes in... dig deeper and get the card :-( .. and I'll keep up with Toms hardware lol
|
Would Ultra Mode work on a Radeon X1600/1650 series graphics card? cause that's' what I got
Would Ultra Mode work on a Radeon X1600/1650 series graphics card? cause that's' what I got
|
The x1x00 series was the last generation of AMD's DirectX 9 / OpenGL 2.0 line-up of cards. If Going Rogue is utilizing OpenGL 2.0 ES, then yes, the x1x00 series should technically be capable of implementing all of the graphical effects called by the graphics engine.
However, given how an x1650 I have right now performs in today's games based on the Unreal-3 and CryTek engines, I seriously doubt it can compute Going Rogue's graphics fast enough to give a display faster than 2 or 3 frames a minute.
If the Going Rogue graphics engine is based agains OpenGL 3.0, then no, you will not be able to use the new engine at all.
The good news for you is that the OpenGL shaders (presumably written by Nvidia years ago) that break certain graphics, like Water effects and anti-aliasing, on ATi cards right now, have been re-written as part of the Going Rogue improvements. When the Ultra-Mode updates hit, you should find that the existing graphical errors with ATi cards should be fixed in the currently existing graphics engine.
When it comes to technical computer stuff I know just enough to be dangerous. I recently purchased a new computer, a quad-core Asus 5270 and had installed in it a BFG Nvidia GT220 graphics card. I know it's after the fact but is that a decent card? I got the whole thing at Best Buy on sale for $90.
I've got an 8800 GTS, but I can't find anything but the core speed in the specs:
http://www.newegg.com/Product/Produc...82E16814130071