Ultra-mode video card shopping guide


5th_Player

 

Posted

Quote:
Originally Posted by Arcanaville View Post
On the other hand, if you aren't interested in building your own, a comparable system that gets close would be something like the Dell XPS 8000 which starts off around $849 for the base system. The advantage of the Dell: you don't have to assemble it, you can use their configurator to move the price point up and down, and it comes with a Windows 7 Home Premium license (which is not cheap if you want to be legit on the Windows license). The disadvantage: the Dell seems to top out videocard-wise at the 260, while with a do-it-yourself system the sky's the limit. Positron suggests a 260 will likely be able to run Ultra Mode only at its medium quality settings, while something like a 285 will run it at max settings. FatherXmas' recommended 275 is much closer in performance to a 285 than a 260, and probably has a shot at running near max settings in Ultra Mode.
Also note: the GTS 220 and 240, which are the other two options for graphics card on that order form, operate below the level of a 9800 GT (Positron's Ultra Mode "entry point"). The GTS 240 is closer to an older 9600 GSO, and depending on manufacturer modifications might reach 9800 GT performance; but I wouldn't bet on such overclocking from Dell.


 

Posted

Quote:
Originally Posted by Human_Being View Post
Also note: the GTS 220 and 240, which are the other two options for graphics card on that order form, operate below the level of a 9800 GT (Positron's Ultra Mode "entry point"). The GTS 240 is closer to an older 9600 GSO, and depending on manufacturer modifications might reach 9800 GT performance; but I wouldn't bet on such overclocking from Dell.
On that subject I configured my XPS 8000 with the ATI 4350, which is basically good for playing minesweeper, but with the intent of getting a 5850 when someone finally manages to breed them in captivity. With the 4350, I'm already Catalyst-loaded so it'll be slightly easier to just swap and go, and the 4350 becomes my backup card.


(Actually, the 4350 seems to be significantly faster than the Radeon X850 that was in my old system, which cost then nearly as much as the 5850 was supposed to cost at release. Moore's law strikes again.)


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

Quote:
Originally Posted by Arcanaville View Post
...but with the intent of getting a 5850 when someone finally manages to breed them in captivity.
XD


 

Posted

I am looking at a possible upgrade. I was running 2 8800 GTX with 768 meg of RAM on each backed by a 1000w PSU, but one of my cards died on me.

I need to see what my MB can handle since it is an old EVGA Nforce 680i.


 

Posted

Quote:
Originally Posted by Positron View Post
After demoing the Ultra-mode at HeroCon, one of the most frequently asked questions we have been receiving is “what are the video cards you recommend to get the most out of Ultra-mode?” With the holiday season coming up, we asked the programmers in charge of implementing the features of Ultra-mode what they would recommend. We would like to forward this information on to you so you can better work on your holiday shopping lists.

If you are looking to spend under US$100, then an NVidia 9800 GT is your best bet. For AMD (ATI/Radeon), we don’t have enough of these cards at this price point to get you good data. This would be the minimum card for enabling all the features, only at reduced quality settings.

If you are looking to spend between $100 and $200, the Radeon HD 4890 and GeForce GTX 260 will do you well. We don’t have numbers from the Radeon 57xx series yet to verify if that is better or worse though. This would end up somewhere in the middle of Ultra-mode quality.

Finally if you are going to spend over $200, the GeForce GTX 285 is an excellent choice. We don’t have numbers from the Radeon 58xx series yet. This would be able to run Ultra-mode with the quality maxed out (except we have no data on Anti-aliasing, so caveat emptor when it comes to that specific feature).

Note: This is based on our current internal testing and might change by the time Going Rogue is released. Also, a video card is not the ONLY thing you should take into consideration. Your PC having a decent processor and memory will also enhance your performance. As we continue to test and get more information available we will update you. I hope this helps, and happy holidays!
I have a 9800 GT right now. My birthday is the day after Christmas, so upon request, I can get one big expensive combined present instead of two. I think I am going to be begging for GTX 285 I'm pushing 30, but my aunt still likes to spoil me

Luckily, the rest of my family is easier to shop for (got stuff for my infant nephew too!)



I'm only ladylike when compared to my sister.

 

Posted

Quote:
Originally Posted by Arcanaville View Post
I would presume yes, but with a caveat. I don't know if CoX will properly detect the Eyefinity single view resolution, but I do know that CoX works fine even on non-Eyefinity multiple monitor setups in Windowed mode. I can make a windowed (circa) 3840 x 1200 (1920x1200 x 2) CoX window. You just need the horsepower to drive resolutions that high. So if CoX doesn't properly handle the ultralarge eyefinity display, it almost certainly will allow for a windowed client across the eyefinity display.
I've windowed CoX across multiple monitors on my GTX280, but Eyefinity also has a peripheral distortion adjustment to correct for field of view (when looking at the central monitor). I'm not certain if CoX works with that, so that's why I am reserving any compatibility until someone has run a test. Call me a pessimist if you will.


 

Posted

Quote:
Originally Posted by PumBumbler View Post
I've windowed CoX across multiple monitors on my GTX280, but Eyefinity also has a peripheral distortion adjustment to correct for field of view (when looking at the central monitor). I'm not certain if CoX works with that, so that's why I am reserving any compatibility until someone has run a test. Call me a pessimist if you will.
I believe that is irrelevant to CoX because as previously mentioned CoX will not allow increased angular perspective. The only thing larger monitors (or composite displays) can do is make the picture bigger, not more panoramic.

Only if UltraMode eliminates that restriction is there any chance for panoramic distortion, and the lack of accounting for it won't make Eyefinity "incompatible" with UltraMode. It might just fail to remove some of the distortion that most games have when played ultrawide screen (and its questionable if removing the distortion is even the "correct" thing to do in all cases).


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

Quote:
Originally Posted by Arcanaville View Post
I believe that is irrelevant to CoX because as previously mentioned CoX will not allow increased angular perspective. The only thing larger monitors (or composite displays) can do is make the picture bigger, not more panoramic.

Only if UltraMode eliminates that restriction is there any chance for panoramic distortion, and the lack of accounting for it won't make Eyefinity "incompatible" with UltraMode. It might just fail to remove some of the distortion that most games have when played ultrawide screen (and its questionable if removing the distortion is even the "correct" thing to do in all cases).
There's no guarantee that enabling Eyefinity wouldn't attempt to insert the perspective altering code into the rendering pipeline, regardless of what the field of view restriction is; this might interact in strange ways with the current CoX graphics engine. That's why I was just saying until someone has tried Eyefinity on current CoX, I would just rather be pleasantly surprised than disappointed.

Given the current state of ATi/AMD card support with CoX features, I wouldn't gamble on any feature compatibility until it was demonstrated.


 

Posted

Quote:
Originally Posted by PumBumbler View Post
There's no guarantee that enabling Eyefinity wouldn't attempt to insert the perspective altering code into the rendering pipeline, regardless of what the field of view restriction is; this might interact in strange ways with the current CoX graphics engine. That's why I was just saying until someone has tried Eyefinity on current CoX, I would just rather be pleasantly surprised than disappointed.

Given the current state of ATi/AMD card support with CoX features, I wouldn't gamble on any feature compatibility until it was demonstrated.
Anything is possible, but this is incredibly unlikely because perspective correction is application-specific. Not all 3D applications are games, and not all games have a first-person or third-person viewpoint where this would even make sense. Its is a million times more likely that if CoX *wanted* perspective correction that it wouldn't be able to enable it, than the reverse.


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

I am very content with my 9800.....but a more powerful card does not hurt. I always get a new one annually anyway.

I'm eyeing a 275......but I would need a new case, and possibly a much more powerful PSU.

I'll wait till I'm done with Final Fantasy 13.


http://www.seventhsanctum.com/index-anim.php
Can't come up with a name? Click the link!

 

Posted

Thanks for the help guys

And as for Eyefinity: Run the game in windowed mode and you can drag the window across as many screens as you want, nothing extra required.


[CENTER]Euro side: [B]@Orion Star[/B] & [B]@Orions Star[/B][/CENTER]

 

Posted

Quote:
Originally Posted by Orion_Star_EU View Post
Thanks for the help guys

And as for Eyefinity: Run the game in windowed mode and you can drag the window across as many screens as you want, nothing extra required.
Was the nvidia driver ever fixed so you didn't have to run in single monitor acceleration mode when hooked up to multiple monitors?


 

Posted

Quote:
Originally Posted by Human_Being View Post
In other words, TSMC has absolutely no idea what is messing up their 40nm process. If true, we can toss that "chamber matching issue" out the window. That never had the ring of truth to me anyway.


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

Hmmm 3 285's in SLI mode .. Check
I7 processor .. check
more memory then I need .. check
1200 PSU .. check

Cold in Ohio during winter use the computer a space heater .. check

can't wait to try this ultra mode out.

if your going NVIDA stay away from the X3/X4 PSU, I had to replace one to get my system to behave.

Enjoy!


LvL 50's Inv/Em Tank, Katana/Regen Scrapper, Merc/Traps MM, Ninja/Dark MM, Crab
@Torell Guardian, Liberty & Freedom

 

Posted

Quote:
Originally Posted by PumBumbler View Post
Was the nvidia driver ever fixed so you didn't have to run in single monitor acceleration mode when hooked up to multiple monitors?
All I know is that it works for me at the moment on a GTS 250:



Click for a bigger version.

The black space at the bottom right is due to the right hand screen being smaller.


[CENTER]Euro side: [B]@Orion Star[/B] & [B]@Orions Star[/B][/CENTER]

 

Posted

So what does this mean for my Nvidia GeForce 8600 GTS?


In the Arena of Logic, I fight unarmed.

 

Posted

Or my dual 7950 GTs SLI?


Protector Server
Woeful Knight (BS/Regen/Body Scrapper)
Kevin Christian (MC/FF/Primal Controller)
SilverCybernaut (Eng/Dev/Munitions Blaster)
Apixie OhNo (Fire/Fire/Pyre Tanker)
Y'ru Glowen (Rad/Rad/Psy Defender)

 

Posted

Quote:
Originally Posted by Gulver View Post
So what does this mean for my Nvidia GeForce 8600 GTS?
An 8600 GTS will underperform a 9800 GT.


 

Posted

Quote:
Originally Posted by WoefulKnight View Post
Or my dual 7950 GTs SLI?
Harder question because of the SLI, but if it did function in Ultra Mode I would expect it to do so at the lower end of performance.


 

Posted

It sounds like I'll be okay then.

I'm running a 2.5 GHz quad core system, 8 GB RAM, and Dual nVidia GeForce 9800GTX 512MB GDDR3 PCI-Express Video Cards in SLI Configuration.

Can't wait to see it!


My Mission Architect arcs:

Attack of the Toymenator - Arc # 207874

Attack of the Monsters of Legend - Arc # 82060

Visit Cerulean Shadow's Myspace page!

 

Posted

Is overkill better in the long run? I'm on the verge of buying a new computer, and I tend to not replace parts, I'd rather have something I know is going to last a while. My system was 'top of the line' 4 years ago, and if it weren't for the fact that I can't put any more RAM on the motherboard, I probably wouldn't be getting a new one at all (a 2.6 Ghz processor isn't BAD, after all!)

I was looking at getting an SLI, or even triple SLI. I'm not a big twitch gamer, I pretty much only play LOTRO and COH, however I do tend to work with really gigantic graphics files when I'm doing Photoshop and Illustrator stuff. The fact that I can't run Photoshop from Windows 7 on my current comp drives me nuts (I can run it, and CoH at the same time, when I boot into XP. Go figure). Should I just go all out?


 

Posted

Quote:
Originally Posted by Aisynia View Post
I have a 9800 GT right now. My birthday is the day after Christmas, so upon request, I can get one big expensive combined present instead of two. I think I am going to be begging for GTX 285 I'm pushing 30, but my aunt still likes to spoil me

Luckily, the rest of my family is easier to shop for (got stuff for my infant nephew too!)
Is your aunt single? I'm not but I could be for a new computer with a GTX 285.