Originally Posted by Arcanaville
On the other hand, if you aren't interested in building your own, a comparable system that gets close would be something like the Dell XPS 8000 which starts off around $849 for the base system. The advantage of the Dell: you don't have to assemble it, you can use their configurator to move the price point up and down, and it comes with a Windows 7 Home Premium license (which is not cheap if you want to be legit on the Windows license). The disadvantage: the Dell seems to top out videocard-wise at the 260, while with a do-it-yourself system the sky's the limit. Positron suggests a 260 will likely be able to run Ultra Mode only at its medium quality settings, while something like a 285 will run it at max settings. FatherXmas' recommended 275 is much closer in performance to a 285 than a 260, and probably has a shot at running near max settings in Ultra Mode.
|
Ultra-mode video card shopping guide
Also note: the GTS 220 and 240, which are the other two options for graphics card on that order form, operate below the level of a 9800 GT (Positron's Ultra Mode "entry point"). The GTS 240 is closer to an older 9600 GSO, and depending on manufacturer modifications might reach 9800 GT performance; but I wouldn't bet on such overclocking from Dell.
|
(Actually, the 4350 seems to be significantly faster than the Radeon X850 that was in my old system, which cost then nearly as much as the 5850 was supposed to cost at release. Moore's law strikes again.)
[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]
In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)
I am looking at a possible upgrade. I was running 2 8800 GTX with 768 meg of RAM on each backed by a 1000w PSU, but one of my cards died on me.
I need to see what my MB can handle since it is an old EVGA Nforce 680i.
After demoing the Ultra-mode at HeroCon, one of the most frequently asked questions we have been receiving is what are the video cards you recommend to get the most out of Ultra-mode? With the holiday season coming up, we asked the programmers in charge of implementing the features of Ultra-mode what they would recommend. We would like to forward this information on to you so you can better work on your holiday shopping lists.
If you are looking to spend under US$100, then an NVidia 9800 GT is your best bet. For AMD (ATI/Radeon), we dont have enough of these cards at this price point to get you good data. This would be the minimum card for enabling all the features, only at reduced quality settings. If you are looking to spend between $100 and $200, the Radeon HD 4890 and GeForce GTX 260 will do you well. We dont have numbers from the Radeon 57xx series yet to verify if that is better or worse though. This would end up somewhere in the middle of Ultra-mode quality. Finally if you are going to spend over $200, the GeForce GTX 285 is an excellent choice. We dont have numbers from the Radeon 58xx series yet. This would be able to run Ultra-mode with the quality maxed out (except we have no data on Anti-aliasing, so caveat emptor when it comes to that specific feature). Note: This is based on our current internal testing and might change by the time Going Rogue is released. Also, a video card is not the ONLY thing you should take into consideration. Your PC having a decent processor and memory will also enhance your performance. As we continue to test and get more information available we will update you. I hope this helps, and happy holidays! |
Luckily, the rest of my family is easier to shop for (got stuff for my infant nephew too!)
I'm only ladylike when compared to my sister.
I would presume yes, but with a caveat. I don't know if CoX will properly detect the Eyefinity single view resolution, but I do know that CoX works fine even on non-Eyefinity multiple monitor setups in Windowed mode. I can make a windowed (circa) 3840 x 1200 (1920x1200 x 2) CoX window. You just need the horsepower to drive resolutions that high. So if CoX doesn't properly handle the ultralarge eyefinity display, it almost certainly will allow for a windowed client across the eyefinity display.
|
I've windowed CoX across multiple monitors on my GTX280, but Eyefinity also has a peripheral distortion adjustment to correct for field of view (when looking at the central monitor). I'm not certain if CoX works with that, so that's why I am reserving any compatibility until someone has run a test. Call me a pessimist if you will.
|
Only if UltraMode eliminates that restriction is there any chance for panoramic distortion, and the lack of accounting for it won't make Eyefinity "incompatible" with UltraMode. It might just fail to remove some of the distortion that most games have when played ultrawide screen (and its questionable if removing the distortion is even the "correct" thing to do in all cases).
[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]
In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)
I believe that is irrelevant to CoX because as previously mentioned CoX will not allow increased angular perspective. The only thing larger monitors (or composite displays) can do is make the picture bigger, not more panoramic.
Only if UltraMode eliminates that restriction is there any chance for panoramic distortion, and the lack of accounting for it won't make Eyefinity "incompatible" with UltraMode. It might just fail to remove some of the distortion that most games have when played ultrawide screen (and its questionable if removing the distortion is even the "correct" thing to do in all cases). |
Given the current state of ATi/AMD card support with CoX features, I wouldn't gamble on any feature compatibility until it was demonstrated.
There's no guarantee that enabling Eyefinity wouldn't attempt to insert the perspective altering code into the rendering pipeline, regardless of what the field of view restriction is; this might interact in strange ways with the current CoX graphics engine. That's why I was just saying until someone has tried Eyefinity on current CoX, I would just rather be pleasantly surprised than disappointed.
Given the current state of ATi/AMD card support with CoX features, I wouldn't gamble on any feature compatibility until it was demonstrated. |
[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]
In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)
I am very content with my 9800.....but a more powerful card does not hurt. I always get a new one annually anyway.
I'm eyeing a 275......but I would need a new case, and possibly a much more powerful PSU.
I'll wait till I'm done with Final Fantasy 13.
Can't come up with a name? Click the link!
Thanks for the help guys
And as for Eyefinity: Run the game in windowed mode and you can drag the window across as many screens as you want, nothing extra required.
[CENTER]Euro side: [B]@Orion Star[/B] & [B]@Orions Star[/B][/CENTER]
[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]
In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)
Hmmm 3 285's in SLI mode .. Check
I7 processor .. check
more memory then I need .. check
1200 PSU .. check
Cold in Ohio during winter use the computer a space heater .. check
can't wait to try this ultra mode out.
if your going NVIDA stay away from the X3/X4 PSU, I had to replace one to get my system to behave.
Enjoy!
LvL 50's Inv/Em Tank, Katana/Regen Scrapper, Merc/Traps MM, Ninja/Dark MM, Crab
@Torell Guardian, Liberty & Freedom
Was the nvidia driver ever fixed so you didn't have to run in single monitor acceleration mode when hooked up to multiple monitors?
|
Click for a bigger version.
The black space at the bottom right is due to the right hand screen being smaller.
[CENTER]Euro side: [B]@Orion Star[/B] & [B]@Orions Star[/B][/CENTER]
So what does this mean for my Nvidia GeForce 8600 GTS?
In the Arena of Logic, I fight unarmed.
Or my dual 7950 GTs SLI?
Protector Server
Woeful Knight (BS/Regen/Body Scrapper)
Kevin Christian (MC/FF/Primal Controller)
SilverCybernaut (Eng/Dev/Munitions Blaster)
Apixie OhNo (Fire/Fire/Pyre Tanker)
Y'ru Glowen (Rad/Rad/Psy Defender)
On my shopping list...
http://www.newegg.com/Product/Produc...82E16814130469
It sounds like I'll be okay then.
I'm running a 2.5 GHz quad core system, 8 GB RAM, and Dual nVidia GeForce 9800GTX 512MB GDDR3 PCI-Express Video Cards in SLI Configuration.
Can't wait to see it!
My Mission Architect arcs:
Attack of the Toymenator - Arc # 207874
Attack of the Monsters of Legend - Arc # 82060
Visit Cerulean Shadow's Myspace page!
Is overkill better in the long run? I'm on the verge of buying a new computer, and I tend to not replace parts, I'd rather have something I know is going to last a while. My system was 'top of the line' 4 years ago, and if it weren't for the fact that I can't put any more RAM on the motherboard, I probably wouldn't be getting a new one at all (a 2.6 Ghz processor isn't BAD, after all!)
I was looking at getting an SLI, or even triple SLI. I'm not a big twitch gamer, I pretty much only play LOTRO and COH, however I do tend to work with really gigantic graphics files when I'm doing Photoshop and Illustrator stuff. The fact that I can't run Photoshop from Windows 7 on my current comp drives me nuts (I can run it, and CoH at the same time, when I boot into XP. Go figure). Should I just go all out?
I have a 9800 GT right now. My birthday is the day after Christmas, so upon request, I can get one big expensive combined present instead of two. I think I am going to be begging for GTX 285 I'm pushing 30, but my aunt still likes to spoil me
Luckily, the rest of my family is easier to shop for (got stuff for my infant nephew too!) |