Ultra-mode video card shopping guide
So the 4850HD is the baseline for the newest graphical update to the game to come at some future point?
Excellent, that means I won't have to worry about adding anything to this beast* for another year, except maybe RAM. That means more money to spend elsewhere, like preordering Going Rogue
*Beast = Gateway FX6800-e
Ok, so here's my comp:
Core 2 Duo 2.8 ghz
2gb ram (soon to be 4gb)
nVidia GeForce 9400 GT 512mb vd
Now, I can run Oblivion and CO on fairly high settings with the current components in my comp. I don't think Going Rogue will give me trouble, but, any thoughts anyone?
Ok, so here's my comp:
Core 2 Duo 2.8 ghz 2gb ram (soon to be 4gb) nVidia GeForce 9400 GT 512mb vd Now, I can run Oblivion and CO on fairly high settings with the current components in my comp. I don't think Going Rogue will give me trouble, but, any thoughts anyone? |
Nice to know my
Intel(R)Core(TM) i7 CPu 920@2.67GHZ
6GB
Windows 7 64bit and
ATI Radeon HD 5700 1024MB GC
will be fine with GR hate to think it wouldnt be lol
Will it still work well though? or does it mean it won't work for ultra mode???
|
I think you asked the following questions:
Will playing on Windows 7 work well?
Will Playing on Windows 7 not work for Ultra Mode?
To reiterate what Chad said, City of Heroes leverages the OpenGL Application Programming Interface. OpenGL is the graphics industry standard for rendering 3D graphics, and I do mean industry. OpenGL is controlled by the Khronos Group, which is staffed by representatives who have Promoter Membership. You can view the companies that have seats on Khronos's Board of Directors here: http://www.khronos.org/members/promoters
The list reads as a whose-who in the Technology world that you've probably heard of, with the likes of the list populated by AMD, Apple, ARM, Sony / Ericsson, Intel, Motorola, Nvidia, Nokia, Qualcomm, SamSung, Sony Entertainment, and Texas Instruments. The only two you might not recognize on the promoters member list are Freescale and Imagination.
The contributers list is where you'll find other companies you've probably heard of, like IBM, Autodesk, Activision / Blizzard, Broadcom, Creative, Dell, Electronic Arts, Fujitsu, FutureMark, General Electric, Google, LG, Mitsubishi, NEC, Opera, Palm, Panasonic, Via / S3, Toshiba, and Transgaming, which are the guys that do the Macintosh / Linux for CoH.
In fact, the only technology company you've probably heard of that is not invested in Khronos... is Microsoft.
So, your questions are actually valid questions.
Microsoft does not like OpenGL, OpenCL, or any of the other open standards backed by Khronos. Microsoft has their own proprietary technologies that they'd rather developers use. The fact is, if you use Microsoft technology, you are locked into Microsoft Platforms. Okay, that's sounds like it's somewhat good on the Desktop where Microsoft has maybe 65%-75% of the consumer desktop market. That's not so good anywhere else... and it's even worse when you realize that writing a game against OpenGL 2.0 ES would allow you to target the Iphone, Playstation 3, Xbox 360, Microsoft Windows NT5 (2000/XP), Microsoft Windows NT6 (Vista/Win7), Wii, Linux (Desktop Linux, Android, ChromeOS, MonteVista Linux, SplashTop), OSX, and smart phones using Symbian.
So... one code-base equals pretty much every platform a consumer can use. Another code-base means... a very limited selection.
Incidentally, this is why so many games that are lead developed on the Xbox 360 have problems on the Playstation 3, and why Valve won't release games for the Playstation 3 platform. Many game developers target DirectX and are at a loss when presented with anything that isn't dummy-moded. (and yes, I went there. DirectX is dummy mode).
Another fact is that Microsoft has been caught sabotaging OpenGL performance on their operating systems. Vista, for example, deliberately broke OpenGL support during the Alpha Testing. The programmer backlash forced Microsoft to unbreak OpenGL.. http://www.opengl.org/pipeline/article/vol003_7/ :: http://blog.wolfire.com/2010/01/Why-...nd-not-DirectX
As of right now, none of Microsoft's published operating systems are known to break OpenGL compatibility. So, in theory, City of Heroes: Ultra Mode should work as well as your graphics cards and drivers will allow under Windows 7.
Does this mean that you'd want to use Windows 7, or that it would work well?
Well, that's a subjective question. If you look through the technical support section you'll see lots of users having problems with various installs on Windows 7, across both AMD/ATi and Nvidia graphics cards.
Now, I can tell you this. If Transgaming's upcoming graphics engine rewrite works like it's supposed to, you'll get the exact same image running under OSX or Linux that you will running under a Windows system. If the game works properly, you'll also get the exact same image regardless of which version of Windows you run, be it 2000, Xp, Vista, or Win7. You should also have the exact same image across ATi, Nvidia, Intel, or any graphics chip that supports OpenGL 3.0 (presuming that's the API in use).
Now, that's theory. We don't know yet if that's reality. We don't know yet if Xp's lighter system overhead will give it a performance advantage with the same image quality over Vista / Win7. We don't know yet if Nvidia's and ATi's drivers will be up to the task of rendering UltraMode on all platforms equally.
For that, we do have to wait for the beta.
***
Does this answer the questions I think you asked?
Ok .. I gotta ask .. My comp is a custom'd Shuttle, Windows XP Pro, fully updated, 3.0 gig Wolfdale Core 2 Duo CPU, 3 gigs of 800 memory, and an Nvidia 9600 GT. Monitor is 1280 x 1024.
Current CoV runs VERY smooth at max settings, will it be enough for the upcoming 'Ultra Mode'?
Note, I've tried Champions Online as well, and was able to max out the settings without straining there as well.
And no advertising ATI or AMD to me PLEASE, I've had my best luck with Intel and Nvidia, End of Story.
Shigeru Miyamoto "A delayed game will eventually be good, a bad game is bad forever."
Ok .. I gotta ask .. My comp is a custom'd Shuttle, Windows XP Pro, fully updated, 3.0 gig Wolfdale Core 2 Duo CPU, 3 gigs of 800 memory, and an Nvidia 9600 GT. Monitor is 1280 x 1024.
Current CoV runs VERY smooth at max settings, will it be enough for the upcoming 'Ultra Mode'? |
While the baseline for Going Rogue on the Nvidia side is the 9800 / GTS 250, we don't know yet what the resolution scaling is. If the developers resolution scaling is based against 720p, then a 9800 / GTS 250 would be the minimum for 1280*720 or 1440*900 (the common PC Monitor size), and your 9600 GT would not be powerful enough.
If the resolution scaling is based against 1080p, then a 9800 / GTS 250 would be the minimum for 1920*1080 or 1920*1200, and your 9600 GT would probably be powerful enough to drive Going Rouge at your monitors limited resolution.
Note, I've tried Champions Online as well, and was able to max out the settings without straining there as well. |
And no advertising ATI or AMD to me PLEASE, I've had my best luck with Intel and Nvidia, End of Story. |
I hope they release more information on the 5xxx Radeon card's Ultra Mode performance soon. The graphics card in my wife's machine died and I'm looking to upgrade to either the 4890 or the 5770. *taps foot*
I hope they release more information on the 5xxx Radeon card's Ultra Mode performance soon. The graphics card in my wife's machine died and I'm looking to upgrade to either the 4890 or the 5770. *taps foot*
|
Maybe we can get him to answer...
Here's a link to a pretty good review of the 5000 series Radeon cards, including overclocked and crossfire setups.
My Deviant Art page link-link
CoH/V Fan Videos
I don't know if this question has been answered, but I use a GeForce 9800 GT, so will I be able to run ultra-mode? I don't care either way, I'm just curious.
Yeah, I'm looking at a 5000 series also, though will hope that my hp power supply can be easily upgraded (just got the box over Christmas)
Arc: 378122 "Tales of the Terran Space Marines -The Apocalypse Initiative" 5stars!
http://www.youtube.com/watch?v=6Rgl4...687B0FC89F142C
Arc: 481545 "Twilight of the Gods - The Praetorian conflict"8000+ hits!
http://www.youtube.com/watch?v=XxJ6S...848B21E2350DCC
I don't know if this question has been answered, but I use a GeForce 9800 GT, so will I be able to run ultra-mode? I don't care either way, I'm just curious.
|
------snip----------- If you are looking to spend under US$100, then an NVidia 9800 GT is your best bet. For AMD (ATI/Radeon), we dont have enough of these cards at this price point to get you good data. This would be the minimum card for enabling all the features, only at reduced quality settings. ----------snip---------- Note: This is based on our current internal testing and might change by the time Going Rogue is released. Also, a video card is not the ONLY thing you should take into consideration. Your PC having a decent processor and memory will also enhance your performance. As we continue to test and get more information available we will update you. I hope this helps, and happy holidays! |
If the game spit out 20 dollar bills people would complain that they weren't sequentially numbered. If they were sequentially numbered people would complain that they weren't random enough.
Black Pebble is my new hero.
CoX uses OpenGL and NVidia consistently writes better OpenGL drivers than ATI. ATI supports directX pretty well but they've historically been sloppy with their implimentation of OpenGL.
Hardware performance is important without a doubt, but don't overlook the quality of the software controlling it all.
As an example I was playing around with OpenGL frambuffer objects, doing some post processing effects (glow, depth of field). No issues on NVidia in XP or Linux (the proprietary binary drivers). Completely broken with the ATI card in my lenovo. It seems ATI only supports a selection of possible pixel modes (# bits/color/alpha) and integer/float modes for framebuffers, and has some odd caveats about copying buffers and the final display buffer.
Even with my novice mucking around with OpenGL it's pretty obvious ATI cards eat up more developer time and probably end up adding duplicated effort to build the shaders and post processing filters.
CoX uses OpenGL and NVidia consistently writes better OpenGL drivers than ATI. ATI supports directX pretty well but they've historically been sloppy with their implementation of OpenGL.
Hardware performance is important without a doubt, but don't overlook the quality of the software controlling it all. As an example I was playing around with OpenGL frambuffer objects, doing some post processing effects (glow, depth of field). No issues on NVidia in XP or Linux (the proprietary binary drivers). Completely broken with the ATI card in my lenovo. It seems ATI only supports a selection of possible pixel modes (# bits/color/alpha) and integer/float modes for framebuffers, and has some odd caveats about copying buffers and the final display buffer. Even with my novice mucking around with OpenGL it's pretty obvious ATI cards eat up more developer time and probably end up adding duplicated effort to build the shaders and post processing filters. |
AMD also released OpenGL 3.0 in the Catalyst 8.9 set, which was on 9/17/2008.
Nvidia didn't get around to it until December 16th: http://news.developer.nvidia.com/200...0-drivers.html
Nvidia's 64bit OpenGL drivers have also been broken for years: http://www.mepisguides.com/Mepis-7/h...ati-64bit.html
As users of Windows 7 64bit are finding out today, things haven't really improved on the Nvidia side.
So do us a favor, get out of 2006 and please rejoin 2010.
***
Okay, for some background data here. The support of ATi and Nvidia in OpenGL can pretty much be tied to the history of how OpenGL was handled.
Just a few years ago the OpenGL specification was directionless. One of the developer advantages to OpenGL is that the specification could accept various proprietary vendor extensions, but nobody in the OpenGL Architecture Review Board was really setting a path for which extensions should be standardized and which should be discarded. Nvidia made great use of the proprietary extensions of OpenGL with technologies like Ultra Shadow in Doom3. Nvidia used the proprietary extensions that OpenGL could support in order to make their hardware produce images that were better than competitors, while achieving better frame-rates. That's all well and good, but as CoH users have found out the hard way, when you let Nvidia write your shaders and background graphics libary, you'll wind up with a graphics system that resembles Sluggy's Torg after going through Riff's Paper cut bomb. (too lazy to go find that comic).
Starting in 2004, 3Dlabs started Pressing OpenGL 2.0 in response to the stagnation and aimless direction of OpenGL's development. Through 2006 and 2007 the Khronos Group took over the Architecture Review Board and pushed out OpenGL 2.0 ES, a stripped version of OpenGL that much like 3DFX's old Glide API, provided little more than the features necessary for fast-path texturing and shading... aka gaming related functions of OpenGL.
ATi, and other vendors like S3, Trident, and Matrox, weren't known for creating proprietary OpenGL extensions. Rather, they were known for relying on specifications. ATi's reputation for being closely tied to DirectX came from the launches of their Radeon cards, which were originally named after DirectX levels, and the launch of the 9700, in which ATi had worked closely with Microsoft to determine the specifications of DirectX 9, as well as producing a reference card to accelerate DirectX 9.
As OpenGL has become more standardized and regained a direction, as well as an overseeing body that isn't likely to do nothing with the API for years, it has been easier for vendors that build cards and drivers against published specifications to get involved with OpenGL. With these background changes, AMD has interacted more closely with the Khronos group, and OpenGL support has dramatically improved.
AMD, however, isn't in a luxury position to do what they want to do, when they want to do it. They, AMD, do have to follow the money. Unfortunately, right now, the big money for PC-gaming is on DirectX. Personally, DirectX is a multi-billion dollar mistake game developers can't make... and the point is made clear across consoles like PS3 and Xbox 360, as well as mobile platforms like Android and Iphone. Porting Xbox 360 games built on DirectX to the PS3's OpenGL ES 2.0 often results in poorer games. However, porting games built against OpenGL ES 2.0 on the PS3 first, then running the exact same code on the Xbox 360, generally results in good game ports. In the same way, developers focused on multiple markets are finding that DirectX is a pitiful way to reach the most amount of players. OpenGL... well. An app written against OpenGL will run on Android, Iphone, Windows, Linux, Apple OSX, PS3, Wii, Xbox 360... you name it.
So, as portable devices become more popular, we'll hopefully see the shift from DirectX to OpenGL that needs to happen, and we'll see AMD taking a larger and more active role in pushing OpenGL support.
Ok, as a Win7 64 bit user, I have yet to have a problem with NVidia's drivers, and I know plenty of others who have no problems with them.
If you want to stop FUD on the other side of things that would be nice.
ATI has always had more issues with CoH over the years it has seemed to me, from Alt-Tab problems on up. Just what I have noticed over the years.
Both makers have nice cards out there. Pick which you like.
Defcon 0 - (D4 lvl 50),DJ Shecky Cape Radio
@Shecky
Twitter: @DJ_Shecky, @siliconshecky, @thecaperadio
When you air your dirty laundry out on a clothesline above the street, everyone is allowed to snicker at the skid marks in your underoos. - Lemur_Lad
looking to spend my tax return
Should this do ultra mode cranked up to 11?
CPU: Intel® Core™ i7-860 2.80 GHz 8M L2 Cache LGA1156
HDD: Single Hard Drive (1TB (1TBx1) SATA-II 3.0Gb/s 16MB Cache 7200RPM HDD)
MOTHERBOARD: [CrossFireX/SLI] EVGA P55 TR Intel P55V Chipset DDR3 Socket 1156 mATX Mainboard w/ 7.1 HD Audio, GbLAN, USB2.0, SATA-II RAID, 2 Gen2 PCIe, & 2 PCIe X1
MEMORY: 8GB (2GBx4) DDR3/1600MHz Dual Channel Memory Module SOUND: HIGH DEFINITION ON-BOARD 7.1 AUDIO
VIDEO: NVIDIA GeForce GTX285 2GB 16X PCIe Video Card (EVGA Powered by NVIDIA)
I reject your reality and substitue my own!
--Adam Savage from "Mythbusters"
I have updated the first post in this thread with the information I got today regarding a couple things.
Read it here
Positron
Follow me on Twitter
When I spoke to the tech guys at IBuyPower.com (where I got my gaming rig), they told me the Radeon 5850 is virtually identical to the GTX 285, FWIW.
*happy dance*
I may actually be able to afford the video card upgrade to make vids with Ultra Mode... thanks to a small but devoted group of my fans. Halfway there...
Michelle
aka
Samuraiko/Dark_Respite
THE COURSE OF SUPERHERO ROMANCE CONTINUES!
Book I: A Tale of Nerd Flirting! ~*~ Book II: Courtship and Crime Fighting - Chap Nine live!
MA Arcs - 3430: Hell Hath No Fury / 3515: Positron Gets Some / 6600: Dyne of the Times / 351572: For All the Wrong Reasons
378944: Too Clever by Half / 459581: Kill or Cure / 551680: Clerical Errors (NEW!)
What does SLI not being supported mean? If you have two GTX 260s in SLI will ultra mode not work at all, or will you just not see a benefit from the SLI? (As in, would it just be like a single 260)
@Bengal Fist - Freedom - Authority SG
Bengal Fist (SS/EA) - Thirt3en (Time/Elec) - Aussi (Elec/Shield) - Potamoi (Water/Time) - Parkr (Staff/Ela)
Nice - my new computer has a pair of high-end ATIs, so I hope you can get Crossfire to work with UM
@Golden Girl
City of Heroes comics and artwork
Woohoo! This is great news. I can justify the purchase of a pair of 5770's. They're like 50 bucks cheaper than the GTX260's.
Hello, my name is @Caligdoiel and I'm an altoholic.
The motherboard has one PCIe/16 slot and 2 PCI slots.
2 Gig Ram - From Dell... that'll be A$100 plus delivery.
Video card: ATI HD5750 PCI-E 2.0 1GB from one different supplier - A$200
and then the power supply... 650W power supply ... A$180 plus delivery
... so I got some saving to do and some bargin hunting to do.
(650W is to allow for the burner I'll eventually add and better regulation of regional power)
I've heard that even with the shipping costs, and the 5% tack-on, you will come in under what you would pay from an Australian site, which would have a smaller inventory to choose from.
Edit: The order form will allow you to input the cost of the item, and it will calculate an estimated quote for you.
Thank you, Champion.