Ultra-mode video card shopping guide


5th_Player

 

Posted

Quote:
Originally Posted by Gulver View Post
How can I find that out?
Look up the specs on your PSU, if you know what it is. Elsewise, you'll have to open the case and check (the information should be listed on the physical unit).


 

Posted

Hmmm.. well this is going to be interesting to see how many people only upgrade their video card and wind up bottlenecking in the processor. I know that will more than likely be my bottleneck as I am using a 4 year old Athalon x64 4200+ dual core machine.

That being said, I am curious to see how different machines take to the Ultra-Graphics Mode overall.

After re-reading the post from Posi, there are some cards such as the GTS250 which has a price point in the mid $100's that seem to defy the scale Posi set up (as the 250 is basically a 9800) and the 260's are running just over $200 right now. So price points are a little off.


Defcon 0 - (D4 lvl 50),DJ Shecky Cape Radio
@Shecky
Twitter: @DJ_Shecky, @siliconshecky, @thecaperadio
When you air your dirty laundry out on a clothesline above the street, everyone is allowed to snicker at the skid marks in your underoos. - Lemur_Lad

 

Posted

Quote:
Originally Posted by DJ_Shecky View Post
Hmmm.. well this is going to be interesting to see how many people only upgrade their video card and wind up bottlenecking in the processor. I know that will more than likely be my bottleneck as I am using a 4 year old Athalon x64 4200+ dual core machine.

That being said, I am curious to see how different machines take to the Ultra-Graphics Mode overall.

After re-reading the post from Posi, there are some cards such as the GTS250 which has a price point in the mid $100's that seem to defy the scale Posi set up (as the 250 is basically a 9800) and the 260's are running just over $200 right now. So price points are a little off.
Well those who can't afford a full new box or those with already decent systems will likely go the GPU route and maybe bottleneck the processor. The rest will either not run UM, can already run UM well with no upgrades, or will buy a new rig that can run UM with no issues. That's my guess anyways. As for the price points Posi uses, keep in mind the original post was done in November. Markets and such change so it shouldn't be too surprising his numbers are off a bit.

The best thing for anyone looking to upgrade to do is to figure their budget and then do some homework. Find whatever gives them the best performance at a price they can afford and buy that. Plenty won't and will end up disappointed but really, think before you spend and you'll do fine.


It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.

 

Posted

Hey, for you folk's wanting to help push Nvidia for sli capability for this update, I managed to find a thread on Nvidia's website.

http://forums.nvidia.com/index.php?showtopic=159310

I wasn't able to post due to some odd error, but for you Nvidia Sli people, lets get this issue looked at!



50's (Only most important)
Zuraq (53 EM/SR Brute)
Stagmalite (50 Granite/Fire TanK)
(Couple of other's I don't care about.)

 

Posted

Quote:
Originally Posted by Bill Z Bubba View Post
Correct me if I'm wrong, but you seem to be stating that there's nothing the devs can do to the engine that the OS isn't already handling in order to "spread the pain" across all the cores on a proc.
Not exactly. What I'm saying is that the current game doesn't seem to generate enough "pain" on even two i7 cores to make spreading that load to four or eight i7 cores worth it. If you have a single processor machine, the many threads of the game all have to run on that one core, going to two cores would help. Except, if you had some special massively overclocked super processor that was 50x faster than a P4 per core, then you'd have so much processing power that whether the game used one core or two wouldn't make a difference to your game. Similarly, Nehelem cores seem to be fast enough that two deliver more power than the game ever seems to ask for. Threading makes sense when you have more load than one processor core can comfortably do, but you have more than one core.


Quote:
Granted, my understanding of processor engineering is nonexistent, but if that's the case, why did the devs have to enable the renderthread flag so that it could use the second thread/core in the first place?
I'm not an expert on the renderthread flag, but I believe that specifically signalled the game to separate two different rendering tasks into two different threads that ordinarily run in only one. The unit of processing that is scheduled by the operating system is the thread. If you have only one thread then Windows (and most other operating systems) can only use one core. Individual threads can be put on different cores to balance the load. Older versions of Windows used to have incredibly dumb schedulers: the classic problem was the case of having a single high CPU utilization thread "bouncing" back and forth between two cores in a dual proc system. Windows saw the load on CPU A to be very high and CPU B very low, so it scheduled the highest load thread onto B. Which made A low and B high. Which caused Windows to schedule that thread onto A. Rinse, repeat. This actually made execution *slower* than if it just left it alone, because switching cores has a certain amount of overhead (and why we have an "affinity" switch).

CoX seems to have a couple of high load threads, plus one for opengl, plus one for 3d sound, that burn at least measurable amounts of CPU. Everything else burns almost nothing - at least relative to the power of a Nehelem core (except demorecords seem to burn a percent or two on a Nehelem core which is at least measurable).


Quote:
Why is it locked down to two cores as it is? Or is it? Resource Monitor is showing me that cityofheroes.exe is using 17 threads for 13% CPU utilization with most of the action appearing to happen on CPU 0 and 6 with half as much working on 2 and 4. All the odd numbered CPUs are labeled "parked" and show almost no activity at all.

Oh. I guess it doesn't matter.
Windows is doing the right thing. Its taking the two most computationally intensive threads and putting them onto two different physical processors. Its then taking the rest of the threads and sprinkling them onto the other two physical processors. And its noting that when it does, none of those processors are anywhere near being full, so it specifically only uses one core per processor, because it doesn't need to use the virtual core on any of those four. If it attempted to use any of the virtual cores, it would do so at the expense of slowing down one of the four physical cores.

Think of the four i7 processors as four pick up trucks, each of which can pull a trailer. You have four of them driving from LA to San Francisco and you need to haul a certain amount of cargo. The first thing you're going to do is try to fill the beds of the four trucks before resorting to trailers. If you can, there's no need for the trailers. You could attach the trailers to all four trucks and take some stuff out of the beds and add it to the trailers, but you're now hauling around the trailers for no reason: you aren't doing any more work.

However, if you have more stuff than the four truck beds can hold, its obviously a lot better to start using the trailers, even if it adds weight and slows the trucks down, than to leave some stuff behind and have to make two trips.

Now, if you have just one really big crate to take, even if you have four trucks you can only put that object in one truck. You have three empty trucks and one overloaded truck that has to drive very slowly. Recognizing this, if you can split that shipment into two crates half the size, you end up with two trucks with a comfortable load (and two empty trucks) that can both drive at comfortable highway speed.

In this case, you have four pickup trucks that can each haul four trailers, and you're being asked to haul a case of beer. Your question boils down to whether or not it makes sense to take the cans out of the plastic rings and place two cans in each truck, add the trailers, and add one can in each trailer. Its possible, but not really beneficial.


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

So my GeForce GT220 1GB vid card should be good to go?


Global: @All-American Teen
70 toons across 11 servers.

Top hero -All-American Teen lvl 50 eng/invul Tanker (01:10 EST; 1/24/09)
Top villian -Poisoned Plum lvl 30 robots/devices Mastermind

 

Posted

Quote:
Originally Posted by Dave_67 View Post
So my GeForce GT220 1GB vid card should be good to go?
no. Not even close.

The GT220 is basically a rebadged Geforce 9400 GT. That's several steps below the starting point of a 9800 GTX / GTS 250.


 

Posted

Quote:
Originally Posted by Gulver View Post
And for the slag comment, I've heard... "Stories" about people buying the most expensive graphic card they could and ending up frying their motherboard and losing everything due to incompatiablity or too much power.

Not sure if I'm recalling it correctly or to the accuracy of said stories.
I popped a GTX 260 in my 4 year old computer and had a fried motherboard less than a week later. It should have been compatabile and it should have had more than enough power. The timing seemed a little suspicious but I don't know that the card caused the problem for sure.

I didn't really have the cash for a dream system build but I have a pretty good in-betweener that ought to keep me happy for 3 years or so. I'm kind of glad it happened really except for the few things I can't get off my old hard drives for love or money.


Attache @ deviantART

Attache's Anti-401k Art Collection

 

Posted

Quote:
Originally Posted by je_saist View Post
no. Not even close.

The GT220 is basically a rebadged Geforce 9400 GT. That's several steps below the starting point of a 9800 GTX / GTS 250.
No it's not a rebadged anything. Entirely new chip. Technically it's a mash up of the 9500GT and the 9600GSO and it's performance, the better ones (with GDDR3 memory) fall right in between their performance.

je_saist is still right about that the GT 220 is far below the 9800GT. Somewhere around 1/2 the performance of the 9800GT.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

Quote:
Originally Posted by Father Xmas View Post
No it's not a rebadged anything. Entirely new chip. Technically it's a mash up of the 9500GT and the 9600GSO and it's performance, the better ones (with GDDR3 memory) fall right in between their performance.

je_saist is still right about that the GT 220 is far below the 9800GT. Somewhere around 1/2 the performance of the 9800GT.
okay. reimplemented then at a lower die size

to me it's sort of like comparing the Radeon 9700 to the 9800, or I think more accurately, the 9800 to the x800. yes, it's technically different... but really... it's the same basic architecture underneath.


 

Posted

Quote:
Originally Posted by je_saist View Post
okay. reimplemented then at a lower die size

to me it's sort of like comparing the Radeon 9700 to the 9800, or I think more accurately, the 9800 to the x800. yes, it's technically different... but really... it's the same basic architecture underneath.
No it's not. It's an entirely new chip.

9400GT - 16 SPs, 8 Texture units, 8 ROP units
9500GT - 32 SPs, 16 Texture units, 8 ROP units
9600GSO - 48 SPs, 24 Texture units, 16 ROP units

GT 220 - 48 SPs, 16 Texture units, 8 ROP units

9800GT - 112 SPs, 56 Texture units, 16 ROP units

On top of that the GPU in the GT 220 does support Dx10.1 where the older G9x based GPUs only support Dx10.

In other news there are rumors that video card prices are going to creep up in price due to continuing 40nm GPU shortages as well as RAM prices going up again.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

I play CoH/V on a laptop with a 9800M GTS video card.

Any idea if this will run Ultra-Mode? And if so, how well it's likely to do? I understand that we don't have hard data yet, but an idea would help.

Thanks.


 

Posted

Quote:
Originally Posted by Father Xmas View Post
No it's not. It's an entirely new chip.

9400GT - 16 SPs, 8 Texture units, 8 ROP units
9500GT - 32 SPs, 16 Texture units, 8 ROP units
9600GSO - 48 SPs, 24 Texture units, 16 ROP units

GT 220 - 48 SPs, 16 Texture units, 8 ROP units

9800GT - 112 SPs, 56 Texture units, 16 ROP units

On top of that the GPU in the GT 220 does support Dx10.1 where the older G9x based GPUs only support Dx10.

In other news there are rumors that video card prices are going to creep up in price due to continuing 40nm GPU shortages as well as RAM prices going up again.
part of me is tilting my head on this... and I'm going to try to explain why. The GT 220 and other chips are build off of the GT200 architecture, the same architecture behind the GTX series. Back when the GTX series launched, several sites, such as like Beyond3D and BitTech looked at the known information, and concluded that the base architectures of the G80 and GT200 were pretty much the same.

Ergo, for a chip derived off of the GT200 series, I'm not entirely convinced the GT 2xx series is new.. so much as it is Nvidia doing what they pretty much did before, re-implement an existing solution in a new die.

Also, as the original G80 was based on programmable shaders, DX 10.1 wasn't exactly that big of a deal : http://www.extremetech.com/article2/...129TX1K0000532

(mental note for another thread: the extremetech link is also relates to another thread about fallbacks between OpenGL and DirectX)


Pretty much the biggest deal(s) for DX 10.1 is that 4x AA is mandatory, and it forces 32bit floating point precision. If you look at the rest of the details, the G80 architecture was pretty much capable of the support from the start: http://www.istartedsomething.com/200...-101-siggraph/

So I'm pretty much willing to standby my statement that the GT series isn't actually a "new" chip, that it's just a reimplementation of the stuff Nvidia was already selling, just tweaked to sound more attractive to prospective buyers.


 

Posted

Quote:
Originally Posted by Peregrine_Falcon View Post
I play CoH/V on a laptop with a 9800M GTS video card.

Any idea if this will run Ultra-Mode? And if so, how well it's likely to do? I understand that we don't have hard data yet, but an idea would help.

Thanks.
Nvidia's mobile chips are almost never as powerful as the desktop versions. You might be able to play in Ultra Mode depending on what the resolution scaling is.


 

Posted

Hey, not trying to throw the current conversation off topic or anything...

But, I managed to find the Nvidia Sli request form, for anyone who wants Ultra Mode sli capable. I posted before with a link but here it is again followed by the Sli Application Compatibility Request link.

http://forums.nvidia.com/index.php?s...&#entry1009215 - Discussion

http://www.slizone.com/object/sliapp_request.html - Sli Application Compatibility Request



50's (Only most important)
Zuraq (53 EM/SR Brute)
Stagmalite (50 Granite/Fire TanK)
(Couple of other's I don't care about.)

 

Posted

Quote:
Originally Posted by Peregrine_Falcon View Post
I play CoH/V on a laptop with a 9800M GTS video card.

Any idea if this will run Ultra-Mode? And if so, how well it's likely to do? I understand that we don't have hard data yet, but an idea would help.

Thanks.
Well according to Notebookcheck's info on laptop GPUs, the 9800M GTS has similar performance to the desktop 8800GS/9600GSO/GT 240 which are all slower than the desktop 9600GT.

And since we really don't know the actual impact to performance that Ultra Mode has combined with the current myriad of game settings as well as your preferred game resolution, etc. we really can't say but it's likely on the no/not well side.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

Posi has already pretty much said that laptops arent really compatable with ultra mode.

I was gonna spend $$$$$ to get one that would play Ultra Mode on Medium/High settings and he said there is not one on the market that will be able to do that.


 

Posted

I've been putting off my computer upgrade for a long time... Dropped the $1k and bought me a new, well, everything.

For my money, I got:
Processor: i7-920
RAM: 3 x 2gb DDR3 1600
Hard Drive: 1tb 7200RPM 32meg cache
Video Card: ATI Radeon HD5850
Power Supply: 650w
Case: Yes (bought a new one)

I'm excited to put it together. Woo!


 

Posted

Quote:
Originally Posted by funnyhalo View Post
Hey, not trying to throw the current conversation off topic or anything...

But, I managed to find the Nvidia Sli request form, for anyone who wants Ultra Mode sli capable. I posted before with a link but here it is again followed by the Sli Application Compatibility Request link.

http://forums.nvidia.com/index.php?s...&#entry1009215 - Discussion

http://www.slizone.com/object/sliapp_request.html - Sli Application Compatibility Request
Adding an SLi profile for this game is something we can do right now. Or at least you were able to the last time I looked. That is only one small step in the equation of SLi/Crossfire equation. All that link will do is include a multi GPU profile instead of the single GPU one as default. ATI is more of an issue since they didn't (don't know if this is still true) allow users to set up their own configurations and the default mode is a simple subdivision of the screen between multiple GPUs as oppose to an alternate frame or tile configurations.

Also there are a list of "avoid doing in a 3D renderer" tips in the SLi software developer's guide that will limit the impact of a multiple GPU solution. So there may need to be modifications to the rendering engine itself. We don't know the level of help Paragon/Cryptic (since they wrote the original engine) received from nVidia and ATI when Ultra Mode features were being added (as well as fixing ATI quirks) and whether or not it the subject even came up in their discussions.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

Quote:
Originally Posted by je_saist View Post
part of me is tilting my head on this... and I'm going to try to explain why. The GT 220 and other chips are build off of the GT200 architecture, the same architecture behind the GTX series. Back when the GTX series launched, several sites, such as like Beyond3D and BitTech looked at the known information, and concluded that the base architectures of the G80 and GT200 were pretty much the same.

Ergo, for a chip derived off of the GT200 series, I'm not entirely convinced the GT 2xx series is new.. so much as it is Nvidia doing what they pretty much did before, re-implement an existing solution in a new die.

Also, as the original G80 was based on programmable shaders, DX 10.1 wasn't exactly that big of a deal : http://www.extremetech.com/article2/...129TX1K0000532

(mental note for another thread: the extremetech link is also relates to another thread about fallbacks between OpenGL and DirectX)


Pretty much the biggest deal(s) for DX 10.1 is that 4x AA is mandatory, and it forces 32bit floating point precision. If you look at the rest of the details, the G80 architecture was pretty much capable of the support from the start: http://www.istartedsomething.com/200...-101-siggraph/

So I'm pretty much willing to standby my statement that the GT series isn't actually a "new" chip, that it's just a reimplementation of the stuff Nvidia was already selling, just tweaked to sound more attractive to prospective buyers.
When you say die shrink, it implies that there was a version of the chip in the GT 220 that was produced and in cards. The same way that there was a G200 GPU and a G200b GPU. The G200b being the die shrink.

Yes the G8x, G9x, G2xx GPUs are similar in design. The GPUs in nVidia's 6xxx and 7xxx series were also similar to each other. The GPUs in ATI's HD 2xxx, 3xxx, 4xxx, 5xxx are similar to each other.

je_saist, you are painting with a broad brush, well they are all similar. So are automobile engines on the surface. This is a 4 cylinder, that is a 6 cylinder. That is a V8, this one is supercharged. The more you start digging into what you may consider to be inconsequential differences in design actually do affect performance.

So what if the GPU in the GT 220 is based on the architectural elements of the big G200 GPU from the GTX 2xx series? It's still a new GPU, one that's a bit better than the 9500GT it replaced which was a bit better than the 8600GT it replaced.

The only "bad" thing about the GT216 and GT215 GPUs is that they are a lower end part. One that should have been available to the masses 12 months ago. Where's the "half of a GT 280" to replace the 9600GT/9800GT?


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

Well, it turns out they allready have a Sli Profile for Coh/V. I don't know if this is recognized by Paragon Studios and PlayNC, and I can allready assume because of Posi's update that the profile listed does NOT include UM.



50's (Only most important)
Zuraq (53 EM/SR Brute)
Stagmalite (50 Granite/Fire TanK)
(Couple of other's I don't care about.)

 

Posted

Well, turns out the crossfiring a 5870 and a 5850 is impossible here... cuz they were out of stock. So I bought him another 5870... He is setting them up right now.

Will see soon how crossfire 5870s do with Ultra Mode


 

Posted

Quote:
Originally Posted by Perfect_Pain View Post
Well, turns out the crossfiring a 5870 and a 5850 is impossible here... cuz they were out of stock. So I bought him another 5870... He is setting them up right now.

Will see soon how crossfire 5870s do with Ultra Mode
Has there been some revision I haven't noticed? Last official word I saw indicated that there was presently no support in UM for SLI or Crossfire, only a hope for it - without it, I wouldn't expect to see any performance boost from simply having two GPUs...hell, it may even degrade performance a little bit. >.<


 

Posted

I broke down and got me GT260. Too many games I play beside CoH/V have issues still with ATI cards so I'm sticking with Nvida for now.