Originally Posted by Gulver
How can I find that out?
|
Ultra-mode video card shopping guide
The ATi 5000 series is pretty energy efficient. If your budget is around $200, here's a good deal on a HD 5770.
http://www.peonline.ca/main.php/en/product/36356
Hmmm.. well this is going to be interesting to see how many people only upgrade their video card and wind up bottlenecking in the processor. I know that will more than likely be my bottleneck as I am using a 4 year old Athalon x64 4200+ dual core machine.
That being said, I am curious to see how different machines take to the Ultra-Graphics Mode overall.
After re-reading the post from Posi, there are some cards such as the GTS250 which has a price point in the mid $100's that seem to defy the scale Posi set up (as the 250 is basically a 9800) and the 260's are running just over $200 right now. So price points are a little off.
Defcon 0 - (D4 lvl 50),DJ Shecky Cape Radio
@Shecky
Twitter: @DJ_Shecky, @siliconshecky, @thecaperadio
When you air your dirty laundry out on a clothesline above the street, everyone is allowed to snicker at the skid marks in your underoos. - Lemur_Lad
Hmmm.. well this is going to be interesting to see how many people only upgrade their video card and wind up bottlenecking in the processor. I know that will more than likely be my bottleneck as I am using a 4 year old Athalon x64 4200+ dual core machine.
That being said, I am curious to see how different machines take to the Ultra-Graphics Mode overall. After re-reading the post from Posi, there are some cards such as the GTS250 which has a price point in the mid $100's that seem to defy the scale Posi set up (as the 250 is basically a 9800) and the 260's are running just over $200 right now. So price points are a little off. |
The best thing for anyone looking to upgrade to do is to figure their budget and then do some homework. Find whatever gives them the best performance at a price they can afford and buy that. Plenty won't and will end up disappointed but really, think before you spend and you'll do fine.
It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.
Hey, for you folk's wanting to help push Nvidia for sli capability for this update, I managed to find a thread on Nvidia's website.
http://forums.nvidia.com/index.php?showtopic=159310
I wasn't able to post due to some odd error, but for you Nvidia Sli people, lets get this issue looked at!
50's (Only most important)
Zuraq (53 EM/SR Brute)
Stagmalite (50 Granite/Fire TanK)
(Couple of other's I don't care about.)
Correct me if I'm wrong, but you seem to be stating that there's nothing the devs can do to the engine that the OS isn't already handling in order to "spread the pain" across all the cores on a proc.
|
Granted, my understanding of processor engineering is nonexistent, but if that's the case, why did the devs have to enable the renderthread flag so that it could use the second thread/core in the first place? |
CoX seems to have a couple of high load threads, plus one for opengl, plus one for 3d sound, that burn at least measurable amounts of CPU. Everything else burns almost nothing - at least relative to the power of a Nehelem core (except demorecords seem to burn a percent or two on a Nehelem core which is at least measurable).
Why is it locked down to two cores as it is? Or is it? Resource Monitor is showing me that cityofheroes.exe is using 17 threads for 13% CPU utilization with most of the action appearing to happen on CPU 0 and 6 with half as much working on 2 and 4. All the odd numbered CPUs are labeled "parked" and show almost no activity at all. Oh. I guess it doesn't matter. |
Think of the four i7 processors as four pick up trucks, each of which can pull a trailer. You have four of them driving from LA to San Francisco and you need to haul a certain amount of cargo. The first thing you're going to do is try to fill the beds of the four trucks before resorting to trailers. If you can, there's no need for the trailers. You could attach the trailers to all four trucks and take some stuff out of the beds and add it to the trailers, but you're now hauling around the trailers for no reason: you aren't doing any more work.
However, if you have more stuff than the four truck beds can hold, its obviously a lot better to start using the trailers, even if it adds weight and slows the trucks down, than to leave some stuff behind and have to make two trips.
Now, if you have just one really big crate to take, even if you have four trucks you can only put that object in one truck. You have three empty trucks and one overloaded truck that has to drive very slowly. Recognizing this, if you can split that shipment into two crates half the size, you end up with two trucks with a comfortable load (and two empty trucks) that can both drive at comfortable highway speed.
In this case, you have four pickup trucks that can each haul four trailers, and you're being asked to haul a case of beer. Your question boils down to whether or not it makes sense to take the cans out of the plastic rings and place two cans in each truck, add the trailers, and add one can in each trailer. Its possible, but not really beneficial.
[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]
In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)
So my GeForce GT220 1GB vid card should be good to go?
Global: @All-American Teen
70 toons across 11 servers.
Top hero -All-American Teen lvl 50 eng/invul Tanker (01:10 EST; 1/24/09)
Top villian -Poisoned Plum lvl 30 robots/devices Mastermind
And for the slag comment, I've heard... "Stories" about people buying the most expensive graphic card they could and ending up frying their motherboard and losing everything due to incompatiablity or too much power.
Not sure if I'm recalling it correctly or to the accuracy of said stories. |
I didn't really have the cash for a dream system build but I have a pretty good in-betweener that ought to keep me happy for 3 years or so. I'm kind of glad it happened really except for the few things I can't get off my old hard drives for love or money.
Attache @ deviantART
Attache's Anti-401k Art Collection
no. Not even close.
The GT220 is basically a rebadged Geforce 9400 GT. That's several steps below the starting point of a 9800 GTX / GTS 250. |
je_saist is still right about that the GT 220 is far below the 9800GT. Somewhere around 1/2 the performance of the 9800GT.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
No it's not a rebadged anything. Entirely new chip. Technically it's a mash up of the 9500GT and the 9600GSO and it's performance, the better ones (with GDDR3 memory) fall right in between their performance.
je_saist is still right about that the GT 220 is far below the 9800GT. Somewhere around 1/2 the performance of the 9800GT. |
to me it's sort of like comparing the Radeon 9700 to the 9800, or I think more accurately, the 9800 to the x800. yes, it's technically different... but really... it's the same basic architecture underneath.
okay. reimplemented then at a lower die size
to me it's sort of like comparing the Radeon 9700 to the 9800, or I think more accurately, the 9800 to the x800. yes, it's technically different... but really... it's the same basic architecture underneath. |
9400GT - 16 SPs, 8 Texture units, 8 ROP units
9500GT - 32 SPs, 16 Texture units, 8 ROP units
9600GSO - 48 SPs, 24 Texture units, 16 ROP units
GT 220 - 48 SPs, 16 Texture units, 8 ROP units
9800GT - 112 SPs, 56 Texture units, 16 ROP units
On top of that the GPU in the GT 220 does support Dx10.1 where the older G9x based GPUs only support Dx10.
In other news there are rumors that video card prices are going to creep up in price due to continuing 40nm GPU shortages as well as RAM prices going up again.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
I play CoH/V on a laptop with a 9800M GTS video card.
Any idea if this will run Ultra-Mode? And if so, how well it's likely to do? I understand that we don't have hard data yet, but an idea would help.
Thanks.
No it's not. It's an entirely new chip.
9400GT - 16 SPs, 8 Texture units, 8 ROP units 9500GT - 32 SPs, 16 Texture units, 8 ROP units 9600GSO - 48 SPs, 24 Texture units, 16 ROP units GT 220 - 48 SPs, 16 Texture units, 8 ROP units 9800GT - 112 SPs, 56 Texture units, 16 ROP units On top of that the GPU in the GT 220 does support Dx10.1 where the older G9x based GPUs only support Dx10. In other news there are rumors that video card prices are going to creep up in price due to continuing 40nm GPU shortages as well as RAM prices going up again. |
Ergo, for a chip derived off of the GT200 series, I'm not entirely convinced the GT 2xx series is new.. so much as it is Nvidia doing what they pretty much did before, re-implement an existing solution in a new die.
Also, as the original G80 was based on programmable shaders, DX 10.1 wasn't exactly that big of a deal : http://www.extremetech.com/article2/...129TX1K0000532
(mental note for another thread: the extremetech link is also relates to another thread about fallbacks between OpenGL and DirectX)
Pretty much the biggest deal(s) for DX 10.1 is that 4x AA is mandatory, and it forces 32bit floating point precision. If you look at the rest of the details, the G80 architecture was pretty much capable of the support from the start: http://www.istartedsomething.com/200...-101-siggraph/
So I'm pretty much willing to standby my statement that the GT series isn't actually a "new" chip, that it's just a reimplementation of the stuff Nvidia was already selling, just tweaked to sound more attractive to prospective buyers.
Hey, not trying to throw the current conversation off topic or anything...
But, I managed to find the Nvidia Sli request form, for anyone who wants Ultra Mode sli capable. I posted before with a link but here it is again followed by the Sli Application Compatibility Request link.
http://forums.nvidia.com/index.php?s...&#entry1009215 - Discussion
http://www.slizone.com/object/sliapp_request.html - Sli Application Compatibility Request
50's (Only most important)
Zuraq (53 EM/SR Brute)
Stagmalite (50 Granite/Fire TanK)
(Couple of other's I don't care about.)
I play CoH/V on a laptop with a 9800M GTS video card.
Any idea if this will run Ultra-Mode? And if so, how well it's likely to do? I understand that we don't have hard data yet, but an idea would help. Thanks. |
And since we really don't know the actual impact to performance that Ultra Mode has combined with the current myriad of game settings as well as your preferred game resolution, etc. we really can't say but it's likely on the no/not well side.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
Posi has already pretty much said that laptops arent really compatable with ultra mode.
I was gonna spend $$$$$ to get one that would play Ultra Mode on Medium/High settings and he said there is not one on the market that will be able to do that.
I've been putting off my computer upgrade for a long time... Dropped the $1k and bought me a new, well, everything.
For my money, I got:
Processor: i7-920
RAM: 3 x 2gb DDR3 1600
Hard Drive: 1tb 7200RPM 32meg cache
Video Card: ATI Radeon HD5850
Power Supply: 650w
Case: Yes (bought a new one)
I'm excited to put it together. Woo!
Hey, not trying to throw the current conversation off topic or anything...
But, I managed to find the Nvidia Sli request form, for anyone who wants Ultra Mode sli capable. I posted before with a link but here it is again followed by the Sli Application Compatibility Request link. http://forums.nvidia.com/index.php?s...&#entry1009215 - Discussion http://www.slizone.com/object/sliapp_request.html - Sli Application Compatibility Request |
Also there are a list of "avoid doing in a 3D renderer" tips in the SLi software developer's guide that will limit the impact of a multiple GPU solution. So there may need to be modifications to the rendering engine itself. We don't know the level of help Paragon/Cryptic (since they wrote the original engine) received from nVidia and ATI when Ultra Mode features were being added (as well as fixing ATI quirks) and whether or not it the subject even came up in their discussions.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
part of me is tilting my head on this... and I'm going to try to explain why. The GT 220 and other chips are build off of the GT200 architecture, the same architecture behind the GTX series. Back when the GTX series launched, several sites, such as like Beyond3D and BitTech looked at the known information, and concluded that the base architectures of the G80 and GT200 were pretty much the same.
Ergo, for a chip derived off of the GT200 series, I'm not entirely convinced the GT 2xx series is new.. so much as it is Nvidia doing what they pretty much did before, re-implement an existing solution in a new die. Also, as the original G80 was based on programmable shaders, DX 10.1 wasn't exactly that big of a deal : http://www.extremetech.com/article2/...129TX1K0000532 (mental note for another thread: the extremetech link is also relates to another thread about fallbacks between OpenGL and DirectX) Pretty much the biggest deal(s) for DX 10.1 is that 4x AA is mandatory, and it forces 32bit floating point precision. If you look at the rest of the details, the G80 architecture was pretty much capable of the support from the start: http://www.istartedsomething.com/200...-101-siggraph/ So I'm pretty much willing to standby my statement that the GT series isn't actually a "new" chip, that it's just a reimplementation of the stuff Nvidia was already selling, just tweaked to sound more attractive to prospective buyers. |
Yes the G8x, G9x, G2xx GPUs are similar in design. The GPUs in nVidia's 6xxx and 7xxx series were also similar to each other. The GPUs in ATI's HD 2xxx, 3xxx, 4xxx, 5xxx are similar to each other.
je_saist, you are painting with a broad brush, well they are all similar. So are automobile engines on the surface. This is a 4 cylinder, that is a 6 cylinder. That is a V8, this one is supercharged. The more you start digging into what you may consider to be inconsequential differences in design actually do affect performance.
So what if the GPU in the GT 220 is based on the architectural elements of the big G200 GPU from the GTX 2xx series? It's still a new GPU, one that's a bit better than the 9500GT it replaced which was a bit better than the 8600GT it replaced.
The only "bad" thing about the GT216 and GT215 GPUs is that they are a lower end part. One that should have been available to the masses 12 months ago. Where's the "half of a GT 280" to replace the 9600GT/9800GT?
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
Well, it turns out they allready have a Sli Profile for Coh/V. I don't know if this is recognized by Paragon Studios and PlayNC, and I can allready assume because of Posi's update that the profile listed does NOT include UM.
50's (Only most important)
Zuraq (53 EM/SR Brute)
Stagmalite (50 Granite/Fire TanK)
(Couple of other's I don't care about.)
Well, turns out the crossfiring a 5870 and a 5850 is impossible here... cuz they were out of stock. So I bought him another 5870... He is setting them up right now.
Will see soon how crossfire 5870s do with Ultra Mode
I broke down and got me GT260. Too many games I play beside CoH/V have issues still with ATI cards so I'm sticking with Nvida for now.