Ultra-mode video card shopping guide
Question that will sounds stupid but I was looking at the Comparison of Graphic card posted on page one and did the 9800 GT come in both a 512 and 1 GB format? I have the latter and well i wondering how well it will handle Ultra mode.
|
Video memory doesn't actually mean that much.
There are several other limitations in the graphics card architecture that will affect performance long before video memory size comes into the equation, such as the bit-size of the memory controller and the memory controllers speed. Sure, you can shove a full gigabyte of memory onto a 64bit or 128bit memory controller... but really, putting more than 256megs on a 64bit bus, and 512megs on a 128 bus... and in practical use... you'll just never see a benefit.
In the specific case of the 9800 GT, the weaker model, it had a 256bit memory controller with memory generally clocked at 900mhz (factory reference), and a memory controller speed of 1.1ghz. Depending on whose card you bought, those speeds could be higher or lower, and it did come in both 512meg and 1024meg memory configurations.
However... that only memory space really turns into an advantage if you are running in really high resolutions, or are applying a bunch of filtering affects. On a 9800 GT, for most games, you'll only see a performance difference either by cranking Anisotropic Filtering and Anti Aliasing up, or by cranking the resolution.
Will it matter in CoH?
Well. It doesn't really matter in UT3, Borderlands, BioShock 2, or any other popular game. Okay, you might have to run with only 4x AA and 8x AF with only 1680*1050... but... come on... that's not really a bad thing is it?
Here's a dirty little secret about most video games out today.
Video memory doesn't actually mean that much. There are several other limitations in the graphics card architecture that will affect performance long before video memory size comes into the equation, such as the bit-size of the memory controller and the memory controllers speed. Sure, you can shove a full gigabyte of memory onto a 64bit or 128bit memory controller... but really, putting more than 256megs on a 64bit bus, and 512megs on a 128 bus... and in practical use... you'll just never see a benefit. In the specific case of the 9800 GT, the weaker model, it had a 256bit memory controller with memory generally clocked at 900mhz (factory reference), and a memory controller speed of 1.1ghz. Depending on whose card you bought, those speeds could be higher or lower, and it did come in both 512meg and 1024meg memory configurations. However... that only memory space really turns into an advantage if you are running in really high resolutions, or are applying a bunch of filtering affects. On a 9800 GT, for most games, you'll only see a performance difference either by cranking Anisotropic Filtering and Anti Aliasing up, or by cranking the resolution. Will it matter in CoH? Well. It doesn't really matter in UT3, Borderlands, BioShock 2, or any other popular game. Okay, you might have to run with only 4x AA and 8x AF with only 1680*1050... but... come on... that's not really a bad thing is it? |
BFG Tech Nvidia GeForce 9800 GT 1 GB GDDR3
GPU; Nvidia GeForse 9800 GT
Bus type: PCI Express 2.0
Memorry: 1GB [1024 Mb] Gddr3
Core Clock: 550MHz
Shader Clock: 1375 MHz
Memory Data Rate: 1800 MHz
Stream Processors: 112
Shader Model: 4.0
Texture Fill Raye: 30.8 Billion/Sec
Memory Interfce: 256 bit
Memory Bandwidth: 57.6GB/Sec
API Support Microsoft DirectX 10 and Lower OpenGL 2.1 Or lower
Display Connectors:2 Dual Link DVI-I
RAMDACs: Dual 400MHz
HDCP Capale: Yes Dual-Link
HDMI Capable: Yes
NVIDIA SLI Suport: Yes, 2-Way
That are the exact specs from the box!
I run City of Heroes at 1680 x 1050 which is my monitor's limit.
The Resistance has boobs too, and better hair!
Basically, a card's memory isn't the end-all be-all of a card's performance. There are plenty of other factors in card performance. A card could (hypothetically) have 1TB of memory on it but if all it's other specs compare unfavorably against the speed of an anemic snail, then that big 'wow' of the memory spec is quite meaningless. A rather extreme example but you get the idea (I hope).
Anyways, based on what is publicly known, yes you can run UM. The 9800GT is listed as the starting point of UM compatibility. Can you run it maxed out? Not unless you'd like to play City of Slideshows. Basically, expect to be able to do 'minimum' UM settings. What is minimum? We don't know yet. More info coming 'soon'.
It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.
Let me toss in my 2 cents.
More video memory doesn't automatically mean faster performance. It's not having enough video memory for a game's settings that can degrade performance.
Now with everything being equal, same type of memory, same video memory bandwidth, same powerful GPU, same clock speeds, just more memory on one versus the other, it would take very aggressive game settings to see the difference. Usually something like using huge uncompressed texture quality at extremely high resolutions. If the game is spending more time retransmitting the textures needed to render the current frame because the card doesn't have enough memory to hold all the ones required than spending the time transmitting the actual instructions about rendering the current frame, then will you see a difference in performance because of video memory size.
I also often see on lower end video cards similarly priced versions of a card based on the the same GPU but one using slower memory (DDR2 Vs GDDR3, GDDR3 Vs GDDR5), but has twice as much than the one with faster but less memory (1024MB Vs 512MB, 512MB Vs 256MB). Don't be fooled. Faster memory nearly always win and when it doesn't it'll be a Pyrrhic victory because the game (games in general, not just CoH/V) settings will have been turned up so high that frame rate would be "unacceptable" with either card.
Here's a nice article on the subject from Tom's Hardware.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
And of cource, for those running 32-bit systems, having a 2GB video card means you only have 2GB of physical RAM thats addressable. You could get worse performance because you wont have the phyiscal RAM available for loading programs.
For me, I wont bother with a card over 896Mb until I switch to 64-bit because I dont want to eat into my RAM any more than I have to.
Tanker Tuesday #72 Oct 5 @Champion
"I am not sure if my portrayal of being insane is accurate, but damn its fun all the same."
And of cource, for those running 32-bit systems, having a 2GB video card means you only have 2GB of physical RAM thats addressable. You could get worse performance because you wont have the phyiscal RAM available for loading programs.
For me, I wont bother with a card over 896Mb until I switch to 64-bit because I dont want to eat into my RAM any more than I have to. |
I'm... not entirely sure this is accurate. The 2gb limit for memory access is for 32bit processors... but graphics processors have been running at much higher bittages for years now. I'm fairly sure that the OS's system memory limitations don't exactly apply to the memory limitations of a graphics card.
[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]
In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)
*headtilts*
I'm... not entirely sure this is accurate. The 2gb limit for memory access is for 32bit processors... but graphics processors have been running at much higher bittages for years now. I'm fairly sure that the OS's system memory limitations don't exactly apply to the memory limitations of a graphics card. |
[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]
In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)
All that depends on the impementation of the memory map.
If the video card boots and says I want 2Gb addressable, and the OS gives it up, you will only have 2GB left addressable by the OS.
I dont have a 2Gb (1896M) card to test with but even on 32-bit XP (with 4GB of RAM installed), i saw less and less RAM listed for my system as I stepped from a 256M (6600GT), 512(8600 GT) and a 768(8800GTX). All were running single monitor 1280x1024 32-bit graphics. There is no reason that the 8800 wouldn't actually use all 768M of memory for that configuration, but at boot time it didn't know and just grabbed all the addressable space it could. Given they were all nVidia cards, I cant imagine the behavior changing much with the new cards.
Tanker Tuesday #72 Oct 5 @Champion
"I am not sure if my portrayal of being insane is accurate, but damn its fun all the same."
This is a cardinal rule of performance in general. Not having enough X will generally hurt, but having more than what the thing needs generally doesn't help. Above the critical level, having faster is better than having more. Below the critical level, having more is better than having faster. Basically, computer resources obey ED.
|
I'm running an 8800GTS 320MB. There is an otherwise identical GPU with double the RAM, at 640MB. At the time, when I had a 19" LCD monitor and with the games that were out those years ago, it was a better buy than the 640MB, even though there was only £30 in it. There was no real-world performance difference between the two outside of a big fat 2560x1600 widescreen monitor.
So. I saved £30, got a GPU with half the memory, and it was just as fast. Gravy.
Skip ahead a year. Now I find my GPU to be straining. But it's an 8800, and it still shoops woops.
A few months later I get myself a 22" widescreen LCD. Now cracks start appearing in the paintwork. Games that are otherwise modest start to take far too much of a toll. I look on a few benchmarks, like the Toms Hardware charts, and...
...the 640MB card is now twice as fast as my measly 320MB thing in every modern game.
Today, my GPU is a joke. Game developers have generally accepted 512MB as a minimum for that class of card, and mine doesn't meet that. I cannot run antialiasing on a new game, even Batman, which has special nVidia wizadry that allows AA to go like the clappers on the Unreal Engine-powered beast. (Aside: deferred rendering doesn't play well with AA, apart from when seen in Batman: Arkham Asylum.)
So there you have it. If I had spent £30 more at the time, my GPU would be twice as fast. All due to the RAM.
Necrobond - 50 BS/Inv Scrapper made in I1
Rickar - 50 Bots/FF Mastermind
Anti-Muon - 42 Warshade
Ivory Sicarius - 45 Crab Spider
Aber ja, nat�rlich Hans nass ist, er steht unter einem Wasserfall.
Yes, you probably went from 1280x1024 on your 19" to what, 1680x1050 on your 22" widescreen? So first you are slinging 35% more pixels per frame. You are also using 35% more video memory for multiple frame buffers and the z buffer.
Also just about every review site tests video cards with all the quality settings set to the max to highlight the differences between cards. In the case of the older, original 8800 GTS where only memory size is the difference, it's the size and amount of textures that's the problem. Frequently a simple lowering of the texture quality by a notch or two can restore an acceptable frame rate as the game uses textures more suited to the amount of video memory you have. Of course as your monitor's resolution increases, smaller, lower resolution textures become more obvious.
Sites like HardOCP review cards by adjust game settings to attain similar frame rates and then report the setting differences. They also include "Apples to Apples" same setting comparisons for those readers who can't wrap their heads around what the various game settings do, even if HardOCP includes a brief explanation of the visual differences.
Oh and don't say you didn't know what you were getting into. The reviews at the time showed a significant impact in performance when they came out in Feb 2007. Here were two.
The Tech Report
bit-tech.net
So maybe at 1280x1024 the differences weren't so huge and the price savings was attractive, maybe the only way you could afford a Dx10 class card at the time. It still out performed the 7900GTX with 512MB of memory so it was probably a good decision at the time, just not so good three years down the line.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
Yes, you probably went from 1280x1024 on your 19" to what, 1680x1050 on your 22" widescreen? So first you are slinging 35% more pixels per frame. You are also using 35% more video memory for multiple frame buffers and the z buffer.
Also just about every review site tests video cards with all the quality settings set to the max to highlight the differences between cards. In the case of the older, original 8800 GTS where only memory size is the difference, it's the size and amount of textures that's the problem. Frequently a simple lowering of the texture quality by a notch or two can restore an acceptable frame rate as the game uses textures more suited to the amount of video memory you have. Of course as your monitor's resolution increases, smaller, lower resolution textures become more obvious. Sites like HardOCP review cards by adjust game settings to attain similar frame rates and then report the setting differences. They also include "Apples to Apples" same setting comparisons for those readers who can't wrap their heads around what the various game settings do, even if HardOCP includes a brief explanation of the visual differences. Oh and don't say you didn't know what you were getting into. The reviews at the time showed a significant impact in performance when they came out in Feb 2007. Here were two. The Tech Report bit-tech.net So maybe at 1280x1024 the differences weren't so huge and the price savings was attractive, maybe the only way you could afford a Dx10 class card at the time. It still out performed the 7900GTX with 512MB of memory so it was probably a good decision at the time, just not so good three years down the line. |
You were wrong on one count, though; I could have afforded a 640MB version. That £30 difference was well within reach. Yes, I am a numpty.
I do indeed keep my texture detail in control, and I can get playable framerates in most games if I keep the texture detail low; but as you pointed out, it can get very noticeable. I'm actually rather happy that Aliens vs Predator runs on my graphics card at all.
Personally I'm getting revved up for the GeForce 400 launch in two weeks, so I can get myself a 5850 in confidence.
Necrobond - 50 BS/Inv Scrapper made in I1
Rickar - 50 Bots/FF Mastermind
Anti-Muon - 42 Warshade
Ivory Sicarius - 45 Crab Spider
Aber ja, nat�rlich Hans nass ist, er steht unter einem Wasserfall.
I have an AMD 9600 Dual Core 2.31 GHz Processor with 3.25GB of RAM and an ATI 4890 Graphics card. I really want to run Ultra Mode at near max power without having framerate issues. What sort of replacement parts should I be getting if I need to be ordering replacements?
Globals: @Nurse Midnight, @Red Kabuto, @Zeronos
Twitter: @NurseMidnight
Golden Age Heroes
I have an AMD 9600 Dual Core 2.31 GHz Processor with 3.25GB of RAM and an ATI 4890 Graphics card. I really want to run Ultra Mode at near max power without having framerate issues. What sort of replacement parts should I be getting if I need to be ordering replacements?
|
So, right now, I'd not worry too much about the cards and CPUs out there (hard, I know) and look to see what your power supply is in case you do have to upgrade stuff.
I have an AMD 9600 Dual Core 2.31 GHz Processor with 3.25GB of RAM and an ATI 4890 Graphics card. I really want to run Ultra Mode at near max power without having framerate issues. What sort of replacement parts should I be getting if I need to be ordering replacements?
|
As for the CPU the only 2.3GHz AMD processor with a 9600 designation I can find is a first generation Phenom quad core (not dual core) that had the rare L3 cache problem, one which the BIOS fix for it utterly killed it's performance. The current Athlon II X4 630 is about 25% faster, which makes sense as the "fix" for the broken Phenom 9x00 series basically neutered the L3 cache, leaving what you essentially have in an Athlon II X4. So from a CPU perspective you have a low end, quad CPU. That might hurt you a bit.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
It pays to do research and to research things you don't think of - like will the card you get actually line up with the holes in the back of my case.
I just received the XFX HD 5850 video card and a Zalman 600 watt power supply. The power supply went in no dramas. However when I went to install the video card the card wouldn't fit. If it was to go into the sole PCI-express slot on the mother board, it wouldn't fit the holes on the back of the case. And fitting the holes on case would put it into the PCI slot.... which the card wouldn't and shouldn't be fitted into.
ARGH!
So now I'm waiting to hear back if I can get a refund or exchange for a similar card which has the card mounted the other way around. But I really, really, Really wanted the XFX card.
So everyone, take warning: Do your research - will your card fit your motherboard and case.
Sig characters appear in the Ch�teau Rouge thread starting from post #100
I Support Nerd Flirting! - Story/Discussion/Sequel
I don't understand what you mean by "fit the hole on the back of the case"? The double bracket wouldn't slide into two adjacent slots? I could understand if the card was too long, the standard HD 5850 is an 11" or so card.
What case were you installing this in?
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
It pays to do research and to research things you don't think of - like will the card you get actually line up with the holes in the back of my case.
I just received the XFX HD 5850 video card and a Zalman 600 watt power supply. The power supply went in no dramas. However when I went to install the video card the card wouldn't fit. If it was to go into the sole PCI-express slot on the mother board, it wouldn't fit the holes on the back of the case. And fitting the holes on case would put it into the PCI slot.... which the card wouldn't and shouldn't be fitted into. ARGH! So now I'm waiting to hear back if I can get a refund or exchange for a similar card which has the card mounted the other way around. But I really, really, Really wanted the XFX card. So everyone, take warning: Do your research - will your card fit your motherboard and case. |
any chance you could take a picture and show us what you mean by this? I'm as confused as Father_Xmas.
Well an ATI HD 4890 is faster than an nVidia GTX 260 so from a graphics side of the equation you should be fine with UM at medium settings.
As for the CPU the only 2.3GHz AMD processor with a 9600 designation I can find is a first generation Phenom quad core (not dual core) that had the rare L3 cache problem, one which the BIOS fix for it utterly killed it's performance. The current Athlon II X4 630 is about 25% faster, which makes sense as the "fix" for the broken Phenom 9x00 series basically neutered the L3 cache, leaving what you essentially have in an Athlon II X4. So from a CPU perspective you have a low end, quad CPU. That might hurt you a bit. |
That's right is it a quad core, my mistake. So basically I would also need to upgrade my processor due to this bug?
Globals: @Nurse Midnight, @Red Kabuto, @Zeronos
Twitter: @NurseMidnight
Golden Age Heroes
That's right is it a quad core, my mistake. So basically I would also need to upgrade my processor due to this bug?
|
You can read about the TLB issue here: http://www.legitreviews.com/article/618/1/ :: http://www.techreport.com/discussions.x/13724
***
You might note that the processor isn't actually coupled against a Socket AM2+ motherboard though. It's coupled against the older (and slower) Socket AM2 Asus M2R32-MVP. The reason is actually pretty simple. Of the Socket AM2+ motherboards I've bought, they've all been bought for SLI setups and used Nvidia chipsets that will not run the Phenom 9600 in dual-channel mode. So far there hasn't been a Socket AM2+ Crossfire motherboard that's matched the price / capabilities of the M2R32-MVP. I got it for $120 back in '07, around 6 or so months after it's 2006 launch. It's only been recently that AM2+ boards like this MSI CrossFire capable board have been dipping into the ~$100 market with equivalent features. Granted, I wouldn't buy it because it's an MSI and every MSI board I've had comes with a BIOS that makes Intel's look good by comparison. Anyways, currently, of the boards listed on Newegg, none of the AMD chipset boards explicitly say they'll do x16/x16 in Crossfire, with most stating they'll do x8/x8. That's a step back compared to a board that's over 3 years old.
Of the boards Newegg has that return searches of x16/x16, on the Crossfire side there's a DFI, which is normally good. Only the last DFI I bought was the LANPARTY JR X58-T3H6... and it won't run I7's in Triple Channel mode.
Then there's the ASUS M4A79T-D. Great motherboard. I put together one for a client. Only.. come one. $180? No. Just. No. That's Intel motherboard pricing.
Anyways, because of my lack of willing to pay Intel-ish prices for AMD motherboards that have the feature set of a 3+ year old motherboard, might be a factor on why I haven't seen the TLB issues that were reported to be in the Agena 9600 B2 revision.
That's right is it a quad core, my mistake. So basically I would also need to upgrade my processor due to this bug?
|
I think Posi's warning about the rest of the system has more to do with people who are still running single core systems built around a Pentium 4, an Athlon XP or an Athlon 64 and are running with less than 2GB of system memory.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
Hey folks quick question...
I have an older comp. It's an HP a840 n. It's cpu is a single core AMD 3300+. It has been upgraded to 2GB of memory, and has a 600 watt supply. However my 7600gs vid card packed it up and caught on fire literally. I was forced to slap in a pci 5200 card I had as a backup/testing card, which of course sucks badly. I am thinking of buying a Radeon HD 4670 1GB 128-bit DDR3 AGP 4X/8X HDCP Ready Video Card I saw on newegg. First of all, can my comp handle it. Secondly is this my best option since I don't have pci-e. Finally will this combo be able to run my game on better that "minimum" gfx mode?
Thanks
Hey folks quick question...
I have an older comp. It's an HP a840 n. It's cpu is a single core AMD 3300+. It has been upgraded to 2GB of memory, and has a 600 watt supply. However my 7600gs vid card packed it up and caught on fire literally. I was forced to slap in a pci 5200 card I had as a backup/testing card, which of course sucks badly. I am thinking of buying a Radeon HD 4670 1GB 128-bit DDR3 AGP 4X/8X HDCP Ready Video Card I saw on newegg. First of all, can my comp handle it. Secondly is this my best option since I don't have pci-e. Finally will this combo be able to run my game on better that "minimum" gfx mode? Thanks |
Sadly, the HD 4670 is the most powerful AGP card that can currently be bought on the retail market. Now, while I can't tell you how the card does, or more preciously does not perform in Ultra Mode without torquing Public Relations off, it is quite powerful enough to run the non-Ultra Mode options at maximum quality, and in relatively high resolutions.
Thanks je_seist. I figured ultra mode was pushing it. But hoped i could spend 70ish $$ instead of getting a new system in toto, and limp along until refund time next year. After that it'll be time to build my uber ultra cox killer system lol
Excellent information, je saist, and thanks! Can't wait to hear what comes out of closed beta....