Is this graphics card good enough for Ultra Mode?
...you might be able to turn on one option, on low. Maybe.
Be prepared for a sobbing, pleading computer if you attempt to turn much of anything up beyond that. Heck, it might even commit harakiri to rid itself of the pain.
Paragon Wiki: http://www.paragonwiki.com
City Info Terminal: http://cit.cohtitan.com
Mids Hero Designer: http://www.cohplanner.com
No.
Be well, people of CoH.
... I'm going with...
not a chance in the world.
Here's the big problem: The GeforceFX series was really a DirectX 8.1 design modeled after the NV2a used in the original Xbox Console. The 2002 launch of the Radeon 9700 and 9500 series took everybody by surprise... including Nvidia, whose chip roadmaps had planned on ATi to launch a luke-warm followup to the Radeon 8500 series of cards... not a new chip that could run Pixel Shader / Vertex Shader 2.0 code at 60fps on a resolution of 1280*1024.
Nvidia delayed the Geforce FX series in a rush to add in Pixel Shader / Vertex Shader 2.0.. but as revealed by Valve during their infamous Shader Day demonstration, the Geforce FX series was drop-dead slow in Pixel Shader / Vertex Shader 2.0 code.
Nvidia went to great lengths to disguise just how bad the GeforceFX series performance was, including using their drivers to modify benchmarks like 3DMark 2003, as well as several games. Nvidia also used several proprietary OpenGL extensions, such as "UltraShadow", to boost the performance of Nvidia cards under titles using IDSoftware's Tech3 engine.
The real history of the GeforceFX series shows when you pick up a modern Windows Client game. While Radeon support often goes all the way back to the 9600... Geforce support stops at the Geforce 6000 series, a testament to just how lousy the FX series actually way.
not a chance in the world.
Here's the big problem: The GeforceFX series was really a DirectX 8.1 design modeled after the NV2a used in the original Xbox Console. The 2002 launch of the Radeon 9700 and 9500 series took everybody by surprise... including Nvidia, whose chip roadmaps had planned on ATi to launch a luke-warm followup to the Radeon 8500 series of cards... not a new chip that could run Pixel Shader / Vertex Shader 2.0 code at 60fps on a resolution of 1280*1024.
Nvidia delayed the Geforce FX series in a rush to add in Pixel Shader / Vertex Shader 2.0.. but as revealed by Valve during their infamous Shader Day demonstration, the Geforce FX series was drop-dead slow in Pixel Shader / Vertex Shader 2.0 code.
Nvidia went to great lengths to disguise just how bad the GeforceFX series performance was, including using their drivers to modify benchmarks like 3DMark 2003, as well as several games. Nvidia also used several proprietary OpenGL extensions, such as "UltraShadow", to boost the performance of Nvidia cards under titles using IDSoftware's Tech3 engine.
The real history of the GeforceFX series shows when you pick up a modern Windows Client game. While Radeon support often goes all the way back to the 9600... Geforce support stops at the Geforce 6000 series, a testament to just how lousy the FX series actually way.
NVIDIA® GeForce FX 5600
thanks