Consolidated ATI Settings Guide
[ QUOTE ]
If your card supports Geometry Instancing, enable the instancing in the driver, and enable this setting in game.
[/ QUOTE ]
Is it called that in the ATI Control panel?
Any comments on alternate drivers?
[ QUOTE ]
[ QUOTE ]
If your card supports Geometry Instancing, enable the instancing in the driver, and enable this setting in game.
[/ QUOTE ]
Is it called that in the ATI Control panel?
Any comments on alternate drivers?
[/ QUOTE ]
In the CCC, it's under the setting: API Specific, and it's only for D3D.
In game, it's called Use Geometry Buffers.
I have it set in both locations as enabled.
As for alternate drivers, both Omega and the DH Zero Point are currently 2 revisions behind. My last run with the Omegas showed no improvement in gameplay. At this point, I see no reason for 3rd party drivers unless:
1: You're running a laptop mobility card and can't get updates from your OEM or ATI.
2: You don't want the CCC or .net framework on your system
Something else that I don't think I've ever stated:
The settings that I have in place within the CCC are left alone for all of my games. I might go in and raise AA and AF for older titles, but beyond that, I'm able to leave as is.
Be well, people of CoH.

Just wanted to let those who use it know, the Catalyst 6.5 Omega Drivers are now available. (The author skipped Cat 6.4 due to RL issues lol)
Damage Proc Mini-FAQ
Just noticed Damage Proc Mini-FAQ wasn't working with new forums, it's been updated.
Hey Bill, thanks for taking the time to post the info. I was wondering if you could try and help me with my situation. I have been playing CoV for a few months now with minimal graphics problems and then a few days ago my game started crashing. It happens generally after maybe 5-10 minutes in game. Usually, the game freezes and after a few seconds my monitor switches itself into standby mode requiring a reboot. Occaisionally I will get some wierd coloring and some artifacting prior to the crash. At the extreme times the screen becomes filled with multicoloured squares all of the same size and spaced equally apart. Sorry for the long post but I am just trying to give as much info as possible to try and get some help.
At first I had all of my settings basically to default. Running at 1280x1024 with a refresh of 75 on a 19"CRT. I've tried setting everything as you've described and played around with various settings and what not but nothing seems to make a difference.
The Set up:
P4 2.4 GHZ, 1GB RAM, ATI Radeon X800XL
I have just updated to the Catalyst 6.5 but that also made no difference. The part I don't understand is how it would suddenly start crashing without having any changes made to any hardware or settings. I also can't think of any programs that have been installed recently, no matter how unrelated they may seem. I simply haven't changed a thing. Sooo... any idea for me?
Everything you just described points directly to heat issues and failing hardware. You need to unplug everything, crack the case open, yank every single component out, clean everything with condensed air, especially all slots and fans, make it spotless. Then after putting it all back togther, leave the case side off, power up, and verify that ALL fans are running smoothly.
Here's an application that will show you your temperatures, but there may be better/easier to use ones out there. Either PM me the results or start a new thread in the tech forum where other eyes can join in and offer advice. Hop on this quickly, as the more you allow heat problems to occur, the more likely that your hardware will permanently fail.
Be well, people of CoH.

Thanks Bill, I've just downloaded that program. I am going to take apart the system and then clean and check the fans. I'll then run that program and likely PM you what I find. Thanks again for the help.
Just wanted to say thanks for the guide. I was having a few (minor) problems, but this info helped. Not being a COMPLETE tech geek, I was mostly lost, but the directions were nice and easy to follow. ^_^ Now my game looks shiny again! Squee!
Looking at your guide to try and save my CoV experience.. solo - my graphics look great. But when I get on an 8 man team with a few MM - looks like a cartoon flipbook.
[ QUOTE ]
Under 3D settings:
...
SmartShader: None
[/ QUOTE ]
Where else could this setting possibly be, or did they remove it from the latest version? I followed your steps, removing, cleaning, right down to re-downloading... (Seems 6.5 is now available) but I cant seem to find Smart Shading anywhere!
3D settings
API settings
I have an ATI X1300 Pro (256mb)
running in an MSI mobo, AMD CPU, 1 GB memory
Thanks in advance
The SmartShader section you should be able to see if you are viewing the CCC in the Advanced view. It will be its own category like Display, 3D and SmartGART. But I"m not seeing it where it should be from your screenshots. Under VPU recover or in a subgroup of the Avivo perhaps? If you can't find it, don't worry about it, as the SmartShader adds funkadelic layers such as: Black and White or Sepia, etc. Not a big deal.
I'm under the belief that your main bottleneck is your processor, but does lowering FSAA and AF to 2X each help at all? If not, then yes, your proc is the big stopping point for you.
Be well, people of CoH.

Actually - I have not tried the game since making the changes (made them from work via remote access).. I just found the post, and stared working through the steps.
I checked CCC again, and no reference to the Smart Shader... I checked a another machine I service which is running the previous version of CCC - and it is there... so I can only assume they took it out, or at least disabled it for the X1300 in this latest version.
Thanks for the info - I will check it out this afternoon when I get home, and see how it works.
[ QUOTE ]
I'm under the belief that your main bottleneck is your processor, but does lowering FSAA and AF to 2X each help at all? If not, then yes, your proc is the big stopping point for you.
[/ QUOTE ]
Your settings out of the box worked pretty well... but I did notice dropping FSAA and AF to 2X each had a big performance boost with little or no hit to quality as far as I can tell. Everything is better than ever in open zone and in solo mis... and much improved in team mis... but still not the best.
Would upping the memory from 1GB to 2GB do much in the way of lag reduction? Or do you still think that its the CPU speed that is holding me back? Would I notice a huge change going to a 2Ghz?
[ QUOTE ]
Would I notice a huge change going to a 2Ghz?
[/ QUOTE ]
Probably not. The jump from 1GB RAM to 2GB allowed me to disable Virtual Memory on all drives so that Windows used RAM, instead, for its Virtual Memory usage.
Be well, people of CoH.

Hi Bill
Your guide has helped me learn a ton, but I miss the description of your specific card type and system. I only have 512RAM and a Radeon 9800Pro, so I've had to lower the settings on many of your recommendations to prevent my worst problem: outdoor (not inside a mission) lag. I know more RAM would help, but until then, it would be great to know your opinion on balancing the specific settings for ATI cards on systems with less RAM. I know I saw your system setup in an old post, but it might help folks to prologue your guide by telling people what it is so they can gauge settings in regards to their own setup. Thanks for the great guides!
Cooke,
I agree with you. It is something I should have had listed as well as comments about tweaking further related to system specs.
My current specs are:
X800XT Xtasy Radeon 256MB Video
P4-HT-3GHz Processor w/ HT enabled and functioning within CoH/V
4X512MB (2GB total) System RAM running in full dual channel
40GB 7200 RPM SATA Hard drive
Creative Audigy 2
Cooke,
You would be amazed at the performance boost if you added just another 512MB of RAM to your system. Load times plummet, overall performance increases.. it's almost scary. I didn't notice as much of a boost from 1GB to 2GB, but that jump has allowed me to disable Virtual Memory on my two partitions so that XP uses RAM instead for its virtual memory usage.
With only 512MB of RAM, you're blocking your video card's use of the AGP aperture, which is controlled by your system's BIOS. This further limits your overall performance.
I currently have my AGP Aperture maxed at 256MB in the BIOS, but with only 512MB of system RAM, I wouldn't set the AGP Aperture higher than 64MB. At 1GB, bump to 128MB aperture, higher than that, max it.
As the video cards RAM gets lower from 256MB, then items to start decreasing are Textures and Shader.
Hope this helps some.
Be well, people of CoH.

Have you had any problems with Alt-Tabbing between applications running these settings? It is damn near impossible for me to get an alt-tab to work properly. Typically it switches and I can see the task bar fine but the rest of the screen is a graphics mess of what the COV screen was at the time of the alt-tab. I can't see any of the applications I am trying to mess with the settings on through it. For example Teamspeak. Any ideas why this would happen?
Running a:
Athalon 64 3200
Radeon X850XT
1GB Ram
Windows XP Pro
THink I am running Catalyst 6.4 but I recently updated so it might be the 6.5s
Yes. My settings are known to cause issues with alt-tabbing and it appears to be from having the driver manage FSAA.
Of course, ATI has known issues with task switching from full screen.
And letting CoH/V manage FSAA at higher resolutions causes the world to go black while leaving the GUI intact.
My advice is not to do it, but I understand this isn't an option for most folks. It appears that you'll get the best task switching performance by running in windowed mode and letting the game manage FSAA and AF.
I can't do this, because of the high res I run even 2X FSAA in game blacks out the world. And Cryptic's FSAA implementation isn't as smooth as ATI's, nor does it perform as well.
Be well, people of CoH.

I wasn't having alt-tab issues pre-i7, fwiw. Running 1280x1024 at 4x and 4x through the drivers. Now I am, I lowered it to 2x and AF off, but still get it. Windowed mode doesn't help. Will try out in game management, but like you said, its not nearly as nice.
-ASYLUM- (nsfw)
St. Lucifer: lvl 50 D3 Defender
Satanicus: lvl 38 E2 Blaster
Diabolus The Dark: lvl 36 SS/DA Brute
Old Man Death: lvl 33 TH/DM Mastermind
Misses Claus: lvl 33 Ice Controller
[ QUOTE ]
As I run an LCD, Refresh Rate is meaningless so it's left at 60Hz.
[/ QUOTE ]
This is not true, refresh rate is not meaningless on an LCD, but that is a popular misconseption. Think about the name, refresh. It controls how fast the screen image is refreshed, which logicaly is better the faster you can set it. While the refresh rate will not change any sort of flicker effect like you get at low refreshes on a CRT, it will enhance redraw and reduce ghosting.
CRT screens have an electron gun that zips across the screen painting the image on the phosphors painted on the inside of the vacuum tube. The refresh rate controls how fast this fires, which at low speeds gives a flicker effect because the glow begins to fade before the next redraw. Higher refresh, higher quality, no flicker.
LCD screens operate by each pixel being a cell that opens to allow light to pass through, so there is no decay, so there is no flicker. However, the refresh rate still controls how fast the image redraws, just like on a CRT. By setting it to the highest refresh supported, just like a CRT, you get better visual quality because the data is updating faster, which on LCD screens helps prevent ghosting effects.
Ebonweaver,
That's what I thought, too. As it was then explained to me, the redraw rate on an LCD is locked at the factory. There is no way to adjust it. So even if you lower the resolution away from optimal so that you can up the reFRESH rate, the reDRAW rate stays the same. But even if you're right, in order for me to punch up the refresh rate, I'm forced to drop my resolution from 1920X1200 by 60Hz to 1440X900 by 75Hz.
The massive dip in resolution away from optimal causes a fuzziness to the entire image that I continue to find intolerable.
Shouldn't a 12ms reDRAW equate to around 83Hz refresh? If so, why am I locked at 60 and 75 by the manufacturer?
Here's a couple of quotes I came across: (although you can find more quotes on the other end)
"Refresh rate does not apply to LCD (liquid crystal display, now commonly referred to as flat panel) monitors, because they do not render images the same way CRT monitors do. Pixels in LCD monitors remain open or closed as needed until the image changes, and the light that passes through from behind stays constant. Therefore, the pixels don't fade. The rate that affects LCD displays is "response time", the time it takes for a pixel to go from fully open (the brightest intensity) to fully closed (black). Response time is a fixed property, not a configurable setting.
In Windows, Display Properties includes settings for refresh rate, but they are only there for compatibility with the video card, which must be able to handle a CRT that can do multiple rates. An LCD monitor has only one optimal refresh rate setting, so it expects a specific value. Changing this setting for an LCD monitor will affect the image (e.g., it may shift it off center), but will not affect the monitor's actual refresh rate."
"Flicker is a result of phosphor decay; that is, after the energy from the electron gun is transferred to the phosphor material, the energy and the resulting light begin to decay very slowly until the electron beam hits the phosphor again.
Since LCD monitors do not employ phosphors, refresh rate is not a concern. Basically, the transistors in the LCD remain open or closed as needed until the image changes. This can be a point of confusion for some consumers, however, since most graphics cards still ask for a refresh rate setting. This is due to the analog nature of existing graphic cards (see Inputs section) and their support for CRT displays. While refresh rates do not apply to LCD monitors, most LCDs are set up to accept any settings from 60Hz and above."
EDIT: It sure would be nice if I could lower my resolution without the blurriness.... the extra horsepower my card has to push at the 1920X1200 really slows me down.
Further edit: To be perfectly honest about it all, I don't know, man. Why would enabling vsync make any difference to tearing on an LCD if the above WAS the case? In one of the other threads, I remember a discussion where someone was stating that it did nothing for them other than lower overall performance, even with triple buffering enabled.
As it stands right now, if I disable Vsync and triple buffering, inside a mission map where I was previously stopped at 60FPS, I'm getting spikes into the low 80FPS. This is on the hero side, of course. But the tearing is definitely present, and noticeable when moving left and right quickly. Vertical lines, such as doorframes, split noticeably.
So telling the video card "you shall not pass 60FPS" by enabling vsync on a monitor set to 60Hz refresh defintely stops tearing on an LCD with a 12ms redraw while it's running at 1920X1200 resolution. That, and LCDs blur at lower than optimal resolutions, are the only facts I can share here. Everything else is based of of readings on the internet, many of which are contradictory on this very subject.
Is Vsync actually tying itself to the 12ms redraw over DVI? Doubtful, since when I have it on I'm capped at 60FPS. If it was, I should be capped around 83FPS.
I don't know. It would involve a lot of testing at various resolutions, maybe even using a tool to force refresh rates not supported by my LCD, to get a handle on what's really going on here. Anyone else want to roll with this?
Be well, people of CoH.

Having an ATI related issue.
running: 2.8 P4 (HT)
1 gig of 3200 XMS DDR Corsair Ram (getting another gig tonight)
Radeon X1600(XT?) 512mb AGP 8x (updated to newest 6.7 drivers last week)
19" crt 60hz refresh rate
i have an odd issue, and i hope i can explain it adequetly enough for someone here to understand.
I just started playing CoV after leaving CoH 2 years ago. And i'm more than comfortable with my machine and any changes i have to make with it.... HOWEVER....
and this is a very strange issue, for me at least, but any fire/water effects around my character or the bottom 1/3 of my screen seem to bring up a watery version of the first "water/fire reflection" i came across for that session of online play.
so i know it's not a residual from an image burn on my monitor.
I have a corruptor that i've been playing recently and if i cast "Warmth" the portion of the circle closest to me will be filled with a texture that shouldn't be there.
The same thing happens if i go for a swim, the bottom 1/4th of my screen is taken up with a textur that looks very watery (the water effects work fine on my machine)
Also if i look closely at my character when she has Fire Shield cast upon her i can notice this "ring" around her verything looks slightly darker and whispy (flame heat effect). BUT if i manipulatethe camera so this "ring" around my character touches the bottom 1/4th of my screen that residual texture appears.
it seems as though i'm retaining these residual texture(s) and it only shows up during a fire/water effect.
also if i go into my graphics options and do a save/close after changing an option (any option) the texture for that portion of my screen will be there for the next effect.
anyone have any idea what may be causing this?
i'll get a fraps/screenshots of it tonight if needed.
Phenriz,
Although I haven't had the opportunity to see the issue myself, there have been reports of the 6.7s causing the aura errors that you speak of.
The fix? Roll back to the 6.6s. Be sure to completely uninstall the 6.7s and run Driver Cleaner to make sure first.
Be well, people of CoH.

Bill, my hubby upgraded my computer... ok, he built me a whole new thing and guess what. He used ATI stuffs and you've saved me helluvlag.
I thank you.
I'll be trying these tonight. I run fine except in certain Arachnos Maps and in teams larger than 5.
Thanks for the information and Thanks to Athyna for pointing me here.
Bill,
You are completely the man. Ive been going nuts with an X1900XTX and an AMD-FX60 (everything o'clocked) trying to comprehend how I could be averaging 18-35fps (all at 1280). After your tweaks Im getting 45-60fps.
Great job!
Hey again everyone. As the Catalyst 6.5s are about to come out, I decided to rewrite the settings guide based on the testing I did on the beta6.5s here.
For explanations of the settings within the Catalyst Control Center, please go here.
For the Nvidia users, tweakguides also has full explanations of nvidia driver settings here.
Mousing over the in game settings will give Cryptic's explanations of them as well as some tips on usage.
At the time of this writing, ATI has not corrected their drivers to allow FSAA functionality with Cryptic's HDR effects implementation. Below are the settings I currently use in game and within my Catalyst Control Center. After that, there will be info that I hope will aid you in tweaking your system to get the best balance between eye candy and performance.
My current settings:
(If not listed, they are at default values.)
In Game:
Resolution AND 3D Resolution Scaling: Same As Desktop 1920X1200
Refresh Rate: Same as Desktop
Advanced Graphic Settings: Enabled
Particle Physics: Enabled
World Texture Quality: Very High
Character Texture Quality: Very High
World Detail: 85%
Character Detail: 200%
Max Particle Count: 25050
Vertical Sync: Enabled
FSAA: Off
Shadows: Disabled
Use Geometry Buffers: Enabled
Anisotropic Filtering: Off
Texture Crispness: Smooth
Shader Quality: High
Water Effects: Low
Depth of Field Effects: Disabled
Bloom Effects: None
In Catalyst Control Center:
Under Display Options:
3d Refresh Rate Override: Same as Desktop
Under 3D settings:
Anti-Aliasing: 4X
Anisotropic Filtering: 4X
Catalyst A.I.: Disabled
Mipmap Detail Level: Full to Quality
Wait for Vertical Refresh: Always On
SmartShader: None
Adaptive Anti-Aliasing: Disabled
API Specific:
Direct3D:
Enable geometry instancing: yes
Support DXT texture formats: yes
Alternate pixel center: no
OpenGL:
Triple Buffering: yes
Force 24-bit Z-buffer depth: no
VPU Recover: Disabled
An item that I was criticized for in my prior guides was that I ignored the possible ramifications of lower end hardware performance hits for people that blindly followed my settings. It was never my intention for anyone to do so, but rather I hoped that readers might instead use this as a "guide" to help them learn about their own systems and tweak them accordingly.
Let's start from the top:
Resolution, Resolution Scaling and Refresh Rate:
The res that I run at is the optimal resolution for my LCD monitor. Any deviation from that causes fuzziness in the overall image, so I have little choice here. One or two steps lower is tolerable, but tolerable is not something I shoot for. I leave the Res Scaling alone as the reduction in visual quality it produces is unacceptable for me. As I run an LCD, Refresh Rate is meaningless so it's left at 60Hz. I advise setting your in game Res and Refresh Rate to be the same as your desktop settings so that your monitor does not need to bounce itself around if you alt-tab out of full screen. (Which is problematic and a known issue with ATI's drivers anyway.)
Advanced Graphics Settings: This must be enabled so that you can manually change all the settings in game. Using the slider provided when this is disabled WILL cause you problems due to the incompatibilities with ATI's current drivers and Cryptic's choices of what to enable as you move the slider from Performance to Quality.
World and Character Texture Quality: I have these set to Very High because I have 256MB of video RAM. If your card has less, you will need to adjust these settings as described in the mouseover in game based on your card.
World Detail: This setting has a HUGE impact on your FramesPerSecond. On the hero side of the game, I can still run it at 100% without issue. On the villain side flying around zones, I had often put it at 50%. I currently run it at 85% as a happy medium as I've been hopping sides a lot recently. Play around with this to find your happy spot.
Character Detail: I leave this on 200% as I've noticed no increase in performance by lowering it. I HAVE noticed, however, that when I lower it, NPCs and Player characters look squished up, especially the shoulders, at a distance in a way I can't tolerate.
Max particle count: This is another "you choose it" setting. I leave mine at 50% on the slider.
Texture Crispness. This setting should be on Smooth unless you have Anisotropic Filtering completely disabled in game AND within your driver. If AF is enabled, you won't see much difference, if any, by changing it, but with AF off, changing this setting to Crisp will mimic what AF does anyway.
Use Geometry Buffers: If your card supports Geometry Instancing, enable the instancing in the driver, and enable this setting in game. You'll receive a nice performance boost if you do. There have been reports of graphical anomolies on some cards if this setting is enabled, a shearing of textures. I've never experienced this, so test it out.
Shader Quality: I leave mine on High. All the time. No questions asked. Lowering it WILL help your performance, but at a HIGH hit to your visual candy.
Don't let your browser resized the images.
Shader Low
Shader Medium
Shader High
Water Effects, Bloom and Depth of Field: I no longer notice a huge performance hit with water on low. So that's where it is now. Placing it on high breaks ATI's FSAA in the same way that Bloom and Depth of Field do. If you want High Water, Bloom and Depth of Field, Disable Full Scene Anti-Aliasing (FSAA) in game and set it for Let Application Decide in the CCC.
FSAA and AF: We now know that the in game management of FSAA is handled in a completely different way than if handled by the driver. We also know that you will get better performance and quality by having the driver manage FSAA. Since I let the driver handle that, I go ahead and let it manage my AF as well. I currently run 4X on both, as 2X FSAA doesn't smooth my lines enough, and 4X AF sharpens up the image enough. Raising AF to 8X or 16X extends the distance in game that the AF sharpening improves.
The CCC Catalyst A.I.: I disable it because I've never noticed any performance increase by having it enabled.
MipMap: I leave this all the way to the right on Quality. Tet this one out for yourselves by dropping it all the way down, going into the game, letting your eyes absorb what they see, then drop out of game, crank it back up, and go back in. The difference is major when it comes to eye candy.
Vertical refresh: Vertical refresh locks the vertical reresh to the horizontal refreash so that they do not get out of sync. When this occurs you see tearing as you move quickly through zones or turn around quickly. Whether you enable this is up to you. If you run lower resolutions, say 1280X1024 or lower, you probably don't need this. If you DO enable it, you should also enable Triple Buffering, as it increases performance when Vertical Refresh is enabled in OpenGL games.
I don't foresee any other changes coming until ATI fixes the FSAA/HDR issue. Some folks are able to run FSAA in game at 2X with the HDR effect enabled and have a certain level of success. I can not. Using the in game management always black screens the world now. I assume this is due to my resolution being so high.
As for the alt-tabbing issue, if you are one that needs to run other applications while gaming, and bounce back and forth between them, it appears the only workaround for you is to allow the game to manage FSAA and to run in windowed mode instead of full screen. Your findings on your system may vary.
For AGP users, leave everything under the SmartGART section at the default settings. I once foudn a performance increase by disabling AGP Fast Writes, but this is no longer the case, AND disabling Fast Writes exacerbates the Contact/Info screen pause/lockup issue.
Hope this helps.
Be well, people of CoH.