-
Posts
363 -
Joined
-
-
Quote:I'm honestly not seeing where you're getting this at all. It uses 3 of the four cores in my machine, and my 7800GT wouldn't be able to touch praetoria. Maybe if you're running 1280x768 or something, that might be true. I'm running 1920x1200 and it certainly uses most of the power of my single 470, and 3 of the 4 cores on my CPU, granted it only uses maybe 4-5GB of my 8GB of RAM, but I wouldn't expect it to. My SO is using SLI'd 470s and notices a massive FPS boost and outperforms me. Other than her PSU and case, we have the exact same rig, except she has a second 470 and I only have one. Oh..she has a different monitor and she only runs 1920x1080 as opposed to 1200, but otherwise our systems are identical, both running Win7 x64.If you have pretty old cards AND low frame rates SLI will likely help. It just seems like a waste of money since CoX uses newer hardware poorly. Modern CPUs go into power save mode from lack of load while playing. My old, old PC with a single 7800gt gets basically identical fps to my new SLI rig with the same new settings. I understand it's an older game and would be more forgiving had they not overhauled the graphics without working on performance and efficiency to support it.
-
Quote:That's about what I get with a single 470. I tend to use HQ, and as I said SHQ actually gives less performance than UQ does. During events I scale down to HQ or further down if I get really low FPS. The 4xxx cards don't incur much of a hit running AA and AF, so it's worth using 8x or so. Beyond that I don't think CoH looks any better, and if you actually go to 32x, you'll grind your FPS nearly to a halt. I try to aim to keep my FPS above 30 at all times somewhere between 33 and 45. You do have to lower your world detail, especially in Praetoria, that costs a big hit. I run at 100 in Prae, 125% in CoV, and 125-150% in CoH, depending on what's going on, any higher in those areas and the FPS becomes unacceptably low. All in all it's nearly maxed out, and runs very well.It is running better after using the AO slider and not the advanced AO settings. I was checking out some of the presets while moving around Imperial City. With the HQ setting my frame rate hops around quite a bit from 33 to 45 or so. The average isn't great but much better than before. Setting AO to SHQ drops my fps to a pretty steady 30-33fps. The Ultra setting gives me 30-40fps. Overall it's playable set to HQ, short of an event with a lot of people.
An example of a low fps scene is standing in front of Nightstar and turning to the right. With AO set to HQ, my frame rate is 35-37. Places like this seem good for testing since they are the toughest to render, but the fps difference after changing most settings is hardly noticeable. I only gain 2-3 fps by turning AO, FSAA, and Anisotropic off while looking at the same scene. Basically, move around the zone after making a change.
I just wanted to add that I had world/character detail turned down to 100%. -
Quote:Thanks for the advice. I'll try using the basic AO slider and post back a bit later with my fps. Hopefully it will run at a setting that doesn't have that outline between objects and shading.
I agree that the color bleed effect looks a lot better. Without it I had AO set to slight, trying to get rid of the dingy gray shading.
The "outline" effect is some sort of glitch that happens between AA and AO, it almost makes it look like CO-style "outlining" but it only happens on certain backgrounds and textures depending on where you're standing and things, it's not a consistant thing. Thankfully, my eyes have started to "tune it out". -
Quote:I agree with this statement.I appreciate the colour bleeding effect seen in the Ultra setting. It gives the occlusion shadows appropriate colours, such as brown rather than gray to match the surface, while minimising the AO effect for surfaces under direct sunlight. This mirrors real life surprisingly well for a post-process effect, and it's an innovation I have not seen in other AO shaders in other games.
-
Yeah, using the advanced AO settings is normally asking for performance to be bad...I'm using "High Quality" in the standard AO slider, but could PROBABLY get away with Ultra, I just don't find the sacrifice of FPS worth the visual increase above the HQ mark. Ironically I think there's a marker between HQ and Ultra, and the one just above HQ but before Ultra has "worse" FPS hit on my system (by 2-3) than the Ultra marker.
-
Quote:SLI is likely not working correctly then. Did you update your drivers, and check to make sure CoH is set to "Alternate Frame Render 2" in the Recommended Mode for CoH? I can tell you that my SLI works on my pair of GTS 250s, and my GF has dual 470s and she massively outperforms my single 470, so I know it *does* work on 470s.Unless I have a setting wrong or something, the performance gain from SLI wasn't worth the wait. With your settings my fps in Praetoria is about 11, and I'm running dual 470s in SLI. I get 15-20 fps with Ambient Occlusion turned off. SLI is working with both cards around 50% load. The fps difference with SLI on/off is 1-2.
-
Quote:I know for a fact this is untrue. There has been discussion and complaining since beta started, before that site went up, about how "the full slot was in the i18/GR Beta" and therefore, the full slot has to be here, when in actuality the only parts that were in and tested were exactly what is in and tested now, the only reason anyone knew about the other enhancements was a leak on test that lead to people being able to see the entire list.Exactly. I they had the tree that's up now but had {UNKNOWN} in the place where the Rare and Very Rare boosts were, then there wouldn't be as much discussion about what is or isn't coming in I19.
Oh...we'd probably still complain, but we wouldn't be as concerned/upset about it.
It's also possible they may release the latter half between 19 and 20, which would still make it true that "the entire slot is in issue 19".
However, unless something changes from what we've heard (roumered release on the 16th or 30th of this month) it's unlikely there'd be time to put the other half in and test it unless they're literally planning to do it tomorrow. -
There is no indication on the site the slot will be "FULLY UNLOCKABLE" in i19, all it says is "Alpha Slot--Unlocked in i19" which is a true statement, it IS unlocked in i19. More of an interpretation failure.
-
Quote:It either means "can't talk about" in the sense of: 1) something is being discussed or worked on that is private between the two companies.I'm not sure why they can't talk about ATI Crossfire setup, I mean they do support the ATI branding logo on their site showing they support ATI. Unless they made a deal with Nvidia to not support ATI's Crossfire, which wouldn't surprise me none.
2) We can't talk about it because we have no/idea if when that might happen. or 3) Something akin to what you're suggesting (which is such a thing *was* done, it was likely done back in the early days of CoH, and the agreement either applies to the engine, or the product, and therefore doesn't have an 'expiration date'. In that sense whatever code and/or optimizations that might've been done or given to them by NVIDIA are still NVIDIA'S right within the agreement. I'm sure it would've been done by Cryptic back in the day, but may well still be binding.
I would tend to think any of these or a combination of these are possible. -
-
-
Thanks PB,
That's really the only reason I made it. In a nutshell, if CoH is your primary game or concern, NVIDIA will serve you better than ATI, that's a fact, supported by the way CoH's code works, as I had it explained to me by someone who used to work at NVIDIA and understood the workings of the "TWIMTBP" project.
Not trying at all to belittle anyone's choice in cards or brands, ATI is perfectly suited for other tasks, and will play 95% of all games just as well as NVIDIA, depending on individual machines and the like, but in the context of CoH, that's a less-true statement than it would be otherwise.
SLI does actually work in CoH, and work well, because my pair of GTS 250s perform within 2fps of my GTX 470 (allowing for a 5fps "margin of error" because of differences in machine, RAM and socket) (AM2+ using a tri core 720 and 4GB of DDR2 versus an AM3 board using a 965 Black Quad Core with 8GB of DDR3--though I don't think any ram past 4GB really benifits CoH, but the speed of the ram and HT bus will).
That was the point of my replies in this thread, in general, besides the occasional joke about ATI and CoH, which I only make because I used to have it happen to me.
Depth of Field used to not work properly exactly, and looked more "murky" than NV versions did, High Quality Water was (and I've heard from some still is, depending on your card) an issue with ATI, AA and AF working together has been a problem before, and I had heard so were certain aspects of AO with certain other combinations (to be fair, there's some of this with NVIDIA cards too, I think AO is very new in general, especially when backporting it into older tech).
Play what you like, use what you like, but facts do support NVIDIA being superior for CoH. Perhaps with CoH2 it would be designed from the ground up to support ATI? (Ideally though, if it was made I'd prefer to see a total graphics card agnostic situation, and/or support the specialized tech of BOTH manufacturers).
It may be difficult for me when/if I build my next machine in 3-5 years, as I don't think NV is staying in the chipset business, and I'm am AMD CPU user, but not an AMD graphics card user...that may make getting good SLI support impossible, and I sure as heck don't want to switch to Intel... -
Quote:First off, yeah, ATI drives are pretty poorly written, I've never had one uninstall correctly, and upgrading drivers was always a mess. As far as the "corporate splash" AMD (before or shortly after the merge) was a corporate sponsor of serveral major CoH events (as Razer is now as well), and I think that's probably why the predomenance on the splash screens, and why CoH is "plugged in the drivers". Ultra Mode itself may have been written "from the ground up" with ATI support--I doubt that it was, quite honestly, and even if it was, Ultra Mode alone won't ensure the kind of support you're thinking of--but the game itself was not.So what's the point of slapping a big AMD/ATi badge on the current splash screens? Even the Catalyst driver plugs this game. I really don't understand what the issue is with getting up-to-date support when Ultra Mode was written from the ground up with the support of AMD/ATi. Are their drivers really that poorly written/supported that nVidia can be first with multi GPU support without needing an official sponsorship?
The reality of the situation is that the CoH engine from the ground up, back in the ancient days of yore was written with Nvidia support and/or coding optimizations/whatever they give to games and teams that are part of the NVIDIA "TWIMTBP" project, which CoH was. No matter what kind of semi-reverse retooling Ultra Mode may have had (I honestly don't believe it had any, it may have just improved support for ATI a little to a moderate amount, but it was always horrendous anyway). So yeah, because of that, SLI would be a lot easier to support in a game already written with NVIDIA optimizations when it was made.
The sad reality of that is, no matter how much ATI/AMD and PS/NC pander to each other and/or do corporate deals and promotions, that optomized code is still there, and even if disabilng it was possible, why do that? It wouldn't help ATI work any better and would hurt those running NVIDIA products.
When I got really into CoH (around the time CoV released) and I realized the card I had then (X800XT AGP 256MB) wasn't really good enough to drive CoH at the resolution I'd gone up to when I borrowed a friend's monitor when mine died (1280x720 4:3 to a 1680x1050 widescreen apple cinema display). I knew that whatever monitor I ended up buying, I was going to need a better graphics card. For the short term and to extend the life of that machine, I got a 7800GS because I knew and had realized NVIDIA worked much better with CoH. I was ALWAYS an ATI/AMD person, but the reality was the reality. Down the line I saved up and got a 24" Dell flat (1920x1200) which I still use, and boy, then it was time to build a new system and I used the top of the line at the time (8800 GTX 768MB). Fast forward a few years, and that older machine runs a pair of GTS 250 1GB cards in SLI (thanks to BFGs replacing my dying cards with better stuff twice--RIP BFG, one of the best in the biz). My new machine runs a GTX 470 1280MB.
Now, I'm not trying to sound like an NVIDIA sales pitch, but it wasn't until I gave up my preconceived notions of Nvidia and actually used one I realize how much easier their driver installs were (only a handfull of times...maybe 5-10 in the last 5 years) have I had to roll back and/or fully clean and fresh install a new driver because of some issue. The most recent being the upgrade from 260.89 to 260.96 (for some reason install kept 'failing' so I used the "clean driver before install" checkbox now included in the last 2 releases, and short of unplugging my network cable so Win 7 didn't go ballistic trying to correct my errors and installing drivers without asking me and not letting me stop, I didn't have to do anything special except a pair of reboots, and rebuilding my game profiles.
I do like ATI cards for a lot of things, especially home theatre, they've always done that well, I have always believed though, that they rushed out their DX11 product, and that the 400 series NV cards are much better handled in that regard.
Long story short: some games are better with one, some are better with the other, CoH just happens to really favor NVIDIA, which is honestly likely more a holdover/corporate decision from Cryptic back in the original design/sponsorship days than it is any reflection on NC and/or PS's current crop of people, because of that SLI was likely easier to impliment. I guess it's possible Crossfire might be supported sometime, but I wouldn't hold my breath. SLI is more finiky and problematic to get supported WELL in something (without a lot of NV customization and "help") but tends to IMO perform better when it is than Crossfire does. On the flip side, Crossfire is generally easier to get working in a partial sense, or just a quick boost, but tends not to be as stable in those situations.
Now...I'd like to see PhysX supported poperly on card with NVIDIA cards, but I'm not sure that, for the vast majority of users, it would help much, though lower-end machines and/or laptops might get decent benifits as opposed to using the CPU for CoH's (admittedly minimal) Physics. -
"I'm Peerless Girl, and I approve this message...remember, friends don't let friends go ATI." *winning campaign smile*
Edit: to clarify, ATI has it's positives...but CoH compatibility ain't one of 'em. -
Quote:A lot of us have this issue, I doubt a whole lot of us were invited by accident, they're probably just overwhelmed with something../snickers
I think I owe some maple bacon cookies to forumites if I get some idea what Electrical Blast alternate animations look like.
I fear I was mistakenly invited to these forums, as I can't get into Closed Beta officially (I get the try a different server message). -
it's funny, I used to be a major ATI devotee back in the day (the irony being, I ran ATI and AMD, which all of my friends made fun of me for because common wisdom at the time (98-2002ish) was that AMD did better with Nvidia, and Intel did better with ATI (it was thought that paring the two more "unstable" companies together (AMD and ATI) was generally a bad idea).
The old me would've been happy when ATI and AMD merged, but by that time, largely due to CoH, and the absolute horrible mess of problems CoH had with ATI cards, I'd switched over to Nvidia and was still with AMD for processors (still am). I guess I switched back in..maybe 04-05 to a 7800GS at the time, then later got an 8800GTX for my new rig at that time, then later SLI'd them, then when I built this more recent one I'm using a 470 (the "old" machine runs dual GTS 250 1GBs).
I can't honestly imagine using ATI with ANYTHING related to CoH, it has always had problems (some tell me the water still isn't right, and it was messed up on my old x800xt AGP card). ATI has never been able to write drivers for crap, I can only hope AMD has helped them in that category.
The irony being, Nvidia can't seem to produce any new chipsets, and only a quarter or so of their current chipsets are halfway decent, and that may end up pushing AMD+Nvida SLI out as a thing of the past, and I may have to *shudder* switch to InHell just to keep SLI an option...I certainly hope not though, as I dispise Intel, as much, if not moreso because of their business practices as I do for the overpriced nature of their processors...
It's a weird time to be a PC user. -
What cards were you using? I was able to get flawless SLI with a pair of GTS 250 1GB models, I've also seen SLI have some issues with 2 other friends' rigs, though in one case, I'm almost certain that's a bad SLI bridge, or a bad card, as it duplicates in other SLI games and benchmarks. Perhaps certain cards may not currently work correctly, but it could also be other issues with the rigs, as some have had SLI working since those drivers were beta, before the announcement of SLI coming was made. I can confirm it works currently for me on both test and live.
-
That's fairly consistant with my own FPS with just one 470, I have 8xAA and everything else maxed @ 1920x1200 (except world detail is at 151% (sometimes 125% during events or in certain areas of Praetoria), and AO set to "High Quality" otherwise, everything is maxed out, and I get between 35-50 depending on what's going on.
-
Just out of curiosity, what cards are you running? I know someone with a pair of 470s who is having massive tearing and ripping (way beyond what is normal "vsynch" tear. So bad the power tray sometimes "jumps" to the opposite corner of the screen and back. We think it's her SLI bridge, but I'm also wondering if newer cards have support issues until i19 actually releases?
-
Have used Skype while playing, and never had this problem. It could be something in his router settings causing the two programs to conflict with each other. Maybe somehow he managed to set a skype port forward range to a port or ports CoH happens to use, that's the only thing I can think off the top of my head.
-
I can't imagine you'd be able to run UM at max with that rig, especially Ambient Occlusion. What sort of FPS do you get?
-
I can confirm that SLI seems to be working (for me at least on test) already, as of the driver update on my second machine, 2 GTS 250s almost equals my GTX 470 in the main machine now...lacking maybe 5fps, which could easily be the cpu and/or board of the older machine.
-
Only one word to say: SLI (at least on test).
-
Note that it's largely impossible to "Max everything out" in CoH Ultra Mode Currently (with the possible exception of SLI) but I consider myself to be basically maxed out.
I have everything "maxed"EXCEPT (World Detail at 124%, AA at 8x (above that doesn't see much improvement (maybe in Praetoria) and just slows the game down, I don't think CoH textures benifit from 16xAA or higher), Bloom is "Regular" and at 100% (That's just a taste thing, really). Ambient Occlusion is "High Quality" (2 notches down from maxed--Maxing this pretty much takes my game to 10-12fps in most situations). my Res is 1920x1200 on a 24" Dell monitor.
I get around 40-50fps in most places, praetoria drops to around 40 even. (talking outside, most missions I can get between 30 and 60 depending). I do not have VSynch enabled as that tends to lock the CoH FPS to around 30, especially outside in praetora, but doesn't always do it in missions.
I run an AMD Phenom II 965 Black on an MSI NF-750a-G55 motherboard (using Nvidia 750a SLI chipset).
I run 8GB of 1333 Geil Ram (7-7-7-24-2T 1.5v Dual Unganged - stock timings)
Running an Nvidia GeForce GTX 470 Superclocked from EVGA (1280MB RAM) and using the motherboard's onboard graphics for PhysX (which has no real effect on CoH, as PhysX is still handled by the CPU..not that I don't have cycles to spare, but really guys, now that SLI works, let's get on updating the PhysX support for Nvidia hmm?).
Th rest of the components are less relevant, but I use an Antec 900 Case with 5 120MM Antec Tri-Cool fans, and one top 200mm fan, the power supply is a BFGEX 1000W PSU (in case I later add a second 470).
I use a Seagate Barracuda 7200.12 500GB drive as my "main" with a 1TB WD Greenpower Data drive, and an external esata dock for backups. Running Windows 7 Pro 64-Bit.
Now just imagine how the sucker would run with a second 470? I could probably actually max out ambient occlusion....maybe...but I doubt it. At the least I'd run a solid 60fps.
EDIT: For Reference my Windows Experience Index is as follows:
CPU: 7.4
Memory: 7.5
Graphics: 7.6 (was 7.5 until the latest driver update)
Gaming Graphics: 7.6 (was 7.5 until the latest driver update)
Primary Hard Disk: 5.9
Overall Score: 5.9
(Darn that lack of an SSD to break the 5.9 "barrier").