-
Posts
9 -
Joined
-
Quote:It actually works fine just now. It's just that nvidia won't put out an sli profile. Well they do, but it is set to Single GPU. The gaming community had to come up with settings that work for the game. So my original point is. If the gaming community can come up with a workable SLI profile, why can't nvidia/cox?Since SLi was first introduced by nVidia withe the 6xxx series, it was certainly designed to handle XP.
The problem is a memory addressable issue and that only recently cropped up when video cards started to have more that 512MB of memory each and more than 2GB of system memory. The issue is 32-bit OS Vs 64-bit OS. It's true that 64-bit XP is the redheaded step child for OS acceptance with very little penetration in the gamer space (0.56%). That changed with Vista and Windows 7. The increase use of 64-bit versions of those two OS chart with the increases in system RAM and >512MB video cards.
There are plenty of people using SLI on 2GB systems with two 512 or 768MB video cards in 32-bit XP. Is that an ideal set up with today's 1GB video cards, no. But three years ago it was.
It sounds like you're someone who spent a lot of money on a high end gamer rig and you simply have to come to grips with the fact that this game's 3D rendering engine isn't written in a way that works well with SLi and Crossfire. Maybe it will one day. -
Quote:Where do you get your #s? If someone is only getting a 10% increase from SLI, then they don't know what they are doing. A proper SLI/Xfire set up should give you anywhere from 50-300% performance depending on 2, 3, or 4 GPU cores used. If someone spends $600 for a 10% increase, they are an idiot. You can slap together 2 cheap nvidia GTS 250s and see a lot more than 10% increases in most games.you make it sound like everyone is using SLI and Crossfire. Last time I checked, SLI and Crossfire was just way too expensive for the performance increase they provided.
A 600$ increase on the budget for a 10% increase in performance (at the very best) is really not worth it for many, even power gamers that are inteligent enough to know when they are throwing money out the window.
I am a power gamer and a computer tech and I can see only 1 application where SLI and Crossfire could potentially be of use and that's not even gaming: It's video / 3d rendering.
so for a average user (don't forget that more than 90% of the gamers aren't power gamer with loads of cash) SLI is useless. Why would a company like NCSoft and Paragon Studio, invest time and effort in something that less than 10% (and that even seems high to me) will use?
And the only argument you seem to be able to make it that TO YOU XP is dead, however, Vista and Windows 7 aren't even officially supported by NCSoft. The code was done 6 years ago for, guess what, Windows XP and 2000. It was never designed for the new redesigned core of Vista and 7. Even less for 64bit. We are lucky enough that it works.
Saying XP is dead because you want it to be, doesn't make you very credible. Also you fail to prove why they should invest time into SLI and Crossfire beside that YOU want them to.
Back to me ORIGINAL post...
If you want SLI, and have nvidia cards, you can use nhancer to override the pre defined profile and enter the hex codes I have given. Depending on your rig, and settings in the game, this will result in a 25-200% increase in fps -
Quote:Again, I could care less if it is still supported. It can not run top end, modern hardware. So for me, it's dead. For any serious gamer, it's DEAD.http://arstechnica.com/microsoft/new...-extension.ars
Yet reprieve for Windows Xp.
There's really not much more to say on this topic. Windows Xp is not going away. It will continue to be a target for any developer.
OpenGL is still the only way to target all versions of Windows.
DirectX is still a multi-billion dollar mistake. Period. Stop.
Sure, go ahead and keep using it if you want a 4GB memory cap. Keep using it if you want every MB of video card memory to eat away at your system ram. Want to run a top of the line SLI or xfire video card(s)? 2 of them will kill your system ram down to 2GB, 3 down to 3GB, 4 GPUs are not possible unless you use 512MB video cards.. and 512MB for textures isn't enough for most games. They want cards with 768MB-1.5GB of vram.
Yes you can game on XP. DX 9, 10, and 11, are and still will be the standard for windows. 90% of the game released for windows use DX and not openGL.
I'm saying it's dead because it CAN NOT do what a modern 64bit OS can do (and XP 64 bit is a failure).
So if you want to live in the stone ages... keep XP. -
Quote:Again, try running 4 modern GPUs on XP = FAIL (or maybe 512-1M of ram available if you are lucky).http://arstechnica.com/microsoft/new...-extension.ars
Yet reprieve for Windows Xp.
There's really not much more to say on this topic. Windows Xp is not going away. It will continue to be a target for any developer.
OpenGL is still the only way to target all versions of Windows.
DirectX is still a multi-billion dollar mistake. Period. Stop.
To an average gamer, XP will work, but not for a power gamer. To a power gamer who wants to use top of the line hardware, XP is dead for them. -
Frankly, I couldn't care if 80% of the population out there still used XP. It was a great OS, but it simply can not take full advantage of modern hardware (like 4 GPUs with 1GB of vram each). je_saist, please try submitting links to data that's not 12-36 months old. I could care less what was going on in 2008. If you are going to slap out links to stats, at least show them as of 2010
NobleFoxx, Actually DX10 and DX11 could EASILY have worked on XP. MS just denied it to XP as a means to attempt to force gamers to dump XP. And someone did hack DX10 onto XP, but DX 10 was a failure. 10.1 sort of redeemed it, but DX 11 is really what DX10 should have been.
Well back to the original topic.. at this point, it doesn't matter if XP is dead or not, the issue in this thread is SLI
... people have been complaining that CoX hasn't been multi gpu friendly for 6 YEARS. Games don't have to be specifically coded for SLI. That's the whole point of it. The driver and the cards are supposed to do all the work and render each frame (or split frame) without the software knowing it (in theory). Of course the way games are coded, some games work fine with SLI and others have issues even when made with SLI (or xfire in mind).
My original point was people have been on the makers of CoX's butts for 6 YEARS. Granted part of this was nvidia's poor support of OpenGL for a time, but I can not blame them as DX was the WINDOWS standard, and for years Windows was (and still is) the primary gaming platform for desktop gamers (mac and linux have had a very small percent of the gaming market share). Now that apple and pcs have identical HW and can run either OS, I do agree, that coding in OpenGL makes porting it from one OS to another much easier.
Well now that CoX has undergone a graphics engine enhancement, sli is doable.
Like I said: A 4-Way setting to nhancer of 5800000 to the open GL section works fine (although it does cause some artifacting with water in the game). And seems to work well for 2,3, or 4 way sli. (At least it works well for me and the computers I have tested it on: Vista x86/x64, Win7 x32/x64 machines that I have access to) -
I"m not sure if this has been discussed in other places. I looked and couldn't find it, but while calling support I asked them a question?
Can EXISTING good/evil characters go Rogue?
Answer: NO!
Apparently you MUST make a Praetorian and your actions will decide your alignment. So to go rogue, we all get to start off with all level 1shehehe
-
Quote:First off... I'll be MORE than happy to FRAPS you a video with my SLI load bars with the /showfps 1 settting in the game showing both with AND without SLI on. So I can confirm and prove this in a heartbeat. With Quad SLI going, I EASILY get a 50% boost in FPS, even more so of a fps boost if I kill Ambient Occlusion (AO be it OGL or DX is about the biggest FPS killer out there). So yes, if done right Sli can have a WHOPPING difference in CoX. I also have FAR more GPU power than your system does. So I will see a very large increase. My partner has 2 GTS 250s in SLI mode, and gets a benefit, but not nearly as I do in Quad SLI (but CoX still performs better for him and he can enable features he couldn't before). I've tried some other settings resulting in about a 100% fps boost, but it results in a LOT of artifacts in the water and other places.97.45?
Don't you mean 197.45? They would be completely driver.
Second: I've been testing with the release driver sets using a 9500 GT 2X sli setup and a Triple SLI GTS 250 setup.
Sorry, I can't confirm what you say.
Not much of a standard, but I'll come back to this with one of your statements down the line. When I say that i create my own profiles, they are technically "shadow" profiles, that I manually activate before playing a game
Own profile using a third party utility: http://forum.nhancer.com/showthread.php?t=264
Um... do I need to really go into this?
Bull Manure.
I can destroy this in two words:
Windows Xp
Let me pop your little bubble here. Windows Xp is artificially limited to DirectX 9 by Microsoft directly.
The only way you can get DirectX 10 and DirectX 11 visuals on Windows Xp is to use OpenGL.
So even if you just intend to release on Windows only, there's still no reason to make a DirectX windows game.
So, download a third party program, don't expect the vendor to support me out of the box.
Okay, I'm good with this. It's a great suggestion for everybody who plays the game wants to get the most out of their graphics.
... and we still disagree.
As much as I smash on Nvidia lately, even I will give them props for stellar OpenGL support when nobody else (Intel, ATi, Matrox, Via, S3, SiS) cared. It's only been recently that Nvidia's OpenGL support has tended to fall by the wayside.
There is also nothing wrong with using nhancer. I have used to to modify nvidia's preset profiles to get much better performance out of my games. Nvidia doesn't always do the best job on setting up profiles, and sometimes they need to be tweaked a bit.
Lastly Windows XP is dead and obsolete. So I don't even consider it. I play CoX on Win7 X64 ultimate. But yes, MS did prevent anything beyond DX9 on XP. And yes if you want to go around than you can use opengl. IMO no point XP is dead and needs to be buried. So there's no reason (unless you are cross platform developing) to use openGL anyway as DX11 will perform much better (DX10 was an utter failure IMO). So I can see CoH sticking to opengl to support people who refuse to get rid of Xp.
Why don't I run XP? It can't see the 16GB of ram that I use, and XP x64 is an utter failure, and any 32bit MS OS limits an application to using no more than 2GB of ram per application. Plus with the 4GB hard limit XP has, and many graphic cards having 512-1.5GB of video ram on them.. well do the math... try to quad sli on XP with 4 1GB cards... it's impossible. 2 1GB cards will leave you with only 2GB of ram to use. Like I said. XP is dead and obsolete... as well as any 32bit OSes. Why MS even released Win 7 in a 32bit version is beyond me.. I'm just happy that the next OS after Win 7 will be 64 bit only.
My point being, with current video hardware, memory demands of modern games, game designers shouldn't even consider XP a factor any longer.
And the nhancer thread? That's something I've already read a long time ago. I have no idea why you linked it. If I'm missing something, please enlighten me. A 4-Way setting to nhancer of 5800000 to the open GL section works fine (although it does cause some artifacting with water in the game). And seems to work well for 2,3, or 4 way sli
My whole point is this:
1) I could care less if I don't have vendor or CoX support for using a non standard profile, and most people who want to use SLI probably don't care either, and if you do have a problem, you kill the shadow profile.. and poof.. you get full support from CoX and nvidia
2) People WANT to use SLI. So I'm giving them options. Plan and simple. As I use nhancer to tweak my coh profile, I will post profile codes for people to put into nhancer to try it. And each one will have its pros and cons clearly stated. And if a shadow profile causes someone issues, they just delete it and go back to the stock nvidia cox profile of single gpu.
3) The day CoX, and Nvidia actually come out with a REAL, predefined SLI profile, I think monkey's will fly. This is something Cryptic (now NC soft), and Nvidia have been "working" on for .. what? over 6+ YRS now? It's been SIX YEARS since people have been asking for an SLI support. -
Yes I meant 197.45
I was just looking in device manager, and it always puts the decimals in different places
I have now upgraded to 257.21 and it has not adversely affected my SLI performance in CoX -
Nvidia's SLI will work JUST fine with all the new ultra mode settings. This has been testing on 97.45 and the most recent driver set. I have an Dell 710, with quad core and 2 295 GTXs for Quad ALI
In the past (before ultramode) Nvidia couldn't release a profile because the problem was with CoX itself (well they DID have a profile but it was set to single GPU). The gaming community tried every possible setttings, and the game would crash or have horrific performance. This was due to nvidias poor openGL support for SLI, and CoX (IMO) mistake of going OpenGL when the "standard" for Windows based games is/was DirectX
Since Ultramode was released, I built my own profile using nhancer 2.5.9, and I get a massive speed boost out of the game.
I just set it to 4 way AFR and enter a hex code of 5800000 into the SLI Open GL settings. This works fine with 2, 3, or 4 way SLI. Granted this IS a generic profile, but it DOES work. I'm currently working on optimizing it.
Has Cox gone with DirectX from the start, we would have seen a true sli/xfire profile a long, long time ago. Unless a developer plans on many cross platforms, there's no reason to make an OpenGL windows game.
Just use nhancer and play with the profile settings until you get a profile you are happy with.
the only downside is you MUST enable the profile manually before you start the game.
I've been following this for years. The issue can't be blamed on nvidia or CoX. They are BOTH at fault. Nvidia for having piss poor open GL support and CoX's fault for not properly working with Nvidia on this.