-
Posts
4197 -
Joined
-
Multi-GPU rendering is largely handled by the graphics DRIVER and not by the game itself. Both Nvidia and AMD support a long list of games that benefit from multi-gpu acceleration even though the developers of those games have done absolutely nothing to the game engine or final binary client to enable multi-gpu support.
Yes, game developers can make some optimizations for their games to leverage multi-gpu setups, but such optimizations are not, by any means, required for multi-gpu acceleration.
Nvidia
Now, as to why Multi-GPU hasn't been supported in the past by Nvidia... nobody knows. Nvidia was pretty quick in enabling Multi-GPU for ID-Software's OpenGL based rendering engines. In fact, Nvidia actually worked with ID-Software to create proprietary OpenGL extensions such as UltraShadow. Nvidia also helped write the underpinnings of CoH's original graphics technology, again utilizing proprietary OpenGL extensions.
AMD
Currently Nvidia has enabled SLI support for the City of Heroes executable. As already mentioned there is no gigantic performance boost from running an SLI setup.
Multi-GPU hasn't been supported in the past by ATi/AMD as the old graphics engine was essentially broken in regards to ATi/AMD hardware. Little reason to enable multiple GPU support on City of Heroes if everybody who has multiple GPU support can't run with high quality water and other rendering issues.
Now, OpenGL itself is having a bit of a revival. The API is the only way to get graphics acceleration on mobile platforms such as Apple IOS and Google Android. Two of gaming's biggest juggernauts, Nintendo and Sony, also are using OpenGL as the rendering API in their next generation handhelds, Nintendo 3DS and Sony NGP. One of the largest standouts of the PC publishing / development crowd, Valve Software, is also turning to OpenGL as a rendering API solution.
Multi-GPU isn't supported currently by ATi/AMD for a couple of reasons. While AMD helped write a new graphics engine that is adherent to OpenGL 3.x targets... and while AMD's single card OpenGL performance is adequate... AMD's multi-card OpenGL performance is downright abysmal.
The reality for ATi/AMD is that most of the OpenGL based games to hit the market in the past 6, 7, 8 or so years leveraged proprietary OpenGL extensions designed to make games run better on Nvidia hardware (Nvidia's "The way it's meant to be code sabotaged" program). For the most part ATi/AMD didn't have much of a market incentive to write a performance Fast / Single Pass OpenGL driver that leveraged multiple GPU's outside of the Linux gamer market.
Whether or not renewed developer interest in OpenGL will be enough of a pitchfork for AMD to get off it's collective rump and get a proper performance driver for Multi-gpu rendering out the door... nobody really knows yet. -
I've been sitting on this for... well... a while now... not sure where to post it. I'm generally hesitant to describe any server as my "home" server since I actively play across several servers. As Guardian is the "Cookie" server...
Tomorrow I leave for Flordia to attend Notter School of Pastry Arts. Assuming I'm able to digest the course material, in 6 months I'll be a diploma holding pastry chef.
I'll probably also need to file for Ulli insurance... -
I would go for a free-transfer tokens as a vet reward.
Something else that might be interesting is bundling a free-transfer with booster backs. -
something else to consider is that changes are underfoot at Transgaming.
In the past the Cider Engine used to drive OSX games has been derived from the Cedega engine technology. In Transgaming's own words, Cider and Cedega are the same source code, just one is compiled to targets for the Linux Kernel and the other is compiled to targets for BSD_Mach.
However, Transgaming's source code tree for Cedega has been in a state of flux for... well... let's just say Cedega 8 was in the testing stages about a calendar year ago and there still hasn't been a stable release. Various versions of the Maudite testing engines have broken CoH compatibility in new and unique ways, leading me to believe that Transgaming hasn't been able to provide a stable source code base to update the OSX client against.
Then, earlier this month, Transgaming dropped a bomb. The Cedega / Cider tech is being opened up through http://gametreelinux.com/
When the change occurred I popped the developers some questions on that subject. Here's a copy of what I sent.
Quote:Pertinent parts bolded.Transgaming has stated in the past that Cider and Cedega share the same source code, and that they are effectively the same set of tools optimized for two different operating system basis. Given the erratic state of support for City of Heroes on Cedega's Maudite Test engines, I have been under the impression that the lack of an update for the existing OSX Client was largely due to Transgaming not being able to provide a stable build of the Cider tech.
Now I'm curious is the move to open up the Cedega technology development under GameTreeLinux might have had an impact on contract negotiations with existing Cider affiliates. Transgaming hasn't confirmed yet on whether or not Cider and GameTreeLinux will continue to share the same source code. For that matter Transgaming has yet to confirm if GameTreeLinux will still allow the arbitrary installation of pre-compiled binary files, something that the currently existing GameTreeMac does not appear to allow.
So, basic first questions:
- Are the changes at Transgaming having any effect on development of City of Heroes?
- Assuming that Paragon Studios is happy with Transgaming, and Presuming that arbitrary binary files can still be launched through GameTreeLinux, would Paragon Studios / NCsoft consider adding a "compatible with GameTreeLinux" blurb to the Online requirements or to the Buy it now pages
- Assuming that Paragon Studios is happy with Transgaming, and Assuming that Transgaming uses the upcoming GameTreeLinux as a store-front for compatible games, has Paragon Studios / NCSoft consider making the title available through GameTreeLinux
As of yet I have received no response to these questions... so I really can't tell anybody what Paragon Studios is planning.
I do think it is significant that the OSX compatibility was a highlight in recent developer comments, so I somewhat suspect some kind of update on the status of OSX updates to be issued... when, I don't know. -
Quote:well... /radiation_blast isn't a debuff set.to most people saying stuff like i was just angry because they didnt pick me thats not it i was fine that they didnt need -def but then they claim all the sudden that /rad isnt debuff and thats why im angry
/radiation_emission IS
Huge difference between the two.
/radiation_emission has powers like [lingering radiation], [radiation infection], [enervating field], [fallout], and [empulse] that cover defense, resistance, movement, recharge, and regeneration debuffs.
/radiation_blast only natively debuffs enemy defense. As already mentioned, defense_debuffs ALONE may or may not be that important to an average team.
Now, that being said, you can do some pretty nasty things with the /radiation_blast set using Invention origin enhancements. You could, for example, slot a bunch of Achilles' Heel Chance for Resistance Debuffs. You could gain a healthy base damage boost with just two Achilles' Heel IO's or two Undermined Defense IO's.
Would you be as powerful or as effective as a /radiation_emission?
No, you wouldn't... but you'd still be just down right nasty in your own way.
Case in point...
Hero Plan by Mids' Hero Designer 1.91
http://www.cohplanner.com/
Click this DataLink to open the build!
Level 50 Mutation Defender
Primary Power Set: Force Field
Secondary Power Set: Radiation Blast
Power Pool: Speed
Power Pool: Leadership
Ancillary Pool: Psychic Mastery
Hero Profile:
Level 1: Personal Force Field -- LkGmblr-Rchg+(A), LkGmblr-Def/EndRdx(3), LkGmblr-Def(3)
Level 1: Neutrino Bolt -- Achilles-ResDeb%(A), Entrpc-Dmg/EndRdx(5), Entrpc-Acc/Dmg(5), Entrpc-Dmg/Rchg(7), Entrpc-Dmg/EndRdx/Rchg(7), Entrpc-Heal%(9)
Level 2: Deflection Shield -- LkGmblr-Rchg+(A), LkGmblr-Def/EndRdx(9), LkGmblr-Def(11)
Level 4: X-Ray Beam -- Achilles-ResDeb%(A), Entrpc-Acc/Dmg(11), Entrpc-Dmg/EndRdx(13), Entrpc-Dmg/Rchg(13), Entrpc-Dmg/EndRdx/Rchg(15), Entrpc-Heal%(15)
Level 6: Force Bolt -- KinCrsh-Dmg/KB(A), KinCrsh-Acc/KB(17), KinCrsh-Rchg/KB(17), KinCrsh-Rechg/EndRdx(19), KinCrsh-Dmg/EndRdx/KB(19), KinCrsh-Acc/Dmg/KB(21)
Level 8: Hasten -- RechRdx-I(A), RechRdx-I(50), RechRdx-I(50)
Level 10: Insulation Shield -- LkGmblr-Rchg+(A), LkGmblr-Def(21), LkGmblr-Def/EndRdx(23)
Level 12: Irradiate -- Oblit-Dmg(A), Oblit-Acc/Rchg(23), Oblit-Dmg/Rchg(25), Oblit-Acc/Dmg/Rchg(25), Oblit-Acc/Dmg/EndRdx/Rchg(27)
Level 14: Super Speed -- Empty(A)
Level 16: Dispersion Bubble -- LkGmblr-Rchg+(A), LkGmblr-Def/EndRdx(27), LkGmblr-Def/EndRdx/Rchg(29)
Level 18: Proton Volley -- Achilles-ResDeb%(A), Mantic-Acc/Dmg(29), Mantic-Dmg/EndRdx(31), Mantic-Acc/ActRdx/Rng(31), Mantic-Dmg/ActRdx/Rchg(31), Mantic-Dmg/EndRdx/Rchg(33)
Level 20: Electron Haze -- KinCrsh-Dmg/KB(A), KinCrsh-Acc/KB(33), KinCrsh-Rchg/KB(33), KinCrsh-Rechg/EndRdx(34), KinCrsh-Dmg/EndRdx/KB(34), KinCrsh-Acc/Dmg/KB(34)
Level 22: Maneuvers -- LkGmblr-Rchg+(A), LkGmblr-Def/EndRdx(36), LkGmblr-Def/EndRdx/Rchg(36)
Level 24: Tactics -- EndRdx-I(A)
Level 26: Repulsion Bomb -- Posi-Acc/Dmg(A), Posi-Dmg/EndRdx(36), Posi-Dmg/Rchg(37), Posi-Dmg/Rng(37), Posi-Acc/Dmg/EndRdx(37)
Level 28: Cosmic Burst -- Achilles-ResDeb%(A), Entrpc-Acc/Dmg(39), Entrpc-Dmg/EndRdx(39), Entrpc-Dmg/Rchg(39), Entrpc-Dmg/EndRdx/Rchg(40), Entrpc-Heal%(40)
Level 30: Repulsion Field -- KinCrsh-Dmg/KB(A), KinCrsh-Acc/KB(40), KinCrsh-Rchg/KB(42), KinCrsh-Rechg/EndRdx(42), KinCrsh-Dmg/EndRdx/KB(42), KinCrsh-Acc/Dmg/KB(43)
Level 32: Force Bubble -- EndRdx-I(A)
Level 35: Neutron Bomb -- Posi-Acc/Dmg(A), Posi-Dmg/EndRdx(43), Posi-Dmg/Rchg(43), Posi-Dmg/Rng(45), Posi-Acc/Dmg/EndRdx(45)
Level 38: Atomic Blast -- Oblit-Dmg(A), Oblit-Acc/Rchg(45), Oblit-Dmg/Rchg(46), Oblit-Acc/Dmg/Rchg(46), Oblit-Acc/Dmg/EndRdx/Rchg(46)
Level 41: Vengeance -- LkGmblr-Rchg+(A)
Level 44: Dominate -- BasGaze-Acc/Hold(A), BasGaze-Acc/Rchg(48), BasGaze-Rchg/Hold(48), BasGaze-EndRdx/Rchg/Hold(48)
Level 47: Mass Hypnosis -- FtnHyp-Plct%(A)
Level 49: Telekinesis -- EndRdx-I(A)
------------
Level 1: Brawl -- Empty(A)
Level 1: Sprint -- Empty(A)
Level 2: Rest -- Empty(A)
Level 1: Vigilance
Level 4: Ninja Run
Level 2: Swift -- Empty(A)
Level 2: Hurdle -- Empty(A)
Level 2: Health -- Empty(A)
Level 2: Stamina -- EndMod-I(A), EndMod-I(50) -
Quote:In the current version of the engine PvP powers are separated from the PvE powers. They can have two totally different effects.I personally wish they'd remove the -tohit from RTTC, its not large enough to make a real difference. Not to mention makes PvP horrid with Willpower.
You can no longer justify any change to the way a power works in PvE based on how that power works in PvP. If you don't like how a power works in PvP, go to the PvP forums and request the change there where it belongs. -
It really... really... sounds like there's a problem with OpenGL support.
Did you have any other graphics cards in this computer, say like a PCI card used for another monitor? -
Quote:Any chance I can call dibs on you shipping me your older hardware? I'm always looking to expand my pool of hardware to test software against.What about the memory. 4 gb or 8? I plan on upgrading to win7. I know the 64 bit version will see/use more memory but I only have 2 gigs now so 4 is a nice upgrade for me. Is 8 or more the way to go for gaming though since the board can hold 16? I see ultra mode is only asking for 4
Anyways, I kind of have a hard time "telling" anybody they need more than 4gb of ram. The reason why can be found here: http://store.steampowered.com/hwsurvey/
It's Steam Hardware survey. As of December 2010:- 24.64% use XP 32bit.
- 13.53% use Vista 32bit.
- 11.03% use Windows 7 32bit.
This has the direct effect that developers really can't afford to optimize their games to use memory counts possible with x86-64 systems. For the average user 8gb of memory isn't going to buy any more real performance over 4gb, or even 2gb dual-channel mode, in most applications, and especially for games in the sub 1920*1200 resolution displays.
Now, this might not be true in two or three years as more x86-64 systems hit the market... but you can pretty much always add more memory. -
Quote:badly or favorably depending on "what you are doing"
http://www.techarp.com/showarticle.a...tno=337&pgno=3
The Phenom II has the advantage in an additional 6mb Level 3 cache compared to the Athlon II. This gives it greater performance in Processor Intensive Applications such as 3D Video games.
The Athlon II has the advantage in an additional Core. If you use lots of applications that are tuned to run with Symmetrical Multi Threading or Symmetrical Multi Processing, you'll see greater returns when running the Athlon II. On Linux based systems the difference is more pronounced as the Operating System's Kernel Scheduler* is better equipped to handle SMT / SMP environments when compared to Microsoft's NT6 scheduler.
I personally would go for the Phenom II since most of the Windows software that is available right now isn't tuned for any more than two processors to begin with. In practical use the extra core on the Athlon II X3 wouldn't buy you any extra performance, and the lack of that Level 3 cache would comparatively harm total performance in single tasks.
*small note: that Orielly book is from 2000... kind of out of date... but I really don't feel like digging up the stuff generated from the Con-Man Kolivas hissy fit on the kernel mailing list* -
Quote:... yeah. that would do it.Okay, I got to the root of the problem when I realized NOTHING OpenGL-based was working, not even the SDK components.
I have another video card in my system, a PCI card intended to drive two side monitors that run various things (stock displays, videos, performance counters, e-mail, etc).
OpenGL was not working using the latest driver (266.58) with that old card also in the system. It didn't recognize it, or support it -- in fact, that card had stopped working -- but OpenGL calls would just fail, with no clear reason why.
However, after Disabling that card in the Hardware Monitor and rebooting, everything worked perfectly fine. I am back up and running again!
Thanks for the suggestions, JS!
-- Vivian -
Quote:never used that one, but it does look fairly solid.Pfft, $55 minimum?? The the $40 Scythe Mugen 2 SCMG-2100 is one I can think of offhand.
Edit: Overclock PII X4 965 OC cooler test results. -
I would not count on unlocking or overclocking your processor.
Not all processors will unlock extra cores.
Not all processors will overclock.
Counting on a processor to have extra cores that work, or that it will work at higher than advertised clock-rates, is well, a horribly bad idea if you aren't an experienced overclocker and don't have a hefty amount of cash sunk into cooling solutions.
As is your build here doesn't include an extra heatsink for the processor to accomplish overclocking, and you're looking at an extra $55 minimum for a Nexus or a basic Noctua.
That being said, what you have here is adequate.
Personally I'd spend a little bit more and go with a RadeonHD 6850. It'll give you more life compared to the 5770. Keep in mind that the 5770 is approaching, or at, it's 1 year birthday. -
Uninstall the Nvidia drivers.
Run Driver Sweeper before rebooting: http://www.guru3d.com/category/driversweeper/
Reinstall the Nvidia drivers.
That should fix your issue. -
Quote:Master Blade already answered this.. I've got a couple of single posts addressing the issues at hand.Still, isn't the problem with the manufacturers themselves for making the chips cheaply, or at least with inadequate cooling? NVidia themselves only makes the GPU chips but not the entire video card - They hand out reference designs to second parties and let them make the rest.
NVidia's chips have always run hot, but with proper cooling they won't overheat. I think this is a case of HP/Compaq/etc being cheapskates and skimping on proper cooling designs and materials for their laptops.
First, there is Thermal Design Power, or TPD- http://boards.cityofheroes.com/showp...31&postcount=6
- http://boards.cityofheroes.com/showp...14&postcount=2
Then there is what Nvidia did:Nvidia flat out lied to Original Design Manufacturers (ODM) and Original Equipment Manufacturers (OEM) when giving them the specifications for Geforce Chips ranging all the way back to Geforce 7 and up to Geforce GT 300 series.
Nvidia told vendors that their chips only put off certain amounts of heat. Nvidia also (allegedly) modified the drivers used by the OEM's and ODM's for validation to detect the stressing applications the OEM's and ODM's and configure the hardware for a certain performance profile. Such benchmark hijacking was something Nvidia was intimately familiar with, which is one of the reasons I believe this allegation although it has not, as of yet, been proven in a court of law.
OEM's and ODM's designed their products around the thermal limitations given to them by Nvidia, and by the results given to them by the hijacked benchmarking applications. This initial overheating then coupled with the disaster known as bumpgate.
One of the problems with reporting on Bumpgate is that Nvidia was able to lean on many established tech news sites to keep from accurately reporting on the problems, or reporting at all. The stories only broke through Rogue Tech News sites such as Semi-Accurate and TheInquirer... noted less for their accuracy and more for rabble rousing... and largely through one single person: Charlie Demerjian
As to bumpgate itself... I guess the simplest way to put the problem is that Nvidia was trying to cut costs on their end to try and compete with AMD/ATi pricing. However, Nvidia was cutting costs in the wrong areas... and chose to cut corners on the actual physical connections of the processor... not really a place you want to save money. That's really a place you want to spend money. Nvidia's cost cutting measures basically resulted in their GPU's causing short circuits, and at running temperature, explosions.
I'll save the forums yet another extended rant on just how much rotten stuff Nvidia has pulled off that they don't want exposed in a Court of Law, but hopefully this gives you some insight into why these settlements are such a big deal. -
Quote:This page will get you started on the process of filing a claim: https://roscomps3.securesites.net/ww.../claimform.phpCrap, that would explain a few things. How do I do this then?
it looks like there's going to be several hoops to jump through... -
Vivian:
Run driver sweeper: http://www.guru3d.com/category/driversweeper/
THEN install the Nvidia drivers.
Chances are Windows is still trying to use the OpenGL driver from the 9500 on the GTX 460. -
Quote:deary. Go play Warhammer.Praetoria is fine. Biggest change I would like to see is allowing all characters to travel freely between Paragon and Rogue Isles, regardless of alignment. Sure villains probably won't get a lot of missions in Paragon, but there could be new pvp options available that way. Heroes and Villains battling in the streets would be interesting. Add a pvp flagging system, so players can't be attacked in their home city unless their flag is on.
PvP has never drawn more than single digit percentages in the entire life of the game. Trying to cram Open World PvP into a game that was never designed for it would more than likely see 99% of all existing subscribers go BAI BAI. -
-
http://paragonwiki.com/wiki/Patch_No...8-16#General_2
Quote:Players can now fight Doppelganger versions of themselves. This can be found in the Boss Enemy Group "Doppelganger". Players can mix and match to create different types of doppelgangers. The following choices are available: -
Nope.
Quote:Source: http://www.cityofheroes.com/game_inf...rchetypes.htmlThe Controller is at the same time the weakest and yet the most powerful of the archetypes.
The Controller has few offensive attacks and possesses the fewest hit points. But the Controller has access to a range of powers that no one else has: the Control power sets. Armed with these powers, a Controller can affect the behavior of villains from freezing them in place to routing them away.
Armed with such abilities, the Controller is the backbone of many groups involved in large-scale battles - but the Controller depends upon his teammates for protection.
Controllers are deliberately weak on straight damage counts as their strength is in the control sets. On SO, Crafted IO, and light IO builds, controllers tend to be roughly reliant on team-mates.
However, that can quickly get inversed with intensive usage of IO sets where builds can reach 100% or greater recharge rates. Some of the possible combinations can actually achieve near game-breaking performance.
For example, both Sleet and Heat Loss have a -30% resistance debuff. Stack both of these on a mob then hit the epic power Ice Storm and watch the health just fade away. -
if you are asking if Power Boost ups Vengeance..
yes. yes it does. -
Quote:I don't believe there is anything to fix. I believe that is "working as intended"You have to be careful with this though. If the Nictus is out of LoS when Rommy dies he won't rez and you end up having the kill the nictus itself (it's possible this has been fixed, it happened to a team I was on once but it was a long time ago). This is possible but unless you can reset the nictus so it isn't healing itself it can take a long time.
Romi's Death Throw requires nictus be line-of-sight on defeat. If the nictus is not line-of-sight, then you have to fight the nictus that remain.
Now, there have been some bugs with the Nictus remaining in the super-buff state they enter when Romi is defeated, but I believe those issues have been ironed out. -
Quote:Syntax42 answered this in his post. There's a limit to how much you can buff any one particular power.You just said tough isn't as important as weave, but suggest I should split my slots evenly between them? Wouldn't it be smarter just to go with weave? Because I have all the powers+energy absorption; wouldn't that in itself be enough?
You can use Tough as a platform to boost your base defense with the Steadfast Protection Resistance / +Defense enhancement: http://paragonwiki.com/wiki/Steadfas...stance/Defense
As to whether or not that, in and of itself, will be enough?
Well, only you can judge that. -
Quote:What Auroxis said isn't entirely correct...So I made my ice/ice tank and he is doing very well. But looking at other forums people say tough and weave are important. I have room to 6 slot one or I can even them both out. But should I? Will I notice a major difference or should I even bother? Any advice would be great, thanks!
If you are running on purely SO's / Crafted IO's, and you have Frozen Armor, Glacial Armor, Weave, Maneuvers, and Combat Jumping, and you have each of them maxed with 4 level 50 crafted defense enhancements... you still won't be at soft cap. You'll be .5% off.
With just basic SO slotting, Frozen Armor, Glacial Armor, Weave, and Energy Absorption will put you up and over the 45% soft-capped defense to Smashing, Lethal, Energy, and Negative energy attacks.
So yes, you likely will notice an improvement in survivability with Weave.
Tough's improvements are not as noticable as Ice has next to no damage resistance to begin with.
Personally I would slot the powers:
Tough:
Steadfast Resistance +Def
Titanium Coating Resist / Endurance
Titanium Coating Resistance
Weave:
LoTG 7.5%
LoTG Defense / Endurance
LoTG Defense -
Quote:Yes. yes it is.Isn't this reversed? I thought the Nicti/Nictuses were immune to pretty much all mezzes except for sleep. Whenever the immobilize method is used, I've always immobilized Rommy while the taunter pulled the Healing Nictus away.
Somehow I transposed immobilizing Romi and leading the healer away.