je_saist

Renowned
  • Posts

    4197
  • Joined

  1. Quote:
    Originally Posted by Fedor View Post
    *Climbs up into the rafters and snugs Demon.*

    Hi DemonNekoAKittehSaistRad

    It's not me, it's the Puppeh. And be careful with my ladder.
    *a low hiss his heard from in front of the mirror*

    ...puppeh...

    oh... he will pay for this. the puppeh... will... pay.
  2. Quote:
    Originally Posted by Hyperstrike View Post
    If the problem is the Intel being flaky and unstable, this is not something the CoH devs can realistically fix. It's not the CoH devs' job to rewrite Intel's drivers for them. There are work-arounds available to trick the Intel setup into being better behaved. But, at the end of the day, the problem
    the problem is Intel not having a clue?

    Okay. It's well known that I'm not exactly numero-uno on Intel's favorites list. There's a reason I refer to their Graphics Accelerators as ... Graphics Accelerators. Intel, outside of Larrabee, doesn't actually make a Graphics Processing Unit. In all fairness, Larrabee wasn't a graphics processing unit either, just a Graphics Accelerator on Steroids... but with steroids like that you didn't need a GPU.

    I'm not entirely sure what the issue is. My personal opinion, and it's based on what little information I could wring out of Television, GPU, and Tex, is that the old CoH graphics engine was making calls to several proprietary OpenGL extensions. The Ultra-Mode engine update brought a semi-re-write / clearing out of the older proprietary extensions, instead using a rendering path based on the Khronos reference specifications.

    Just because a call is in the reference specification does not mean that it is in the graphics driver, or that it is implemented correctly in the graphics driver.

    As I understand the situation at hand, Paragon Studios has a choice on support for these older chips. They can re-break the engine(s), and inject the proprietary extension calls back into graphics layer, and thus return the engine to it's old state of nobody knows that section of code actually does, so we have to have this section of code here to work around it and contain it's memory leak, but we need to have an extra section of code here to determine whether or not we actually need that black-box... and I hope I get the point across.

    The other option is to hope the driver vendor steps up and fixes support for reference API calls. Now, Intel's gotten better at making sure their OpenGL drivers are reference specced. For many users though, they don't have the option of upgrading the drivers as newer Intel drivers won't recognize their hardware.

    The big question on my mind is whether or not City of Heroes alone is enough to make Intel do what AMD did and work their own mobility program, releasing a basic reference driver that should install on the most common variants of hardware, at the possible expense of some advanced mobile features. Given that Intel is not exactly a partner now with Paragon Studios, I somewhat doubt they'll take this approach.
  3. *bounces in*

    Hi there... woah. who left their cute rump just standing around?

    *smacks that rump*

    Hey? Where'd my paw go?

    *bounds in front of a mirror*

    Where did I go???

    *looks around the channel*

    FEDOR! YOU GOT SOME 'SPLAINING TO DO!
  4. Quote:
    Originally Posted by Father Xmas View Post
    From what I understand what the PhysX card buys you over the current CPU emulation is more packing peanuts, shell casings and debris visible at one time.

    The only benchmark I remember seeing was from a long time ago in AnandTech around the time CoV and PhysX first came out. Not sure how relative the performance of a 2.93GHz Core 2 dual core coupled with an nVidia 7950GX2 compares with current CPUs and video cards.
    Not much as the performance and on-screen display of running a PhysX card on single cored Socket 754 and 939 Atlhon64 was pretty much matched by running a duel core 939 Athlon64. Back then City of Heroes didn't default with a renderthread -1 command, and it didn't leverage a multi-processor system by default. The game today does leverage SMP by default, granted with only two threads.

    Now, it's been a while since I actually had an AGEIA PhysX card in hand. I sold mine off... a little bit after I had picked up my AM2 A64 6000. In my own testing back then, at least in CoH, the AGEIA card added pretty much nothing. Same was true with a 939 X2 4000. The PhysX add in card did... nothing. My personal opinion is that this lack of performance boost was largely due to CoH's engine architecture, as one of the Ghost Recon's I had on hand with AGEIA PhysX support did enable some extra fluff with the Physics Processor enabled.

    Now, that being said, PhysX cards may still hold some future value. That value being found in OpenCL. We know that Nvidia's not about to release an OpenCL driver for the AGEIA PhysX PPU as Nvidia has been systematically trying to stomp the PPU into a grave since well... they bought AGEIA out.

    However, the actual PPU itself is a rather simple chip-design: http://www.hexus.net/content/item.php?item=5492&page=2

    It Might be possible to reverse engineer an OpenCL driver for the AGEIA PPU. For games that leverage OpenCL for physics, this would allow them to expose existing PPU's as viable physics processors getting around Nvidia's baby-like mine mine mine attitude and behavior.

    This scenario probably won't happen for one very good reason: http://www.x.org/docs/AMD/ :: http://intellinuxgraphics.org/

    The hardware data to write a driver, be it an OpenGL graphics driver or an OpenCL computational driver, is pretty much available for most AMD GPU's and Intel GPA's that are on the market. Why go through the work of reverse engineering a chip that actually didn't sell that well... when lots more gamers are more likely have an AMD / ATi graphics card on hand or in the Processor, or a Intel Graphics Accelerator on the mainboard or in the processor?
  5. je_saist

    Server Merge?

    Quote:
    Originally Posted by Cien_Fuegos View Post
    dont make me get out my beating a dead horse with a stick emote
    Oh trust me. I'm stockpiling my own collection of pictures.
  6. From here: http://boards.cityofheroes.com/showthread.php?t=219502
    Intel

    There is a known issue with Intel Graphics Chips at this time. The developers have not gotten around to re-kludging the game engine to work with / against Intel chips. Yes, it is known that the graphics are stuck on the lowest possible shader settings.

    The Developer GPU has indicated that patches to Intel chips are due at a future date: http://boards.cityofheroes.com/showt...49#post2819749
    Quote:
    Some users with integrated intel chipsets currently experience a degrade in the shader quality settings allowed. This should be resolved in a future patch.
    From Luminous Martini: Here's a fix to shove the game into it's lowest shader mode.

    Quote:
    1. right click on CoH shortcut
    2. click on properties
    3.click on shortcut tab at top
    4. go to target window "C:\Program Files\City of Heroes\CohUpdater.exe"
    5. after the last " press space once
    6. enter the following code exactly -useTexEnvCombine
    7. click on apply then ok
  • Quote:
    Originally Posted by NekoAli View Post
    Nah, haven't you heard? I'm on a divine mission and protected by the power of Ceiling Cat.
    *makes a note to see if there are any bounties on ceiling cat*
  • Quote:
    Originally Posted by Kheldarn View Post
    The Character Transfer tool is being pissy. For most people, we can only choose to transfer from Freedom or Virtue.
    hmm. Then somebody needs to go in and give the ole server a wack in the binaries.
  • Quote:
    Originally Posted by NekoAli View Post
    but... where did you find a jester's outfit big enough for that lizard?
    Probably from the Siwwy Puppeh Mart.

    Also, shouldn't you be out-cold?

    Quote:
    Originally Posted by Feyfey
    Rustler!?! Thems fighting words!
    *pulls out the wanted-poster*

    It says right here:

    WANTED

    The Notorious Kitten Rustler FeyDeux

    A bandit calling herself FeyDeux has repeatedly interrupted the annual Kitten-Herd, stealing away valuable kittens for purposes such as, but not limited to, Illegal Snuggling, Conspiracy to deny playing with yarn, and Prevention of Access to Milk and Catnip.

    REWARD: 1,000lbs Catnip or ownership of FeyDeux
  • Quote:
    Originally Posted by Desi_Nova View Post
    why is it the only servers that can apparently transfer over to Test are Freedom and Virtue? are those of who play on the other servers not worthy of transfering to Test?
    um.

    I successfully transferred Characters from Triumph and Victory a couple nights ago, so I have to ask, what exactly are you on about?
  • je_saist

    New Computer

    Quote:
    Originally Posted by Elric View Post
    When it comes to the video card i would recommend the gtx 460 1gb, over all its a good card and the ATI 5830 doesn't out perform it by that much and in some games the 460 out performs it. http://benchmarkreviews.com/index.ph...=558&Itemid=72 you can take a look at the benchmarking for yourself.You can also see the comparison between it and the 5770.
    Fixed that for you: the GTX 460 competes against the 5830. It does not compete against the 5850: http://www.hardocp.com/article/2010/...tx_460_review/
  • not sure what to tell you.

    Your drivers aren't that horribly out of date, but you might want to update them first.
  • Hmm. Worst PUG Experience ever...

    Well, I've had a lot, like a Katie's with a Tank who didn't have taunt and swore it could hold aggro with melee-attacks. Not with hurricane running and no to hit buffs deary. I've been on Task Force's where I've actually kicked the tank for being an idiot. I've been on teams where the Fire / Kin's didn't have Increase Density and couldn't figure out why on earth why they were expected to have it. I've been on teams where I've had to explain to the Storm that their O2 boost could stop the Malta's sapper drain. I've been on (old) Posi's where we got to the final mission in Perez Park and then the team leader says we "have" to go to Faultline cause we have to go to the Dam.

    There's just a lot to choose from.

    I think though the worst P.U.G. I've ever been on is one that I actually used in an explanation of why MMO's don't grant leadership skills.

    Quote:
    City of Heroes then, as a more typical MMORPG, also represents emperical data that simply playing an MMO, even for a long time, doesn't grant management skills. Very recently I ran a Cap Au Diable Strke Force with a rather... obnoxious player. One of the basis of City of Heroes is that a team leader is signified by a star over their name. The starred player can invite new players to the team, kick current players on the team, and select new missions without actually having to be at the mission door. However, just because somebody has the star, doesn't mean that they are actually the team leader, or the one choosing the missions.

    ...

    Going back to the Cap Au Diable SF, the person who put the SF together had started with the star, and early on the SF was directing the fight... which was a good thing. The player knew how to work the SF. After a couple of disconnects, I wound up with the star, and promptly kept asking the experienced villain side player what to do, where to go next... only to get no response. As far as the player was concerned, whoever had star made the choices, no matter what... and then after the TF the player went off on a rant that I hadn't "abdicated" my position and that if I hadn't wanted to lead the Strike Force I should have just quit, logged out, and given somebody else the star.
    In the original post I called this particular player out directly for this rampant display of abject... psychosis. I think the mods would get mad if I do that again.
  • je_saist

    New Computer

    Quote:
    Originally Posted by Hyperstrike View Post
    Christ, up until about a year or so ago, it was still possible to find Rage Pro cards!

    Tangential data here... AMD actually has somebody on staff whose job is to make sure the X.org ATi driver works with Rage 128 cards on newer Linux kernels. The Rage chipset is still used in some "modern" server motherboards.

    Quote:
    Um, that was just a mistake on my part. So are you recommending that I switch it back to the DDR3-1600? I do appologize, but what do you mean by the ""1600MHx setting is unlocked"?
    Starting with the I7 Intel copied AMD's 2001 HAMMER architecture design and moved the memory controller on-board the processor. The advantage to this is much lower memory latencies, lower power consumption, less complex motherboards, less expensive cost to make motherboards, and so on.

    The disadvantage is that if a new type of memory or memory speed comes along you need an entirely new processor to take advantage of that memory. Case in point: The I7-870 that defaulted on your second cyber-power link only supports DDR3-1066/1333: http://ark.intel.com/Product.aspx?id=41315

    However, because you are changing memory speed, and not memory type, you can overclock the processor's memory controller to the higher rated speed. Now, I haven't actually had any of the I7-8xx series in here, so I don't know if what Father Xmas says is true. I don't know if an I7-870 will do DDR 1600 by default.

    I would suspect that if DDR 1600 is exposed as a memory option, it's the motherboard applying a behind-the-scenes overclock to the processor.
  • *flops down atop a roof and pulls out a sniper rifle, surveying the area*

    Hmm... I see the Ali Kat Kid to the north... Butcher Becky to the West... and the dreaded Rustler Fey Fey to the south...

    *decides to try for the Ali bounty first and takes careful aim, then fires a tranquilizer dart towards the Ali Kat*
  • Quote:
    Originally Posted by Shakhra View Post
    How much of an XP nerf is 1 NPC? What is this upcoming "Fix" going to be?
    anybody who could tell you that would have the mods canceling their account for breaking NDA.
  • *sighs*

    servers finally go back up... and Pocket D 1 is already too full to accept anybody else.
  • *strolls into town wearing a stampede duster*

    Howdy there folks. I'm looking for a couple of kitty girls that been said to be round these parts. Yall done seen the dreaded Ali Cat-Kid and Butcher Becky?
  • http://www.phoronix.com/scan.php?pag...item&px=ODQ2NQ

    http://developer.nvidia.com/object/opengl_driver.html

    I don't have a Fermi card on hand to work this, but I know there are some players out there with Fermi class cards. If some of the GTX 4xx users could make sure that the CoH still works against the upcoming driver changes that probably would be appreciated by the developers.

    Bugs can be filed through: https://nvdeveloper.nvidia.com/
  • Quote:
    Originally Posted by Swift_Frost View Post
    i get in because i r has "I'm Malkore pass"
    *burns the pass*

    now... where were we... oh yes. Getting you prepared for your date with Ghost Widow. Now, would you like her to just destroy your soul or hold it captive?
  • Quote:
    Originally Posted by ShoNuff View Post
    Didn't Frank Miller write some of the best Daredevil stories during the era he was writing?
    oh, Frank Miller CAN write. e.g. Dark Knight.

    It's more of he has descended into some form of insanity since those days.
  • je_saist

    New Computer

    Quote:
    Originally Posted by Residentx10 View Post
    I'm replying to make you feel welcomed. The system your spec'ing should have no problems with CoH/GR. The video card is plenty fast enough. Are you getting a 4 or 8 cores system? Here's a link with some good information on what to expect with Ultra mode:

    http://boards.cityofheroes.com/showthread.php?t=217610
    ... 8 cores?

    Neither Intel nor AMD have any 8 core processors on the market. AMD does have some 6 core processors on the market.

    If you were referring to the I7, those aren't cores. Intel's I7 is a quad-core processor that leverages Symmetric Multi-Threading (S.M.T.) to expose code-execution units that are not in use as separate logical processors. These exposed logical processors are not capable of general processing, and can largely only be leveraged for additional power when performing tasks that require different types of computations.

    Most consumer oriented workloads barely leverage Symmetric Multi-Processing (S.M.P.) to begin with, much less containing the extra code to dictate multi-threaded processing loads.
  • Quote:
    Originally Posted by Hazmatter View Post
    I think Frank Miller's opinion regarding movies adapted from comics lost all weight with his directorial debut on the Spirit movie...
    ... yeah.... this. I actually felt sorry for Lewis over at TGWTG for having to sit through it. Didn't really feel sorry for that Film Brain guy.
  • je_saist

    New Computer

    Quote:
    Originally Posted by ioshie View Post
    Did I say something wrong? I can't believe nobody has any opinions
    oh, I've got opinions. Just do a quick search for Nvidia against my user-name.

    The GTX 460 is the first Nvidia card since the RadeonHD 4850 launched that is probably "worth" the cash to look-at. It's performance-per-watt isn't far off that of the RadeonHD's, it's not drastically more expensive than the RadeonHD's, and most GTX 460's overclock. Most 5830's... don't.

    That being said, I'm a little concerned looking at that site. A Geforce GTX 460 with 1gb memory starts around $230 retail. A 1gb RadeonHD 5830 is significantly cheaper. On the site you list, they are selling the 1gb GTX 460 and the HD 5830 for the same price...

    In addition, that site has a $91 split between the GTX 460 and the RadeonHD 5850. The street price of a 5850 is $290. That's only a $60 split between the 1gb 460 and the 5850.

    So that site you link to is either overcharging you for the hardware in general, or they are taking a drastic hit on Nvidia cards to move inventory.

    To purchasing the cards from retail, the 5850 is worth the $60 over the 1gb GTX 460.

    As to the Motherboard... if you are going to be buying an Intel system, go ahead and spend out for a motherboard that will do both Crossfire and SLI.

    If you do that you'll be set for multi-gpu regardless of who you get your multi-gpu from.