Arcanaville

Arcanaville
  • Posts

    10683
  • Joined

  1. Quote:
    Originally Posted by StratoNexus View Post
    Is the average or outlier "performance" of defenders and corruptors likely to be so high with increased base damage mods that it trumps fun blasts?
    It wouldn't, if you could give a definition of "fun blasts" that:

    1. Most defenders would agree were fun.
    2. Most blasters would agree didn't trivialize theirs.

    Blasters and Defenders are placed more or less as far away as possible without (deliberately) making Blasters too strong at the high end or making Defenders too weak at the low end. We could debate the numbers, but I think that's really the real issue. Defender damage is as high as it needs to be, not as high as it possibly can be. And that's not likely to change without someone making an extremely deft argument that causes a major reevaluation of archetype definitions.

    Its not impossible: a similar seismic shift occured with Dominators. I think the argument might be a little harder for Defenders. But I think people were far more successful arguing that the archetype concept for Dominators was wobbly than they ever did arguing that Dominator numbers "were not fun." Because there aren't really any fun and unfun numbers.
  2. Quote:
    Originally Posted by StratoNexus View Post
    It strikes me as odd that they are concerned about top end, edge case performance when it comes to possible improvements for defenders and corruptors and yet when releasing a new set for an AT that is top end and edge case already, they feel free to dance on that edge with abandon.
    Perhaps it seems odd because the devs don't see it that way, because they don't agree with all of the assumptions in that statement. For example we don't know that - for the average player - Masterminds have consistently superior performance.

    The devs are always concerned about at least two things simultaneously when thinking about things such as this. First: does the change affect the performance of average or typical players in roughly the way desired, plus or minus some spread. Second: does the change affect maximal or extremal performance too much to be considered practical. The first criteria is a measure of the change's effectiveness in performing the task desired. The second criteria is a guard rail that disqualifies certain changes that have unacceptable side effects.

    When the devs say they don't "balance" things based on extreme performance cases, that's true only in one context: targeting changes. They still think about them when it comes to design limits. Given two possible changes, one of which addresses 50% of the problem, and the other which addresses 90% of the problem and also causes extreme problems in a few edge cases, the devs are more likely to choose the former over the latter, with some exceptions to that rule. The edge cases don't affect the devs judgment on balance, but they do factor into decisions about whether to do something at all, balanced or not.

    While I know the basics of the rules here, most of them are subject to judgment. I can't predict with certainty how Castle (or any other dev) will interpret and apply those rules. But just because I can't predict this, doesn't mean its completely inconsistent from the perspective of the devs. Its the application of judgment that makes the execution of these design rules sometimes seem inconsistent from an outside perspective.

    (Its also the case that MMOs are collaborations. Castle might decide one way today, he might get overridden by War Witch or Positron tomorrow, he might allow Synapse to make a call different from his own the next day, and then they might all have a meeting and come to a consensus decision about something that is a blend of - but not entirely representing any one of - all of their sensibilities, all as part of the process of collaboration. You don't get to have one single vision for every single detail of game design in a game this big, and few game designers would likely work for someone who tried to impose one.)
  3. Quote:
    Originally Posted by Another_Fan View Post
    Then how bout some actual insight into where the game is going in terms of balance.
    One day a farmer in Texas finds himself in the middle of a heavy rainstorm. The local river is overflowing its banks and the entire area begins to flood. As the water begins to reach the farmer's front porch, a police officer shows up. The man says "come with me, I'll take you to safety."

    The farmer crosses his arms and says stubbornly, "I have faith: God will save me."

    The police officer drives away. The water continues to rise and reaches the second floor of his house. A boat comes up to his window and the man in the boat says to the farmer, "Jump aboard, I'll save you."

    The farmer again says, "I have faith: God will save me."

    The boat motors away. The water continues to rise and reaches up to the roof. As the farmer stands on the roof a helicopter flies over and drops a ladder to him. The pilot yells down to the farmer "climb up the ladder and I'll save you."

    The farmer yells back "I have faith: God will save me."

    The helicopter flies away. The water comtinues to rise and sweeps the farmer off the roof, drowning him.

    The farmer finds himself in heaven. God sees him there and says "what are you doing here?"

    And the farmer says "I put my faith in You and you let me drown."

    And God says, "What do you mean, I let you drown? I sent you a police officer, a boat, and a helicopter, what more did you need?"
  4. Quote:
    Originally Posted by Johnny_Butane View Post
    I think for some things, a lot of it would have to be done by hand. There's a reason those figure prints take a month or more to create.
    I believe what figureprints does is make a database of all the parts, figure out which ones are printable and which ones are not, and automatically replaces the unprintables with their own versions of ones that are. That way, they can do it automatically.

    Although the figures can take a month to turn around, I think they've said it only takes about a week to actually generate them (there is a backlog), and I believe a lot of that is just the process. It can take all day to print one (although they can print more than one at a time) and there is a post process to harden them. Then there is some touchup by hand. But I don't think they literally hand-edit the actual geometry for each individual model. If you think about it, the problem isn't time, its money. The cost for someone to operate a printing machine is probably low: the cost to pay an actual trained 3D modeler with both aesthetic sense and knowledge of what is printable would likely be so high that if he or she spent any significant time on the model their time would cost more than the model normally sells for.

    We (CoH) actually have a weird advantage in that area. We have a costume save file format. We could literally look at a simple text file and figure out which parts were printable and which were not. If we decide that hair style 5 is not printable, we could replace it in that text file with a batch script and then feed it back into the game, re-pose it, and then extract for printing. We would need someone to make printable versions of everything that is not printable, which would be time consuming, but you would only have to do it once. You could also do it as people order models that required it, and do it once for all future models.


    Quote:
    I wouldn't mind one bit hypothetically being able to bring the geometry into Max to see what I could do using it's smoothing and geometry tools. One could hypothetically send a link to an exported .OBJ to my PM box.
    Hypothetically speaking, I'll see what I can do there.


    Quote:
    There's also the question of, artistically, where does one draw the line at smoothing. If you look at a figure like this one: http://cf.figureprints.com/media/photo084.jpg
    They left the shoulder armor and his pony tail relatively unsmoothed and "low poly" looking. The character's who are holding mugs are still holding six-sided cylinders. It's clear they've added geometry to areas that needed it for the printing process, but left the look from the game intact. In the case of your model, I think that would include keeping the point on the skirt, if you see what I'm saying.

    What would players expect? Myself, I'd want the integrity of the character maintained as much as possible, warts and all. That means blocky mitten hands and putting up with the limitations of the animation skeleton causing odd deformation like in the skirt. I think it would have to be a point of discussion where the dev's and players' desired level of quality would be and what exactly they value, accuracy, speed or pure aesthetics or a combination of only two.
    All good questions, and outside my expertise to render an opinion on. Personally, I would be willing to tolerate some artistic license where the technology placed absolute limits (say, hair, capes, skirts) or where the 3D nature of the models required some reasonable tuning (turning texture-only tops into slight relief maps so the clothes actually look like clothes and not dyed skin). And I'm willing to lose certain exotics, at least initially (i.e. combat auras).
  5. Quote:
    Originally Posted by Kheldarn View Post
    I still think Arcanaville is really the mastermind behind CoH/CoV, telling everyone what to do. She is controlling transmission.
    I'm more of the Illusion controller behind CoH/CoV. I tell everyone what to do, and then they ignore me and go off in completely random directions attacking whatever they want to or in at least one case trying very hard to scare everything in sight.
  6. Quote:
    Originally Posted by Castle View Post
    Quote:
    Originally Posted by Another_Fan View Post
    Amusing, not saying I agree with the neg rep, but 11 paragraphs to say what fun is ?

    If Castle had of said it, it would have been official and we would have a guideline of what to expect. As it stands there is now an interpretation of an interpretation that given recent inclusions in the game, and nerfs to things in the game is not shared or well communicated amongst all the developers.

    And if you refer back to that post in any discussion you can expect "Thats what she said" as the response.
    That's what she said...and it was pretty darned accurate both in the original post and the followup. The truth is there isn't one balancing factor but dozens; many of which Arcanaville pointed out.
    At least its now official and a guideline of what to expect in the future:

    Arcanaville is "pretty darned accurate" and "points to the truth" raves Paragon Studios official spokesperson Castle.


    I'd like that to be my new forum title if it will fit: "Pretty darned accurate truth-pointer."


    (Thank god Castle saw this post before BaB, or I might have been "a fun ride for the whole family" instead.)
  7. Quote:
    Originally Posted by Lemur Lad View Post
    I should think that the length of your answer would be a clear indicator of why the question wasn't answered fully in the QA. It's probably pretty much the answer a dev would give if they had been tasked to do a memo on it, but the fact that your name isn't red means someone gets to ding you, whereas if it had been Castle it would have 2 pages of kudos.
    Actually, that's a short answer. If Castle was tasked with writing a memo on it, it would probably be about fifty pages long, a third of which would then be redacted.

    What I didn't cover are difficulty thresholds, level progression, reward table construction, power design requirements, powerset progression rules, archetype balance, combat soloing thresholds, rank requirements, late-game difficulty shift, difficulty sliders, teaming reward modifiers, power caps and constraints, invention balance requirements, content repeatability, performance and balance margins, special case power restrictions, and critter AI limitations. Each of which would take pages of text just to explain what I know about each one, which is probably only about 75% of the total on average. Castle's version would be longer, and his version wouldn't necessarily be 100% complete either (he wouldn't know all of the specifics of reward table balance unless he discussed it internally, as that is not part of his normal day to day responsibilities).

    And while Castle would have probably gotten 2 pages of kudos, he would have also gotten thirty pages of "what do you mean by that," "please explain this with actual numbers," "oh yeah, well explain this then," and "so why don't you fix this already." Which is why Castle is smart enough not to open that can of worms. Usually.

    (Castle must also surely take note of what happens whenever I discuss balance issues - or any game implementation internals in general - on the forums and repeat things I've already discussed with him, only to have them nit-picked to shreds. He's almost certainly thinking "oh hell no: screw that I'm going back to working on Kinetic Melee." If I had a choice, I'd rather work on Kinetic Melee also.)
  8. Quote:
    Originally Posted by Tenzhi View Post
    Which is kind of silly. Nameless minions are nameless minions precisely because they are meaningless opposition.
    There's a term for meaningless opposition in game design: its called destructible environment.

    The funny thing here is Champions Online did exactly what I said the devs should do way back in '04. Just rename them. In fact, CO has stronger bottom tier critters than we do (imo) but just doesn't call them minions. If the devs just decided to call them "Super Villains, Super Duper Villains, Mega Villains, Ultra Villains, and Galacticly Insane Villains" they could sidestep 99% of these objections, and that change could be implemented in about two minutes.


    Might be time to resurrect Extreme Hyperdodging next.
  9. Oh my god when did they nerf Windows XP? Did this happen while I was traveling last week on business? I only heard about that volcano that shut down Europe and the entire government of Poland getting wiped out like in a Tom Clancy novel. Strange stuff always happens when I travel: the US invades Afghanistan or Detroit wins a football game, stuff like that. But I haven't upgraded my laptop to Windows 7 yet. Does Office XP at least still work?

    I guess if I can still type in a browser the nerfs can't be too detrime
  10. Quote:
    Originally Posted by Elric View Post
    The Q&A was useless.

    "One of the original balance design goals was that a character should be roughly equivalent to 3 minions or a lieutenant and one minion. That no longer seems to be the goal. What is the design goal today with regard to balance in both PvE and PvP?"

    That deserves more than a 7 word answer.

    I'm sorry,but i can't help but be upset this whole thing. If you can't answer the questions that you know everyone has at the tip of their tongue, then please don't hold a Q&A.
    This ventures into territory the devs can't discuss in too much detail, so they are usually cautious try to avoid discussing too deeply. Its also something that they have historically found difficult to articulate precisely even in areas they are allowed to discuss.

    I'm not as cautious, and extremely wordy.

    First: we were never meant to be "equal" to 3 even minions. Even back when Statesman was making that statement, I discussed with him the fact that statement was misleading and contradicted other things he said about the game balance. What he said was that 3 even minions was originally meant to be a challenge to players, because that was the standard spawn in a mission (for one player). Obviously, when you are designing a game, you try to generate opposition for the player to overcome that is within their means to do so but generally not trivial to do so.

    We were meant to be challenged by three minions, but generally win. By definition, that means we were always supposed to be superior to three even minions. Just not so superior that they were meaningless to us.


    Today, that rule is just a whisper of a guideline. The devs now recognize that because of the way the level progression works, the players get more powerful relative to the critters as they get higher. And the devs now *want* the players to get more powerful relative to the critters, to provide the psychological feedback of progress. So we are "balanced against" increasingly higher levels of threat as we get higher.

    But as I said, that's just a guideline. The game balance is based around a set of boundaries that define a range of acceptably balanced performance, rather than a specific "target." To the best of my knowledge, these are the parameters the game is balanced *within*:

    1. Every powerset combination where the player makes reasonable choices to do so should be able to reasonably solo the core story content of the game.

    2. When averaged across all of the players that play the game every powerset combination should generate performance similar to that of the average performance of all players playing all powerset combinations, to within a specific range centered around that average. This should be true under a set of different specific circumstances, such as different combat levels.

    3. A player should be theoretically be able to level from level one to level 50 by playing and completing some subset of the core story content less than 100%. In addition, the amount of time for the average player to complete that content should fall within a certain leveling range. Above average players can level significantly faster without generating unacceptably high performance up to some unknown (and possibly non-specific) limit.


    There are other minor rules here and there I'm aware of that I doubt you'd be interested in. The problem is that I have only a vague sense of the actual numbers involved in #2, and absolutely no idea (except what basic logic tells me) about the numbers involved in #3, and the devs are explicitly barred from discussing either. To the extent that I know anything about those quantities I'm equally barred from discussing them except in vague terms comparable to what the devs have already said in public.


    This is just unfortunately one of those topics for which the devs could give answers, but for most people they cannot give satisfactory ones and would be forced to terminate the discussion at almost every turn. The fact that there are still people who think we were, are, or were ever considered to be "equal to three minions" attests to the difficulty in communicating about this specific topic. That statement was never, ever true and even when Statesman was here I worked hard to correct that misunderstanding, with only very limited success. Its just a very difficult topic to cover without falling into a lot of misunderstandings and general arguments.


    To the person who rep-commented "you dont need 7 paragraphs every response" this post actually has eleven, but don't worry, its no problem at all to do more than I need to. I go the extra mile just so you have a safe, anonymous, and harmless way to expend those negative rep points. You're welcome.
  11. Quote:
    Originally Posted by Umbral View Post
    Of course, when you begin putting things in terms of "well, I can live with that" you start getting into the realm of your own personal opinions concerning the capabilities of a set being set as a design standard.
    Absolutely. I never said otherwise. I believe I can eventually justify them, but I'm not in a position to do so now.


    Quote:
    You've outright said that you believe that MA should be the best ST damage set for Scrappers on multiple occasions so of course you wouldn't find it untoward for the set to get a massive (and I do mean "massive"; when you're dropping the required recharge levels from "top end IO build" to "SO build" that's massive) buff to its performance.
    Ah. While you will probably not find this convincing, I'll tell you what I already intend to tell Castle, when I finally get around to formalizing this suggestion.

    I am not creating a massive single target buff for MA. You did, when you decided to alter Storm Kick as a workaround for MA's set design issues.**


    However, the discounts I'm thinking about aren't actually as drastic as you portray them to be. Looking at it from the perspective of what recharge level is specifically necessary to achieve a very specific build is not the appropriate way to judge the change. That's really not relevant from a game balance perspective, because specific builds have very specific requirements. Rather, the question is, for any given build, how much would the change increase the damage output of that build. And a buff to just one or two attacks gets quickly diluted in real chains.

    Put it another way: suppose there was a way to make a build with 89% resistance at low cost, but it took a billion inf of inventions to get the last 1%. And suppose I was proposing a 1% resistance increase for such things. On the one hand, you could say I was cutting the costs to make a 90% resistance build by a billion inf, which is a huge benefit. On the other hand, its only an 11% survival increase. The question is: which is the game-balance significant perspective. And for me, the answer is the latter, not the former, for any game in which performance is not cost-normalized (and this one definitely is not). The former is simply impossible to balance around in the general case.


    Quote:
    It's pretty obvious that Storm Kick is allowed to break the rules because the rest of the set would be vastly underperforming without it
    Except that's not why. MA was *not* underperforming prior to the Storm Kick buff as the devs define performance. It was actually broken to start by costing too much endurance, and then rather than correct that bug Castle took the opportunity to counter a bug with an exception, and added the enhanced critical chance. And the reason why Castle did *that* is, as I understand it, a combination of MA's long-standing unaddressed design flaws and the fact that BaB's recent animation clean up buffed the surrounding sets more than MA - Storm Kick being one of those rare exceptions that wasn't enough to compensate.

    The Rosetta Stone to understanding these changes is: why Storm Kick? Why not buff the crit chance on Thunder Kick, say, or CAK? Well, the simple fact is Storm Kick was the only MA attack that didn't have a secondary effect. From a pure damage-balance perspective, it would have probably made more sense to buff TK or CAK. But Storm Kick didn't get the crit buff to balance attack chains. Its actually addressing a secondary effect issue (see below).


    Now the question is: would MA with the buff be too powerful, relative to other scrapper primaries? Or rather, would it even outperform them on a damage output basis? Its saying something very weird if you are saying that the buff to MA is *huge* and yet the net result of that huge buff is not that MA ends up outperforming everything else. Or perhaps not so weird.



    ** Just to clarify, while I should not speak for Castle here, I think its innocuous to state that in my conversations with Castle, he and I are in general agreement that MA has no serious *performance* problems as the devs define performance, but it does not achieve the design goals for which it is intended. This is something I've said many times in the past. That problem either requires that the design goal of MA be changed, or MA be changed to meet it. But as it is not a performance issue, its also a consistently low priority for the devs to revisit. Within that context, for years I have advocated one of a number of things. I used to advocate MA's design intent be changed to include high single target output. But that is an increasingly difficult metric to achieve given other changes to the game. Alternatively, a better set of secondary effects makes more sense these days, and I actually haven't advocated major damage increases for MA in over two years. I did advocate a minor one when the animation changes that BaB put in place increased the damage of practically every other set to the point where MA was not even competitive on single target, even though BaB's changes improved MA's single target damage somewhat - it improved almost everyone else's more.
  12. Quote:
    Originally Posted by Techbot Alpha View Post
    I only object to things that I can look at objectively and see a flaw, however slight with. In my opinion (note, Opinion. I never said it was fact, and if I did 'It was late, I was tired' comes into it) the TO-DO-SO system is a bit dated.
    I don't think its dated, exactly. I think it was always broken that they expire. I know why they expire: because its gear, and in an MMO gear has to wear out, expire, break, or otherwise force you to buy better gear. But if you ignore that presumption, there's actually no reason for enhancements to expire in City of Heroes.

    However, if the devs were to wipe them out and replace them with common IOs, I wouldn't complain.
  13. Quote:
    Originally Posted by LostHalo View Post
    An inherent problem I see with your suggestion here is that it seems to over-complicate itself quickly, which doesn't bode well for it as a solid solution. As much as I hate the general dev methodology of blanket solutions to problems, this is a valid place for it. Just a couple of ideas I idly suggest would be some sort of "momentum" endurance discount (more actions taken over time, the larger endurance discount granted) or "breather" end. recovery bonus (more time idle, the faster your recovery). Neither are really viable but suffer less issues with needing special exceptions thrown in all over to yours and the thread's suggestions. Of course, this is all under the assumption there is a problem.
    Suppose we discover, after very careful analysis and testing, that the blanket solution does different and undesirable things to controllers relative to defenders, say, or dominators relative to masterminds. Or even energy manipulation relative to devices. The problem with "blanket" solutions is that its virtually *never* true that you ever get lucky enough that the blanket just happens to cover exactly what you want to cover, and nothing you don't. It just never happens.

    You use blanket solutions not to find a solution to a set of problems, but rather to *impose* a rule that resolves a problem by fiat.

    When pervasive criticals were given to scrapper primaries, that didn't precisely affect all of those primaries in exactly the same way. Some got better burst damage than others for example. Some had different issues with AoE than others. But criticals were not specifically intended to produce the exact, precise, absolutely numerically identical effect across all scrapper primaries. Criticals became the new paradigm for scrapper primaries, and under that new paradigm different sets behaved in slightly different ways, but ways that the new paradigm endorsed as intentional.

    In a sense, my idea (I wouldn't quite call it a proposal yet, because it isn't fully formed yet) is also a blanket solution in that it declares a global change within a specific domain: it suggests applying a discount (and not "changing the formula" which is something else entirely: Widows and Claws change the formula) to all of the first three available powers. But the difference between that idea and the idea to apply global endurance and/or recharge discounts across the board is that its tweakable between archetypes and between powersets if it turns out that is necessary. Global level-based buffs are not. But separate from that, I believe (but can't prove) that, perhaps with minor adjustments, the low tier power change I mention could be the new paradigm for powerset design. A global low level buff could also be one, but in that case its a paradigm I don't endorse. Because I don't, from my perspective it does not have less issues: it has a lot more - because it creates consequences I don't always agree with. You may disagree, of course.

    You bring up the separate issue of whether there actually is a problem to solve. I'm not certain there is one either. But I'm not thinking about this in terms of whether there is a literal problem to solve. I'm thinking about it more in terms of whether the current game design, after taking a step back and seeing where it is now verses where it was at launch, makes more sense with the discounts I'm proposing than not.


    There are lots of things in the game for which its difficult to prove there is a "problem" in the strict sense of the word, but clearly there is a sense that something somewhere is wrong. Take DPE. At one time, prior to ED, the different archetypes had different DPE factors in scale damage per endurance point built into the attack power balance formulas. Now they all have the same one: 5.2 end per damage scale point. Was it wrong before, and correct now? Was it correct before, and wrong now? Is it even possible that it doesn't actually matter? That seems unlikely.

    Besides this low-tier idea, there's another one that I've been kicking around for a very long time that I think is related to it, although independent of it. When the devs want something to have "more offense" they increase its damage modifiers. Doing so increases the dps output of that archetype. If they want it to have less, they reduce that modifier. That reduces the dps output of that archetype. But it *also* increases the *cost* of that offense in endurance. A conjecture I believe to be true but cannot prove is that the reduction of DPE that is coupled with the reduction in DPS is unintentional relative to the balance guidelines the game attempts to follow (even if it is otherwise intended by the dev that makes the change).

    Does that mean there is a problem? That's harder to prove. But I believe the game would function better if that issue was untangled, since I believe its a hidden inconsistency in the game.

    I think the low tier powers contain a separate, but much more subtle design inconsistency. But I'm not fully prepared to make an air-tight case for that yet. I'm just mentioning that I'm thinking about it.
  14. Quote:
    Originally Posted by Johnny_Butane View Post
    The "blockiness" of those models is due to the smoothing groups being lost. That's a minor thing to fix with a 3d program. Not a big issue. Stuff like the angled point the skirt makes in the Grief emote shot, that's trickier but no big deal.

    Looking at the character with the tail, modeling work would have to be done to the hair and glove fins for sure. The tail I think would cause a lot of problems when it came time to print. Anything long and thin like that is going to have to be thickened up, and is still likely to snap off when an artist is painting the thing.

    The character with the skirt, again the hair would need work and most likely the inside of the skirt would have to be filled in and the bottom edge chamfered to give it thickness.
    We might get lucky on some of this, although not all of it. The right set of Catmull-Clark subsurf parameters and iterations would smooth out the mesh and also eliminate some of the finer (and impossible to print) details like in hair:


    (The slightly washed out look on the chest of the model is due to the fact that I had no patience for the light source positioning in Blender and just added a bunch of suns in the sky).

    At the moment, I'm less interested in what could be done by hand, and more interested in what could be done in a relatively automated fashion. Fortunately, I can't do anything by hand, so I'm the perfect human analog to a batch script. That model is almost printable (the skirt is probably still too thin, but the point you mentioned is largely smoothed out now) and it could be generated automatically. Its still a) half-naked, b) has details that would almost certainly break off or not print (glasses), and c) has no color, but you could almost imagine that being a lead figurine.

    Hypothetically, of course.


    (Those mitten hands, though, make expressive gesticulation poses just a tad distracting.)
  15. Quote:
    Originally Posted by Techbot Alpha View Post
    Beleive me, Brand, I've taken all possibles into consideration.
    And frankly, before 3 SO slotted Stamina, MA/SR is the endurance wh0re from hell. Yes, it does high damage. But it brains end like no other combo I've played. Give it a shot, and see if I really am doing something stupidly wrong.
    Ah, this never gets old. Is it possible to run MA/SR without stamina? Well, yes and no:

    MA/SR, no stamina vs two Death Mages

    MA/SR, no stamina, deliberately trying to run out of endurance in Portal courtyard

    That MA/SR build has all the toggles, continuously runs combat jumping, and in the second video runs sprint as well, and still has great offense (for its time - its basically ED soft-capped on damage on all attacks). And it uses Aid Self, because its running around with only 34% defense (which, in I8, was better than average for SR).


    On the one hand, these are circa I8ish vids pre-inventions. That means not just no stamina, but no recovery IOs of any kind. And only 30-something defense.

    On the other hand, the build does have more HOs than ... well, I won't go there.

    On the third hand, any invention build even with common IOs could probably get close to this, and with the cheap set IOs can probably surpass what HOs could do then. This build would cost nothing to make today, and could be exceeded by a wide margin with pocket change.

    On the fourth hand, it is a level 50 build. And it does, eventually, run out of endurance.

    On the prehensile tail, this is higher performance than the game is balanced for, and probably higher performance than the average MA/SR player has today, even with much better options at their disposal *and* stamina.


    Basically, I made these videos a few years back when the whole "everyone needs stamina" thing was a common forum occurance. The videos are not intended to show that it *is* or *is not* necessary, because "necessary" is a relative term. I made them to show what is *possible* with the powerset combination many think is among the worst endurance burn offenders. If those videos are "reasonable" performance, I don't think you need stamina (at least past the 20s when SOs become available and you can slot heavily for endred, or easy-to-get but better-than-SO invention sets). If you *need* far better performance than you probably *need* stamina as well.

    The larger point is that it seems whenever people talk about the need for stamina, or endurance management in general, they are not speaking within the same context. What is "underpowered and unacceptable" to one player is often "wow, I didn't even know you could do even that" to another player. Its rare when anyone that discusses or analyzes the endurance management situation in the game tries to illustrate what they mean by "acceptable" and "unacceptable" performance, except in small corner cases. I was trying to see where people fall in the "this is acceptable/this is unacceptable" range with an objective data point or two (or three).


    You know, I actually miss that build. I made it just for fun to test the theory, and ended up playing it well past I9 even when better slotting options became available (I didn't change the build until I think I11 on live, although I did have lots of better builds on test).
  16. Quote:
    Originally Posted by Samuel_Tow View Post
    On the note of "Windows knows best," no. Just no. Windows does very much NOT know best. Windows, especially Windows 7, is built under the assumption that I'm some kind of knuckle-dragging idiot who's liable to poke his eye on the sharp corners of a drop-down menu, so all of those are hidden and replaced with large, colourful icons to hold my child-like, limited attention span focused. I'm not supposed to know how my computer operates because I'm too dumb to know what to do with that knowledge, so all my system control functions are buried under a mountain of menus. I still remember having to download a service pack "for IT specialists," which so hideously complicated... That it was a self-extracting self-installer I had to run and do nothing else.

    As such, I have precisely ZERO trust for Microsoft's ability to predict what decent settings are or what's good enough for me. In the case of CPU parking, the idea is for the technology to conserve power. Well, fat load of good this does me on a rig with a beefy power unit and constantly hooked up to a UPS unit. No offence, but if I wanted to save the planet, there are plenty of OTHER things I could do. I pay for my power consumption, I'm not on a limited-span battery, so who gives a crap about a marginal gain in power efficiency? If it's not dangerous to the hardware and doesn't increase equipment fatigue, then I very much do not care.

    That said, I'm still going to go listen to you guys and not mess with it. Having a switch I can flip if I so desired is one thing, and I wouldn't have even made a point about it, but when I have to dig into my registry to alter hardware driver flags... That's not something I'm going to do just for no reason, even if it irritates me that something like this is done for my "benefit." I realise it probably won't help speed City of Heroes up, and yes, I realise that's just four physical cores with two logical cores each (hence why I said eight logical cores total), but I'd still disable this if a less invasive option were introduced for it.

    So I guess the consensus is to leave it alone, then?
    On the subject of CPU parking. The purpose to parking isn't really to save power. Its actually a method whereby Windows 7 can detect which of the 8 pseudocores the i7 presents to the OS are really physical cores, and which are hyperthreaded cores. Windows 7 then "parks" the hyperthread cores which tells the Windows 7 scheduler not to use them unless the first four physical cores get more or less maxed out.

    This is almost *always* the right decision. Hyperthreading works differently for P4s and the newer Core-iXs, but there are certain gross similarities. In particular, ultimately there aren't two complete cores there: there is one core in which not all of the resources are being used simultaneously.

    To a first order approximation, think of a hyperthreaded core as one full core, and a partial core made up of the left-over bits of the full core. Obviously, as the full core runs the left-over bits change dynamically over time, so the left-over core gets more and less powerful over time.

    You could sort of think of the two cores as one core running at 2.8 Ghz (say) and the other running at a variable clock rate that wildly swings around between maybe 0.5 Ghz and 1.4 Ghz.

    So your i7 has four 2.8 Ghz cores, and four cores with wildly variable and slower clocks. Which do you use first? Is it *ever* a good idea to use one of the left-over cores if any of the full cores are sitting idle? Generally not.

    And it gets worse. Using a left-over core can, sometimes, temporarily slow down the full core. If the left-over core grabs a resource and doesn't give it back fast enough, the full core can actually slow down waiting for those resources to come back.

    (That's not how it really works, but its a useful mental model: on Nehalems to the best of my knowledge hyperthreads are time-slice multiplexed mapped onto the shared core resources, which means the threads share resources: there might be a priority execution thread but its not quite so binary).

    So if you have four full cores running at, say, 50% utilization each, and four hyperthreaded cores unused, it is probably still better to schedule a new thread that needs 10% of a core onto one of the "full" cores, even though they are 50% loaded and the hyperthread cores are completely empty.

    Vista is *not* hyperthread-aware, and will schedule all eight cores completely randomly, without regard to which pairs of pseudo cores are actually sharing a single physical core. So out of eight cores, Vista might use four, and accidentally use both both threads from a single core, using only two of the four physical cores and leaving the other two empty. Or it might try to spread the load out among all eight cores, causing four of the cores to actually slow down the other four cores in the process.

    In this case, there's not too much brains necessary to pick the right thing to do, except in weird corner cases. The correct thing to do is usually to execute one execution thread per core until you start to run out of CPU, then switch to running two execution threads per core with hyperthreading. It almost never makes sense to take a relatively low utilization CPU and try to split up its load into hyperthreads.

    (The practical difference between P4 hyperthreading and Core-iX hyperthreading as I understand it is that because the P4 was sharing a superpipeline, hyperthreading penalties could be high in many cases. That's less likely to happen in the Core-iX architecture, which is not superpipelined and is sharing wide resources, not deep ones.)
  17. Quote:
    Originally Posted by Kheldarn View Post
    Most of those are Urban Legends...
    Two things I've personally seen:

    1. I've never seen anyone ask about the "any key." But I *have* seen people ask where the "space bar" is (think about it).

    2. I know someone who had to actually fix the problem where somehow a screencap of a program end up being their desktop background, and they can't figure out why the program isn't working.

    Also, I've never met someone who thought the CD drive was a cup holder, but I did meet someone once who had three CD drives installed in their machine, because, and I'm not making this up, he owned three programs that came on disc. Each program CD was permanently mounted on its own drive.
  18. Quote:
    Originally Posted by Eek a Mouse View Post
    Is that, this?

    "Experience: You can now handpick the powers for custom enemies and still receive experience for them (within game balance limits)."
    Its this:

    Quote:
    We are making adjustments to allow players to handpick the powers for custom enemies and still receive experience for them, but not be able to exploit weaker enemies for unfair experience gain. Now, every power for custom critters has been weighted individually.
    Coulomb2 has an excellent write up of the basics of the system in the second post of that thread.
  19. Quote:
    Originally Posted by Umbral View Post
    The assumption that all powers would behave in the same manner is fundamentally flawed because of this.
    Actually, I don't make that assumption. What I said, in fact, is that the change does basically what I want it to do. For example, take one of the more extreme cases of Storm Kick, one of the best tier 2 melee attacks around. In cutting the recharge and endurance of that power, you're assuming that I think that will have no effect at all on an MA past level 20. I'm not assuming that. Rather I believe that, for every powerset that I've examined (and I haven't looked at them all yet, which is what I meant when I said I hadn't fully done all the legwork yet), the change is something whose effects I can live with.

    Storm Kick doesn't actually follow the rules *now*. It costs too much endurance for its base damage, and conversely it does more critical damage than any other tier 2 scrapper attack. Its allowed to break the rules for a reason. Before you conclude that my change would break Storm Kick unacceptably, you should consider why Storm Kick is allowed to break the rules now.

    I'm actually more concerned about Blind than Storm Kick, but as I said, I haven't closed the loop on all of the special cases yet.
  20. Quote:
    Originally Posted by steveb View Post
    CPUs don't run faster than GPUs. GPUs process their information significantly faster than CPUs and frequently have to wait on their instruction sets from the CPU. Creating video graphics essentially comes down to massive number crunching, which GPUs do much more efficiently than CPUs. This is why a mid level GPU whose core clock is at 500MHz can run ten times the number of Folding@Home calculations in a day that a high level CPU with a clock speed of 3GHz can, and why the same mid level GPU can transcode video in far less time than that same high level CPU.
    That's not relevant to what I said. CPUs don't *do* the same things GPUs do: its entirely possible for a CPU to be able to do the work a game demands of it much faster than the GPU of the same computer to be able to do the work the game demands of it, frame by frame. When that happens, the CPU is outpacing the GPU, and a better video card would speed you up: you're GPU bound in that case, not CPU bound.

    Core clock is completely worthless to compare between standard CPUs and GPUs, because current GPUs are basically SIMDs these days.

    But that has nothing to do with whether the CPU or the GPU is the bottleneck in your performance.


    Quote:
    The laws of physics are really quite simple here. Regardless of how saturated a processing thread is, calculations being run at 3GHz are happening faster than calculations being run at 2GHz, and thus instruction sets get transferred from the CPU to the GPU faster. The more time that the GPU is processing instruction sets from the CPU, the higher a game's frame rate is. The more time that a GPU spends waiting on instruction sets, the lower that the frame rate is.

    GTA 4 is the perfect example here as it is a heavily CPU demanding game: a Core i7 920 and GeForce GTX 285, both at stock clocks, will run the game with maxed out settings at 1920x1200 at an average frame rate of roughly 45-50 frames per second. The game in fact will never saturate any thread more than 20 to 30%, and will only run two on threads. At a clock speed of 3.4GHz, the flood gates break open and the frame rate will jump up to a consistent 60 FPS (never tried it without Vsync, so I'm not sure what the max is). Until that processing speed is reached by the CPU, the GPU spends the equivalent of 10 to 15 frames per second waiting on instructions from the CPU, meaning that the CPU is the bottleneck, despite being nowhere near full load. The exact FPS gained in any given game is dependent upon the CPU and GPU set up, but the principle is consistent: the faster that the GPU receives its instructions from the CPU, the higher the frames per second will climb.

    Again, this comes down to the difference between a workstation environment and a gaming environment.
    Pointers to the benchmarks, please.


    Quote:
    A workstation environment is about maximum productivity. Speed, while far from unimportant, is secondary to productivity. Having as many threads as possible as saturated as possible is a good thing: the more threads a CPU can run at a stable clock speed with healthy temperatures at maximum load, the better the workstation is considered to be performing.

    A gaming computer is counter productive. On a gaming computer the only thing that is important is speed. Multi-tasking is not only not a concern, but counter-productive to being counter-productive. The CPU has to be blazing fast in order to keep up with the GPU, and that is the only thing that matters. If a single frame per second is lost due to the CPU lagging behind the GPU, then the system is under performing.
    I'm not unfamiliar with the performance issues involved. However, if you're telling me any game can be "CPU intensive," only put 20% load on each of two cores, and be the performance bottleneck holding back GPU computations, I'm afraid I will need a lot more situational information and performance numbers before I accept that as anything other than an aberration.

    This is so counter to experience, actually, that I think I'm going to have to borrow a copy of GTA 4 just to analyze its performance. If its doing what you are saying its doing, its worth ripping apart just to find out why. You should never, ever, ever be CPU bound at only 20% utilization of a single core, multiplied by however many cores you have. You really shouldn't see major CPU issues until at least one of those cores gets up above 75%-80% utilization at least, unless another bottleneck is reached (memory IO, for example) or you have incredibly crap code.
  21. Quote:
    Originally Posted by Eek a Mouse View Post
    Custom experience system?!?
    I'm pretty sure he meant custom critter experience system.
  22. Quote:
    Originally Posted by TonyV View Post
    I'm not familiar with the WoW 3D printing company, but the impression I get is that it is based as much on art as it is on technology. Someone who is truly familiar with doing 3D prototyping would be able to take models such as the ones generated above and turn them into true 3D prototypes, plus you'd need a competent painter to paint and finish them.

    This also explains why it is probably not an endeavor that Paragon Studios/NCsoft would want to undertake. There is a HUGE difference between 3D rendering for game design and 3D prototyping, and if they wanted to get into this business, they'd definitely have to invest a lot of money into buying expensive hardware, software, and experienced 3D prototyping gurus. In short, it would almost certainly have to be a completely different business unit, one they probably don't want to get into as a gaming company.

    It also explains why the models are so expensive. Each one is a project unto itself, and it takes many hours of painstaking work to generate the finished product.

    Now having said all of that, if someone has a lot of experience with 3D prototyping, or has a pile of cash and a lot of time and is willing to invest in learning it, it most certainly does seem to me to be a viable business model (no pun intended)--but only if Paragon Studios/NCsoft is willing to turn loose of some of the licensing rights, something I'm working on getting them to do for me as well. But it really would take a smart cookie: part computer whiz, huge chunk of 3D CAD guru, part sculptor, part artist. (Or someone willing to invest that pile of cash into hiring them and who has the leadership to get them coordinated.)
    I was wondering about the technical complexities of converting CoH character models into actual printable ones, and I think most of them are aesthetic ones, not genuinely technical ones. There is an interview with Ed Fries who founded the Figure Prints company that makes the WoW models, and he basically confirmed my suspicions in that interview that:

    1. Base MMO models *can* be printed in theory (he didn't even have special access to them when he started, just access not too different from what I currently have - or I should say what I hypothetically would have of course ).

    2. They just tend to be very blocky and not as cool-looking if you print them that way.

    3. Tessellation and remeshing are the key to making better models, and the process can theoretically be automated: models aren't literally hand-crafted to look good (at least in a CAD-sense).

    4. Most of the time-consuming labor for such a process seems to be actually process fabrication-based and due to the specific powder-print technology they use. Not the modeling or even the initial printing.


    Quote:
    For what it's worth, I consider Arcanaville a really smart cookie, and that ain't so hypothetical. If I made a list of top contenders who I thought could pull it off based solely of what I know about them on these forums, she'd probably be at the top of it.
    I'm flattered, but I'm not a trained 3D modeler or 3D fabricator. Eventually, I think I could figure out how to make *one*. I might be able to figure out how anyone else could make *one* themselves. But there are probably higher resolution, color, and quality standards that a company like NCSoft might want to adhere to if they were to do something like this as a mass-production business, and its there where my expertise runs out.

    Keep in mind: I've leap-frogged past three years of Ogle attempts at getting CoX to work, but still haven't figured out how to wrap my face around my head correctly. Knowledge gaps like that are going to slow me down a tad. I'm actually going to jump for joy when my hands are the right color. I don't think they get high-fives for that in the art department of Paragon Studios.


    The one thing this hypothetical exercise has convinced me about so far is that if a company like Figure Prints wanted to make CoX figures, City of Heroes is quite capable of giving such a company sufficient information to enable them to do it, in a way they could use effectively. It would only require a relatively small amount of code to do it (code that at one time I believe existed in the actual game itself). The practical and commercialization problems are beyond my pay grade to figure out.
  23. Quote:
    Originally Posted by steveb View Post
    In a workstation environment, such as what you run, you're absolutely right. But on a pure high end gaming machine, assuming that your hard drive is not bottlenecking you first, you want to get the load off your CPU as quickly as possible and onto your GPU(s) (technically, you want to bottleneck the monitor with more frames that it can display). Even if a thread is not fully saturated, moving the load to the GPU as quickly as possible allows the GPU to render frames at it's maximum capacity.
    I'm afraid you lost me here. Once the CPU is outpacing the GPU, going faster can't help by more than a tiny fraction, because the vast majority of workloads including games don't require synchronous computing between the CPU and the GPU for most of their work. In fact, I can simulate the reverse and arbitrarily load down a CPU while a game is running, and so long as the game wasn't using much of the CPU to begin with, and the extra load doesn't get very close to maximum CPU load (and the extra load doesn't trigger a different bottleneck like disk or network) the frame rate stays basically constant in every case I've ever tested under.
  24. Quote:
    Originally Posted by Father Xmas View Post
    It's Turbo Boost and not Hyperthreading that needs to be shut down for maximum OC of an i5/i7.
    Separate from what I've read, I've been told by people who experiment with this that if you intend to saturate all active cores with as much work as possible disabling hyperthreading can reduce the heat load per core by enough to allow you to increase the per core clock rate. I haven't tested that myself, but the theory seemed logical given the way hyperthreading works on the new Core iX processors relative to P4-style hyperthreading.

    Although its possible disabling hyperthreading does absolutely nothing to allow you to clock the individual cores faster, it almost defies the laws of physics unless you're running into an uncore bottleneck that makes the advantage moot. I've also heard some rumblings that disabling hyperthreading doesn't just disable the second thread on the core, it disables other things that might impact the performance of the core itself. But that's unsubstantiated rumor as far as I know.