azile

Informant
  • Posts

    12
  • Joined

  1. The Hoozdat Award Person Who is seen, but unknown to most
    dead

    The Obnoxious Award
    Infinite Zero

    The 'I don't see em' Award Person who you think is giginored the most
    Xyce

    The Bound and Gagged Award Person who is silenced the most in 2.0
    Elmist

    Mod of the Year
    Taryn

    Funniest Triumpher
    TamakiRevolution

    Most Loveable Triumpher
    Mintmiki

    Favorite person to put in a Task Force
    Leo

    Best Player in Triumph
    Vinezapper
  2. azile

    CoP reward buff

    Quote:
    Originally Posted by Shadowe View Post
    Haven't read everything posted in this thread, but just wondering if it's worth me pointing out that the Merit Rewards for every task in the game that awards them has what I have previously referred to as a "jigger factor" to account for the developers' perception of the difficulty of the task?

    So, the current "time metric" for calculating merit rewards isn't a simple "1 merit per 5 minutes mean average to complete task" or whatever. It's actually that "plus or minus developer-set difficulty bonus/penalty merits".

    Now, the calls for a time-independent method of determining merit rewards gets two big thumbs up from me, and here's my thoughts on factors that could be included:

    Number of players required to start
    Number of players who start
    Number of players who complete
    Mob-type (we all know that KoA and Malta are a pain in the proverbial, and a hypothetical TF that uses them would be a lot harder than one that uses Council)
    Map size
    Number of Objectives required per mission
    Number of AV's (possibly with a special factor that includes a modifier for the particular AV's in question - after all, Reichsmann is tougher to beat than Dr Vahzilok)

    Those are all good, solid, calculable things (for mob type, their XP modifier is probably the most suitable number to use) that can be datamined or already exist in the game. The sums of the factors involved then would be used to calculate an overall modifier that would be used as a multiplier for merit rewards.

    Anyway, just thinking aloud (well, thinking through my fingers).
    I disagree that time should be completely taken out of the equation, but the current system is highly subject to the Ecological Fallacy. The more factors used to determine the rewards, the more robust the system as a whole. Complex problems often require complex solutions. Using simple solutions means that the problem has been simplified by making assumptions that may or may not be valid.

    In addition to the what's mentioned elsewhere in this thread, I'd love to see the reward formula also take into account:

    The number of task forces run on a monthly basis.
    The total number of merits intended to be injected into the economy on a large timescale (measured in months).
    Other global economic metrics.

    I doubt this approach would be feasible without using relatively complex machine learning. If I were to tackle the reward problem, I'd train a Bayesian Network to estimate rewards based the data given by all these factors (and more), then update the reward table with each new issue release. However, programmer time would probably be better spent doing other things.

    Even under the current reward scheme, reward values need to be looked at again. There have been some systems which dramatically changed the amount of time it takes to complete a task since the last time the task reward table was updated. Specifically, super side kicking made a huge difference. I can imagine the inherent fitness changes having a similar effect - especially for low level activities.
  3. azile

    CoP reward buff

    Quote:
    Originally Posted by White Hot Flash View Post
    Because there's all kind of reasons why a TF isn't completed that have nothing to do with its difficulty. If you consider how many people finish in your metric for reward, your numbers will be skewed in a different and undesired direction. Completion rate can also be skewed by looking at the same character finishing multiple times. That doesn't mean the task is hard or easy. It could simply mean that character either has a trait/powerset that just trumps what the TF throws at it, or the player has learned the trick of the TF and prefers to only play that character through it. Neither one fairly looks at whether the reward correlates to the task.
    By completion rate, I mean specifically: number of task force x completed / number of task force x started. There's precedent for this as well, the devs have used this metric in the past to balance difficulty. Off the top of my head, they talked about the completion rate of the hero respec trial, and ended up toning down the difficulty shortly after it was introduced. Similar comments were made when the Heroes on the final mission of the Lord Recluse Strike Force were lowered to level 53. You'd also want to throw out outliers in the time-to-complete, and time-to-quit-without-completing when figuring out the success/failure rate. But again, this is data mining the devs have done in the past.

    I'd expect the completion rate to be very close to 1 for most tasks (with outliers thrown out), and slightly lower for the "hard" stuff like CoP, RSF, STF, and so forth. I don't think this should be the only metric, I just think it is one that should be considered in the mechanism for determining rewards. There's others, as well: Organization time for events like CoP and Hamidon. A suggestion here would be to factor in number of people required to complete.

    It's worth noting that *any* reward metric will be skewed by power gamers. Thus, it's important to choose as robust a metric as possible.
  4. azile

    CoP reward buff

    One thing I've been wondering - why doesn't the reward merit formula take into account the completion rate? Being based purely on completion time makes "hard" tasks like the CoP trial, Recluse SF, etc, seem undervalued. If rewards were be scaled on average completion rate, as well as time to complete, it might help towards alleviating this issue.
  5. Quote:
    Originally Posted by UberGuy View Post
    Doing this at all creates power creep unless it comes pre-loaded with a nerf. Making this part of the base rates would be a nerf to everyone who has set bonuses, uniques or procs in Health and Stamina, a wash to anyone who has nothing but SOs or commons in them, and a buff to everyone with +regen or +recovery powers.

    The question is which way creates the least power creep, or the most acceptable form of creep, hopefully while nerfing the fewest people. I think the option that does that best is probably the one they have chosen.
    My initial post in this thread was to Castle, asking for the reasoning behind this change (instead of what I saw as another option, or doing nothing). With fitness becoming inherent, there's no change to offset the global buff every toon is getting (either immediately, or on a respec). We're getting power creep, and I am wondering why.

    As I pointed out earlier, current power numbers would have to be re-examined if the base rates were changed, and that is a lot of work. And you're right, it would be a nerf to high end builds unless set bonuses were looked at as well. It's probably not a viable alternative at all, really.

    The assumption in the majority of this thread seems to be that low level endurance recovery is really the issue being addressed here. But then there's an even simpler solution: just give Beginner's Luck an scaling endurance recovery buff, similar to it's accuracy buff.
  6. Quote:
    Originally Posted by reiella View Post
    Ah, I was responding to a different logic then. I was assuming you meant still removing the power in question, simply making it's effects baseline. More so for that, to maintain the same economy. More so the second item I listed initially. They don't want to just buff Recovery 0.83%/s to everyone. Because frankly, [most] everyone would feel the need to take Recovery. Consider that before I6, the population still typically 6 slotted Stamina.

    This method lets them maintain the same endurance economy at the top end [and the 'general' end], at the expense of increasing it at the low end.
    Years ago (around i1 or i2) I had this same discussion with Statesman, when I felt that stamina should have become an inherent. It feels weird to be using his argument against it. But again, this approach causes power creep (even if it's simpler to implement).
  7. Quote:
    Originally Posted by reiella View Post
    Exactly, by making it inherent, you no longer have the option to slot enhancements in it.

    [ edit for clarity ]
    By making the modifications to the values themselves, you remove the options of slotting it, and placing more slots in the power.
    And if you wanted the option to slot it, you could take the power, which would still be in a pool. What would be a much simpler solution is a global movement/regeneration/recovery rate buff, along with a nerf to fitness.

    Don't get me wrong, I'm not against more inherent movement/regen/recovery. My question is why was this approach chosen.

    EDIT:
    Not just a change to fitness, but all other regeneration/recovery/movement powers to keep their power level the same.

    So, it requires fewer spreadsheet changes to just make fitness inherent. This makes sense. However, it's causing power creep, and power creep leads to nerfs.
  8. Quote:
    Originally Posted by reiella View Post
    Versatility in build forms, the other mentioned IO-slot abilities.

    And to keep from changing the economy of it with potential double-dipping [which admittedly can be handled another way].
    We've been told over and over again that the game isn't balanced around IOs.

    Also, build form versatility which is "added" by *removing* a power pool instead of reducing it's necessity and/or effectiveness, making it less desirable to take? That's akin to saying "I gain more options by removing one."
  9. Quote:
    Originally Posted by Castle View Post
    Here's the only catch:

    You'll have to respec to take advantage of it on existing characters.

    Characters as they exist now, will only get the Inherent Fitness powers if they do not have any of them in their build. Conversely, if you don't you get them as Inherents and never get the option to select them as pool powers.
    Castle, any chance you can shed some light on why this choice was made as opposed to simply raising the base values for recovery/regeneration/movement?
  10. azile

    PVP IO Exchange

    I have a glad armor 3% blueside, PM or global message with bids.

    EDIT: Sold
  11. Pentad/septad only requires only two people, so long as they're both of the same AT. Also, Swiss draw tournaments are currently bugged, and impossible to start.
  12. I don't post on the boards often, but this seemed to warrant some commentary. Also, I'm a lot more familiar with OpenGL than Direct3D, although I do have experience with both.

    Quote:
    Originally Posted by je_saist View Post
    The theory behind OpenGL support is that each successive API includes fallbacks for the older API's. So if you write an application that uses the Tessellation from OpenGL 3.2, theory states that if the OpenGL driver finds your hardware does not support Tessellation, it will render the scene without tessellation. You should still have the same basic polygon / structure build though.

    There's several threads, like this one, over on OpenGL about the fallback rendering paths.
    I go into a bit more detail below, but this isn't accurate. What's happening in this particular example is that the programmer has to explcitly check for two pieces of hardware functionality: Programmable Shaders (GLSL), and Multitexturing. If they don't exist, he then turns off a particular portion of the render path. The driver does not handle this task automatically. Attempting to call the functions where the hardware does not support them causes an error. Some of us have seen this in COH. If we try to run it on a machine with default drivers or a lousy video card, a message pops up to the effect of: Unsupported Extension: GL_ARB_MULTITEXTURE.

    Quote:
    Originally Posted by je_saist View Post
    It wasn't till 2007 that things began to change for OpenGL, with the Khronos group basically doing what 3DFX had done years before, and making a list of gaming specific commands from the full OpenGL API: http://www.opengl.org/pipeline/article/vol004_2/

    It's expected that 3DFX's legacy of hand selecting gaming specific calls will continue with the expected OpenGL 3.0 ES or 3.2 ES specifications. Rumor has it that Activision Blizzard, EA, and Transgaming had quite a bit of input on what's expected to be the next short list of OpenGL 3.x gaming specific calls.
    OpenGL ES (Embedded Systems) is not a set of gaming-specific calls. It is a subset of OpenGL that is used for low-powered devices: cell phones, mp3 players, portable gaming devices, refrigerators, and other embedded systems. It is used for games on cell phones / portable consoles, but the major use is actually for user interfaces. It's not intended for PC games, or high powered consoles like the PS3 / Xbox 360, where a more full-featured 3D API would be appropriate.

    Quote:
    Originally Posted by je_saist View Post
    Other than Microsoft just loves to be in control of everything?
    My opinion is that Microsoft does utitilize DirectX as leverage on game developers. As far as I'm aware, and anybody whose actually more familiar with the DirectX API can answer this... previous API specifications aren't always implemented in the Current implementation.

    With OpenGL, the fallback rendering path is supposed to be part of the OpenGL driver. The idea is that if you make API calls that the hardware does not support, OpenGL just does not run those calls, but still builds the scene anyways. There are some API calls that are deprecated: http://www.gremedy.com/tutorial/_add...sarydeprecated :: Although this is how I understand the fallback process, that doesn't mean that I'm right here, or that this is how it actually winds up working. Somebody who actually has experience writing to the OpenGL API's is better qualified to speak on how the fallbacks actually work.

    With DirectX... the memory that sticks in my head comes from Half Life 2. At the time Half Life 2 was launched Valve software said something about having to maintain a separate rendering path for DirectX 9 support, DirectX 8.1 support, and DirectX 7 support. They couldn't just write one coding path, and let the driver / underlying system figure out what to display / what not to display.
    Since DirectX puts an additional burden on graphics developers, there is a financial limit on just how much work can go into a project that will give a return. As OpenGL shows, the graphics API is not as integrated as Microsoft would like to have everybody believe. The Graphics API of DirectX 10 was developed as an update for WindowsXP to begin with, something Microsoft doesn't really like to talk about.

    The... implication... is that Microsoft is using DirectX to force publishers into a hard spot. Either the publisher okays funding for coders to work the hours needed to maintain and support separate rendering paths... or... they don't. Microsoft's pressure on the publisher is what winds up putting pressure on the consumer.
    The Direct3D API does change from version to version, but the changes from D3D 9 to 10 were not that different than the changes between OpenGL 2.1 to 3, depending on how you define your OpenGL rendering context. Some function calls are added, some are changed, and others are dropped. If you are going to support D3D 9 and 10, you'll need seperate render paths for each, as they have incompatible rendering contexts. If you generate a D3D 9 context, you need to use D3D 9 calls, and the same is true of D3D 10. Performing a D3D 9 Call on a D3D 10 context will fail.

    OpenGL 3 has a similar mechanism: if you define an OpenGL context to be "forward-looking," you set the minimum version of compatibility - say 3.1, and you're strictly forced to comply with OpenGL 3.1+ function calls. For instance - because the fixed function pipeline was deprecated in 3.0, and removed in 3.1, certain basic calls to it like glTranslate() will throw an error. There is also a backwards compatible context, where deprecated calls aren't removed. However, the interaction between some deprecated OpenGL calls and the new OpenGL 3 calls is undefined. Thus, if you opt to use both the deprecated and new features, the action that will be taken is whatever the driver feels like doing, which can be the right thing, but is not guarenteed to be. Because of this, it can preferred to have a forward-looking OpenGL 3 context, especially with new code, or heavily optimized code. So you wind up with two seperate render paths with OpenGL (one for 2.1, another for 3, if you are planning on supporting both), much like you would with different versions of Direct3D. Personally, I'm curious what route the devs choose for Ultra Mode.

    The deprecation in OpenGL 3 is long overdue. The Fixed Function pipeline has been around since the late 80s (OpenGL's been around since '92), and it's not indicative of how things are rendered anymore. Initially, The new features of OpenGL 3 were not going to have explicit backwards compatibility with 2.1 (Google "OpenGL Longs Peak" if interested in the history), but ultimately the Architecture Review Board decided to maintain backwards compatability, at the cost of performance. The majority of fixed function pipeline calls have a significant performance hit for rendering scenes with a large number of vertices.

    Ultimately, OpenGL and Direct3D have different design philosophies. Like all APIs, they are slaves to the hardware. Which is "better" is really a matter of what the task at hand is.

    Quote:
    Originally Posted by je_saist View Post
    There's no technical reason that I'm aware of that Microsoft cannot implement DX10 and DX11 atop Windows Xp. That OpenGL can render the exact same scene with the exact same image quality pretty much torpedoes arguments that the graphics API won't work at all.
    I managed to ask a Microsoft developer at Siggraph about this a couple years ago right after Vista was released. His claim was that the reasoning behind not porting DX10 (really Direct3D 10) to XP had to do with a design decision made on integrating it with the Windows Vista Display Driver Model, which would also need to be ported, and doing that was supposedly deemed to be too much of an ABI change for a production OS. I've not actually been able to verify this anywhere else, so the guy could have been making this up. However, it did seem to be a reasonable explanation.

    Quote:
    Originally Posted by je_saist View Post
    I'm afraid you got that backwards. Nvidia's the one whose been having driver issues. Please get out of 2003.
    From my perspective, Nvidia's drivers remain more stable than ATI's at least for OpenGL development. Perhaps the better term would be more "forgiving," as they don't crash as often when I am debugging (and usually sending bad data in some form or another to the card). As always, YMMV. ATI certainly has the hardware lead for now.

    Anyhow, hopefully this helps clear things up a little.