Agonist_NA

Cohort
  • Posts

    88
  • Joined

  1. Agonist_NA

    CoP reward buff

    Quote:
    Originally Posted by Gaderath View Post
    This would skew it in the opposite direction from how I am reading it that you want it to be. Must remember the average speed TFer/"powergamer" runs the TF about a kajillion times more than the "average player".
    Well if more than 50% of the people running a given TF were speedsters, then that would be appropriate (if time-to-complete is a valid metric, which I am not sure of myself). If it really were a problem of a small number running it a huge number of very short times, one could calculate a median for each account for say the last 10 runs, and then have each account only count once for the overall median. I don't really think you would have to do that, but then I don't have the data either.

    The median is less affected by extremes in the data set, so even just that would be an improvement, even in the presence of a large number of speed runs...
  2. Agonist_NA

    CoP reward buff

    As far as I am concerned, reward merits should be based on the median, not average, time to complete (CoP being an exception). Maybe the mode (most frequent time). This would account for powergamers that skew the average down lower than what most people experience.
  3. Quote:
    Combat Auras

    * All in-game Auras can now be set to only activate while in combat.
    Whoohoooo! Thanks FX guys (and David for asking them) for implementing my suggestion!! This is totally awesome!

    *Off to make my blaster's eyes glow only in combat!*
  4. Agonist_NA

    Monitor woes

    Finduilas, here is something else to try before you buy a new monitor...a little background first.

    I have an HP widescreen LCD monitor (HP w2408). The monitor comes with the usual VGA plug and a DVI-D plug. Sending the video signal over DVI results in a marked improvement in the image quality, so that is what I use.

    Whenever I upgrade my video card though, I uninstall the video drivers. Once the video drivers are installed, I can select the HP monitor and use the DVI. HOWEVER, when the computer reboots with the new driver BEFORE I can select the monitor, the card doesn't send out a recognizable signal to the monitor, either on VGA or DVI, and the monitor remains blank and the LED flashes, just like yours, and I can't complete the installation. (My hypothesis is that by default the new card is sending out a VGA signal, but my fancy schmancy monitor doesn't understand VGA unless it is "translated" by the driver.)

    So how do I complete the installation? I go get my old CRT monitor and plug it into the VGA port, and it shows Windows just fine (just like what is happening to you). I finish installing with the CRT, hook up the HP monitor on DVI, and reboot. Now I get a basic signal to the monitor, and can go to the video driver and select the HP monitor and get full access to the full resolution and features. (I believe I have also tried a "safe" boot, but to no avail IIRC...)

    So you might have a fried monitor, but you might try this procedure and find out that the monitor was fine, just that the video driver (which can "see" it) isn't sending a signal that makes sense to it.
  5. Warshade has the Dimension Shift problem with shields as well. I /bugged this in game.
  6. I just saw that it does stop after the power in which it is in deactivates, so I can confirm that Draggyyn. As I recall, it is supposed to be global, eg http://wiki.cohtitan.com/wiki/Aegis:...tus_Resistance

    So I am going to call "bug!" on at least that part. That 20% going to 16.667% is definately a change, but I don't know if it is now WAI or if that is a bug.
  7. Quote:
    Originally Posted by Aramar View Post
    Working as intended, although misleading.

    The combat attributes displays a different mechanic. Aegis's status resistance of 20% is computed similar to recharge reduction:

    New Time = Oldtime/(1+reduction) = oldtime/(1+0.20) = oldtime*0.8333

    So a 20% reduction results in an actual 16.67% reduction of actual time.
    Thanks Aramar...

    Did the displays in the Real Numbers just recently change then? I have always seen it tick up to 20% (I keep it displayed since mostly I put Aegis into click powers and want to make sure I remembered to click it!). So, in the absence of other modifiers, I might see Sleep 80% in Real Numbers, now it is higher than that.
  8. James Joyce,

    There are (at least) two other factors in the market coming to bear on prices other than inflation.

    One is that there is more "spread" between casual gamers and power inf gamers (akin to the recent trend to an ever-greater divide between rich and poor in the US). That greatly inflates certain items that those with billions to spend will pay, but those more casual gamers will never have. The safety valve for this *should* be merits (leaving purples to rocket up), as even a casual gamer could get a super-shiney randomly or save up for one. Thing is, I don't think these recipes are making it to the market frequently enough to lower the price. Either people are using them, or selling them at the inflated price.

    Second thing is that the market is not rational. Sometimes someone with a lot to spend will pay a hundred thousand for something they want "NAO" - and without sufficient down price pressure that becomes the new normal price, even if it doesn't match supply and demand. On the other hand, sometimes people say to themselves, "There is no way I am going to pay that much for that piece of common salvage!" and the price can be artificially low compared to the demand.

    So it is a very chaotic system, made more so by very limited market information and very small market sizes.
  9. Hi MB,

    I can confirm it was not listed in the combat attributes, BUT I totally forgot was ex'ed doing a flashback when I slotted it, and that was probably it. I'll verify, but until then, assume it closed.

    Thanks!
  10. I just noticed that the three Touch of the Nictus slotted in my Black Dwarf's Drain are not giving me the +9% accuracy bonus. Bug?
  11. On multiple toons, the Aegis status protection only goes to 16.67%, rather than 20%

    Bug?
  12. Very good news.

    To help with market manipulation, how about making a little graph showing the trading prices for the past month? Five most recent buys just allows people to monkey with the market. And try clicking through the different levels of one of the procs and see the crazy different prices all for something that does the same thing regardless of level...

    I'd love to see a UI something like this: I want to buy recipe X. I can set the level range to view the prices for multiple levels at once, a graph for each of the prices over the last few days, and also the prices for crafted enhancements (those never seem to relate back to their recipe counterparts).

    The amount of information now that helps in trading is almost as bad as a US hedge fund....
  13. Posted for completeness - David was kind enough to promise to look into this in a PM...but I am interested to see how people respond.

    How about if ALL auras optionally become combat auras. I think it would be a lot of fun to select as a combat aura: fiery eyes, or tendril hands, or full-body smolder or whatever as combat-activated.

    "I BURST INTO FLAME when attacked!"

    Knowing nothing about how it is programmed, it seems like it would be an easy change - add a flag onto auras and they show up in the list and are activated as combat auras, since the tech is already there. Of course, I could be dead wrong!

    Still, I think it would be fun to see your whole team "light up" in their various ways as you enter combat!!!
  14. Hi Fallenz,

    My CPU is a dual core - the one I talk about is the fastest dual-core CPU that will fit on a Socket 939, but 939 is a 6-ish year old standard, so it is now obsolete and would drive the need to upgrade the mobo. Of course, if Pendix or others have a single-core CPU, yes they would get a big improvement without changing out the mobo if it accepts dual core or higher CPUs. I was hoping to just point out that just upgrading a CPU may not be the way to go.
  15. Pendix got me thinking about this - so when I had the 8800GTX I was at 50% CPU with settings described above. With the GTX285 I have bumped up the character and world detail to 200% and maintain about a 20fps framerate, but now the CPU is pegged up at nearly 100%. For the same settings, in moving the bottleneck from GPU to CPU I doubled the framerate (15 fps to ~30fps).

    Now in looking at upgrades - with the socket 939 I have, the fastest CPU I could put in there gives something like a 15% improvement over my current CPU. I know it is not quite the case, but if we assume that that entire 15% improvement goes to framerate, for the bargain basement price of $200 I could get a pretty small framerate increase.

    So my real upgrade path would be motherboard and CPU as I am not likely to get much more from the current mobo. Sigh - that ain't gonna happen anytime soon I think.

    So be sure if you think of CPU upgrades to consider what you will actually get for your buck before you spend it.
  16. Quote:
    Originally Posted by Pendix View Post
    Can I get a confirmation on this?

    I'm having pretty much the same problem: Upgraded to a GTX 260+, and everything pretty much runs fine (other games look brillaint as well) unless I turn on Shadow Maps, which absolutly hammers my FPS (down to below 30). Dont want to go out and try and up grade again without knowing weather I should upgrade the Graphic Card, or the processors.

    Hey Pendix,

    Well, there are probably better ways, but if you are running Windows, you could leave up the Program Manager and see what the CPU burden does. If it is up near 100%, then you are at least bumping up against a limit there. RAM might be an issue too, but keep in mind that 32-bit operating systems can't access more than 4 Gb of RAM. So if you have a 32-bit OS, don't bother buying any more than that.

    There are only so many bottlenecks of that magnitude in a computer... I would guess that since it is so closely related to turning on the shadow map, that is what bogs down the CPU. I don't know if graphic cards do that type of calculation - projecting an image from a fixed point as 2D seems like a CPU thing to do, then the graphic card calculates what that image looks like from your POV.

    One of the experts can probably offer a better diagnostic.
  17. Quote:
    Originally Posted by VoxPanda View Post
    Very curious now...
    I just upgraded from a gtx275 to a gtx480 on a shiny new rig.

    I went from the 275 to the 480 because no matter what I tweaked, I could not break the 30 FPS barrier.
    I've tried in game and out of game settings, driver updates... the works.

    The very bad news.... even with the 480, I cant get above 30 FPS, and
    the culprit is Shadow Maps.
    If I turn them on, FPS tanks. Setting them to off/Stencil Shadows... and I get back to 60.

    So, what trick am I missing?
    Vox, I'm no big expert, but if you are locked at 30fps, that sounds like a CPU limitation. It might be that shadow maps use more CPU than graphics card processing, so you hit a ceiling and your graphics card is going as fast as it can, but is waiting around for something to do.

    Not that I think most people would notice a difference between 30 fps and a million fps...
  18. Quote:
    Originally Posted by Agonist_NA View Post
    Cascadian,

    My experience with the 8800GTX (pretty much equivalent to the 9800GTX) is that I can run it almost maxed out. Compare the 8800GTX, the GTX 260 and the GTX 285

    So I would say the GTX285 would be able to run UM full on just fine (given sufficient supporting components of course). I have a bid on one right now as a matter of fact...
    Update on this: just got the GTX285 up and running. The frame rate went from 15-20 fps to 20-30fps with the exact same hardware, drivers, etc.

    With my not-cutting edge rig (WinXP, 2 Gb RAM, AMD 4400+ duo core, 1920x1200 monitor), the GTX285 lets me run ultra on everything except 16x antialiasing (I don't see a difference in frame rate or visually between 2x, 4x, and 8x), ultra quality shadows (I actually prefer the more subtle high performance setting anyway). Of course, going to stencil shadows jumps the framerate up to 60+.

    Which brings up an interesting observation: with the new card I can run Crysis at "enthusiast" level with amazing lighting effects, projected shadows, etc., at high framerates. But CoH runs much slower than Crysis (!!!). Is this an artifact of bolting on Ultra-mode to the existing platform, or a symptom of inefficient coding?
  19. Quote:
    Originally Posted by Cascadian View Post
    In the very first post of this thread, Positron give the GeForce GTX 260 and GTX 285 as good video cards for Ultra Mode. Is that information still accurate? Five months have passed since he posted that.
    Cascadian,

    My experience with the 8800GTX (pretty much equivalent to the 9800GTX) is that I can run it almost maxed out. Compare the 8800GTX, the GTX 260 and the GTX 285

    So I would say the GTX285 would be able to run UM full on just fine (given sufficient supporting components of course). I have a bid on one right now as a matter of fact...
  20. Quote:
    Originally Posted by Techbot Alpha View Post
    Hmm...ok, I'll bite; what does it do? And is there much point using it on a 9800 GT? I can get fairly decent UM levels out of it, but with a slight dip in speed from what Live has at the moment.
    Would this doohicky help? And is it potentially harmful, or an actual Nvidia driver?
    It is a preview nVidia driver Techbot, so there should be no downside. I did see a noticeable improvement in my 8800GTX (above) of around 5 fps outside (15 stable to 15-25 variable). It seems that the driver adds more support for OpenGL. It is conceivable to me that not all cards benefit equally. The 8800 GTX is equivalent to/slightly better than the 9800GT, but being a generation behind maybe its driver was missing some optimization. So, you might or might not see a benefit. No harm in trying though I don't think.
  21. archaeum, I can confirm some boost on my system with that new OpenGL driver.

    With a 8800GTX, Ultra on most things except character and world detail and low shadow saturation, outside it was a solid 15 fps, dropping to 10 occasionally, 2 rarely. Playable for me, if with a bit of noticeable jerkiness. But hey, I did claymation in junior high school, so that was OK with me. Inside I had a solid 20 fps, no matter if alone or on a team.

    Upgrading to the new OpenGL drivers (installed over the top of the previous one) now I see around 20 fps looking over long vistas, slowing to 15 fps minimum jumping around and looking at shadows. "Claymation" effect is almost not noticeable. Inside I have a solid 30 fps.

    So I rate it a "win" from my side! Thanks for the tip!
  22. ...and now that OB is here, I can finally say...

    My system is probably considered borderline, but I can run Ultra-Mode maxed out at 4-5fps, nearly maxed out at 15-20 fps, and using the options get up to 50+ fps and still have it look great! I am impressed that it is not chewing up more GPU than this...

    System, all stock speeds:

    OS: WinXP Home 32 bit SP3
    CPU: AMD Athelon 64 4400+ X2 Dual 2.21 GHz (around 50% CPU utilization)
    RAM: 2 Gb
    VidCard: nVidia GeForce 8800 GTX, current drivers
    Resolution: 1920x1200

    FSAA doesn't seem to impact framerate much at all, nor does the anisotropic filtering. Going above 100% on world or character detail DOES impact framerate. Biggest effector seems to be the ambient occlusion, and in tricky areas this will slow things down quite a bit.

    Still, overall I am very impressed at the frugality of the new Ultra Mode!
  23. Thanks again Father Xmas!

    Quote:
    Originally Posted by Father Xmas View Post
    First, SLi isn't an automatic 2x improvement in performance. A lot of it has to do with how the game's code is structured so the game code running on the CPU is waiting on the video card to finish before the next frame can start. The longer the wait, the more of a speed boost having a 2nd card in the system.

    Second, each card may get a full 16 lanes of PCIe V1 but it's not like either can "borrow" the additional lanes from each other as needed. The SLi bridge is so one card can access the frame buffer on the other to read for output to the monitor.
    Yeah, that is what I meant by SLI inefficiencies, but good to point out explicitly...

    Quote:
    Originally Posted by Father Xmas View Post
    Third the price disparity is because the 8800 GTX isn't made anymore and if someone is looking for one it's because they already have one and is looking to do SLi. It doesn't occur to them, the gamer looking for a 2nd card, that a single newer card may be considerably faster and cheaper. You probably remember since you have one, the 8800 GTX was a $600 card when it debuted and I can understand why someone wouldn't want to pull it, bag it and stick it on a shelf of old parts when they think they can still use it if only they could find another for SLi.
    Ain't people funny? Personally, I got it for a killer price ($150 on eBay a few months after it came out. I put the bid in on a lark.) This actually further supports my position about Crossfire/SLI: it is sometimes positioned as a path to upgrade when, for the two I have had, it never made money sense to do so. It has both times been cheaper and better to buy a new card rather than a second card. So really, duo-card only makes sense to leverage the current technology for the price of beaucoup bucks. Because of the rate of technology progress, I doubt it will anytime soon make money sense to buy a second card to "keep up." Cheapest 8800TGTX I found was ~$150. Even at 50% of the price, it is a wash performance per dollar (at best).

    Quote:
    Originally Posted by Father Xmas View Post
    Lastly, those two reviews you site use different hardware (CPU, OS, memory) so you really can't compare the two. But you can compare them if you know where they used the same setup to test every card. They are using a highly overclock CPU and rather high quality settings which helps SLi/ Crossfire to shine.
    Oooh, wish I had known about that feature on the site - that is slick and I would have wasted a lot less time comparing Red Delicious applies to Granny Smith apples (not quite apples to oranges you see...though not quite the same apples to apples either).

    So is that comparison saying that a SLI'ed dual 8800GTX is in fact almost comparable to a GTX285 in certain situations?
  24. Quote:
    Originally Posted by Father Xmas View Post
    The difference between PCIe V1 and V2 is V2 is double the bandwidth. Another way of looking at it is a PCIe x 16 V1 interface has the same bandwidth as a PCIe x 8 V2 interface.
    Thanks Father X - that is interesting. The ASUS mobo has 2 PCIe x16 slots, but I only have one vid card. So thought experiment (probably not really going to do it and UM doesn't like dual boards yet apparently):

    What would the generic performance comparison for two SLI-linked 8800GTX (which can use 16x2 = combined x32 bandwidth) vs. one GTX285 which is a faster card, but can only use one PCIe1 16 pipeline.

    Looking at nVidia's site and using the 9800GTX+ as a stand-in for the 8800 GTX (since they were about equal in performance - slight edge sometimes to the 8800 for the extra memory)

    8800GTX x 2 = theoretical 30x 3DMark®Vantage Performance Preset (somewhat less due to SLI inefficiencies)


    GTX285 x 1 using PCIe 1.0a single card = something less than 28x 3DMark®Vantage Performance Preset - perhaps 4% less

    Going back and looking at 3DMark tests, 2x8800GTX scores 5467 on the SM3.0 test
    http://www.legitreviews.com/article/421/7/

    ...on a PCIe x16 board, SLI giving it a x32 effective I guess. WinXP OS

    and the GTX 285 scores 7540-7731 on the same test here
    http://www.legitreviews.com/article/915/5/

    ...which is a PCIe 2 x16 board, so x32 effective I guess. Vista SP1 OS

    So would it be correct to conclude that buying another 8800GTX would be about 75% performance as buying a GTX285 for a given system?

    Interestingly, according to Google, the 8800GTX new is more expensive ($483 lowest it found) than a new GTX 285 ($352). Of course a used 8800GTX might be found, but still a weird disparity.
  25. Question for you system experts...

    What effect would putting in PCI-E 2 vid card into a system board that is PCI-E 1 have? I know 2 is backward compatible, but I figure it has to slow down a top-end card somewhat anyway.

    For comparison, here is my system info:

    Board: ASUS A8N32-SLI Deluxe
    Card: 8800 GTX
    Memory: 2Gb
    OS: WinXP
    CPU: Athelon 64 X2 4400+
    PSU: 650W BFG

    If I were to put in, say, a GTX285 would I get a performance boost do you think?

    Thanks!