je_saist

Renowned
  • Posts

    4197
  • Joined

  1. *cracks knuckles*

    you know what that means!!!

    The Council will be even HARDER to find in open areas because... *ish suddenly knocked out cold by BaBs and Castle and dragged off*
  2. Quote:
    Originally Posted by Call Me Awesome View Post
    Actually I'm not sure that Hami isn't autohit... in all the raids I tanked he never missed me once that I can recall. It was just BAM 935 damage.... BAM 935 damage.... BAM 935 damage regular as clockwork every 4 seconds.

    I did try as an experiment once using up a half tray of purples without effect.
    A quick check shows that purple inspirations do not buff base defense in the current version of the game.



    **

    Checked out my bubbly while I was at it. Strangely, Dispersion Bubble, which was reguarded as boosting Base Defense back during the I7 / I8 hamidon raids that I took part in, isn't shown as boosting Base Defense, neither on the bubbly itself, or a friends avatar.





    I am starting to wonder then what buffs in the game boost base defense. Since Hamidon Damage is reported to be untyped with neither positional nor elemental components, then the power might as well be auto-hit since two of the items that should boost base defense (Purple inspirations and Dispersion Bubble)... don't.
  3. Original Source here: http://www.engadget.com/2010/01/07/l...s-press-event/

    Of of the recent topics in not only the Ultra-Mode GPU thread, but among several other threads about which GPU to be buying, Nvidia's financial woes keep getting brought up, and there's been serious questions from my perspective, as well as other players and analysts, on whether or not Nvidia can stay solvent with the huge losses it's taking on G9x and GT2x chips, not to mention pulling the GT2x chips from the market.

    Well, the good news is, if you like Nvidia, the company isn't going to die anytime soon. German Auto-Maker Volkswagon has inked a deal to put the Nvidia Tegra platform in all cars sold under the 2011 Audi marque, and is already placing Tegra in several 2010 models as well. That also presumably includes Lamborghini. Presumably Bugatti and Bently will also be receiving Tegra systems, and if the project pans out, you can probably expect Tegra in all Volkswagon Models in 2012 / 2013.

    Don't get your hopes too high about playing CoH in your VW car though. While it's possible that the increased Developer Size of CoH might mean a Linux ARM port is in the works, and not just a translation using a W.I.N.E. base, and while the existing graphics engine certainly will run on the Tegra's performance envelope, I don't see car manufacturers... wait. I am talking about VW here. They might actually do it.

    Anyways, even if Fermi is a complete disaster with a too-high price-point, not enough performance, and a triple-screen 3D setup that can be done on ATi cards that were shipping last year (cough:Eyefinity can do Stereoscopic 3D too), Nvidia won't be shuttering their doors anytime soon.
  4. je_saist

    CPU Problems

    Quote:
    Originally Posted by MondoCool View Post
    This might not be the best forum to ask this, but I've had a problem running CoX on my computer. Basically, the game never seems to use more than half my CPU power. I'm using the following system:

    Intel D945GNT motherboard
    Pentium 4 3.0 ghz Hyperthreaded
    1.8 GB RAM
    Geforce 8500 GT, 512 MB
    180 GB PATA hard drive

    If I run CoX, it uses 50% of my CPU at max, and game performance suffers. If I disable hyperthreading, the CPU usage increases to 100%, and performance on the game improves. However, I still only get about 10-20 FPS outdoors and 20-30 indoors. Any framerate above 5 FPS is completely manageable for me, but I've noticed that changing my graphics settings or screen resolution does absolutely nothing to change my framerate or my loading times (which are surprisingly fast, around 10-20 seconds).

    Anyone know what the problem is?
    you're not going to like my answer.

    It's your processor.

    Here's the thing with Hyperthreading: it's not actually a magical performance increaser. It can, in many cases, actually hurt overall system performance.

    Thing was with the old Pentium 4 Northwood and Prescott processors is that they had exceedingly long pipelines, somewhere in the neighbood of 20 stages in the Northwood design and 31 stages in the Prescott design. The idea is that the more stages you have in a processor pipeline, the higher you can crank the clockspeed, because each stage does less work.

    Unfortunately, these deep pipelines meant that if the processor "guessed" wrong on a branch prediction, or a batch of processed code was errored, canceled out, or somehow affected, the entire pipeline has to be cleared. This means that the processor is effectively stalled and doing nothing for at least a full clock cycle.

    This was actually a huge issue on the first sets of Pentium 4 processors which were routinely trounced in real-world processing by the existing Pentium III chips and AMD Athlon chips.

    As a solution Intel started working on SMT, or Symetric Multi-Threading. Most x86 processors have multiple instruction units along their pipeline. SMT allows a processor to run multiple execution threads on a single processor pipeline providing those execution threads are different. Basically, this means you could hand a SMT enabled processor 2 different instructions at the same time as long as those instructions were different.

    Intel branded this technology for their Pentium 4 processors as HyperThreading. It was, all in all, an attempt to make a bad processor design more efficient. The concept was actually such a failure that Intel junked the entire Pentium 4 architecture and went back to the Pentium III's short pipeline for Banias, and then reverse engineered the Athlon64 architecture for Conroe.

    ***

    So... what does this mean to you?

    Well, the reason why your processor is giving you such funky readings is that it's presenting itself as two different processors to your computer. However, it's not. It's just a single core processor. Your processor is actually at 100% utilization whether or not you have hyperthreading turned on. Your computer, either due to a bios, driver, or Operating System issue, isn't able to determine that you don't actually have two processors... you just have one.

    Turning hyperthreading off gives you an accurate picture of how your computer is performing. It's slammed.

    Now, really, there are probably some other issues here, because, while the Pentium 4 architecture was a bad architecture, and while the 3ghz processor wasn't actually that fast... I could get 30fps out of a Radeon x1800 XT on a P4 @ 3ghz on a 1024*768 screen.

    So, there probably is another issue cropping around. My suspicion is that it's due to a BIOS or driver issue that is preventing your processor from giving accurate readings.

    ****

    Ergo, my first suggestion for troubleshooting would be to see if there is a BIOS update available for your motherboard.
  5. Quote:
    Originally Posted by Scyntech View Post
    Nvidia does NOT have DirectX11 yet. They will not until the 300 series hits the shelves. ATI is currently the only video card developer that does starting with the 5xxx series.
    ... ... you do... realize... that you are like... preaching to the choir here.

    I guess from this response that you just caught this little snippet of the post:

    Quote:
    Now, we're coming up to a similar situation. AMD has an entire line-up of cards coming out from the low-end OEM market to the high end gaming market that are all DirectX 11 / OpenGL 3.2 compatible.

    Nvidia has a DX11 / OpenGL 3.2 compatible card for the high-end market... but not for the low-end or medium-range markets. Rather, from what we know of unofficial information, Nvidia intends to keep using the G9x architecture and it's GT2x respins in the low end market. These aren't DX11 / OpenGL 3.2 parts...

    Okay, in all fairness, we, as gamers, know that there's not actually that much visual difference between a game that's coded in DX9 / DX10 / DX11 / OpenGL 2.x / OpenGL 3.x. Keeping in mind that the Playstation 3 and Xbox 360 are OpenGL 2.0 GPU's, which is roughly equivelent to DX9, we know that all of the API's can produce some amazing visuals. The question really becomes which can produce the best visuals with the best frame-rate. The GTX line-up of cards, for the foreseeable future, is going to be able to deliver excellent frame-rates with excellent image quality. Most game developers are not going to be leveraging the very latest graphics API exclusively. So far most engines, such as the CryEngine, Unreal Engine, ID Tech 4, ID tech 5, Source, Torque Shader Engine, and Unigine have offered different fallback levels of rendering.
    Again, maybe I wasn't clear enough, but I thought it was pretty obvious that I was talking about Nvidia's FUTURE LINEUP OF FERMI GRAPHICS CARDS and how that line-up is only for the high-end market, not the mass market low-end and medium-range where the companies actually make money.

    ****

    Quote:
    So my question is this, In order to even play Going Rogue or CoH/V: Do I sell the Optiplex and try to save for a better computer, or do I upgrade it? If I upgrade it, what gets priority? I know the power supply will needs attention if I get a card.
    Human Being pretty much already answered this. If you have a PCIE 16x slot you should be able to add-in a graphics card.
  6. just wondering... but what happens when you try an ATi card?
  7. ... really. this isn't a bad idea.

    There's just one giant problem standing in the way.

    It's known as the RIAA. Paragon Studios would have their collective rear-ends sued off within minutes of a such a patch going live.
  8. Quote:
    Which is why I linked the way I did. The tone of the original message I responded to was somewhat...skewed. Now maybe I was imagining it, but that sort of brand bashing (for a product that isn't even available yet) has always been like nails on a chalkboard to me, and I'm too stupid to just let it pass by unanswered.
    Oh, it was skewed on purpose, but again, apparently I wasn't clear enough on what I was skewing, or why.

    I have a distaste for vendors launching accessories for a product that's not even on the market with the expectation of using that product as a springboard for sales, or in order to further the hype surrounding that product. I also have a distaste for corporations that deliberately mis-spend customers or clients money.

    I used the example of the Bugatti Veyron because it's an example of a product that turned out to be very good. It is the fastest road car in a straight line. However, it was preceded, and accompanied by, a rash of products that simply latched onto the Veyron name, such as an aftershave and a custom watch... accessory products that were junk.

    Now, in the specific case of Thermaltake, Nvidia, and Fermi, there are still several questions to be asked. Nobody knows yet whether or not Fermi is actually going to be any good for gaming. We can infer from Nvidia's reluctance to talk about clock-speeds, and the continual pushing back of the launch date, that Fermi isn't exactly all it's cracked up to be. We also now know that everything at Nvidia's late 2009 conference was faked in regards to Fermi, and that hurts Nvidia's credibility by a large amount.

    What we don't know is if Thermaltake licensed (paid money to Nvidia), for the right to slap a Fermi-Certified sticker on one of their cases, jack the price up, and make that money back on users buying a case because it's certified for Fermi.

    What we do know is that Thermaltake probably actually hasn't any time with Fermi silicon, so we can be pretty sure that the thermal requirements the case is designed to meet are based off of the thermal requirements mentioned by Nvidia. This poses an interesting scenario. What if Nvidia is paying Thermaltake to put the Fermi name on an existing product with a few mods, hoping that the finalized Fermi product meets the design limitations, and that Nvidia can make money back on royalties from products sold bearing the Fermi-Certified moniker.

    In this scenario we have Nvidia spending money they actually do have, but in a way they should not. Nvidia's already in enough trouble with the likes of Asus, Acer, Aopen, FoxConn, and Clevo for having outright lied about the thermal properties of the G7x, G8x, G9x, and on initial reports, the GT2x series of chips in both desktop and mobile formats. Nvidia's lack of trustworthiness in detailing the aspects of their chips is commonly referred to as: "Bumpgate"

    Now, in all fairness, given the thermal properties of the recent high-end cards from both ATi and Nvidia, nobody in their right mind is going to try to shove a high-end Fermi card into a chassis from Dell, Hewlet Packard, or Acer's Gateway and Emachines divisions. Most gamers interested in Fermi probably are going to have properly designed cases with proper air-flow. Not really a big deal.

    However, the situation with Thermaltake does raise some other questions, such as the one I brought up in the first post on this particular subject. Does Nvidia seriously intend to make money off of either licensing Fermi-Ready Certifications to vendors, or by receiving royalties back on sold products bearing the Fermi-Ready moniker? How many more products are we going to see bearing the Fermi-Ready or Fermi-certified badges over the next 3 months as Nvidia and TSMC presumably ramp up production of Fermi based cards?

    There's also another huge problem facing Nvidia right now. Fermi is a megachip, with some 3billion odd transistors. As far as we know, Nvidia hasn't exactly designed a low-end or mid-range product off of the Fermi Architecture. As is, Nvidia has only "now" released low-end and mid-range parts based on it's GT2x series of chips... which was really just a re-implemented version of the architecture used in the G8x chips.

    Now, this might not mean much to the common user until I give the names of two graphics cards parts from the past.

    GeforceFX
    Radeon x800

    Nvidia had originally planned to launched the GeforceFX series of cards up against ATi's Radeon 9500-9700 range of graphics cards. However, ATi, with the help of ArtX, had blown Nvidia out of the water. The Radeon 9500 and 9700 Graphics cards were perfectly capable of accelerating DirectX 9 games at 30fps, even in the dizzying resolutions of 1280*1024 and 1440*900. Nvidia basically had to take GeforceFX back into the lab and add in DirectX 9 extensions. Unfortunately, when Nvidia actually got GeforceFX out the door in the second quarter of 2003, they weren't having to deal with the the original Radeon 9x00 line up... they had to deal with the Radeon 9600 and 9800 series, which were clock-bumped and more efficient. The result was a disaster for Nvidia. Their only success of the GeforceFX line-up was the GeforceFX 5200 which was a popular OEM card.

    Things changed on the next-line up of cards though. Nvidia launched the Geforce 6x00 series of cards in the second quarter of 2004. These cards featured DirectX 9c / OpenGL 2.0 support. ATi, on the other hand, was fielding the x800 and x850 lineups... which were DirectX 9.0b.

    Okay, in all fairness, most games didn't ever actually use the additional features in either the 9.0b or 9.0c versions of DirectX... and OpenGL 2.0 was pretty much only matching the base DirectX 9. It wouldn't match / pass DX9.c until the 2.1 revision which came much later.

    Still, the marketing damage was done. Nvidia was able to market the 6x00's range of full DX9c compatibility and win back most, if not all of the market share they had lost in the previous round.

    Now, we're coming up to a similar situation. AMD has an entire line-up of cards coming out from the low-end OEM market to the high end gaming market that are all DirectX 11 / OpenGL 3.2 compatible.

    Nvidia has a DX11 / OpenGL 3.2 compatible card for the high-end market... but not for the low-end or medium-range markets. Rather, from what we know of unofficial information, Nvidia intends to keep using the G9x architecture and it's GT2x respins in the low end market. These aren't DX11 / OpenGL 3.2 parts...

    Okay, in all fairness, we, as gamers, know that there's not actually that much visual difference between a game that's coded in DX9 / DX10 / DX11 / OpenGL 2.x / OpenGL 3.x. Keeping in mind that the Playstation 3 and Xbox 360 are OpenGL 2.0 GPU's, which is roughly equivelent to DX9, we know that all of the API's can produce some amazing visuals. The question really becomes which can produce the best visuals with the best frame-rate. The GTX line-up of cards, for the foreseeable future, is going to be able to deliver excellent frame-rates with excellent image quality. Most game developers are not going to be leveraging the very latest graphics API exclusively. So far most engines, such as the CryEngine, Unreal Engine, ID Tech 4, ID tech 5, Source, Torque Shader Engine, and Unigine have offered different fallback levels of rendering.

    Case in point being the user on the forums with the Workstation versions of the 7900 GT cards. Yes, they run most of today's hit games very fast... but they do so because they aren't running the hit games best image quality API rendering path. The user pretty much never noticed.

    ***

    So, why is this such a big deal if it doesn't really matter what product you buy?

    The simple answer is: It is a big deal to people who don't understand what's going on in the graphics, or does not understand what goes on a in a game engine.

    As both Nvidia and ATi found out the hardway, offering a product in the low-end that doesn't have the same feature set as the product in the high-end can be very painful in a financial sense. It doesn't matter if the RadeonHD 3200 can't actually run Crysis in 1024*768 using the DirectX 10 API rendering path. As long as ATi can put on the sales box that the RadeonHD 3200 can accelerate DX10 graphics, that's all users see.

    That's all the mass market cares about.

    That's also where all the money is. Nvidia and ATi make the majority of their money off of low-end parts sold in contract to OEM's and ODM's.

    ***

    Which is why the Thermaltake event irks me so much.

    If I was an executive within Nvidia, I wouldn't be spending a dime on any promotions, on any marketing or rhetoric, until I had working silicon in my hand that I could take to Kyle Bennet over in Texas and allow him to benchmark, or send to the people who run sites like Phoronix or Mepisguides to benchmark and look at. I'd be throwing money left and right to get mid-range and low-end parts of the new architecture ready for 0-day launch with the high-end part.

    I, however, don't work for Nvidia, so I can only speak on what I see from outside the company. What I see is Nvidia wasting money on potential marketing stunts, rather than tending to their core business.

    And that's what just torques me off.
  9. Quote:
    Originally Posted by Emberly View Post
    Oh that sucks, so my big radeon idea might not be that great until these fixes you mentioned are in the game for real
    eh. The game works fine with Radeon cards as is.

    You don't even really need to do that much tweaking. Just leave water and Depth of Field turned off, and you'll probably never see a glitch.
  10. Quote:
    Originally Posted by Emberly View Post
    Yeah that post doesn't say what the issue is or whether turning off AA fixes it but thanks!
    ah.

    No. the graphical glitches with water and the Depth of Field are part of the underlying problem, but there are additional glitches that occur if you turn on Depth of Field and Water regardless of whether or not you try to run AA.
  11. Quote:
    Originally Posted by Emberly View Post
    So this is quite the thread and I am not a technical kind of dude. I too am interested in upgrading my video card. I am looking at a Radeon 5770, but a friend tells me that ATI has problems in CoX because of openGL or something. I don't know what any of that means, so I am just gonna ask: are the Radeon cards a safe bet for CoX? I assume they will be OK with GR too, if they are OK with the current game.

    edit: ok so there is an antialiasing problem? If I don't use AA is everything cool?
    ... see post #15 in this thread : http://boards.cityofheroes.com/showp...8&postcount=15
  12. Quote:
    Originally Posted by Hyperstrike View Post
    I call FUD.

    If you actually read the article and the attached image, it states that the case is optimized for systems sporting Triple and Quad SLI setups (which are already notably warm-running). This means they're ridiculously overkill for single-card solutions.

    And the use of 200mm and 230mm fans in cases isn't exactly new technology here. And please note the second link especially. It shows that the case you linked to is little more than a branded rework of a pre-existing chassis.

    With this in mind, the way you've portrayed it is inappropriate.

    *coughs*

    Quote:
    Okay, I really shouldn't mock or make too much fun of this.
    ... Apparently I wasn't clear enough that the initial commentary on the case design was supposed to be a mocking joke.
  13. Quote:
    Originally Posted by Wuigly Squigly View Post
    If they take out Ultra-Mode from the (im guessing 50 buck expansion) and just give it as part of i17, or even delay it for 6 months, I hope the rest of GR is pretty good
    I'm having to go back and check that massive thread on this, but I don't think Ultra-Mode isn't actually part of the (retail) Going-Rogue Expansion.

    As I understand the situation, Ultra-Mode will be arriving with, and was paid for with, the Going Rogue development monies. However, the art assets and mode will be available even if you don't actually buy the expansion as both Paragon City and the Rogue Isles will be receiving graphical enhancements.
  14. Quote:
    Originally Posted by Bohmfalk View Post
    I just finished (for the most part) upgrading from Vista to Windows 7. I started the game, got the brightness issue, updated my drivers, and now the game looks almost the same as before. The only issue is that now, on characters with costume pieces that used to be shiny, they just look flat now (as in many Vanguard costume pieces). Does anyone know how I can fix this?
    two questions:

    did you crossgrade to 32bit or 64bit?

    What's your video card?
  15. In the news today is this little gem from Thermaltake: http://www.thermaltake.com/news_deta...pid=P_00000145

    They've designed a case specifically for for Fermi based graphics cards.

    I just have to ask. How hot are Nvidia's test samples running in order for Nvidia to feel the need to work up a special chassis that can handle the heat output? Who here is going to spend $170+ (the current price of the Element V) on a new chassis that is certified to work with Fermi?

    Okay, I really shouldn't mock or make too much fun of this. The last time I saw companies making accessories for a grand and hyped product that was actually launched, the Bugatti Veyron was involved... and hey, that goes pretty quickly in a straight line... surprisingly not that good in corners though.

    I am left wondering how many more of these accessory style launches we'll see in the next 3 months. Will the licensing fees for using Nvidia's name be enough to offset what is not actually the most advanced GPU computing architecture launching late with 3 months of nothing to compete with what a competitor is offering?
  16. Quote:
    Originally Posted by Human_Being View Post
    I'm not sure here if you meant the entire quad-core computer plus new graphics card was $90 or just the graphics card. However, if it was $90 for the card are you sure you didn't mean a GT 240? That's the Nvidia offering at that price point currently.
    Probably not. $90 is around what Best Buy is selling the GT 220 at.

    Their cheapest GT 220 is $80... which is still overpriced.

    They do list a GT 240 at $100.
  17. Quote:
    Originally Posted by Wuigly Squigly View Post
    Did they ever fix the issue ATI had with FSAA and CoH yet by any chance?
    yes and no.

    Yes. The issues have been fixed on the back-end on the current graphics engine.

    No, those updates have yet to be pushed out to game client.

    Reportedly these fixes will arrive with the Ultra-Mode updates. Unfortunantly, we don't know when Ultra-Mode will arrive. I recall somebody commenting in the gigantic Post Going Rogue Information Here thread that Ultra-Mode could arrive separately of Going Rogue.

    There is also the possibility that if Ultra-Mode is held off to be launched in the free issue accompanying Going Rogue, that the current engine fixes will be published before hand.

    Either way, when Going Rogue arrives, you should have the exact same visual image reguardless of whether or not you are running Intel, S3 Chrome, Nvidia, ATi, and presumably even XGI graphics. (Though really, is ANYBODY using a Volari?)
  18. Quote:
    Originally Posted by bzald View Post
    Excuse me? i don't understand why you went to the trouble of typing that?
    I can come up with a reason. This sort of activity reeks of griefing.

    Other players before have used logging out + activating powers in order to grief various zones, such as dominators / controllers logging out or disconnecting while in Pocket D with their pets spawned, causing the pets to go active and attack opposing players.

    Even if that's not your intent, there might be one or two applicable exploits to be found in combining certain power usages with logging out / disconnecting at the right time.

    Speaking for myself, I am a bit puzzled as to why somebody would want to log out on a self-destruct and log back in to find themselves dead. If I wanted to do that, I'd go trigger the log out after jumping into Lusca's arms.
  19. Quote:
    Originally Posted by Oxmyx View Post
    When it comes to technical computer stuff I know just enough to be dangerous. I recently purchased a new computer, a quad-core Asus 5270 and had installed in it a BFG Nvidia GT220 graphics card. I know it's after the fact but is that a decent card? I got the whole thing at Best Buy on sale for $90.
    um... not.... really.

    Xbitlabs looked at the GT 220 with a not really good conclusion here : http://www.xbitlabs.com/articles/vid...210-gt220.html

    I know somebody who got a GT 220. I got roped into installing it after the purchase fact. I mentioned it... here: http://boards.cityofheroes.com/showp...87&postcount=8

    Quote:
    I wouldn't... pick a GT 220 for a couple of reasons... An associate of mine picked up a GT 220 to replace a Geforce 7300. The 7300 was running on a Dell 300 watt power supply. The GT 220 wouldn't. Same card on the same motherboard ran fine with a 400watt power supply. While I'm not saying Nvidia's, you know, lying about the GT 220 running on 300 watt power supplies, there are other factors to keep in mind, such as the processor, northbridge / memory controller, hard-drive, system fans, and so on. Also, in fairness, the associate had picked up the Low Profile ECS GT 220 rather than an Asus, so there might be a brand quality factor involved as well.

    The second reason I wouldn't pick up a GT 220 is that it was hot. Really. Really hot. The associate was putting it in one of Dell's... smaller cases... and... well. Okay, I define obnoxious as a GeforceFX 5800. The ECS GT 220 wasn't exactly obnoxious... but as soon as the chassis cover went on you could tell the GPU fan had kicked into high gear. Again, this might be another brand quality issue, but as far as I'm aware from pictures, the Asus and ECS cards use the same heatsink designs.
    From a raw performance point, the GT 220 has a theoeretical performance of 5,000 MegaPixels and 10,000 MegaTexels. This puts it around the same real-world performance as the older Geforce 9500 and 8600 GTS chips, of which the GT 220 is basically yet another respin of.

    From a price standpoint, the GT 220 sells on Newegg for around $60~$70.

    Unfortunantly, you can also get RadeonHD 4670's with the 1gb VRAM in this market segment... and with more Pixel performance (6,000 MegaPixels) and more than double the Texel Performance (24,000 MegaTexels) the HD 4670 had no problems trashing the GT 220 in Xbit's tests.

    Basically, if you paid $90 for it, Best Buy ripped you off.
  20. je_saist

    fastest flight

    Quote:
    Originally Posted by mintmiki View Post
    I'm thinking about making a shard build on my fire blaster and am trying to work out the means to get her to fly very fast. I hate TP travel but I also want to go quickly, even at the cost of performance. Therefore, I was wondering how you guys think I can reach the flight speed cap?
    put two SO's on Fly and one SO on swift.

    my Ill / storm is just 2mph shy of cap with 2 Crafted Fly IO's on Swift and one Crafted fly IO on Fly.

    Since Fly can be boosted more than swift, swapping the two around with more on Fly would easily cap flight speed.
  21. Quote:
    Originally Posted by Detra View Post
    *clicking sounds from the stilettos can be heard as Detra apporaches the kitchen area, dressed in her black latex french maid outfit*

    Morning Neko & Rad.

    Dont you loook soooo cute Neko *smooches*

    *gives everyone big squishy boobie hugz™!!!*
    *ish squished in Detra's boobs!*

    mhy, hemmo mere, mice mo meet mou.

    *frees self*

    You should see the cute outfit I made for Fedor. A puffy shouldered blouse in dark blue with yellow polka-dots and matching ruffled petticoats... but he won't wear it willingly... *pouts*
  22. Quote:
    Originally Posted by Human_Being View Post
    Actually that's the official price point from ATI now. They bumped it up to $300 when multiple review sites reported the same thing: that they could repeatably and reliably overclock a 5850 to 5870 performance levels. With the 5870 at $400, there was no reason to buy it over the 5850. So they raised the price.
    Interesting. I hadn't caught that bit.
  23. je_saist

    CoX take 2

    Quote:
    Originally Posted by Zmoosh View Post
    The first time I played this game I came from SWG after that game was ruined. So when the ED hit I quit without a second thought, even though I was playing with some awesome peeps in my SG.
    *head-tilts*

    Um... ED hit before NGE. ED was in Issue 6 in October 2005: http://paragonwiki.com/wiki/Issue_6

    NGE was November 2005: http://www.joystiq.com/2005/11/24/st...ayer-backlash/

    I guess I'm a bit tossed on the dates since this line implies time-travel.

    Quote:
    Long story short I heard about the IO's and now I'm back on trial, and I LOVE this game. I loved it then and I love it now. What I want to know is do IO's bring back the pre ED power/fun?

    P.S. I've been playing a fire/dark corr and it feels like everything a defender should have been.
    Eh, I disagree with the opinion that the corruptor is what a defender should have been. A corruptor places more emphasis on their attacks, since attacks are primary.

    Defenders are supposed to place emphasis on defending, protecting teammates, debuff opponents, and so on. Both pretty much fill their intended roles.

    As to whether or not IO's make up for ED?

    Well, ED was pretty much good game design. It's problem was the timing. CoH was already established with 5 free expansions, and with a launch date of 04/27/04, over a calendar year of play. When I first played CoH I was struck by the enhancement design and outright told an NCSoft rep that being able to 6 stack enhancements was just asking for players to skip over the vast array of other enhancements available... a point of view that turned out to be correct. Players wound up focusing on singular effects of powers, rather than stretching out and performing different roles in the game.

    True, a large percentage of the player-base didn't like having to actually be what they were, but as sales later showed, ED didn't have the sales impact that genuine design blunders like NGE had.

    Do IO's allow players to be more powerful than pre-ED avatars?

    Well, yes and no.

    Yes, IO's can allow players to be really good at what their archtypes do. Controllers can be really good at controlling. Tanks can be really good at taking damage. Scrappers and Blasters can be more damaging, or a little bit tougher, and so on.

    IO's can also allow players to be really bad at what they do. Case in point is the recent trend I see on the forums to focus on soft-capping defenses on an avatar. Most of the soft-capped builds I see sacrifice class defining powers, and make players weaker at what their avatar is supposed to do, in an attempt to allow the avatar to do what the avatar isn't supposed to do.

    So it's a double edged sword. IO's can be a very useful tool, but they can also make a decent power combination into something that can't, or won't, fulfill it's design role in the game.

    Yes, then can allow players to blaze past some of the power restrictions of the games. Stone tanks can remove the recharge penalty or damage penalty of Granite Armor. Many avatars can make hasten permanent. Ice / Rads can actually get enough endurance back to do something more than choose between controlling or debuffing.

    Is it a better system than ED? Well, the IO system builds on top of the ED restrictions, adding a complexity to the game that simply wasn't there back in 2005.
  24. Quote:
    Originally Posted by Rylas View Post
    What about the GTX 275? According to the chart, it would have quite a bit of performance over the 5770. I've had unpleasant experience with ATI in the past, so I'm dubious of going that route. Sure, I'd need to upgrade my PSU, but the power requirement isn't that much more than what I use now.
    The 275 is a bit odd out since there's no direct answer on the AMD side right now due to market demands.

    The RadeonHD 5850 out-runs the GTX 285, and if you could get one at it's MSRP of $260, it'd be well worth the extra cash over GTX 275's current $230 price tag.

    However, nobody is selling the HD 5850 at it's $260 price point. Prices start at $310. Okay, that's cheaper than the starting price of $350 for the GTX 285 it outruns, but you are still getting price-gouged. Granted, you'd be getting price-gouged even more if you went with a GTX 285.

    So if you are looking to spend between $200~$250, the GTX 275 is your only option.

    Unless Nvidia can get a competitor to the RadeonHD 5x00 series onto the market, AMD's vendors aren't going to have much incentive to drop prices. Right now the HD 5x00 series is a cash cow with gamers willingly paying way over the suggested price for what's a better product.

    AMD's plans on the $200~$250 market are still uncertain. Depending on you ask AMD has two different plans to deal with this price segment depending on how TSMC production holds up. Reportedly, when the 5670/5570 (Redwood) and HD 5450 (Cedar) GPU's hit, AMD could also be introducing a 5830, much like they have before. Such a card would be a downclocked version of the GPU found in the 5850, possibly with the memory controller from the 5770.

    The other option is a clock bumped 5770 running at 1ghz by default.

    ***

    Now, as to whether or not you should buy AMD or Nvidia?

    Well, I'm sorry, but unless you are stuck in that $200~260 price bracket, Nvidia's not exactly an option. You are basically paying for a product that is overpriced, under performing, with the industry's worst 64bit drivers (seriously, even Via/S3 has better 64bit drivers on their Chrome products), and from a vendor whose continued existence in the consumer market is still very much under question.

    It's your money though. Buy what you want.
  25. Quote:
    Originally Posted by NekoAli View Post
    I might be willing, but I'm pretty sure my landlord won't go for it.
    *talks over with CatFancy* ... um... what would your landlord like?