Father Xmas

Forum Cartel
  • Posts

    6048
  • Joined

  1. True both the HD 5850 and 5870 are drifting upwards from their $299 and $399 MSRP. Yes the HD 5850 was suppose to be $269 MSRP but that lasted about 10 seconds when nVidia didn't slash the price of the GTX 285 while the GTX 275 simply vanished.

    As long as nVidia isn't willing to play, ATI (card manufacturers actually) can keep ratcheting up the price until the market cries Uncle. And if the current rumored prices for the GTX 4xx cards are true, their still won't be a reason to start lowering them anytime soon. I'm afraid the golden era of the late 2008, early 2009 when memory and video card prices were cheap is now history.
  2. Quote:
    Originally Posted by ironsmiter View Post
    ... and a gt210.


    Well, it's faster than an IGP. It may even be as fast as your old 6800XT. And it's a G 210 or simply a GeForce 210. But hey, beggars can't be choosers.

    I'm guessing an E5200 CPU (please not the E3300). And the better CPU should also affect overall performance, not just the video card.
  3. Personally I don't think the HD 5870 is worth the extra bucks (10-15% improvement for 33% more $), but the price between the HD 5850 and HD 5870 is narrowing (due to lack of nVidia competition) so it's not as bad but still.

    And you did notice that the HD 5830 is priced a bit high for a modest (10-15%) improvement over the HD 5770 considering it costs over 40% more.
  4. Father Xmas

    PhysX Issue

    Quote:
    Originally Posted by Stardrive View Post
    WHY am I sadly NOT surprised?

    The game does a LOUSY job of supporting ATI cards. I got tired of black water and other ugly effects on my 4650. I solve that by going out and buying the chipset/GPU THEY RECOMMEND, only to find out the advanced feature they hype for the game is NOT supported by the GPU that THEY RECOMMEND!

    There are no words to desribe how assinine this is.
    The game supported PhysX before nVidia bought PhysX. It's not the game's fault that nVidia hypes the heck about PhysX.

    The nVidia recommended video card had to do with hardware to support OpenGL 2.0 (essentially a Dx9c card) which was added at the same time as the pre-nVidia PhysX, way back in Issue 6/City of Villains which was the last major graphic revamp of the rendering engine back in 2006.
  5. Didn't see them in Target but did see a Game Card in my "local" Best Buy.
  6. 1+4) Characters, boosters are server based and tied to your account. You can install the game on as many systems as you have access to. You still can only be on one at a time per account but having it installed on your laptop, desktop, computer at your in-laws, it doesn't matter.

    Things that are stored locally, saved window layout, chat layout, binds (complex ones that trigger loads of other binds), saved costumes from the editor, AE stories under development, custom enemy groups and probably a few things I missed.
  7. Well the first way is to use TweakCoH. Note that this site is being redesigned so the TweakCoH page may move at anytime. This utility allows you to change the various game settings such as resolution. It does this by changing the settings for the game in the Windows registry.

    Another possible way, since I haven't tried this myself (traveling now so I can't try it), is to create a new shortcut with -screen x y added to the end of the target line of the shortcut. So, in theory, -screen 1024 768 should start the game in 1024x768 resolution. Like all command switches there is a space before the -. I do know there is a way to start in game in a window, I just don't remember how but I've seen other posters list the method, maybe one of them will chime in here.

    Sorry I can't be more helpful.
  8. Father Xmas

    PhysX Issue

    When PhysX support was added to this game way back when Issue 6/City of Villains came out, it was so new that what we have integrated into the game could be considered a Beta version and was only designed to work with the old Ageia based PhysX card. The game wouldn't even recognize later drivers that Ageia put out that improved CPU emulation, likely because of the early nature of the PhysX API the game used. Sometimes being the first on the block with the latest gadget isn't always a good thing.

    So when nVidia bought up Ageia and wrote a CUDA PhysX emulator for their Dx10 class cards, we were out of luck. The game simply doesn't know how to talk to those drivers, just like it didn't know how to talk to the last ones from Ageia.

    Simply bump the Particle Physics Quality up to the max if you have a good multicore CPU. The game will still use the PhysX CPU emulation to fling around packing peanuts, shell casings and exploding boxes.
  9. Well as the date comes closer, the rumors are a changing. The lastest is this.

    Quote:
    GeForce GTX 480 : 480 SP, 700/1401/1848MHz core/shader/mem, 384-bit, 1536MB, 295W TDP, US$499
    GeForce GTX 470 : 448 SP, 607/1215/1674MHz core/shader/mem, 320-bit, 1280MB, 225W TDP, US$349
    See all the original articles on the GF100 GPU chip as

    512SP, 725/1450/2100MHz core/shader/mem

    so the GTX 480 is about 90% of what it should have been. So I'm betting it's either a yield (can't get enough GPUs with all 512 SPs working) or a power/heat problem. Now at least nVidia seems to have gotten the sense, since the rumored benchmarks but it in the range of ATI's HD 58xx cards in performance, to also price them similarly.

    The good news is that may mean the upward drift in HD 5850 prices may stop and reverse if the GTX 470 actually compete with it in performance. The GTX 285 prices may drop as the remaining stock sells out. I don't think we are going to see any other price shifts however.

    Well the NDA will lift in a week and we will see but the benchmarks say about performance. That coupled with the MSRP of the two cards will determine if we will see much of a shift in prices.
  10. Father Xmas

    new laptop

    Well the HD 3200 is also an integrated graphics processor. It's a bit better than the Intel version when it comes to games but neither are what I would consider to be a real GPU. Recommended settings may be a bit high for either of these at their's native resolution of 1366x768.

    Of the two I would pick the one with the HD 3200 (Aspire AS5517-1127) because first it's CPU is dual core. Second the Intel IGP drivers are somewhat glitchy when it comes to this game, ATI has it's own quirks but not as bad as Intel's. Third it has more system memory. And lastly, it also 64-bit Windows 7 and not Vista.
  11. Well IGPs have progressed toward Aero desktop support and video decoding than gaming. Don't forget that 3D gaming is a minority when it comes to what PCs are used for, even in the home.
  12. Well the 512MB 8800GTS is essentially a lower clocked GTS 250 (650MHz Vs 738MHz GPU, 1940MHz Vs 2200MHz memory). Actually a number of "cheap" GTS 250s are hitting the marketplace with GPU clocks set in the range of your 8800GTS or the slightly faster 9800GTX. In any case the performance gain from your card to a true GTS 250 isn't really all that great (5-10%). That's the only nVidia card available in your price range as the GTX 260 is in the $205-220 range. A GTX 260 (216 streaming processors version) would be about 30-35% faster than your current card.

    The HD 4890, also in the just over $200 mark is about 15% faster than the GTX 260 or about 50-55% faster than your current card.

    Between the GTS 250 and the GTX 260 you will find the HD 5750, HD 5770 and the HD 4870. The HD 5770 and HD 4870 are within a percent or two of the GTX 260 with the HD 5770 being a hair slower and the HD 4870 being a hair faster. The HD 5750 is the slowest of these three, it'll be around 15% faster than your current card.

    There is also the HD 5830 but most everyone thinks it's still priced to high for it's performance gain relative to the HD 5770. The HD 4890 beats it.
  13. Looks to me that you have to pick a card that's truly a single wide card, not just a single wide bracket but no "large" heatsink either. That's going to limit your choices a lot since most manufacturers simply assume that you can fit a double bracket or large heatsink if you are choosing higher end performing video cards.
  14. Old 8800GTS with 320 or 640MB of memory or the "new" 8800GTS with 512MB of memory?
  15. Father Xmas

    New Computer

    ATI cards had that "problem" with reflections.
  16. Quote:
    Originally Posted by Nurse Midnight View Post
    That's right is it a quad core, my mistake. So basically I would also need to upgrade my processor due to this bug?
    Nah ... maybe ... don't really know. If UM is as GPU intensive as they imply with their suggested video card requirements, it may be a lot more GPU limited than CPU.

    I think Posi's warning about the rest of the system has more to do with people who are still running single core systems built around a Pentium 4, an Athlon XP or an Athlon 64 and are running with less than 2GB of system memory.
  17. I don't understand what you mean by "fit the hole on the back of the case"? The double bracket wouldn't slide into two adjacent slots? I could understand if the card was too long, the standard HD 5850 is an 11" or so card.

    What case were you installing this in?
  18. Father Xmas

    Ssd?

    SSDs are expensive little beasties, especially if you get one of the better ones. They still range in the $2-3 per GB for prices and there's currently a battle of ever improving SSD controller chips going on. Personally I would wait another six months to see the fallout of the battle to find out who is winning. Maybe the competition will start to drive the prices down as well.

    What SSDs by you is the virtual elimination of seek time from a conventional hard drive. 9-14 ms average seek time doesn't seem like a lot but when you include that time when calculating disk throughput even Raptor 10K drives are 20x slower on random read/write tests than the better SSDs out there. Now if you are running a database server I can see the advantage an SSD brings but for gaming? I'm not sure shaving 10 of 30 seconds off of level load times in games is worth the price.

    AnandTech has been benchmarking a number of the newer units as they come out. You can find their current numbers here.
  19. Quote:
    Originally Posted by Nurse Midnight View Post
    I have an AMD 9600 Dual Core 2.31 GHz Processor with 3.25GB of RAM and an ATI 4890 Graphics card. I really want to run Ultra Mode at near max power without having framerate issues. What sort of replacement parts should I be getting if I need to be ordering replacements?
    Well an ATI HD 4890 is faster than an nVidia GTX 260 so from a graphics side of the equation you should be fine with UM at medium settings.

    As for the CPU the only 2.3GHz AMD processor with a 9600 designation I can find is a first generation Phenom quad core (not dual core) that had the rare L3 cache problem, one which the BIOS fix for it utterly killed it's performance. The current Athlon II X4 630 is about 25% faster, which makes sense as the "fix" for the broken Phenom 9x00 series basically neutered the L3 cache, leaving what you essentially have in an Athlon II X4. So from a CPU perspective you have a low end, quad CPU. That might hurt you a bit.
  20. (drive by post)

    Well if I'm reading the specs for the Dell XPS 630i correctly you have a 750 watt PSU with 2, 6-pin PCIe power connectors. So you have enough power for either of those cards.

    Oh, and here is the service manual. Read the sections on removing and replacing the cover and the section on cards. Also the section labeled Technical Overview includes reference drawings of the motherboard and the power supply connectors.

    The video card fits in the first PCIe x 16 (long) slot, closest to the CPU. Remember to plug in the external PCIe power connector to the card, if they use them. It's a 2 pin by 3 pin rectangular connector, they may be labeled P2 and P3.
  21. I have American Princess, Red, White and Blue in the Freedom costume set, except for rocket boots. She's an AR/Dev blaster.
  22. Father Xmas

    Weird fps drop

    With an integrated video "solution" like the 9100, you really shouldn't have any of the more advance shader effects on such as DoF, Bloom or Water Effects. You may also want to turn your Maximum Particles number down to 5-10K.
  23. At this point they are all pretty much the same. Not happy about the price creeping up since nVidia hasn't/can't/won't lower the price of the GTX 285. They started at $299.

    And your link is busted but I think you meant this one.
  24. Considering you have an 2.0GHz Athlon 64 and an oldish ATI 9600 I would say it's not because of an underpowered PSU.

    If it's dieing, that's another matter entirely.