Father Xmas

Forum Cartel
  • Posts

    6048
  • Joined

  1. Yes, you probably went from 1280x1024 on your 19" to what, 1680x1050 on your 22" widescreen? So first you are slinging 35% more pixels per frame. You are also using 35% more video memory for multiple frame buffers and the z buffer.

    Also just about every review site tests video cards with all the quality settings set to the max to highlight the differences between cards. In the case of the older, original 8800 GTS where only memory size is the difference, it's the size and amount of textures that's the problem. Frequently a simple lowering of the texture quality by a notch or two can restore an acceptable frame rate as the game uses textures more suited to the amount of video memory you have. Of course as your monitor's resolution increases, smaller, lower resolution textures become more obvious.

    Sites like HardOCP review cards by adjust game settings to attain similar frame rates and then report the setting differences. They also include "Apples to Apples" same setting comparisons for those readers who can't wrap their heads around what the various game settings do, even if HardOCP includes a brief explanation of the visual differences.

    Oh and don't say you didn't know what you were getting into. The reviews at the time showed a significant impact in performance when they came out in Feb 2007. Here were two.

    The Tech Report
    bit-tech.net

    So maybe at 1280x1024 the differences weren't so huge and the price savings was attractive, maybe the only way you could afford a Dx10 class card at the time. It still out performed the 7900GTX with 512MB of memory so it was probably a good decision at the time, just not so good three years down the line.
  2. Let me toss in my 2 cents.

    More video memory doesn't automatically mean faster performance. It's not having enough video memory for a game's settings that can degrade performance.

    Now with everything being equal, same type of memory, same video memory bandwidth, same powerful GPU, same clock speeds, just more memory on one versus the other, it would take very aggressive game settings to see the difference. Usually something like using huge uncompressed texture quality at extremely high resolutions. If the game is spending more time retransmitting the textures needed to render the current frame because the card doesn't have enough memory to hold all the ones required than spending the time transmitting the actual instructions about rendering the current frame, then will you see a difference in performance because of video memory size.

    I also often see on lower end video cards similarly priced versions of a card based on the the same GPU but one using slower memory (DDR2 Vs GDDR3, GDDR3 Vs GDDR5), but has twice as much than the one with faster but less memory (1024MB Vs 512MB, 512MB Vs 256MB). Don't be fooled. Faster memory nearly always win and when it doesn't it'll be a Pyrrhic victory because the game (games in general, not just CoH/V) settings will have been turned up so high that frame rate would be "unacceptable" with either card.

    Here's a nice article on the subject from Tom's Hardware.
  3. Ambient occlusions is how an object appears dimmer when it's in a shadow of another object.

    This describes it better and provides some pictures for comparison purposes.
  4. Well ATI defined the price performance points with the HD 4xxx and 5xxx series. But because nVidia couldn't/wouldn't follow their prices are creeping up slightly.

    On the other hand nVidia cards are quietly lowering their clock speeds on some of their existing products making long established benchmarks a bit moot. The 9800GT was released with a 600MHz GPU clock. Now I'm seeing more and more running at 550MHz. These were meant to be the "green" version of the 9800GT that doesn't require external power but some of these cards still do.

    Same goes with the GTS 250. Original spec was a 738MHz clock and now I'm seeing 675-700MHz ones which are a lot cheaper than the ones running 738MHz or higher. I'm not sure if this is just the manufacturers fault or not but be on the lookout. There wasn't an official "green" version for this card. 675MHz was the speed of the original 9800GTX. Makes me wonder if someone found a shipping container full of older, not rated for GTS 250 speeds, GPUs and are sticking them on cards.

    The new GT 220 and GT 240 cards are targeted to replace the 9500GT, 9600GSO and 9600GT cards.

    On the ATI side, the HD 57xx series fall between the GTS 250 and GTX 260 cards. The HD 5670 is less powerful than the 9800 GT but more powerful than the 9600 GT or GT 240.
  5. That hasn't been the case for a while Luminara, custom pinouts. The major problem recently (as in several year old Dells) is the lack of a punched out rear opening for power supplies to stick through. This is where PC Power and Cooling and others are still making money on Dell replacements. Newer Dells have the rear panel punched out which bypasses that issue.

    As for slim line cases I have absolutely no idea what to look for, from a size and mounting PoV to assist.
  6. Not a problem related to your problem but who the heck sold you a system with 6GB of RAM and a 1GB video card but with a 32-bit OS? That's quite a bit of memory you aren't seeing (seeing only 2.5GB of six).

    Looks like the 9.12 ATI drivers, not sure with or without the hotfix.
  7. Quote:
    Originally Posted by EmperorSteele View Post
    I already have a 650w Antec power supply i got last year, so I'm presuming that'll be adequate for any power supply needs i may have.

    But the 240 is LOW? Dang, from what I've read it blows my current card out of the water and runs modern games great.. I would hope that it could run an old game with some modern bells and whistles good, too!

    Hmm, if i get some cheaper memory, i can probably slide with getting a slightly more expensive GPU (a 250)... won't give me EVERYTHING, but i won't be on the low end fro ultramode anymore, so that'll be good =)
    The GT 240, even the one with GDDR5 memory, is at best the same performance level as the 9600GT. Don't take my word for it.

    AnandTech
    Tom's Hardware
    X-Bit Labs
    TechPowerUp
  8. spit take

    Ah, no.
    No.
    Hell no.

    Ultra Mode is for players who have beast machines and gamers who may be attracted to this game purely due to the "pretty" factor. It's entirely optional and if you don't have the hardware to run it, no big deal, the game will simply look as it does today.

    First you are going to need a Dx10 class of card. Not that the game is Dx10 but you need the GPU hardware a Dx10 card can bring, a bucket load of configurable streaming processors for all the complex shaders Ultra Mode will be using. Then it's not just any Dx10 card but a mainstream or higher relative to today's standard. So HD 47xx, HD 48xx, HD 57xx, HD 58xx, 9800GT or GTX 2xx.

    Of course to feed such beasts you will need a reasonable CPU, a true dual core should do. An old P4 even with HT isn't really an option.

    And all that Windows 7 will bring, assuming it's 64-bit, is access to all 4GB of RAM.
  9. It's a mindset thing. Think "I'm playing/creating a super hero/villain" and not "I'm playing an MMO how do I min/max for this game", at least at character creation time. Once you have an idea of what your hero/villain is going to be like, then worry about min/maxing them.
  10. Well the guy's on a trial during double XP weekend. He'll be hitting 14 extremely quickly so why not take the opportunity to roll up a variety of AT/power set combos to get an overall better idea of game play and the differences in each AT?

    That was the point I so ineloquently trying to make.
  11. Quote:
    Originally Posted by EmperorSteele View Post
    So there's no harm going 4Gs, then? Hmm, I'll have to consider it then, thanks! *gives prezents!*

    Oh, but the computer should still run Ultra Mode fine, right? Or is that still unknown at this point?
    Well if Posi's recommendations are correct, the GT 240, even with GDDR5 memory, is a bit on the low side. Problem is anything more powerful will require external power and that brings up the whole issue about having a decent PSU.

    But since we don't know the actual impact on performance with Ultra Mode, even on it's minimum settings, relative to current performance as well as what you may consider to be an acceptable resolution and frame rate, there may be a combination of settings that you would feel OK with.
  12. You can put 4 GB in, you (the OS) just won't see it all until you go with 64-bit Windows 7 (64-bit anything actually but why anyone would want to install Vista or 64-bit XP is beyond my help).

    You are spending $90 on ram anyways, why not spend $20 more and get 4GB (2x2GB) of DDR3? There are several nice sets for around $105 at NewEgg, less with rebates. The 4GB of Cas 8, 1.65V, DDR3-1600 in my $1200 build is only $115 which is still less than the $30 a GB you were paying.

    And note, at NewEgg, due to how manufacturers round, there are three DDR3 "identifiers" for DDR3-1333, PC3-10600, PC3-10660 and PC3-10666 (1333 x 8). Use Power Search to select them all when you go looking. Also if you keep the search in the 1.5V (DDR3 default) to 1.65V (Core iX "max voltage), you can reuse these on an Intel Socket 1156/1366 motherboard if you ever find yourself upgrading to Intel.
  13. Try another AT and power set combo. This game is about alts and playing with different power sets.
  14. Quote:
    Originally Posted by NovaThunder View Post
    I got this computer here the link let me know what other video Card it can take?

    http://www.thesource.ca/estore/produ...ine&tab=2#more
    Not sure. It will depend on the PSU.

    Acer normally ships a 300 watt unit in their Aspire line. Not sure how many watts at 12 volts. You wouldn't be able to use any of the GPUs Posi recommends with that sized PSU in any case.
  15. Quote:
    Originally Posted by EmperorSteele View Post
    Okay, Ultramode has me salivating, but unfortunately, my current box (2.1g proc, 2 gigs ram, 6600GT vid card) can barely handle CoX at good settings (solo i get about 30 fps at 1024 x 768 w/o AA or AF... forget team content!).

    But then Techreport (.com) did their system guide for 2010 and the econobox looks very nice. I made some modifications to fit my own desires, and here's what i came up with:

    http://secure.newegg.com/WishList/Pu...umber=17853108

    Basically, it's a AMD Athlon II Quad core at 2.8GHz, 3 gigs of DDR3 ram, and a Geforce GT 240 (512 mb GDDR5). The vid card has impressive numbers and good reviews, and I was wondering/hoping that it would be good enough for all the Ultramode goodies. Confirm/deny?
    Ah, three 1GB sticks is going to force memory into single channel mode, halving memory throughput. Are you sure you want to go that way?


  16. And they say color blindness isn't a handicap.

    If I am a winner, I permit NC Interactive, Inc. and NCsoft Europe Limited to use my name, likeness, photograph, hometown, and any comments that I may make about myself or this contest that I provide for advertising and promotional activities. I also certify that I am at least 13 years of age and am eligible to participate in this contest.
  17. The problem with 2XP weekends right when the Issue Drops is the unexpected wide scale bugs that frequently crop up upon Issue releases.

    If you going to have an open house do you do it after you have time to clean up or do you have it first thing after a wild party?
  18. Quote:
    Originally Posted by El_Toronado View Post
    The GTS 250 is indeed a power hog! I just looked at the power specs and just about choked! 450W Minimum from the power supply, and 24 amps (!!!) on the +12V rail!
    You do realize that's for your whole system and not just the GTS 250. The GTS 250 tops out at around 150 watts when pushed to the max which is about 13 amps at 12 volts. That leaves 11 amps for the CPU and the various drive motors and case fans.

    But since there's a lot of very cheap PSUs that are rated at 450 watts but only have 18-22 amps at 12 volts, video card companies have to be very conservative with their minimum PSU requirements to keep the RMAs down.
  19. I'm going to say you have the ATI driver that strobes OpenGL games installed. Grab the latest ATI drivers (10.2 I think) and you should be fine.
  20. Quote:
    Originally Posted by Mr. DJ View Post
    well I was looking at a Antec TruePower 650w, talked to a couple friends and they believe it should fit in my mini tower hp a6040n
    One moment as I check ...

    As the service manual is downloading (dial-up) let me tell you my concern. Most of the larger power supplies are exactly that, larger. In the case of the TP-650, it's 1 cm longer. Doesn't seem like much but if HP used some friction latch to help hold the PSU in place you will run into complications.

    Done downloading and unlike Dell or Gateway, it doesn't include how to replace the PSU or have a clear picture or drawing that I can reference. Grrr.

    Well it appears that the case does uses the standard mounting holes for screwing the PSU to the rear of the case. It looks that the PSU fan will be facing down when mounted so that's good. I would crack open the case, take a ruler or tape measure and measure 6"/15 CM from the rear of the case and see if there is any obstructions inside the top or on the side near the current PSU. If not, you are good to go with the new one.
  21. Father Xmas

    Game card

    And if you do get a box edition, DO NOT install it, just apply the code to your trial account. You already have the whole game. Installing from the DVD or CDs (depending how old of a box edition you get) will simply overwrite your current install and then 90+% of the game will need to be patched back up to the current version, which will take some time.
  22. Father Xmas

    5770 or 5870

    Is the HD 5870 worth the $100 over the cost of a HD 5850? I don't think so.

    X-Bit Labs has an review comparing the HD 5850 to the HD 5870 (and the GTX 285). The summary page has four charts at the bottom showing relative performance of the HD 5850 when compared to the HD 5870, two resolutions, with and without AA/AF on, across 15 games at stock clock speed, OC to the same speed as the HD 5870 and maximum OC. For a card that costs 25% less than the HD 5870, they are seeing roughly 15% less performance at stock clocks.
  23. Father Xmas

    5770 or 5870

    Quote:
    Originally Posted by Flameshot View Post
    It is? Hmm. My bad then. I was going by this. Guess I read it wrong. Thanks for correcting me FatherXmas
    Because they're in the same "bin" doesn't mean they have exactly the same performance.

    Of the five single GPU cards listed there, they are ordered roughly

    HD 5770 = HD 4870 < GTX 260 (216SP version) < HD 4890 < GTX 275

    The difference between the GTX 275 and the HD 5770 is around 25%.
  24. I consider DoT to me standard MMO lingo. Much like AoE, Mez, and Mob. Of course that doesn't help the first time MMO player much.
  25. Father Xmas

    new laptop specs

    You can't really upgrade a video card in a laptop. But try to find an updated video driver from either Samsung or Intel.

    As for game settings, turn on the advance graphic settings and turn off/down bloom, depth of field and drop the water effects down a notch or two. Also drop the particle count to 10,000. Then there's always turning off shadows.

    Use /showfps 1, to see the current frame rate while tweaking those settings to understand just what affects performance.