Father Xmas

Forum Cartel
  • Posts

    6048
  • Joined

  1. [ QUOTE ]
    I believe Father Xmas has several builds at differing $ amounts in his sig.

    [/ QUOTE ]
    Bamf!

    You rang?
  2. It's 8:20am pacific time, they aren't even in to work yet.
  3. Well the G92 was introduced in Oct 2007 with the 8800GT with only 112 of the 128 streaming processors (SP) enabled and in Dec 2007 the 512MB 8800GTS arrived. Since the G92 was their 9th generation GPU they should have been called the 9800GT and 9800GTX from the start but the problem was that it didn't really exceed the performance of their top end 8th generation GPU the 8800GTX/Ultra. Actually at very high resolution the 512MB 8800GTS was a bit worse due to lower memory bandwidth and less video memory. So nVidia waited until they had their $600 video card, the 9800GX2, in March 2008 before they hung the 9 on the name of the original G92 cards.

    Now the 9800GX2 is sort of cheating to have that as your high end card. People were looking for a boost in performance for a single GPU like when the 8800GTX replaced the 512MB 7900GTX. But that wasn't coming until the G200 GPU arrived in June 2008, only 7 1/2 months after the G92 came out and only 3 months after they formally introduced the 9xxx series. And while the G200 did seriously outperform the 8800GTX/Ultra and the G92 including the 9800GX2 twin GPU card, it wasn't anything revolutionary. They simply increased the number of SP per cluster to 24 from 16, the increased the number of clusters to 10 from 8 and doubled the width of the memory bus to 512-bits so all those SPs could be fed.

    Now nVidia may have had midrange G200 based parts on the drawing board built around 4 to 6 of the 24sp clusters with 128 or 256-bit memory bus but ATI dropped the HD 4850 and HD 4870 a few days after nVidia's G200 based cards were introduced. What shocked the market was ATI's previous top end card, the HD 3870 was barely competition for the 8800GT/9800GT and suddenly the HD 4870 was outperforming the 192SP version of the GTX260 and at the price point of $300, $100 less than the GTX260, and the HD 4850 easily beat the 9800GTX and was priced only at $200, also $100 less than the 9800GTX. So from a consumer standpoint, in the same time between the introduction of the G92 to the G200, ATI nearly caught up but also redefined the price/performance point for video cards.

    nVidia was forced to lower their suggested prices for the GTX260 and GTX280 right out of the gate. So nVidia needed to punt. A 4 to 6 cluster (96 to 144 SPs) based G200 part wouldn't provide much improvement over the mature G92/G94 part (the G94 is found in the 9600GT). It took a year for nVidia to refine the G80 into the more profitable G92, why bother developing a "new" part that would perform just like the "old" cheaper to produce part?

    As it was, nVidia had to tweak the GTX260 by enabling a 9th 24SP cluster for 216SPs just to stay ahead of the HD 4870 (which caught up some by going to 1GB of memory). nVidia couldn't waste the time developing a refined G200 beyond a straight forward die shrink so they are pushing ahead to the G300, rumored to have a whopping 384SPs with GDDR5 memory like the HD 4770, 4870 and 4890 have.

    Currently nVidia is fighing ATI at the high end with GTX295, a twin GPU card, the GTX285, an OC GTX280 and the GTX275, which is half of a GTX295 or a GTX280 with a GTX260 memory bus to choke performance. ATI introduced the HD 4870 X2 twin GPU card that easily outperforms the GTX280 and the GTX285 and the HD 4890 which competes nicely with the GTX275.

    So do I blame nVidia's marketing department renaming the same set of video cards twice? No, I've long assumed, and direct experience proven, marketing does not relate to our reality of facts and specifications, they simply look forward to the next artificial "season". Do I blame their engineering department? No, they had all indications that they were at least a year ahead of ATI, I'm sure they were planning a refinement of the G200 with midrange versions of the design. They were simply caught the same way AMD was when the Core2 arrived, they didn't expect their competition to leapfrog ahead the way they did.
  4. Nah, it's just ATI finally showed up with a design that competes well with nVidia, market forces did the rest. Amazing what real competition can do to a stagnant market. Same is true with Intel and AMD now that the Phenom II is out.

    ATI has nVidia over a barrel, the G200 GPU is to big and costs to much to make compared to the R7xx series of GPUs. nVidia is relying on the G92, which is nothing to sneeze at, to provide a midrange alternative (128sp Vs 240sp, 256-bit wide memory Vs 512-bit) than issue a similarly reduced GPU based on the G200.
  5. Well speak of the cigar chomping devil.

    Honestly, I haven't yet upgraded from the nVidia series 7, Dx9 yet (7900GS to be specific). I don't play many other 3D action oriented games and my 80lbs monitor is still limited to 1280x960. But I read reviews nearly everyday on the latest and greatest CPUs and video cards.

    I'm waiting on the next big full rebuild that will be targeting Vista SP3.. I mean Windows 7. Looking at the Core i7-920 since I do coding as well as playing a few games.

    It's frighting who much faster and cheaper video cards have gotten in 3 years.
  6. Sadly however it appears that nVidia and Intel are negotiating , negotiating with lawyers that is as Intel wants to prevent nVidia from marketing the chip to Atom customers as well as letting nVidia create a chipset for the Core-i7 and nVidia's licensing their SLi for use in Intel chipsets.

    If that stops Taiwan from putting the two together anyways, who knows. Currently there is a mini-mac style nettop that Acer has shown at CES a month ago. A many of the rumor sites indicate later this quarter for ION netbooks to reach the market from Asus, Acer and others. If not Atom then Via has a similar CPU called that Nano that may end up coupled to a 9400M is nicknamed the ION 2.
  7. nVidia released the 512MB 8800GTS back in Dec 2007.

    Then, three and a half months later, bumped its clocks up a few percent and renamed it the 9800GTX in Q1 2008.

    Then another three and a half months later they bumped up the clocks by another 10% and called it the 9800GTX+ in July 2008.

    Finally after seven and a half months, they renamed the card the GTS250. The GPU is now a smaller, lower power version of the same GPU they had since Dec 2007, the G92.

    Compared to the original 512MB 8800GTS, the GTS250 clocks are 13% faster, is available with twice the memory, uses less power (only one 6-pin power connector is now needed) and is 60% cheaper.

    The price drop, the clock speed boost, the doubling of memory are all the result of ATI's HD 4xxx series arriving on the market.
  8. Their is a check box on the updater labeled "Safe Mode". This temporarily sets the graphic options and resolution to their lowest. You can then, once you enter a server, set your graphics options.
  9. Father Xmas

    Power Supply

    I would say that 250 watts total, 180 watts at 12 volts, PSU with a baby quad core and an 8600GT, you are pretty much at the PSU's limit.

    Even eVGA's "low power" 9600GT, which is underclocked by about 10% but doesn't require a PCIe power connector, recommends 300 watts and 18 amps at +12 volts.

    So if you really want to upgrade, I think you are going to need to add $50-75 to your budget and include a larger power supply, assuming it, your Acer box, uses a standard size ATX12V PSU.
  10. Well you've answered your question. There isn't an nVidia alternative for that level of general performance at that price point.

    BillZBubba's sticky thread pretty much sums up the problems with ATI and CoH/V. As of his last post in that thread in Feb 2009, it appears nothing has changed in regards to the AA and the advance shader effects like Water, DoF, Bloom.
  11. Father Xmas

    Power Supply

    Well on paper, the 9600GT has double of everything the 8600GT has. Here is a review of the 9600GT comparing to a number of low end cards including the 8600GT. The 9500GT is the equivalent 9 series model to the 8600GT.

    OP, you need to crack open your PC and look for the specification sticker on the side of your power supply. The two things to look for are the total watts of the power supply and the number of amps available at +12 volts. There should be a little table that lists each voltage with the number of amps or each under it.
  12. The E7xxx series can't run that feature. Doesn't have Intel's VT enabled.
  13. /loc in the chat bar. The coordinate order is a little non intuitive to those who don't program 3D graphics for a living.

    ParagonWiki has a handy diagram explaining the order and direction.
  14. Once more from the top.

    You can try changing what port the game updater is using to connect to the update servers. The default port happens to fall into the default range that many bittorrent clients use and who knows if some switch between you and the update servers are shaping traffic in that port range.

    This article from the PlayNC support area shows you how to change the port. You can stop the updater at any time, it is smart enough to pick up where it left off.
  15. Worse comes to worse, use TweakCoH to change the resolution.
  16. [ QUOTE ]
    Just to reiterate, YES we can.

    [/ QUOTE ]
    Ah, not quite. That's changing the port the updater uses for patches. You can't change the ports the actual client uses.
  17. No it is automatic now.

    And welcome back.
  18. Reactivation week. Everyone and their mother in-law is updating the game to play for free. Toss in the new accounts purchased though Steam, the new box version in stores and trials due to all the press about the Mission Architect, the update servers are a little overworked right now. Take a deep breath and relax.
  19. Mission available at Level 20 in CoH through the City Rep standing inside City Hall in Atlas. In CoV it's available from whoever your first contact was.

    At level 30 you can unlock Auras the same way.
  20. Personally I'll wait until the netbooks that use the nVidia 9400M chipset arrive as the Intel chipset that all netbooks currently use is sad for gaming. This configuration, Intel Atom with a 9400M, nVidia has dubbed the ION platform.


  21. Dang, we sure seem to burn through community reps around here since Cuppa.

    Good luck Ex on all your future endeavors.

    Stay frosty.
  22. For graphics, this game uses OpenGL (like Doom, Quake, Prey, etc.). To my knowledge the only major difference between Dx9 and Dx10 centers around the 3D graphics portion of Dx10.

    Microsoft with Vista tried to make the OS more stable by forcing the device drivers, especially graphics, to run in a mode where if they fail, they can be restarted and not take down the OS. And while Microsoft also tried to convince the PC gamer ecosystem (developers and players) that Dx10 is the cat's meow, other parts of Vista wasn't all that friendly to the way developers coded games in the past or the use of multiple video cards. Vista is a lot more stable now than when it first came out and graphics drivers are a lot more stable now then when Vista first came out. But don't think one minute that all the backward compatibility issues have been addressed.
  23. Guild to getting yourself a forum avatar

    I use Photobucket myself for storage and Irfanview to edit and scale it but as some point out MSPaint can be used just as well.
  24. [ QUOTE ]
    Nemesis staff and Blackwand root whether or not the power actually goes off.

    [/ QUOTE ]
    Actually Nemesis Staff, Blackwand and Lost Curing Wand will root if you are standing still and your target is either out of range or not within line of sight. However if you are moving there is no rooting even if the target is out of range or not within line of sight.

    Also drawing weapons with Dual Blades seem to alternate between drawing a single weapon or drawing both.

    Note: I assume all you posting here or have also encountered any of these "new" problems not in Ex's first post are /bug(ing) them.
  25. Damn, that's the second time in two days I posted on the wrong thread.