Father Xmas

Forum Cartel
  • Posts

    6048
  • Joined

  1. Father Xmas

    New Rig Question

    Let me point out a few things here.

    The motherboard is NOT nVidia SLi compliant. If you ever were planning to go with that route, that motherboard isn't the way to go. Also it isn't even a good way to go for ATI CrossfireX either because the 2nd video card slot is actually only wired for x4 V1.0 PCIe while the main video card slot is x16 V2.0 (twice as fast as V1.0) PCIe. So there is an 8x difference in speed on how fast the CPU can talk to each card.

    Looks like you spec this at CyberPowerPC/BuyXG (same place, same fax, different names). Nothing obviously bad about your selections, just some oddities. You have a full size tower case, great if you have/plan to have a ton of hard drives. However you are using a uATX motherboard. It sort of sticks out like a Hummer driven by a family of "little people". Just seems odd to me that's all.

    Honestly now, your choice of soundcard is just to get the "SoundBlaster" name isn't it. That card isn't going to affect gaming performance and I doubt, unless you are an audiophile, not notice any difference in audio quality. Save your money, you can always buy a sound card later at NewEgg and plug it in yourself. And speaking of audio, unless you have no speakers right now, those $15 (price difference between these and none) 2.1 speaker system tells me you aren't an audiophile.

    Not of fan of "unknown" manufactured 700 watt power supply. It's probably good enough for now but a PSU is as important to a system as the motherboard. Would you buy a "generic" motherboard?

    And do you need 8GB of memory today. Yes more the merrier but this is an easy thing to upgrade later.

    I also think the noise reduction mats aren't going to be to helpful on a case with top, bottom, side, front and back fans/fan mounts. I'm all for anti-vibration mounts for fans and PSUs since they can help to get rid of rattle.

    So there are my comments and criticisms on your build. You are caught in the trap of a limited budget and big desires. You want the top shelf CPU, memory and video card and are willing to sacrifice everything else just to get within budget than than compromise on the major components to bring the general quality up.

    If you "downgrade" to a GTX 275 and 4GB of main memory and get rid of the sound card you will free up over $250 that you can use to get a better motherboard with actual SLi support (Gigabyte UD4P), a better quality power supply, a better set of speakers or maybe even a legitimate Windows 7 license.
  2. Father Xmas

    Engine Question

    Just because you upgraded the heating system in your house from oil to gas doesn't mean every room is now wired for cable or Gb ethernet.

    Power customization for instance had very little to do with the graphics engine and a lot more to do with storing an alternate animation selection or color tint for every power in every costume slot on a character. Then when compositing the scene looking up those selection on every character, applying them and then feeding the result to the graphics engine to render.
  3. Because vacuum cleaners in general are huge generators of static electricity. Yes there are the little ones used to suck the chuff out of a keyboard and they are safe outside of the case.

    When I clean out my or my parent's computer I bring it to the basement and blast out the dust down there. Yes immobilize any fan before you blast it with air. After I blast the dust bunnies out and remove the computer I then vacuum the work area.
  4. Clicking on mid's in the tag list reveals around 40-50 threads tagged with mid's. Not so many that it'll take forever but enough that it'll take 5-10 minutes to do.
  5. Father Xmas

    video cards

    Quote:
    Originally Posted by GibberingLunatic View Post
    Going Rogue Might be moving CoX to the OpenCL platform instead of the proprietary PhysX system.

    Having a Card thats able to support it right off the line might be a good idea.
    Like to know where you heard that. Wait found your previous posts on the subject.

    Problem is the game's physics engine would need to be replaced unless nVidia chose to release an OpenCL version of PhysX. Problem with that is that would cause nVidia to lose a unique marketing point that they have been beating the trades over the head with since they had very little else to talk about for months. Considering that nVidia's current crop of drivers disable PhysX support if an ATI card is also found in the system (for those who wished to use ATI for graphics but nVidia for PhysX) it doesn't seem they are ready to share PhysX support yet.

    Of course the is moot since the API of PhysX the game uses, to the best of my knowledge, doesn't work with CUDA based PhysX either. Now if the devs state that they are swapping out PhysX for an OpenCL based physics engine (Bullet perhaps?) great. Just don't jump to the conclusion just because the GR demo was done on an ATI card or reports that an NCSoft studio (it's a big company, lots of development studios) is looking at OpenCL for a different game means that OpenCL is coming here.
  6. Father Xmas

    video cards

    Intel doesn't want anyone to make chipsets for the Socket 1366 or 1156 series of CPUs other than Intel. I think this is also true about the laptop versions of those CPUs. This locks nVidia out of the "but my integrated graphics stomps your integrated graphics into last week" laptop and desktop chipset market but they can still sell discrete laptop GPUs.

    This annoys nVidia big time and is also now forced to license the magic word to motherboard manufacturers that will allow those motherboards to enable SLi code in nVidia's drivers. nVidia makes considerably less money selling magic words than chips. But without SLi support, who is going to buy multiple high end nVidia cards?

    The biggest "surprise, didn't see that coming" market for high end video cards is Wall Street who are using them to build super computer trading systems that buy stock on news milliseconds before the competition and then sells to that competition for an immediate gain seconds later for instant profit.

    Rumor is nVidia may have a video card with their G300 GPU to show before the end of this year. Nothing about when you will be able to buy one but it sounds like the first batch of G300's are allocated for Wall Street and not die hard gamers. The new price point for high end video cards as set by ATI's HD 5870 is discouraging nVidia's manufacturers from building the top end cards because the fat profit margin there is quite thin now. So it's been getting tougher to find GTX 275/285/295 cards from big name players other than eVGA for some reason, I'm guessing price, eVGA really haven't cut theirs.

    ATI has hit a snag with their 5xxx series cards, two actually. First they didn't plan they will be so popular so they didn't order enough 40nm wafers the chips are made of and second the Fabs producing them are having manufacturing problems at 40nm so yields are horrible making the shortage even worse. So it's tough if not impossible to find the HD 58xx series cards. The GPUs for the HD 57xx series don't seem to be in short supply but they are literally half of an HD 58xx GPU in size and complexity.

    Basically it sucks now if you are looking at top shelf video cards.

    It's also tough to find a mid to high end video card that isn't dual slot, explicitly (double wide bracket) or practically (single width bracket but huge bolted on heatsink interfering with the next slot. Problem is simple. The more powerful the GPU, the more power it uses and the more heat it produces. For an air cooled solution the more cooling fin area you have the better. Also more area generally means a less noisy/slower fan can provide cooling. Add this all together and poof, double wide video cards starting at $100-125.

    Multiple GPU support is not automatic in games. However games aren't coded to know if multiple GPUs are in the system but they do need to be coded so the CPU has time to kill between the last command it issued to the GPU and the GPU finish rendering the frame. Why, because if you have a second, third, etc. GPU in the system, the CPU can use that time to start issuing commands for the next frame to the next GPU in sequence. It also helps if you are pushing the GPU with crazy sick shaders, a large resolution (lots of pixels) as well as multisampled anti-alias and anisotropic filters to maximize the time it takes the GPU to finish the frame. The more the CPU waits on a single GPU system, the more a multi-GPU system will help framerate.

    This is also why two lower level video cards show such a performance boost in multi-GPU setups. Or when the CPU is much, much faster than the GPU. Again, the longer the CPU needs to wait for the GPU to finish, the more multiple GPUs help.

    This is why it's not an automatic "home run". It depends on the game. For our game I'm not sure if the GPU is simply not pushed hard or the engine is coded to command the GPU bits at a time leaving little time between the last command and the frame finishing.

    However I'm going out on a limb here and say that full volumetric shadows is an expensive post process effect along with more realistic reflection in buildings and water. I think it will be safe to say that those two effects are going to load a ton of work on to the GPU at the end of each frame. Because of this I'm guessing we will see a more obvious improvement in framerates on multi-GPU rigs in the Going Rogue expansion than we do today. How much? I can't say.
  7. I was more curious about the overall lack of European women playing MMOs in general than this game in particular. Lack of need for escapism and reinvention that a virtual avatar can provide or are they just more "grounded" in real life?
  8. Father Xmas

    Fps Issues

    Quote:
    Originally Posted by Thug_Two View Post
    This is a real silly question, but:

    With the FPS on, there is the frame rate number, then another set of numbers in parenthesis, something like (17m/m). What do the parenthetical numbers mean?
    It's time per frame in milliseconds. Occasionally you will see a large number with it, that's the amount of textures loaded onto the card since the last time that was displayed.
  9. Quote:
    Originally Posted by Coolio View Post
    Also Windows 7 Ultimate has both virtual pc and virtual xp pc options.

    So you could run stuff in a native XP mode if required, which gets around most of the compatability issues with some programs on anything other than xp.
    Except Virtual XP mode doesn't support 3D hardware acceleration.
  10. Quote:
    Originally Posted by SentryVII View Post
    Hey everyone. I was wondering if it was a good idea, or even possible, to run CoH on a netbook. The specific one I am interested in is the Acer Aspire One. All the stats can be found here: http://en.wikipedia.org/wiki/Acer_Aspire_One

    I'm not looking for highly impressive graphics if I were to play on this netbook. I simply want it to be able to play the game without lagging. Has anyone had any experience playing on a netbook? Please post your opinions and the make/model of the netbook you used.
    Short answer, poorly. The integrated Intel graphics found in most inexpensive netbooks are only able to run the game at minimal settings and not very fast even at those settings.

    As for bumping the memory to 2GB, it appears that Microsoft tweaked the XP license and possibly the code so it doesn't recognize more than 1GB on a netbook. I believe this is not the case with a Vista or Windows 7 netbook.

    However a netbook like the current version of the Lenovo Ideapad S12, which uses an nVidia ION chipset for graphics should be able to run the game better. However at the cost you may be better off looking at a real laptop. Also you aren't going to find these in warehouse and big box stores.
  11. My guess, if you don't respec frequently it's because you either plan ahead, read guides, guess well, don't care about micro optimizing, cashing out enhancements, etc then the free respec's we've been getting every issue and the vet respec's were enough for the casual tweak here and there. I've respec my main about five times in five years, most around the time of the GDN and ED. I also have like 60 costume tokens, wish I could sell some of them.

    Honestly, I have a hard time finding players who even want to do the CoH respec trial and I'm in it for the badges. It's not merit worthy enough for merit ... ah ... nymphos to easily round up a minimum size group for the time it takes to run it.
  12. I posted this over on the Hero/Villain Culture board but I'm hoping to get a better response here.

    Saw this chart over at Blue's News today.

    My main questions over here are where are all the MMO women from the UK, Germany and France? Comparing MMO players to PC players UK and Germany have only 1 in 5 PC playing women also playing MMOs, France about 1 in 4. In the US, like male PC players it's 1 in 2. Is this a cultural thing? Are hardware requirements too high relative to disposable income in those European countries?
  13. "I'm a model you know what I mean
    And I do my little turn on the catwalk
    Yeah on the catwalk on the catwalk yeah
    I do my little turn on the catwalk"

    Yeah, I'm waiting for the costume contests to include a runway style walk off.
  14. I like how this thread went from the OP dissing a player created utility not being able to run on an OS that this game didn't have native support for until 10 months ago to an all out OS/Platform holy war.
  15. Quote:
    Originally Posted by DarkEther View Post
    With the I-7, wouldn't it be ebtter to get RAM in multiples of 3? So maybe get 6 instead of 4?

    Heck, with that decent budget, I'd look at adding a SSD to run maybe the OS and this game.
    Socket 1156 i7's take memory in pairs unlike Socket 1366 i7's which take memory in triplets.

    I would give SSD's another year or so before I feel they got the kinks worked out. At $2-3 a gigabyte it's a very pricey alternative to 10 cents a gigabyte of a very good terabyte drive considering the modest improvement in general performance or even game level load times (green bars are SSD, yellow bars are 7200RPM 3.5" hard drives, red bars are 10,000 RPM drives, blue bars are 2.5" laptop drives). 20-30 times the price per gigabyte for 10-30% improved performance in actual application tests, I don't think that's a fair trade unless you have a lot of money to burn.

    Anyways I'm interested in seeing how SATA III (6.0 Gb/s) and an iteration or two tweaking of the file system in Windows 7 improve their usefulness and performance.
  16. Quote:
    Originally Posted by Simian_Stalker View Post
    I have decided to take the plunge and build my own PC after years of just buying a new one every couple of years. This will be my first build and I will be using this PC for gaming. I thought I would ask this group if they had suggestions for the components.

    I want to use Windows 7, but pretty much everything else is up for discussion. I do not want to spend more than $3k and I would like to get the most performance for my dollar, so any help is appreciated.
    That's a lot of money. I'm not comfortable with that amount of money toward a system. So here are a list of system builds from a number of sites.

    Tech Report's Windows 7 System Guide

    Ars System Guide: October 2009 Edition - This page links to their three builds.

    bit-tech.net's What Hardware Should I Buy? - Notice this is published monthly and November's is due within a week.

    Tom's Hardware Build Your Own Articles - They periodically build system at various price points and specifications. The last set specified AMD CPUs.

    These are the ones I can think of off the top of my head. I hope they help.
  17. Another vendor is Puget Systems. But it is a touch more expensive than the other two but actually list the manufacturer of every component from the power supply to the case fans. Personally I like to know the origin of my components.
  18. Quote:
    Originally Posted by Laevateinn View Post
    I'm Chinese and if someone else said that in front of me I wouldn't let them get away with it; it is neither cute nor adorable.
    As I said, that's her not polite retort, usually mumbled between her teeth within earshot of friends about comments made by those with expensive suits but narrow minds.
  19. And a friend of mine who is Chinese really hates it said that she comes from the Far East. Her standard non polite reply to that is "East of what, your backward continent that didn't get printing or gunpowder until a 1000 years after we invented it roundeye?" She can get away with that because she's tiny, cute and adorable.
  20. No but you can plug USB 1.1 devices into a USB 2.0 port and have it work and vice versa (a USB 2.0 device in a USB 1.1 port but it will only run at the slower speed).

    When USB first arrived (1996) it was slow and was meant for devices that didn't need a lot of speed, mice, keyboards, printers, etc. What it wasn't good for was transferring a lot of data quickly, like off of digital camcorders, video capture devices or external optical and hard drives (even thumb drives).

    In 2001 USB 2.0 came around and was 40 times faster than USB 1.1 and 320 times faster than USB 1.0. When USB 3.0 hits, around now, it will increase the maximum speed by another 10x.
  21. Father Xmas

    Game down again?

    And yesterday they pushed a patch to the live servers.
  22. Quote:
    Originally Posted by Psyte View Post
    My current machine (bought in 06): Core2Duo 6300 (1.8ghz), 4g RAM, Radeon 4670 w/1g ram, Windows 7 64-bit

    I'm looking at an $800 PC: Intel i5 @ 2.66ghz, 4g RAM, GeForce 210 w/512m ram.
    Well you have a nice kick in CPU power there, twice the cores with roughly twice the performance per core. Sadly the same can't be said with that video card which is about 1/3 as powerful as what you currently have so swapping the cards is a good idea for now.

    Also it is highly likely that the power supply in that $800 system will be mediocre at best (one reason for the weak video card) so any video card upgrade should include an upgraded power supply.

    Now it seems that the updates to the graphics engine also resolves the quirks this game has with ATI graphics, at least according to Hero-con reports, this means that ATI should be able to be recommended with a clean conscience. However the video card market is always shifting, lets see what 6 to 9 months brings us in both performance and price.
  23. I know Hyper, just pointing out again after Flame's comment.

    Well is USB 2.0 slower than a native internal interface, yes.

    Is it slow enough to be annoying, not for your audio/video/picture files. Not for data. Not even for infrequently run software. 25-35 MB/s (megabytes per second) is about 1/3 to 1/4 of the speed of a brand new, 500MB per platter, 7200RPM internal SATA drive but compared to what came with a Dimension 8250 from 6-7 years ago, it really isn't all that slow.