Ultra-mode video card shopping guide
Also, a video card is not the ONLY thing you should take into consideration. Your PC having a decent processor and memory will also enhance your performance. As we continue to test and get more information available we will update you. I hope this helps, and happy holidays!
|
While I've no doubt about i5 and i7's, I was more curious about AMD's offerings these days. I'm debating upgrading my poor old C2D 6300 @ 1.8ghz, and the Athlon X4 620 is looking good.
Ya thats the board I currently have I was just wondering what the pci e 2 ment over it just being a X16 pci e slot
I would hate to buy a expencive card and not use it to its full potential
I have one the D975XBX motherboards: http://www.intel.com/products/motherboard/D975XBX/
As far as I can tell, Intel doesn't actually specify what's different between the d975xbx and d975xbx2. Never mind, found it. The 2 model added several processors that really should have been added in a BIOS update to the first motherboard... D975XBX 1 Processor List D975XBX 2 Processor List The only reason you'd buy one is if you wanted to run a really early Core 2 Duo with Crossfired Radeons, since this was one of, if not, the first Intel board to carry Crossfire support. It does not, however, support SLI as later and more modern Intel motherboards do. I'd also have a hard time suggesting this board on it's own merits. The BIOS for the original model is rubbish, and since it looks like Intel just re-released the board with an updated BIOS for new processors instead of... you know.. actually releasing the BIOS so that if you had the original board you could use the new processors... I somewhat doubtful that the 2 model is any better. Getting a SATA drive to boot from the system was difficult. And it chewed through more power than it really should have with it's feature support. *** If I mis-read this as you actually having one and aren't just looking at buying one, yes, it's PCI-Express slots will support Nvidia cards. Just not in SLI. |
Ya thats the board I currently have I was just wondering what the pci e 2 ment over it just being a X16 pci e slot
I would hate to buy a expencive card and not use it to its full potential |
Ya know, I have to ask "What constitutes a decent processor these days?" While I've no doubt about i5 and i7's, I was more curious about AMD's offerings these days. I'm debating upgrading my poor old C2D 6300 @ 1.8ghz, and the Athlon X4 620 is looking good. |
For the purposes of gaming in resolutions of 1280*720 (720p) or 1440*900, two common resolutions for low end LCD's, even Socket 754 or late Pentium4's can deliver enough punch for most games.
For the purposes of gaming in resolutions like 1680*1050, 1920*1080 (1080p), or 1920*1200, processors from the Intel Core2 Duo lineup and the AMD Socket 939 and AM2 dual core CPU's can deliver enough back-end power.
Right now, for most shipping games against the Socket AM3 Phenom's or Intel I5 / I7 processors, you'll run into a limit on your graphics card long before you run into the limit on your processor.
From a price for performance standpoint I have a hard time suggesting people buy Intel processors and motherboards. Earlier in the thread I was talking about the Intel D975XBX. This board was a contemporary of the Asus M2R32-MVP as both were launched in 2006. The M2R32-MVP cost... well... quite a lot less than the Intel board. I think I paid around $130 for the Asus board, and the Intel board was somewhere in the neighborhood over $250 if memory serves correctly. The Asus board had more PCIE bandwidth for each PCIE slot (both slots were 16x, the slots on the Intel were 4x, 8x, then 16x, and if you ran in crossfire it was 8x,8x) The Asus board used less power. And so on and so on.
What really got me though is that as AMD introduced new processors, Asus pretty much kept updating the BIOS to the literal physical limits of what the board could handle. That M2R32-MVP of mine is currently running with a Phenom 9600 that's overclocked.
The Intel board, as noted earlier, never had a bios update with new processors. Rather, as I found out when researching the board name for this thread, if I wanted to utilize new processors from Intel, I would have to buy a completely new $200+ motherboard... with the same exact physical hardware. I'm sure some accountant somewhere is sitting there, nodding his head, saying Yeah, that's how we make money. I'm sitting here going that's how you *CENSORED* your customers!
***
Now, we do know that AMD intends to keep the ongoing Socket Compatibility that they have right now for future processors. The upcoming Bulldozer will have a new socket, but reportedly will also work on currently shipping Socket AM3 motherboards.
So if you bought an AM3 motherboard today, there's some assurance that just like Socket AM3 processors will work on most Socket AM2+ motherboards, consumer oriented Bulldozer designs will work on most Socket AM3 motherboards.
You don't get that kind of... assurance with Intel. Rather, you get the opposite. Case in point is the recent release of the LGA 1156. Rather than re-using the existing LGA 1136 and maintaining Socket Compatibility across processors, Intel simply releases a new socket design. Which is rather rotten if you want to stop and think about it for a minute.
***
So if you were buying now, I'd push you towards a Socket AM3 motherboard with a Socket AM3 processor.
I'm just not sure which motherboard I'd push you too.
DFI burnt me pretty badly with a couple of their Intel I7 motherboard offerings that refused to run in Triple Channel memory mode, so I'm a little hesitant at pointing out the nice looking DFI LANParty DK 790FXB-M3H5. Sure, all of the DFI AMD boards I've had have been awesome, so maybe it was just an Intel thing or something. Asus makes a wonderful Quad-Crossfire board... but it's almost $200, which is Intel board territory. MSI has the 790FX-GD70... but it's an MSI... and every single MSI board I've had with an AMD chipset has been complete and utter *CENSORED.
I'd actually be tempted to live with the ASUS M4A79XTD EVO. Yeah, it's only got dual 8x PCIE slots, so you do lose a bit of bandwidth if you run in Crossfire mode. It's also available for around $110, so it's a bit of a deal for it's feature set.
I rebuilt most of my puter over the summer and I have to say the difference between my old Geforce 7600 and the new GTX 260 is startling. I also upgraded my motherboard from an Asus M2V to a MSI k9nsli v2 (for future upgrades), I went from 1gb ram to 2gb, and my power supply from a 200 watt weakling to a 750 watt octopus on steroids. >8) I also added 2tb external HD. I should be good to go.
My next upgrades will be Windows 7 64bit Ultimate Edition, 4gb ram, Quad-core processor, and a Blu-Ray burner.
Give it a few months. Prices for things like this drop very rapidly. By the time GR is released, these cards will probably cost significantly less.
|
The price of graphics cards is usually driven perpetually-downward by the back and forth between Nvidia and ATI. Right now though, that situation doesn't apply for either company.
Nvidia is in a very nasty developmental position right now. Their new G300 processor is delayed until at least late spring/early summer of next year; it was supposed to have debuted before the ATI 5xxx cards a few months ago. Early sample reports about their performance are not encouraging either.
Meanwhile, the company is bleeding money on the GTX 2xx series cards introduced around this time last year. The G200 processor is fairly huge, larger than the contemporary ATI processor of the 4xxxs, and is relatively expensive to produce because of that. But 4xxxs not only equal-or-out-performed the GTX 2xxs, they undercut them on total price as well. They had to reduce the prices on the GTX 2xxs to the point they were basically unprofitable in order to maintain their leading market-share. But Nvidia has a sales-guarantee with its card producers wherein Nvidia will pay the difference if the producer ends up having to sell their cards for less than Nvidia's reference MSRP. I've seen it described as Nvidia having to wrap each chip in a $20 bill for someone to be willing to take them out of the warehouse.
Now the 5xxxs have hit and Nvidia still doesn't have an equivalent offering. To prevent having to lose even *more* money from a price decrease in response to the flatly superior 5xxxs, Nvidia told its card producers that they were "having production difficulties" and started to taper off distribution of GTX 2xx chips. The GTX 2xxs have been made "artificially scarce" to keep their price up. Then Nvidia announced a few weeks ago that they were End Of Lifing the GTX 260, 275, and 285; what's in the production chain *right now* is all there will ever be of these cards.
So Nvidia has to keep prices of their current cards up while they struggle for a new product, have constructed a situation to make that happen, and are now stopping production of the cards entirely to staunch the bleeding.
So no, the price of Nvidia cards isn't likely to drop between now and next spring.
For ATI's part, they are currently dominating the market with superior products and do not have an equivalent competing product on the immediate horizon. However, they have a superior product...but there aren't any around to sell. The same company (Taiwan Semiconductor Manuf. Co.) that is fabricating Nvidia's G300 chips is also making ATI's 5xxxs with the same manufacturing technique...and it's suffering similar production problems. There are mid-range 57xx boards available, but there are virtually no 58xxs. Indeed, the price for the 5850s has gone UP $50 since they were released in September. There's rumor that supply will get better around Dec 15th, but we'll have to wait and see.
ATI has a free run at the market, with Nvidia sitting on its hands, and they can't take full advantage of it because of supply problems! Add in that ATI is carrying a large amount of structured debt they have to service and they have no reason to reduce prices, every reason to keep them up, and real logistical problems forcing them to keep it that way.
So no, ATI won't be dropping prices on their cards any time soon either. (They will however, be dropping low-range versions of their new cards on the market in the first quarter of next year.)
If you want a cheaper card, I recommend trying to catch a holiday sale of some sort.
As a related note, I'd like to mention that Nvidia has just announced release of a G310 card. Do not buy this thing thinking it is part of a new high-performance line and will be better than a 2xx card! The G310 is a re-named G210 (which was a piece of junk) that Nvidia hopes will sound attractive because "300" is higher than "200". It's the same thing they did with the GTS 250 actually being a renamed 9800 GTX+; still using an older G92 chip rather than one of the newer G200s despite the name. The more I learn about Nvidia's business practices, the less I like the company.
If you're looking to build a true low-end budget PC (that will still play games ok), this is a fine suggestion. But given that you can find core i7 920s for $200 (as of 4mo ago), and they run circles around any AMD processor, if you're not looking at a total budget system, I'd get the core i7 920.
|
However, the i5 750 gives most of the performance of the i7 920, costs $200 currently, and will require much less money to give it a motherboard and RAM than an i7 920. That's also a possibility if you're working inside a strict budget.
Ya know, I have to ask "What constitutes a decent processor these days?"
While I've no doubt about i5 and i7's, I was more curious about AMD's offerings these days. I'm debating upgrading my poor old C2D 6300 @ 1.8ghz, and the Athlon X4 620 is looking good. |
I hate to get involved in religious wars but here is some objective information for cpu comparison
for general performance
http://www.tomshardware.com/charts/2...tage,1394.html
here we go for gaming related
http://www.tomshardware.com/charts/2....0.2,1396.html
Don't count on it!
As a related note, I'd like to mention that Nvidia has just announced release of a G310 card. Do not buy this thing thinking it is part of a new high-performance line and will be better than a 2xx card! The G310 is a re-named G210 (which was a piece of junk) that Nvidia hopes will sound attractive because "300" is higher than "200". It's the same thing they did with the GTS 250 actually being a renamed 9800 GTX+; still using an older G92 chip rather than one of the newer G200s despite the name. The more I learn about Nvidia's business practices, the less I like the company. |
Ok, I've been thinking about it, and have decided to post the main equipment I plan to purchase for my new system (CPU, GFX, ETC) to get a feel for the performance it will have from someone more experienced in this area. I'm a diagnostician, not a power PC builder.
So with that in mind:
CPU: AMD Phenom II x4
http://www.newegg.com/Product/Produc...82E16819103471
GFX Card: EVGA 02G-P3-1185-AR GeForce GTX 285
http://www.newegg.com/Product/Produc...82E16814130486
GIGABYTE GA-MA790X-UD4P AM3/AM2+/AM2 AMD 790X ATX AMD Motherboard
http://www.newegg.com/Product/Produc...82E16813128387
Any thoughts? Will this system stack up to GR's Ultra Mode and handle the load? I know that, according to Posi, the card will handle Ultra Mode at full tilt, but I want to make sure before making any purchases that the rest can handle it.
And, of course, I have a compatible case, PSU, etc. picked out, so if you need more info, just ask!
"Iron defenses and a crappy attitude do not, a tanker, make."
Proud Leader and founder of The Gangbusters Super Group and The Madhouse Villain Group: Ask me about becoming a member!
I hate to get involved in religious wars but here is some objective information for cpu comparison
for general performance http://www.tomshardware.com/charts/2...tage,1394.html here we go for gaming related http://www.tomshardware.com/charts/2....0.2,1396.html |
Ok, I've been thinking about it, and have decided to post the main equipment I plan to purchase for my new system (CPU, GFX, ETC) to get a feel for the performance it will have from someone more experienced in this area. I'm a diagnostician, not a power PC builder.
So with that in mind: CPU: AMD Phenom II x4 http://www.newegg.com/Product/Produc...82E16819103471 GIGABYTE GA-MA790X-UD4P AM3/AM2+/AM2 AMD 790X ATX AMD Motherboard http://www.newegg.com/Product/Produc...82E16813128387 Any thoughts? Will this system stack up to GR's Ultra Mode and handle the load? I know that, according to Posi, the card will handle Ultra Mode at full tilt, but I want to make sure before making any purchases that the rest can handle it. |
GFX Card: EVGA 02G-P3-1185-AR GeForce GTX 285 http://www.newegg.com/Product/Produc...82E16814130486 |
If the price point is no object, then you could also grab an ATI 5870 for the same money ($400-$420) and get something that flatly outstrips the GTX 285 head to head. Since the 5850 and 5870 use the same chip, the available dies are all going to the 5870 and it's a little easier to actually find a 5870 for sale.
Unless there's something bizarre about the implementation, I would expect the ATI cards to run at their maximum strength in CoX; as Bill Z pointed out to me, the ATI sticker has replaced the Nvidia one at the bottom of the CoH Homepage . We don't have empirical data that the 58xxs will perform that well with CoX, but as they are using 48xxs as a baseline, the 58xxs are a scale-up of similar architecture, and ATI is now the official graphics partner for CoX, it's quite likely to be so. Supposedly, new 58xxs are hurtling towards market as we speak and will appear magically on Dec 15th; just in time to be sold for Christmas. If they don't appear then, they'll certainly show up in January.
So if you want the card now, it would be easier to get a GTX 285. If you wait a bit, you could probably get the same performance for cheaper or better performance for the same price.
With all that said, EVGA is an excellent company. They're kind of the Cadillac of Nvidia board makers. If you decided to go with the Nvidia card, that's likely to be a quality one.
Thank you very much for the info HB. I can't actually make the purchases right now, so waiting to keep an eye out on the cards you suggested is something I will definitely be doing. That, and, I would like to see if Posi or another Dev will eventually come forward with ATI information.
However, I do have a few questions.
Firstly, If I do end up going for the ATI cards, it will be my first ATI Card. Is there anythign I would need to know about differences between nVidia and ATI?
Second, What company should I shoot for with the Card's board. I'm leaning towards the ASUS board, since I do have some experiendce with their products, but any information you could give there would be helpful.
And, Lastly, the ATI card has 1GB memory while the nVidia card I selected has 2GB. Will this difference be noticeable in the performance of the card?
"Iron defenses and a crappy attitude do not, a tanker, make."
Proud Leader and founder of The Gangbusters Super Group and The Madhouse Villain Group: Ask me about becoming a member!
I hate to get involved in religious wars but here is some objective information for cpu comparison
for general performance http://www.tomshardware.com/charts/2...tage,1394.html here we go for gaming related http://www.tomshardware.com/charts/2....0.2,1396.html |
The problem I having with ATI cards is fact that half the games I play beside CoH have a page long issues with ATI cards. Yeah the new cards run rings around nVidia cards but at least nVidia run alot more stable than the ATI cards. And the sticky for playing CoH with ATI cards still scares me away from buying a new one.
I hate to get involved in religious wars but here is some objective information for cpu comparison
for general performance http://www.tomshardware.com/charts/2...tage,1394.html here we go for gaming related http://www.tomshardware.com/charts/2....0.2,1396.html |
The problem I having with ATI cards is fact that half the games I play beside CoH have a page long issues with ATI cards. Yeah the new cards run rings around nVidia cards but at least nVidia run alot more stable than the ATI cards. And the sticky for playing CoH with ATI cards still scares me away from buying a new one. |
Firstly, If I do end up going for the ATI cards, it will be my first ATI Card. Is there anything I would need to know about differences between nVidia and ATI? |
Second, What company should I shoot for with the Card's board. I'm leaning towards the ASUS board, since I do have some experiendce with their products, but any information you could give there would be helpful. |
And, Lastly, the ATI card has 1GB memory while the nVidia card I selected has 2GB. Will this difference be noticeable in the performance of the card? |
Would anyone know how well does COH do with Crossfire using AFR these days? Do you know, by chance? I couldn't find anything, but I don't know if that's my search-fu being weak or there not being much.
Firstly, If I do end up going for the ATI cards, it will be my first ATI Card. Is there anythign I would need to know about differences between nVidia and ATI? |
Firstly, in current implementation of the game, ATI cards have a number of "graphic glitches" in CoX. (See Bill Z's aforementioned sticky thread in the Technical Issues and Bugs section.) We've had no official word if these are going to be remedied in Going Rogue. It is likely that Ultra Mode will not see these problems with ATI since they were demoing it on ATI boards and ATI is the official graphics partner. However, we can't know for certain if these oddities will go away entirely when Going Rogue is live, or if they will only disappear in Ultra Mode but remain in the standard version of the game.
Heck, for all we know, Nvidia cards might end up sporting "graphical oddities" in Ultra Mode =P.
For now though, be aware that there are quirks with using ATI in the current game.
Secondly, ATI does not support hardware-accelerated PhysX.
Wait, what does that mean? For CoX, pretty much nothing.
Ageia was purchased by Nvidia a while ago, and a proprietary physics engine got even more proprietary. All Nvidia cards of 8800 or newer vintage (except the "brand new" G210/G310 =P) will run the PhysX calculations on their GPU. If you *don't* have an Nvidia card, the PhysX gets done entirely by the CPU.
In CoX, that's not a huge burden. I've never heard of ATI users complaining of massive framerate drops when flying debris appears on their screen in-game. The PhysX calculations are light enough that the CPU can handle it as implemented; which makes sense considering the minimum hardware specifications Paragon wants to maintain. Given the processor you've suggested, you shouldn't even notice the "lack of hardware PhysX".
Where it might become important though, is in other games. There aren't very many that use PhysX, and even fewer that use them intensively. The only one I can think of, though it's definitely significant, is Batman: Arkham Asylum. The PhysX implementation in the game is very intense when set to "High". You can be using an ATI card in B:AA with PhysX effects turned on, zoom along at 60 frames per second, and as soon as you encounter volumetric smoke or flying sheets of folding paper, the PhysX will crush the frame rate until it passes. Even a high end Nvidia card will see a dip in frame rate under those circumstances.
Some people have taken to using a second Nvidia card in another motherboard slot as a Dedicated PhysX Processor; there's an option in the Nvidia driver control panel to implement that when there are two graphics cards present. They don't have to be the same card either. People have been taking low-end 9600 GTs or 9800 GTs, pairing them with GTX 260s, and seeing a boost in overall performance since neither the CPU nor Graphics processor have to deal with the PhysX stuff. The weaker, older cards are plenty enough to handle the specialized task.
Here's the kicker though: you can do the same thing with an ATI card as main GPU! In fact I've seen some reviews that say an ATI card outperforms the equivalent Nvidia one in B:AA while using a 9x00 GT as dedicated PhysX unit. You don't need an SLI or Crossfire motherboard (the two companies proprietary dual-card standards). You do have to be using either Win XP or Win 7 though; both have the required "Driver Test" mode wherein you can run two graphics drivers at the same time. Vista doesn't have this capability.
A little while ago, Nvidia caught on to this and disable dedicated PhysX operation in their drivers unless there was an Nvidia card in the other motherboard slot. Amongst the antics of an increasingly erratic company, I find this particularly asinine. Nvidia has essentially abandoned the high-end graphics card market and all they have to compete on are their lower-class boards. Those same boards are going to become even less competitive in Jan/Feb when the new ATI bargain cards arrive. If people are choosing ATI performance cards for quality, but could still pick up a cheap Nvidia card to run with it for specialized purposes, Nvidia might still make some money. But I'm not in marketing, so what do I know =P.
You can still do the different-cards trick if you use older (pre-190.xx I think) drivers. If you are using Win 7, there is also a homemade patch (predictably ) floating around that will let you use the newest Nvidia drivers while telling them "No no, that Radeon chip in the other PCIe slot is really a Geforce ! Really !"
Off the top of my head, those are the two things you need to know: as currently implemented, CoX has graphics quirks with ATI cards and a small handful of games (but not CoX) can see drastic performance fluxtuations with PhysX enabled in the absence of an Nvidia card.
Second, What company should I shoot for with the Card's board. I'm leaning towards the ASUS board, since I do have some experiendce with their products, but any information you could give there would be helpful. |
I do know that Sapphire is well respected for ATI cards. Sapphire is basically to ATI what EVGA is to Nvidia. Sapphire also has a good line of custom, non-reference-standard versions of the cards they sell, like the Vapor-X line with better cooling/quieter fan/slight overclocking.
Another good company is XFX. XFX is the never-quite-overtaking-them closest quality-competitor for EVGA in the Nvidia line. About a year ago though, they looked at Nvidia and went "these guys are nuts !" and started setting up to build ATI cards as well. From looking at newegg after following your links, I also noticed that XFX had (past tense) both ATI 5850s and 5870s available for a $60 premium today; apparently no other company had them stocked with Newegg for Black Friday. Definitely no flies on these guys.
And, Lastly, the ATI card has 1GB memory while the nVidia card I selected has 2GB. Will this difference be noticeable in the performance of the card? |
Just out of interest, any opinions on the Geforce 295? I'm looking to build a new hi-spec system (yay tax deductions!) and have seen that going as part of certain deals.
Always remember, we were Heroes.
Just out of interest, any opinions on the Geforce 295? I'm looking to build a new hi-spec system (yay tax deductions!) and have seen that going as part of certain deals.
|
These cards are the "best discrete cards in the world". They also are huge, run hot, and tend to make noise like a Basset Hound being sucked into a vacuum cleaner. (Of course, if you've got the money to buy one or more double-chip cards, you probably have the money to drop on a hefty water-cooling system).
I would note that I think there have been three revisions of the GTX 295, one of them being two distinct circuit boards with one chip each inside the cowling instead of two chips on one board. I'm not sure what the dissatisfaction with them was, but apparently not all GTX 295s are built the same.
Just out of interest, any opinions on the Geforce 295? I'm looking to build a new hi-spec system (yay tax deductions!) and have seen that going as part of certain deals.
|
You can click the "News" link to read the story behind the strip.
real quick:
Here's the kicker though: you can do the same thing with an ATI card as main GPU! |
(still reading the rest of the thread since my last post)
... and I see you actually mentioned this as I read further.
Would anyone know how well does COH do with Crossfire using AFR these days? Do you know, by chance? I couldn't find anything, but I don't know if that's my search-fu being weak or there not being much. |
This will likely change as of Going Rogue as the new engine supposedly will take advantage of multi-gpu rendering.
Another good company is XFX. |
Sooo.. quick note on Sapphire. Sapphire's traditionally been a bit closer to ATi than other vendors. Back when ATi was selling their own branded cards, Sapphire was the actual manufacturer. One of the things to keep in mind is that for a long time ATi didn't sell chips to 3rd party vendors.
However, I did pay for that cheapness on the RadeonHD 4850's I picked up. They actually are the most annoying fans since the 6600 GT's I had, and remind me quite a lot of an FX 5800 Ultra I borrowed. At full tilt it's like a Banshee Howling fest.
Just out of interest, any opinions on the Geforce 295? I'm looking to build a new hi-spec system (yay tax deductions!) and have seen that going as part of certain deals. |
A GTX 275 starts around $240.
A good one with 1.7gb of memory is around $320.
A GTX 295 starts around $530.... well, the only one Newegg has is $530.
So you're basically paying a $50 premium to use up 2 slots in your computer rather than 4 slots... since i'm not aware of any single slot GTX 275's.
Now, if you've got the cash, yeah, the GTX 295 is a freaking monster. And the one on Newegg is actually cheaper than a 4870x2 I found on Dell.
But I can get a Stock 4870 from XFX for around $154. Two of those are only $308...
I can also find RadeonHD 4890's for around $200... and two of those in Crossfire are going to set you back around $400... and you'd be running away from GTX 285's just on single cards.
So, I wouldn't be buying a GTX 295. I can save a lot more money by being willing to sacrifice space in the case.
Thanks for that info, load of of my mind.
All I have to do now is see if the brand new Power Supply I bought to replace my burned out oldie can power the awesome nVidia Card I picked out, as well as the rest of the system. |
Found that I needed 42 ohms and I had 35 ohms on the 12V-rail. So I had to take back the PSU and find another with enough ohms and watts.
Just something you should think about before buying like I didn't.
So far, the multi-threaded and multi-core overheads appears to be trivial.
Just a reminder, its not a simple case of CoX using two threads and therefore two cores. CoX is actually using 15 threads, most with very little CPU relative to the primary game client. But starting a demorecord, say, kicked a previously idle thread into just under 1% utilization, as did turning on 3D sound. I did have six threads with measurable CPU performance running simultaneously under some test conditions. Win7, as I mentioned previously, seemed to be doing a good job of isolating those six threads to the four even-numbered CPUs (which are the non-hyperthread ones) without pegging any of them (the highest CPU, CPU zero, only reached about 70% utilization maximum and I did have other things running at the time).
[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]
In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)