Ultra-mode video card shopping guide


5th_Player

 

Posted

Quote:
Originally Posted by Chyll View Post
OK, I am not sure what planets aligned... but the spousal unit has tentatively approved a new rig.

I am examining some Alienware setups in my comparison shopping.

Dual 1GB GDDR5 ATI Radeon™ HD 5670 CrossfireX™ Enabled
or
Single 1.8GB NVIDIA® GeForce® GTX 295
or
Single 1GB GDDR5 ATI Radeon™ HD 5870

(also an option for Dual 1GB GDDR5 ATI Radeon™ HD 5870, but don't think I can go that far)
The 5870. Hands down. Dual, at this point, won't benefit you a great deal for gaming (may not be long off though).


 

Posted

Quote:
Originally Posted by Cade Lawson View Post
The 5870. Hands down. Dual, at this point, won't benefit you a great deal for gaming (may not be long off though).
^^^ Correct sir!


 

Posted

Quote:
Originally Posted by Chyll View Post
OK, I am not sure what planets aligned... but the spousal unit has tentatively approved a new rig.

I am examining some Alienware setups in my comparison shopping.

Dual 1GB GDDR5 ATI Radeon™ HD 5670 CrossfireX™ Enabled
or
Single 1.8GB NVIDIA® GeForce® GTX 295
or
Single 1GB GDDR5 ATI Radeon™ HD 5870

(also an option for Dual 1GB GDDR5 ATI Radeon™ HD 5870, but don't think I can go that far)

Alienware, the King of overpriced Hardware!

You could build the exact same spec system for roughly 2/3 what they charge.


 

Posted

Quote:
Originally Posted by Cade Lawson View Post
The 5870. Hands down. Dual, at this point, won't benefit you a great deal for gaming (may not be long off though).
TY

Quote:
Originally Posted by F_M_J View Post
Alienware, the King of overpriced Hardware!

You could build the exact same spec system for roughly 2/3 what they charge.
Time is worth much as well, and frankly the last thing I want is to deal with all that, especially if I screw something up. But your point is taken - after all I said I was shopping around. And I am comparing many avenues


City of Heroes was my first MMO, & my favorite computer game.

R.I.P.
Chyll - Bydand - Violynce - Enyrgos - Rylle - Nephryte - Solyd - Fettyr - Hyposhock - Styrling - Beryllos - Rosyc
Horryd - Myriam - Dysquiet - Ghyr
Vanysh - Eldrytch
Inflyct - Mysron - Orphyn - Dysmay - Reapyr - - Wyldeman - Hydeous

 

Posted

I would go with the HD 5870. The GTX 295 is really an SLi setup on one card and it's unclear yet if SLi or Crossfire will make a difference.

Plus two HD 5670s (2x 400 SPs at 775MHz) is less than one HD 5770 (800 SPs at 850MHz). The HD 5870 is 1600 SPs at 850Mhz.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

Yes it's me again...

I've been trying to get the dimensions of the HIS HD 5750 iCooler IV 1GB (128bit) GDDR5 PCIe (DirectX 11/ Eyefinity) particularly were the cooler assembly is located on the card relative to the ends of the card. However trying to get answers of the manufacturers is like getting blood out of stone.

Anyone able to give me some pointers or able to supply the dimensions? I just need to know if it'll fit my case before I tell the shop to ship it.



"Just as I knew all of life's answers they changed all the questions!" - Unknown (seen on a poster)
Sig characters appear in the Ch�teau Rouge thread starting from post #100
I Support Nerd Flirting! - Story/Discussion/Sequel

 

Posted

Quote:
Originally Posted by Dark Shade View Post
Yes it's me again...

I've been trying to get the dimensions of the HIS HD 5750 iCooler IV 1GB (128bit) GDDR5 PCIe (DirectX 11/ Eyefinity) particularly were the cooler assembly is located on the card relative to the ends of the card. However trying to get answers of the manufacturers is like getting blood out of stone.

Anyone able to give me some pointers or able to supply the dimensions? I just need to know if it'll fit my case before I tell the shop to ship it.
Well, just by eyeballing it, it appears that it extends about the same distance on either side of the PCIe x16 slot. You can measure from the edge of the slot to the back of the case and then add the same amount (plus the length of a 6-pin plug on your power supply) on the opposite side of the slot. Regardless it should be less than the 9.6 inches of a standard ATX/micro-ATX motherboard; probably around 7.5.

If you're looking at that card, I should point out that Newegg has the (higher quality) Sapphire Vapor-X 5750 listed at the same price as the HIS card ($145), but with free shipping. It also has the standard version from Sapphire on sale for $135; again with free shipping and with a limited-time checkout promo-code for an additional $15 off.


 

Posted

The card's length is 19.7cm, width of 4cm. Now if you click on the picture of the card that shows the card from the top edge (next to the last one on the right), it will blow up and display a picture that's almost as wide as your browser window. Capture that image and print it out (landscape mode). If you don't have a printer then you will be forced to measure off the screen.

Now break out your calculator and ruler. Measure the length of the card (board part, exclude the bracket). Remember that it should be 19.7cm. So calculate the scale factor of actual to measured (19.7/measured). Now go ahead and take measurements of what you are interested in. Then multiply it by the calculated scale factor, round up and you get a pretty good idea of the actual measurements.

Edit: What I get is the cooler stops at around 17.5cm (measured from the bracket), at that point it's 2.5cm. It widens to 3.0cm at 15.0cm where the fan cage starts which widens to 4cm unit you're 6cm from the bracket and then drops back to 3cm. You should still do your own calculations.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

Quote:
Originally Posted by Human_Being View Post
Well, just by eyeballing it, it appears that it extends about the same distance on either side of the PCIe x16 slot. You can measure from the edge of the slot to the back of the case and then add the same amount (plus the length of a 6-pin plug on your power supply) on the opposite side of the slot. Regardless it should be less than the 9.6 inches of a standard ATX/micro-ATX motherboard; probably around 7.5.

If you're looking at that card, I should point out that Newegg has the (higher quality) Sapphire Vapor-X 5750 listed at the same price as the HIS card ($145), but with free shipping. It also has the standard version from Sapphire on sale for $135; again with free shipping and with a limited-time checkout promo-code for an additional $15 off.
Thank you very much for that find Human_Being, maybe someone else can use it. There are a few issue with it for me. 1) I'm in Australia, and have to use a freight forwarder to get stuff from Newegg, 2)I foolishly (as I've come to realize) got a Dell and have thus have a BTX case and thus 3)only basically have a single slot available. Also 4) I have a store credit at Aus PC Market good about $180 so I'm planning to shop there.

Quote:
Originally Posted by Father Xmas View Post
The card's length is 19.7cm, width of 4cm. Now if you click on the picture of the card that shows the card from the top edge (next to the last one on the right), it will blow up and display a picture that's almost as wide as your browser window. Capture that image and print it out (landscape mode). If you don't have a printer then you will be forced to measure off the screen.

Now break out your calculator and ruler. Measure the length of the card (board part, exclude the bracket). Remember that it should be 19.7cm. So calculate the scale factor of actual to measured (19.7/measured). Now go ahead and take measurements of what you are interested in. Then multiply it by the calculated scale factor, round up and you get a pretty good idea of the actual measurements.

Edit: What I get is the cooler stops at around 17.5cm (measured from the bracket), at that point it's 2.5cm. It widens to 3.0cm at 15.0cm where the fan cage starts which widens to 4cm unit you're 6cm from the bracket and then drops back to 3cm. You should still do your own calculations.
Thank you for your help! That will work and should fit. I'll have to pull my case out yet again and remeasure but going off the various pics.... it looks promising.

Shader.



"Just as I knew all of life's answers they changed all the questions!" - Unknown (seen on a poster)
Sig characters appear in the Ch�teau Rouge thread starting from post #100
I Support Nerd Flirting! - Story/Discussion/Sequel

 

Posted

Been thinking about getting a card for the wife so she can go nuts with Ultra Mode: Hoping this would do it for Low-Mid: http://www.newegg.com/Product/Produc...82E16814143186 BFG Tech GeForce 9800 GT 512MB;

If you can recommend a mid to High Card for 100-200 that isn't HUGE (I need a card that only takes up 1 slot); that would be appreciated.

Caveats: I tend to shop NewEgg for their Customer service (Been burned on PriceWatch too often), and shop NVidia.


 

Posted

Quote:
Originally Posted by Positron View Post
You bring up a good point: some top-end video cards require their own power cable in your case, and some even require two. Make sure your PC's power supply can handle what you are planning on putting in. In addition, some of these cards are physically huge. I know the card I ended up putting in my home PC -barely- fit and I had to remove several other things just to get it into the slot. If you find yourself in this position and are not comfortable with the inside of your PC, have a friend do the install for you, or have it professionally done.
It about killed me when I bought my 5850 and couldn't get it to fit. Then I realized I could bump my harddrive up a slot (I have six connections and the first two didn't consider large video cards).

It takes a little while longer to boot up from the bios; not sure why. However, once I'm in windows I'm gravy.

Jumped from 8800 GTX to HD 5850 and my framerate doubled. (using sapphire's overclocked model though, which is more like a 5870).


 

Posted

Quote:
Originally Posted by Kry8ter View Post
Been thinking about getting a card for the wife so she can go nuts with Ultra Mode: Hoping this would do it for Low-Mid: http://www.newegg.com/Product/Produc...82E16814143186 BFG Tech GeForce 9800 GT 512MB;

If you can recommend a mid to High Card for 100-200 that isn't HUGE (I need a card that only takes up 1 slot); that would be appreciated.

Caveats: I tend to shop NewEgg for their Customer service (Been burned on PriceWatch too often), and shop NVidia.
What does "one slot" mean.

9800 GT takes up one PCI-e slot, but IT CAN BE as physically tall as two slots.

So be careful to get specs on the physical size. It really depends on the manufacturer.

9800 GTX is guaranteed to be as big as two slots.


 

Posted

That's why you look at the pictures such as the one that's the bracket/ports view. Wider coolers will stick out and be all obvious like.

Kry8ter, sorry, can't think of one. The more powerful the card, the more power it uses, the more heat it generates, the bigger the cooler it needs. The 9800GT is pretty much the limit when it comes to low profile, single slot coolers.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

Quote:
Originally Posted by Father Xmas View Post
That's why you look at the pictures such as the one that's the bracket/ports view. Wider coolers will stick out and be all obvious like.

Kry8ter, sorry, can't think of one. The more powerful the card, the more power it uses, the more heat it generates, the bigger the cooler it needs. The 9800GT is pretty much the limit when it comes to low profile, single slot coolers.
Yup. Is it the cooler that's the problem or simply an inability to put in a double-wide card. I think I have seen some better cards that are still quite wide in the cooler but don't actually requite the second rear slot. If it's the slot that's the issue but you have room for a tall cooler, something like that might work. Let us know what you're working with and we can advise better.


It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.

 

Posted

As stated by some others, if you're building a new rig solely for CoH/V, then, even with Ultra mode coming down the pipe, a dual GPU system probably won't be much help to you, as SLI/CrossfireX tends to yield less impressive results in MMORPGs than in most other games.

Now I understand the idea of wanting a dual card set-up, I have one myself, but SLI or CrossfireX really needs to be treated one of two ways: either as the starting point for a very high end gaming system; or as an upgrade path on a mid-range system. I see people looking at building something new with dual cards at the mid-range which is something that doesn't really make a lot of sense. For example, the Radeon 5670s mentioned in earlier in an Alienware build would be outperformed by a 5770 or 5830 for less money with lower power consumption and heat generation. But if you already have a card like a 5670 and you're just looking to add some extra power to your graphics processing, then adding in a second 5670 makes a lot of sense, as its probably still cheaper than selling off the used mid-range card and then buying a new high end card.

However, I feel this thread represents a pretty large failure on the dev's part: by February, they should have had a pretty exact specification needed to run ultra mode, instead of all this guess work that we players have been making. And while I understand the need to still be flexible while this is under development, considering that many of their customers are considering a fairly expensive upgrade to coincide with this graphics update, I really think that it was their responsibility to nail down and post an exact minimum spec. While everyone is sort of panicking with the whole "can my system run ultra mode?" I'm pretty sure that ultra mode's requirements are going to be far less hefty than most are scared of: a dual core CPU at about 2.6GHz and a Geforce 8600 or Radeon 3850 are probably all that will be required for minimum spec (most game developers shoot for specs much lower than this for minimum specs, and we're talking about on very popular games like Mass Effect or Call of Duty; its stupid to eliminate 70% of potential customers out there with obscenely high requirements); cards in the current mid-range like GTS 250 or Radeon 5750 will probably be all that's needed to run the game at relatively high settings with a good frame rate. The bulk of worried posts in this thread could have been largely avoided by simply making a statement along the lines of "You WILL need at least CPU XX, X amount of RAM and card XXX from Nvidia or XXXX from ATI in order to run ultra mode: we are still shooting for a slightly lower minimum spec, but this is the guaranteed minimum spec required for ultra mode." This hasn't shown up yet, and so now we have people running around trying to shop for an unknown future, and quite possibly spending money that they could have put to better use somewhere else simply because the dev's have not nailed down a guaranteed minimum spec for ultra mode, and haven't posted that spec up in lights with arrows pointing to it for everyone to see.


 

Posted

So what is the absolute best card out there to run CoX with all the Ultra mode options on the highest settings?

If it matters, I have a 2.85Ghz Quad Core system with 8GB of Ram, 64-bit Windows Vista Ultimate, and a 1000w power supply, not to mention a fairly huge case, so size of card of the power it takes isn't a great concern.


My Mission Architect arcs:

Attack of the Toymenator - Arc # 207874

Attack of the Monsters of Legend - Arc # 82060

Visit Cerulean Shadow's Myspace page!

 

Posted

Quote:
Originally Posted by steveb View Post
As stated by some others, if you're building a new rig solely for CoH/V, then, even with Ultra mode coming down the pipe, a dual GPU system probably won't be much help to you, as SLI/CrossfireX tends to yield less impressive results in MMORPGs than in most other games.

Now I understand the idea of wanting a dual card set-up, I have one myself, but SLI or CrossfireX really needs to be treated one of two ways: either as the starting point for a very high end gaming system; or as an upgrade path on a mid-range system. I see people looking at building something new with dual cards at the mid-range which is something that doesn't really make a lot of sense. For example, the Radeon 5670s mentioned in earlier in an Alienware build would be outperformed by a 5770 or 5830 for less money with lower power consumption and heat generation. But if you already have a card like a 5670 and you're just looking to add some extra power to your graphics processing, then adding in a second 5670 makes a lot of sense, as its probably still cheaper than selling off the used mid-range card and then buying a new high end card.

However, I feel this thread represents a pretty large failure on the dev's part: by February, they should have had a pretty exact specification needed to run ultra mode, instead of all this guess work that we players have been making. And while I understand the need to still be flexible while this is under development, considering that many of their customers are considering a fairly expensive upgrade to coincide with this graphics update, I really think that it was their responsibility to nail down and post an exact minimum spec. While everyone is sort of panicking with the whole "can my system run ultra mode?" I'm pretty sure that ultra mode's requirements are going to be far less hefty than most are scared of: a dual core CPU at about 2.6GHz and a Geforce 8600 or Radeon 3850 are probably all that will be required for minimum spec (most game developers shoot for specs much lower than this for minimum specs, and we're talking about on very popular games like Mass Effect or Call of Duty; its stupid to eliminate 70% of potential customers out there with obscenely high requirements); cards in the current mid-range like GTS 250 or Radeon 5750 will probably be all that's needed to run the game at relatively high settings with a good frame rate. The bulk of worried posts in this thread could have been largely avoided by simply making a statement along the lines of "You WILL need at least CPU XX, X amount of RAM and card XXX from Nvidia or XXXX from ATI in order to run ultra mode: we are still shooting for a slightly lower minimum spec, but this is the guaranteed minimum spec required for ultra mode." This hasn't shown up yet, and so now we have people running around trying to shop for an unknown future, and quite possibly spending money that they could have put to better use somewhere else simply because the dev's have not nailed down a guaranteed minimum spec for ultra mode, and haven't posted that spec up in lights with arrows pointing to it for everyone to see.
Well first off, for the moment, SLI and Crossfire are stated to not work with UM so until they are, a dual GPU setup is less than optimal for that reason alone. Presumably, they will get it working at some point but we don't know when. Until then, single GPU upgrades are the way to go.

Beyond that, while they haven't specified a CPU/memory requirement for UM, Posi has listed a range of what cards they expect to do what level of UM. And thus it can be taken that the 9800GT and equivalents is the current minimum spec for UM. Also keep in mind that no one is being run off. If you can play now, you'll still be able to play under I17. UM is an *optional* feature. Anyways, until I17 at least leaves Closed Beta, I think you're expecting too much.


It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.

 

Posted

What I can't figure out is why I can run Crysis, Fallout3, and Mass Effect 2 with all the bells and whitles with my GTX260 but CoH is a lag fest if I turn on all the effects. And I have good net access.


 

Posted

Quote:
Originally Posted by Cerulean_Shadow View Post
So what is the absolute best card out there to run CoX with all the Ultra mode options on the highest settings?

If it matters, I have a 2.85Ghz Quad Core system with 8GB of Ram, 64-bit Windows Vista Ultimate, and a 1000w power supply, not to mention a fairly huge case, so size of card of the power it takes isn't a great concern.
Well if money is no object, the best card out there, period, right now is the ATI Radeon HD 5970. But as it is two GPUs on one card using an internal Crossfire setup and Crossfire doesn't support or isn't supported by (not sure which is the correct phrase) UM, it won't do as well as it possibly could. Though we can probably expect Crossfire support eventually. Below that would be the Radeon HD 5870. (The 5970 is essentially two 5870s on one board.) Comparable to the 5870 is the Nvidia GTX 480 which will be out soon. Those would be the best cards available (or nearly so). The GTX 2xx line is Nvidia's current top cards but as they don't support DX 11 or the new OGL standards, I'd avoid them. The GTX 4xx line will support those and all Radeon 5xxx cards already support them.


It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.

 

Posted

I was thinking of getting the GeForce GTX 285 because Positron mentioned that one in his initial post in this thread, so thanks for the heads up re: DirectX 11. I guess money is an object to a point - I don't want to spend $700+ for the Radeon 5970 - but the Radeon 5870 or Nvidia GTX 480 look like possibilities.


My Mission Architect arcs:

Attack of the Toymenator - Arc # 207874

Attack of the Monsters of Legend - Arc # 82060

Visit Cerulean Shadow's Myspace page!

 

Posted

Quote:
Originally Posted by Back_Blast View Post
Well first off, for the moment, SLI and Crossfire are stated to not work with UM so until they are, a dual GPU setup is less than optimal for that reason alone. Presumably, they will get it working at some point but we don't know when. Until then, single GPU upgrades are the way to go.

Beyond that, while they haven't specified a CPU/memory requirement for UM, Posi has listed a range of what cards they expect to do what level of UM. And thus it can be taken that the 9800GT and equivalents is the current minimum spec for UM. Also keep in mind that no one is being run off. If you can play now, you'll still be able to play under I17. UM is an *optional* feature. Anyways, until I17 at least leaves Closed Beta, I think you're expecting too much.
Actually, I haven't read a post saying absolutely that SLI/CrossfireX will not work on Ultra Mode when its launched; although I don't expect it as I haven't read any updates in Nvidia or ATI's drivers that would indicate they've re-optimized their drivers to support CoH/V in a multi-GPU set up. The most recent information I've seen around here on that dates back to February, so that's a long time for changes to have occurred. There's also an equally good chance I've missed an update, as Posi's posts on the subject are buried pretty deeply in the Dev Tracker at this point. I will readily admit to being too lazy right now to go digging past the first three or four pages for more recent news.

I think you might have misread my statement about running off 70% of customers due to high system requirements: I was actually referring to why even the most recent cutting edge games tend to have relatively low minimum system requirements. And as I stated, I expect the minimum demands of UM to be along the lines of what it would take to run a DirectX 10 only game, despite the fact that CoH/V actually uses OpenGL, which would be comprised of what was considered mid-range hardware circa 2007.

But my point is that even as an optional graphical upgrade, a guaranteed minimum spec should have been posted at least a month ago; all we've seen are "well, you can expect this sort of result with this... etc" statements, which are not exactly concrete information: I can throw a Radeon 5970 into an eight year old system with a Pentium 4 CPU and will not see anywhere near the same result as someone with the same card running with a Phenom II quad core CPU. Its fine for for the exact specification to still be in downward flux while at this stage of development, but we should have been told a CPU, RAM and graphics spec that was guaranteed to meet minimum requirements for UM; it gives them room to work on a less demanding spec, but it also would give their customers a firm idea of what they should be looking for if they are considering an upgrade to allow for UM. Honestly, I'm not concerned about my rig's ability to run UM at maximum settings (or any currently existing game), but not everyone has the money available to go out and get themselves a Core i7 system with a Radeon 5870 or GTX 480, and that's why a guaranteed minimum spec should have been posted long before now.


 

Posted

OpenGL has evolved as well as DirectX, both taking advantage of the hardware available in Dx10/10.1/11 cards. Actually OpenGL had features found in Dx10 before there was a Dx10.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

Quote:
Originally Posted by fallenz View Post
What I can't figure out is why I can run Crysis, Fallout3, and Mass Effect 2 with all the bells and whitles with my GTX260 but CoH is a lag fest if I turn on all the effects. And I have good net access.
I can currently run CoH maxed out on just an Nvidia 8800GT with zero lag (I plan to upgrade for Ultra Mode!)


Quote:
Originally Posted by eltonio View Post
This is over the top mental slavery.

 

Posted

Apple announced their new i7 MacBook Pro computers today with the following video card info:

Quote:
The graphics chip has been bumped to NVIDIA GeForce GT 330M, with a choice of graphics memory options. This replaces the NVIDIA 9400M and 9600M GT cards of the previous generation.
Any ideas/guesses as to how good the NVIDIA GeForce GT 330M would be at handling Ultra Mode?

Thanks,

Buxley



"Don't cry because it's over, smile because it happened." -- Dr. Seuss

 

Posted

Quote:
Originally Posted by Buxley1 View Post
Apple announced their new i7 MacBook Pro computers today with the following video card info:



Any ideas/guesses as to how good the NVIDIA GeForce GT 330M would be at handling Ultra Mode?

Thanks,

Buxley
The GT 330M is a tough one to nail down, as it and the 325M are what Nvidia has aimed at being the most attractive to buyers, so there are few variants available that will affect performance. It'll probably be able to handle it fairly well at mid-range settings, depending on screen resolution. Don't expect to max out the settings, as the 330M is a mid-level card, but it does depend on the version being used. The 330M has both a GDDR2 and GDDR3 version, and the GDDR2 version can be significantly slower than the GDDR3 version. Also, given that this card is in a Macbook, and Apple heavily emphasizes low power consumption, it could also be the low power version which has a lower clock speed and thus loses some performance. I believe the card should work quite well, but you'll probably get better results by playing at a resolution lower than the monitor's native resolution.

However, I'd honestly stay away from the current crop of mobile Core i7s. The clock speeds on them are very low. Yes they have a great turbo boost feature, but that only boosts the speed on the first two cores when multi-threaded applications are not coming heavily into play: once you get into using a program optimized for multiple threads, or into multi-tasking, any advantage from the turbo boost disappears and you're left with a very slow quad core processor (not difficult for a gamer to kill turbo boost's advantage: run the game, a web browser and a music program at the same time). I personally feel that the mobile Core i5s are a much better value: higher default clock speeds on a hyper threaded dual core CPU give you the power of a high speed quad core for a much lower price; plus, they still have the great turbo boost for when you're not running anything demanding.