Ultra-mode video card shopping guide


5th_Player

 

Posted

Quote:
Originally Posted by je_saist View Post
I suspect, however, that you meant to type:
I did mean that, yeah, sorry - I originally purchased it with XP, this was soon after Vista was released as I remember having to specifically request that they did NOT install Vista.

I upgraded to Win7 as soon as the RTM was released on MSDN.

So yeah - It was top of the range about 4 years ago, I have no idea if that means it would have an i5 or not :S


[CENTER]Euro side: [B]@Orion Star[/B] & [B]@Orions Star[/B][/CENTER]

 

Posted

And let me point out something. The original GTS 250 specs the clock at 738MHz and memory at 2200MHz. I notice that several of the "cheaper" GTS 250s are clocked at much lower speeds.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

Quote:
Originally Posted by Father Xmas View Post
Reviewed here.
Nice writeup. Thanks. So for CoH, is NVidia still better because of the PhysX or is ATI now the thing to get? Pretty much I figure my choices based on budget will be that GTX 275 Co-OP, a GTX 280 or a 5850. They all seem to be about the same in price and (broadly) performance. Since CoH is the only game of note I play, I want whatever will do the best job in CoH. It sounds to me like the 275 co-op might be that card but I want to be sure.


It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.

 

Posted

Quote:
Originally Posted by Back_Blast View Post
Nice writeup. Thanks. So for CoH, is NVidia still better because of the PhysX
PhysX only works if you have a physical AGEIA add in card from BFG or from AGEIA.

Some players are pressing the developers to support Nvidia's graphics-card based PhysX. I think this is a bad idea, and am one of the players pressing the developers to move to OpenCL for hardware accelerated physics.

The big problem with PhysX is that it locks you to Nvidia cards, and Nvidia's started playing nasty with their support. If you use up to date drivers Nvidia forces you to run PhysX with Nvidia hardware alone, and they've dropped support for the original add-in AGEIA PhysX card.

OpenCL, however, can accelerate pretty much on all platforms. Here's what I said in this post: http://boards.cityofheroes.com/showp...03&postcount=5

Quote:
And a lot more people will have ATi cards, Intel GPU's, or Nvidia cards in systems with Intel Integrated graphics, or ATi cards in systems with Intel integrated graphics, or Nvidia cards in systems with ATi integrated graphics, or ATi cards in systems with ATi integrated graphics, or Nvidia cards with integrated Nvidia graphics, or ATi cards in systems with integrated Nvidia graphics.

No matter how you cut it, continued PhysX support is just a bad idea for the CoH developers, or any game developers. Just drop it, move to OpenCL. Everybody's happy regardless of what hardware platform they have now, or what hardware platform they use in the future.
***

Quote:
or is ATI now the thing to get?
I'm pretty much the wrong person to ask this question to. I will be the first to admit that I'm biased against Nvidia. I have no respect for the company. I have no respect for their driver development. I have no respect for their marketing. I have no respect for their Linux stance.

Right now the Nvidia product line-up is a clear financial rip-off. For every product they currently offer, you can get more performance for your dollar by buying an Add-In AMD card, with a small caveat.

Nvidia has been caught actively sabotaging graphics code in games developed under their "The Way It's Meant to be Played" banner. Games that ship with Nvidia marketing will undoubtedly work better on Nvidia hardware, but it's not because the Nvidia hardware is better. It's because the graphics code is deliberately structured to drop to a lower rendering mode or corrupt when running on anything but an Nvidia card. City of Heroes users have gotten a taste of this. Nvidia was a graphics partner back a few years ago, and as I understand it, they helped write some of the shader code for the game. Not a whole lot of surprise then that some shader functions that should work on graphics chips from vendors that are not Nvidia, don't work.

Just based on the corporate behavior that we know about... I don't think I could tell anybody with a straight face to buy Nvidia.

Unfortunately, in the current add-in card market, that means the only alternative to buy from is AMD. While I do like a lot of what AMD does... http://www.x.org/docs/AMD/ comes to mind... I've never been real happy with ATi's focus on DirectX. I've never been convinced that the ATi marketers understand that promoting OpenGL to game developers and publishers is a better long term financial option than promoting DirectX. ATi has gotten better about OpenGL since AMD bought them up... but there are still some legacy problems that are both perception based, and reality based.

Quote:
Pretty much I figure my choices based on budget will be that GTX 275 Co-OP, a GTX 280 or a 5850. They all seem to be about the same in price and (broadly) performance. Since CoH is the only game of note I play, I want whatever will do the best job in CoH. It sounds to me like the 275 co-op might be that card but I want to be sure.
Short answer: we don't have official numbers on how the 57xx or 58xx series stacks up. We do know that the performance of the 57xx cards in ultra-mode was described as excellent.

As I said earlier, I'd have a hard time suggesting you buy an Nvidia card.

Speaking for right now, you're going to pay out over $200 for a card that does not support DirectX 11 / OpenGL 3.2. I'm going to presume that you intend to use this card for 3 or 4 years before upgrading. So, you are going to spend 3 or 4 years with a card that doesn't support the graphics technologies that game developers are starting to bring online. Titles with DirectX 11 support are already arriving on the market. Those titles are playable with DirectX 11 features on RadeonHD 57xx series and RadeonHD 58xx series cards. Developers targeting these new features are using RadeonHD 57xx and RadeonHD 58xx cards as the reference platforms.

Barring anything else, buying one of Nvidia's $200+ card in this circumstance just seems like a bone-headed move.

***

Now, if you are not in that price bracket... if you are in the sub $100 to $130 range... yeah. The Geforce 250 GTS is tempting. It and the RadeonHD 4850 trade blows over which is faster in different games. The RadeonHD 5x00 cards at this price point, the 56xx series, isn't quite fast enough to run current DX11 titles at resolutions you might be using (1440*900, 1680*1050). Here in this price bracket, you might spend a bit more for the Nvidia brand name... but you aren't getting royally... well. You know.

****

If you are in the $140 to sub $200 range, things are complicated again. This is the price range where you'll find the RadeonHD 57xx cards... as well as the RadeonHD 4870 and it's brother, the 4890. In this price bracket, Nvidia is just once again a bad option. Okay, you can make the argument that future DirectX 11 / OpenGL 3.2 titles may not run that fast on RadeonHD 57xx series, so you'll just stick with DX 10 / OpenGL 3.0.

However, I can buy a RadeonHD 4870 for around $170. If I can find a Geforce GTX 260, it's going to cost me $30 more new.

So, if I want to risk bare minimum DX 11 / OpenGL 3.2 support, I can have it in this price bracket. If I don't want want to risk it, and just stick with DX10 / OpenGL 3.0... Nvidia's just a plain rip-off for the parts that are available.


 

Posted

So basically, the 275 card is a waste since the PhysX it supports is not the same as CoH PhysX. Good to know. Come to think, I have a friend who has a PhysX card and isn't using it any more that I know of. Perhaps I can get it and add it to the mix...

Well that pretty much leaves me at the 280 or the 5850. And as I generally understand it, the 5850 would be the better card of the two since it supports the newer DirectX and OpenGL tech as well as having better benchmarks from what I see on the web. Or I could do a pair of 5770's in Crossfire for about the same price and performance. Either of those should do well in UM if I'm reading things right. I know UM is still in flux but I'd expect we'll hear more soon and that the numbers will only get better as they optimize and tweak it up. Or am I just dreaming there? So that sounds like my best options at this point unless someone has a better suggestion?


It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.

 

Posted

Quote:
Originally Posted by je_saist View Post
Short answer: we don't have official numbers on how the 57xx or 58xx series stacks up. We do know that the performance of the 57xx cards in ultra-mode was described as excellent.

As I said earlier, I'd have a hard time suggesting you buy an Nvidia card.

[...snip...]
Developers targeting these new features are using RadeonHD 57xx and RadeonHD 58xx cards as the reference platforms.
Nice writeup , je_saist. The last couple of times I purchased graphics cards I went with nVidia because of the problematic ATI support in CoH. But given that the devs are developing Ultra Mode on ATI cards and demo'd it on ATI cards, given that the 57xx and 58xx lines support DirectX 11 but nVidia doesn't, and given that in terms of performance per dollar ATI seems to have the edge right now, I'm planning on ATI for my next purchase.


Freedom: Blazing Larb, Fiery Fulcrum, Sardan Reborn, Arctic-Frenzy, Wasabi Sam, Mr Smashtastic.

 

Posted

Quote:
Originally Posted by Back_Blast View Post
So basically, the 275 card is a waste since the PhysX it supports is not the same as CoH PhysX. Good to know. Come to think, I have a friend who has a PhysX card and isn't using it any more that I know of. Perhaps I can get it and add it to the mix...
I do kind of need to halt you here. Nvidia has dropped support for this card. You'll have to use an older driver set in order for the AGEIA add in card to work properly.

Quote:
Well that pretty much leaves me at the 280 or the 5850. And as I generally understand it, the 5850 would be the better card of the two since it supports the newer DirectX and OpenGL tech as well as having better benchmarks from what I see on the web. Or I could do a pair of 5770's in Crossfire for about the same price and performance. Either of those should do well in UM if I'm reading things right. I know UM is still in flux but I'd expect we'll hear more soon and that the numbers will only get better as they optimize and tweak it up. Or am I just dreaming there? So that sounds like my best options at this point unless someone has a better suggestion?
Xbitlabs looked at this cards in Crossfire: http://www.xbitlabs.com/articles/vid...rossfirex.html

The good news is, AMD has launched a new Crossfire profile system which is fixing some of the software compatibility problems with Crossfire. The bad news is, the developers last word was that multi-gpu wasn't giving an expected performance boost in Ultra Mode. In Posi's edit for this thread, we need to start bugging AMD and Nvidia to make sure multi-gpu works for this game.

***

For now, the safe bet is a single GPU setup. Really, the 5850 is at least $50 less than the GTX 280... outruns the GTX 280 in all but "The Way it was meant to be sabotaged games", uses less heat and power... and... it's not even a contest really.

Unless you are buying brand name for the sake of buying brand name, the GTX line-up from Nvidia is just a bad buy right now.


 

Posted

Quote:
Originally Posted by je_saist View Post
I do kind of need to halt you here. Nvidia has dropped support for this card. You'll have to use an older driver set in order for the AGEIA add in card to work properly.
Yeah, I had heard that but figured I might still be able to make it work. Whether I'll actually bother to try remains to be seen.

Quote:
Originally Posted by je_saist View Post
Xbitlabs looked at this cards in Crossfire: http://www.xbitlabs.com/articles/vid...rossfirex.html

The good news is, AMD has launched a new Crossfire profile system which is fixing some of the software compatibility problems with Crossfire. The bad news is, the developers last word was that multi-gpu wasn't giving an expected performance boost in Ultra Mode. In Posi's edit for this thread, we need to start bugging AMD and Nvidia to make sure multi-gpu works for this game.

***

For now, the safe bet is a single GPU setup. Really, the 5850 is at least $50 less than the GTX 280... outruns the GTX 280 in all but "The Way it was meant to be sabotaged games", uses less heat and power... and... it's not even a contest really.

Unless you are buying brand name for the sake of buying brand name, the GTX line-up from Nvidia is just a bad buy right now.
I'd say you're likely right. And if I just do a single 5850, there's always the potential to get another later and Crossfire *that* instead of starting out with the paired 5770s. Plus while we can assume they'll get SLI/Crossfire working with UM, it's anyone's guess *when* that will occur. Might be next week, might be Issue 27.


It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.

 

Posted

Quote:
Originally Posted by je_saist View Post
In Posi's edit for this thread, we need to start bugging AMD and Nvidia to make sure multi-gpu works for this game.
I'm glad you mentioned this. How exactly do we make ourselves heard on this issue? Do we just use the generic "contact us" forms from both vendors? Or is there a better way? If Posi had given a specific link for each vendor, he'd likely have gotten thousands of people helping to raise the issue. As it stands, I'm guessing hardly anyone took action.


Freedom: Blazing Larb, Fiery Fulcrum, Sardan Reborn, Arctic-Frenzy, Wasabi Sam, Mr Smashtastic.

 

Posted

Quote:
Originally Posted by Sardan View Post
I'm glad you mentioned this. How exactly do we make ourselves heard on this issue? Do we just use the generic "contact us" forms from both vendors? Or is there a better way? If Posi had given a specific link for each vendor, he'd likely have gotten thousands of people helping to raise the issue. As it stands, I'm guessing hardly anyone took action.
well. AMD can be contacted by a variety of ways.

You can file bug reports here: http://ati.cchtml.com/

You can file a post on the AMD community forums here: http://forums.amd.com/game/

You can email AMD using this form here: http://emailcustomercare.amd.com/

There's also the Catalyst Crew Feedback form: http://www.amdsurveys.com/se.ashx?s=5A1E27D27E29B0E3

ATi engineers also browse through the forums at places like Phoronix and Rage3D.

***

Now, Nvidia's a tougher nut to crack. The only way that I'm aware of to contact the company about driver support is through here: http://www.nvidia.com/object/driverq...assurance.html

***

Another approach, and one advocated by Terry Makoden and John Bridgman of AMD is to bug your graphics card vendor. Let them know you buy a card for a particular feature. An OEM like Sapphire, Asus, XFX, Acer, Tul / PowerColor, or the like has a greater amount of pull with driver developers. Michael Larabel (of Phoronix) wrote about this method in response to getting better Linux drivers back in 2007: http://www.michaellarabel.com/index.php?k=blog&i=121


This is probably the best way to get Nvidia's attention these days as well. I do need to note that some companies don't have contact emails. You'll need to send a snail mail letter.


 

Posted

I went to my local Fry's after work and picked up a Radeon HD 5770 for a cool $120. They're normally like $189 in the store or as low as $160 for the 1G version. I was looking at the 5850, but the price point sold me on this 5770 (marked as a return item, but was new in box, still had the protective film on the fan cover)


 

Posted

Like many previous posters in this thread, I'm considering an upgrade to my video card, and could use some guidance....

My motherboard supports SLI (and doesn't support crossfire, right?), with a QX9650 running at 3.0 GHz and 4 Gb of memor; I suspect my video card (GeForce 8800 GTS 512) is my current limiter.

Looking at the "Graphics Card Hierarchy" on Tom's Hardware, there are five tiers above my current card. Excluding the $800 HD 5970 and the unavailable GTX 295, I'm considering the GTX 285 (~$375), HD 5870 (~$400), a second 8800 GTS 512 for SLI (~$100), or doing nothing.

These performance comparisons on THW all suggest that both the 285 and the 5870 would be serious upgrades from my current card -- generally 2.5x the FPS on various tests at 1920 X 1200.

If Ultra Mode ultimately supports SLI, would 2x8800 GTS 512s seem like a good option? On the THW tests, this configuration seems to perform near the level of the GTX 285, and would cost significantly less.

If UM does not support SLI, I'm leaning towards the HD 5870. Any thoughts or suggestions?


Hazel Black - Mind/Psi D
Stephanie Winters - Nightwidow
Jacqui Frost - Cold/Ice D
Jacqui Embers - Fire/Kin C
Simone Templar - Fire/MM B
Mallory Woods - Kin/Rad D
Sanguine Melody - Grav/Sonic C
Fumina Hara - Plant/Storm C
Nutmeg - Warshade
Lauren Wu
- SS/WP B

 

Posted

Quote:
Originally Posted by Mezzosoprano View Post
Like many previous posters in this thread, I'm considering an upgrade to my video card, and could use some guidance....

My motherboard supports SLI (and doesn't support crossfire, right?), with a QX9650 running at 3.0 GHz and 4 Gb of memory; I suspect my video card (GeForce 8800 GTS 512) is my current limiter.
Okay, Socket 775 chipset. I can be pretty sure in telling you that no, your motherboard probably won't support Crossfire. Nvidia did not open up SLI licensing until Intel launched the I7, and Nvidia decided to get out of the x86 chipset market.

Ergo, you probably have an Nvidia Nforce chipset, and Nvidia doesn't allow Crossfire setups on their chipsets.


Quote:
Looking at the "Graphics Card Hierarchy" on Tom's Hardware, there are five tiers above my current card. Excluding the $800 HD 5970 and the unavailable GTX 295, I'm considering the GTX 285 (~$375), HD 5870 (~$400), a second 8800 GTS 512 for SLI (~$100), or doing nothing.

These performance comparisons on THW all suggest that both the 285 and the 5870 would be serious upgrades from my current card -- generally 2.5x the FPS on various tests at 1920 X 1200.

If Ultra Mode ultimately supports SLI, would 2x8800 GTS 512s seem like a good option? On the THW tests, this configuration seems to perform near the level of the GTX 285, and would cost significantly less.

If UM does not support SLI, I'm leaning towards the HD 5870. Any thoughts or suggestions?
With Ultra Mode launching in April, and the engineers saying that multiple GPU setups aren't delivering the performance gains you would expect, I would suspect that Multi-GPU probably isn't going to help that much on Ultra-Mode's launch.

I expect that with time, SLI and Crossfire setups will offer performance gains, but if you are buying with an eye towards the game...

I'd actually recommend buying with an eye further towards the future. I've gone over, multiple times now, in this thread why I think buying Nvidia, right now, is a rip off outside of the ~$100 GTS 250 card.

***

Now, if you can wait a couple more months, to when Ultra Mode actually arrives, Nvidia should be pushing Fermi cards, the GF 100, into retail. What we don't know right now is what Fermi's price point is going to be, nor what it's gaming performance will be.

We can infer from Fermi's die size at 3billion transistors, that it's about 50% more expensive for TSMC to make than the RadeondHD 5870 GPU. We can also infer from Fermi's die size that if it has the same clock speeds as the 5870, it would be anywhere from 30% to 100% hotter than the 5870 GPU depending on what those transistors are doing.

We also have the inference from the recent unveiling of operational GF 100 cards that the clock-speeds aren't actually that fast, and gaming performance isn't that good. Every single system Nvidia had running was running in SLI mode. While this might seem like good PR on the surface, the possibility was raised, and never countered by Nvidia, that the cards had to be running in SLI to manage passable frame-rates.

Barring anything else, we know then that Fermi is going to be more expensive to manufacture than the RadeonHD 5870, which in turn means that GF 100 cards are going to more expensive than RadeonHD 5870 cards. If Nvidia goes after the same price points, they'll be repeating the problem they have now: where the GTX 2xx series is physically more expensive to make than both the existing RadeonHD 4x00 series and 5x00 series cards, and Nvidia won't be able to offer competitive price / performance ratios.

With a price point that has to be higher than the RadeonHD 5870, and a good chance of performance lower than a 5870... Fermi might resemble something else that sailed and sunk.

No I don't mean PvP in CoH, I mean the Titanic. (oh come on, who didn't see that line coming?)

****

Anyways, with Fermi pretty much a non-factor at this point, if you've got the money, the RadeonHD 5870 is going to be the best long term solution if you are buying right now.


 

Posted

Quote:
Originally Posted by je_saist View Post
Nvidia did not open up SLI licensing until Intel launched the I7, and Nvidia decided to get out of the x86 chipset market.
They didn't decide to get out of the chipset market, they were pushed by Intel who didn't want any competition in the chipset market for their latest line of processors (Socket 1156 and Socket 1366). They aren't too happy over nVidia's ION chipset for the Atom either but that's a FSB license and that horse is out of the barn. That's one of the reasons behind the FTC probe of Intel.

All nVidia could do was to license their secret code to motherboard manufacturers to put into their BIOS so SLi would work. Otherwise their whole "buy multiple video cards" business plan goes POOF!


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

Quote:
Originally Posted by Back_Blast View Post
So basically, the 275 card is a waste since the PhysX it supports is not the same as CoH PhysX. Good to know. Come to think, I have a friend who has a PhysX card and isn't using it any more that I know of. Perhaps I can get it and add it to the mix...

Well that pretty much leaves me at the 280 or the 5850. And as I generally understand it, the 5850 would be the better card of the two since it supports the newer DirectX and OpenGL tech as well as having better benchmarks from what I see on the web. Or I could do a pair of 5770's in Crossfire for about the same price and performance. Either of those should do well in UM if I'm reading things right. I know UM is still in flux but I'd expect we'll hear more soon and that the numbers will only get better as they optimize and tweak it up. Or am I just dreaming there? So that sounds like my best options at this point unless someone has a better suggestion?
Last I read, UM wasn't -presently- supporting Crossfire. That may have changed, but if so it's only changed very recently.


 

Posted

Okay, I've read a large number of the posts in here (including all of Posi's) and I have to admit to still being a bit confused. It's one of those times I could use a guide to video cards, or just the game website telling me the minimum and recommended requirements.

I'm sitting on a i5-750 chipset in a Gigabyte GA-P55-UDR3 board with 2gig of DDR3 RAM onboard, with my video card (nVidia 9600 GT) apparently being my only weak point. Now in Australian dollars, what's a reasonable price for an American buyer hurts an Aussie a bit. Add to that I haven't quite understood what the 'reduced quality' represents in the tiers of Ultra Mode. Less reflections, less draw distance? It'd be nice to know as an ordinary PC user with just enough video card knowledge to be dangerous () whether I should really just be holding off on any decision regarding video cards until Ultra Mode hits something close to a final form.

I've always bought nVidia, and until I started reading this thought 'hey, I'll just buy more of these guys' stuff, it works for me'. I think Ultra Mode for Dummies (and this definitely includes me) would be a real help about now.


S.


Part of Sister Flame's Clickey-Clack Posse

 

Posted

Thanks for the info.

From all the discussion here, it certainly sounds like buying a higher-end ATI card would be a better eye-on-the-future purchase, other than the fact that I can't go crossfire further down the road.


Hazel Black - Mind/Psi D
Stephanie Winters - Nightwidow
Jacqui Frost - Cold/Ice D
Jacqui Embers - Fire/Kin C
Simone Templar - Fire/MM B
Mallory Woods - Kin/Rad D
Sanguine Melody - Grav/Sonic C
Fumina Hara - Plant/Storm C
Nutmeg - Warshade
Lauren Wu
- SS/WP B

 

Posted

Quote:
Originally Posted by Father Xmas View Post
They didn't decide to get out of the chipset market, they were pushed by Intel who didn't want any competition in the chipset market for their latest line of processors (Socket 1156 and Socket 1366). They aren't too happy over nVidia's ION chipset for the Atom either but that's a FSB license and that horse is out of the barn. That's one of the reasons behind the FTC probe of Intel.

All nVidia could do was to license their secret code to motherboard manufacturers to put into their BIOS so SLi would work. Otherwise their whole "buy multiple video cards" business plan goes POOF!
I hate to disagree with you, but Nvidia was already abandoning the AMD chipset market when Intel was bringing the I7 architecture to market. Nvidia was looking to get out of the chipset market, and simply used Intel's positioning on I7 licensing as an excuse to stop producing chipsets. Licensing SLI technology would be more profitable to Nvidia than to continue to make their own chips.

There was just one slight problem.

There's no secret code. According to Intel engineers, they didn't have to make any changes to X58 to support SLI. Nvidia just had to allow the setup in the official drivers. Various users have been using leaked, beta, or hacked drivers since Crossfire motherboards started hitting the market to run two Nvidia cards atop an ATi chipset, or Intel chipsets that support Crossfire.

***

Now, I will admit there is probably merit to the idea that Intel wanted to be the only game in town with chipset support on the I7. Intel never played nice with Via, S3, ULI, or anybody else in the chipset market. However, looking at Nvidia's behavior and their choice to stop developing for AMD's platform, I stand by my statement Nvidia was trying to get out of chipsets.

The writing for chipsets has actually been on the wall since 2003 when AMD started selling Athlon64's with integrated memory controllers. Prior to the Athlon64 on AMD, and up to the I7 on Intel, processors have / had external memory controllers. Several years ago Nvidia's memory controllers were the best in the market, and helped AtlhonXp's trounce the Pentium 4 processor lineup.

With memory controllers moving onto the processor, chipsets were largely relegated to IO support functions. This meant that the cost of the physical chip went down, as well as the profit on the chip. From appearances, Nvidia tried to keep prices up on their chipsets with SLI certification and licensing... which was really nothing more than allowing a particular chip to be used at the software driver level.

With Intel finally going to an integrated memory controller, the profit margins for chipsets were headed for the toilet.

Then there was the other problem. Fusion and Project Larrabee's integrated development. Both Intel and AMD are launching processors, this year, with integrated GPU's. These hybrid CPU / GPU units will pretty much take over the low end market. OEM motherboard prices will go down as engineers no longer have to plan for graphics support from a northbridge, through the processor, to a memory controller, and back.

This means that the current crop of low-end computers with integrated chipsets will change. An OEM like Dell, HP, or Gateway isn't going to make a computer with an extra motherboard GPU with the associated circuitry when they can save $5 or more per system on the hybrid CPU/GPU setups.

Since Nvidia isn't in the x86 CPU market, their cash cow of Nforce is pretty much gutted.

Ergo: from my perspective, Nvidia saw that their chipset market was going to evaporate, regardless of whether or not they had licenses to the system bus components.


 

Posted

Quote:
Originally Posted by Back_Blast View Post
So basically, the 275 card is a waste since the PhysX it supports is not the same as CoH PhysX. Good to know. Come to think, I have a friend who has a PhysX card and isn't using it any more that I know of. Perhaps I can get it and add it to the mix...
I strongly recommend against it. There are two reasons AGEIA stopped making and selling their PhysX-100 card. First because CPUs were getting so fast that the card wasn't making much of a difference. Second, because the card would make a lot of machines unstable, with an increased incidence of crashes and lock-ups. I should know, I had one. It only gave me about 5-8% more FPS in the PhysX demos versus CPU alone, and I can't tell you how many times I would lock up and crash right in the middle of missions in CoH.

Not to mention there's not a whole lot of Physics content in CoH anyway! Blowing up cars? Rare. Falling leaves / branches / ATM money / letters / garbage? The CPU does just as good a job, with no lock-ups or problems. I eventually got rid of the PhysX-100 card and never missed it once.

-- Vivian


 

Posted

The 275 card is NOT a waste.


 

Posted

No P_P, the GTX 275 card with the built in GTS 250 for PhysX was a waste. It costs as much as the GTX 285 and nVidia GPU based PhysX doesn't work with this game.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

It's better than a 9500 any day of the week.


 

Posted

Quote:
Originally Posted by je_saist View Post
I hate to disagree with you, but Nvidia was already abandoning the AMD chipset market when Intel was bringing the I7 architecture to market. Nvidia was looking to get out of the chipset market, and simply used Intel's positioning on I7 licensing as an excuse to stop producing chipsets. Licensing SLI technology would be more profitable to Nvidia than to continue to make their own chips.

There was just one slight problem.

There's no secret code. According to Intel engineers, they didn't have to make any changes to X58 to support SLI. Nvidia just had to allow the setup in the official drivers. Various users have been using leaked, beta, or hacked drivers since Crossfire motherboards started hitting the market to run two Nvidia cards atop an ATi chipset, or Intel chipsets that support Crossfire.
When I said secret code, I meant that literally. Well not a code per se but a decryption key. The part of the nVidia's driver that handles SLi is encrypted. At boot time the key is fetched through a function call and if the call succeeds the SLi portion of the driver is then decrypted. In the era of the chipset, the key was found there.

It ticks me off that this locking up of nVidia's SLi technology was equivalent to an inkjet printer's ink cartridge being chipped so no third party cartridge will work. Especially the impression they gave when it first came out was they were doing something clever with the PCIe controller in their Northbridge.

But nVidia is still waving the lockout of the QPI interface from Socket 1366 and the DMI interface from Socket 1156 licenses as examples of Intel competing unfairly to the FTC.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

Quote:
Originally Posted by SuperOz View Post
Okay, I've read a large number of the posts in here (including all of Posi's) and I have to admit to still being a bit confused. It's one of those times I could use a guide to video cards, or just the game website telling me the minimum and recommended requirements.

I'm sitting on a i5-750 chipset in a Gigabyte GA-P55-UDR3 board with 2gig of DDR3 RAM onboard, with my video card (nVidia 9600 GT) apparently being my only weak point. Now in Australian dollars, what's a reasonable price for an American buyer hurts an Aussie a bit. Add to that I haven't quite understood what the 'reduced quality' represents in the tiers of Ultra Mode. Less reflections, less draw distance? It'd be nice to know as an ordinary PC user with just enough video card knowledge to be dangerous () whether I should really just be holding off on any decision regarding video cards until Ultra Mode hits something close to a final form.

I've always bought nVidia, and until I started reading this thought 'hey, I'll just buy more of these guys' stuff, it works for me'. I think Ultra Mode for Dummies (and this definitely includes me) would be a real help about now.


S.
Well the quick answer is we don't know exactly what the quality levels represent. But one could conjecture that it would likely affect the distance at which you could see reflections and such and possibly/probably the quality and clarity of those reflections. So if you want maxed UM eye candy, you need a max or near-max card most likely. In Nvidia vs. ATI I take no sides other than to say, right now, ATI has the edge. So that's what I went with when I did my ordering. I've a 5850 headed my way. I expect it will do quite nicely in UM.

Quote:
Originally Posted by Doctor Vivian View Post
I strongly recommend against it. There are two reasons AGEIA stopped making and selling their PhysX-100 card. First because CPUs were getting so fast that the card wasn't making much of a difference. Second, because the card would make a lot of machines unstable, with an increased incidence of crashes and lock-ups. I should know, I had one. It only gave me about 5-8% more FPS in the PhysX demos versus CPU alone, and I can't tell you how many times I would lock up and crash right in the middle of missions in CoH.

Not to mention there's not a whole lot of Physics content in CoH anyway! Blowing up cars? Rare. Falling leaves / branches / ATM money / letters / garbage? The CPU does just as good a job, with no lock-ups or problems. I eventually got rid of the PhysX-100 card and never missed it once.

-- Vivian
It was more a stray idea than serious intent. I'd have to do some hacking and fiddling to make it work anyways since I'll be running 64-bit Win7 when it's all said and done. Probably not worth it regardless.


It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.

 

Posted

Quote:
Originally Posted by Mezzosoprano View Post
Thanks for the info.

From all the discussion here, it certainly sounds like buying a higher-end ATI card would be a better eye-on-the-future purchase, other than the fact that I can't go crossfire further down the road.
Well if you get an upper-end ATI, by the time you're seriously wishing you could Crossfire, it may well be time to get a new system. So I wouldn't sweat it too much.


It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.