Ultra-mode video card shopping guide


5th_Player

 

Posted

Quote:
Originally Posted by Oxmyx View Post
When it comes to technical computer stuff I know just enough to be dangerous. I recently purchased a new computer, a quad-core Asus 5270 and had installed in it a BFG Nvidia GT220 graphics card. I know it's after the fact but is that a decent card? I got the whole thing at Best Buy on sale for $90.
For $90 that's a steal!


COH has just been murdered by NCSoft. http://www.change.org/petitions/ncso...city-of-heroes

 

Posted

Quote:
Originally Posted by Oxmyx View Post
When it comes to technical computer stuff I know just enough to be dangerous. I recently purchased a new computer, a quad-core Asus 5270 and had installed in it a BFG Nvidia GT220 graphics card. I know it's after the fact but is that a decent card? I got the whole thing at Best Buy on sale for $90.
um... not.... really.

Xbitlabs looked at the GT 220 with a not really good conclusion here : http://www.xbitlabs.com/articles/vid...210-gt220.html

I know somebody who got a GT 220. I got roped into installing it after the purchase fact. I mentioned it... here: http://boards.cityofheroes.com/showp...87&postcount=8

Quote:
I wouldn't... pick a GT 220 for a couple of reasons... An associate of mine picked up a GT 220 to replace a Geforce 7300. The 7300 was running on a Dell 300 watt power supply. The GT 220 wouldn't. Same card on the same motherboard ran fine with a 400watt power supply. While I'm not saying Nvidia's, you know, lying about the GT 220 running on 300 watt power supplies, there are other factors to keep in mind, such as the processor, northbridge / memory controller, hard-drive, system fans, and so on. Also, in fairness, the associate had picked up the Low Profile ECS GT 220 rather than an Asus, so there might be a brand quality factor involved as well.

The second reason I wouldn't pick up a GT 220 is that it was hot. Really. Really hot. The associate was putting it in one of Dell's... smaller cases... and... well. Okay, I define obnoxious as a GeforceFX 5800. The ECS GT 220 wasn't exactly obnoxious... but as soon as the chassis cover went on you could tell the GPU fan had kicked into high gear. Again, this might be another brand quality issue, but as far as I'm aware from pictures, the Asus and ECS cards use the same heatsink designs.
From a raw performance point, the GT 220 has a theoeretical performance of 5,000 MegaPixels and 10,000 MegaTexels. This puts it around the same real-world performance as the older Geforce 9500 and 8600 GTS chips, of which the GT 220 is basically yet another respin of.

From a price standpoint, the GT 220 sells on Newegg for around $60~$70.

Unfortunantly, you can also get RadeonHD 4670's with the 1gb VRAM in this market segment... and with more Pixel performance (6,000 MegaPixels) and more than double the Texel Performance (24,000 MegaTexels) the HD 4670 had no problems trashing the GT 220 in Xbit's tests.

Basically, if you paid $90 for it, Best Buy ripped you off.


 

Posted

Quote:
Originally Posted by Oxmyx View Post
When it comes to technical computer stuff I know just enough to be dangerous. I recently purchased a new computer, a quad-core Asus 5270 and had installed in it a BFG Nvidia GT220 graphics card. I know it's after the fact but is that a decent card? I got the whole thing at Best Buy on sale for $90.
I'm not sure here if you meant the entire quad-core computer plus new graphics card was $90 or just the graphics card. However, if it was $90 for the card are you sure you didn't mean a GT 240? That's the Nvidia offering at that price point currently.


 

Posted

Quote:
Originally Posted by Human_Being View Post
I'm not sure here if you meant the entire quad-core computer plus new graphics card was $90 or just the graphics card. However, if it was $90 for the card are you sure you didn't mean a GT 240? That's the Nvidia offering at that price point currently.
Probably not. $90 is around what Best Buy is selling the GT 220 at.

Their cheapest GT 220 is $80... which is still overpriced.

They do list a GT 240 at $100.


 

Posted

In the news today is this little gem from Thermaltake: http://www.thermaltake.com/news_deta...pid=P_00000145

They've designed a case specifically for for Fermi based graphics cards.

I just have to ask. How hot are Nvidia's test samples running in order for Nvidia to feel the need to work up a special chassis that can handle the heat output? Who here is going to spend $170+ (the current price of the Element V) on a new chassis that is certified to work with Fermi?

Okay, I really shouldn't mock or make too much fun of this. The last time I saw companies making accessories for a grand and hyped product that was actually launched, the Bugatti Veyron was involved... and hey, that goes pretty quickly in a straight line... surprisingly not that good in corners though.

I am left wondering how many more of these accessory style launches we'll see in the next 3 months. Will the licensing fees for using Nvidia's name be enough to offset what is not actually the most advanced GPU computing architecture launching late with 3 months of nothing to compete with what a competitor is offering?


 

Posted

Quote:
Originally Posted by je_saist View Post
In the news today is this little gem from Thermaltake: http://www.thermaltake.com/news_deta...pid=P_00000145

They've designed a case specifically for for Fermi based graphics cards.

I just have to ask. How hot are Nvidia's test samples running in order for Nvidia to feel the need to work up a special chassis that can handle the heat output? Who here is going to spend $170+ (the current price of the Element V) on a new chassis that is certified to work with Fermi?
I call FUD.

If you actually read the article and the attached image, it states that the case is optimized for systems sporting Triple and Quad SLI setups (which are already notably warm-running). This means they're ridiculously overkill for single-card solutions.

And the use of 200mm and 230mm fans in cases isn't exactly new technology here. And please note the second link especially. It shows that the case you linked to is little more than a branded rework of a pre-existing chassis.

With this in mind, the way you've portrayed it is inappropriate.



Clicking on the linked image above will take you off the City of Heroes site. However, the guides will be linked back here.

 

Posted

Quote:
Originally Posted by je_saist View Post
Unfortunantly, you can also get RadeonHD 4670's with the 1gb VRAM in this market segment... and with more Pixel performance (6,000 MegaPixels) and more than double the Texel Performance (24,000 MegaTexels) the HD 4670 had no problems trashing the GT 220 in Xbit's tests.

Basically, if you paid $90 for it, Best Buy ripped you off.
Just to add off of this: I'd like to say as a Radeon 4670 owner I'm pretty darn happy with it. I bought it because it didn't require an additional power plug going into it (the slot offers enough juice, so I didn't need to upgrade my power supply) and it runs pretty cool-ish (important considering I had my original GeForce die of heatstroke here in Hawai'i as the house has no air conditioning!).


 

Posted

Quote:
Originally Posted by Hyperstrike View Post
I call FUD.

If you actually read the article and the attached image, it states that the case is optimized for systems sporting Triple and Quad SLI setups (which are already notably warm-running). This means they're ridiculously overkill for single-card solutions.

And the use of 200mm and 230mm fans in cases isn't exactly new technology here. And please note the second link especially. It shows that the case you linked to is little more than a branded rework of a pre-existing chassis.

With this in mind, the way you've portrayed it is inappropriate.
Actually, word on the grapevine is that Fermi does run overly hot; to the point where Nvidia has had to reduce the clock speed from what they intended, reducing performance, to allow it to run at all. Furthermore, there have been and are now tri and quad graphics card setups, including with the new ATI 5xxxs, and they don't require a specialized case. Finally, the Element V cases are Thermaltake's second-from-the-top cooling solution, with only the $850 Level 10 above it. And while the 230 mm and 200 mm fans in the Element V may not be "new technology", requiring them over regular 120 mms is "rather unusual". If you read the bottom of that flier, they state that the case is still being certified for this application...meaning all of that plus their "new graphics card ducting" just for Fermi might not be enough.

Fermi is increasingly looking like a boondoggle, but we should know more next weekend during CES.


 

Posted

Quote:
Originally Posted by Human_Being View Post
And while the 230 mm and 200 mm fans in the Element V may not be "new technology", requiring them over regular 120 mms is "rather unusual".
I fail to see where the word "required" was used.

Quote:
If you read the bottom of that flier, they state that the case is still being certified for this application...meaning all of that plus their "new graphics card ducting" just for Fermi might not be enough.
No. Meaning it hasn't been certified yet. That is all. It could be for any number of reasons, yours is just one of them.



Clicking on the linked image above will take you off the City of Heroes site. However, the guides will be linked back here.

 

Posted

Quote:
Originally Posted by Hyperstrike View Post
I call FUD.

If you actually read the article and the attached image, it states that the case is optimized for systems sporting Triple and Quad SLI setups (which are already notably warm-running). This means they're ridiculously overkill for single-card solutions.

And the use of 200mm and 230mm fans in cases isn't exactly new technology here. And please note the second link especially. It shows that the case you linked to is little more than a branded rework of a pre-existing chassis.

With this in mind, the way you've portrayed it is inappropriate.
Which means this is no different than - say - Coolermaster 690 nVidia edition. Or the Silverstone TJ10B nVidia edition. Which was "Pay $30 more for green coloring and the nVidia name." (I think there was also a Cosmos 1000 like that.)

What does "Certified" really mean? It could mean nothing more than "Yes, the cards will mechanically fit, and we'll put our name on it and split the profits." In fact, I'd put money on it, if I were inclined to pay more money for the exact same thing.

Yes, the one je_saist linked mentioned Fermi. Why? Because it's the newest, upcoming card from nVidia. (Eventually.) And it's an nVidia branded... er, "certified" case. Read and compare the standard Element V's setup - it's the exact same, including the 200 and 230mm fans. Here, direct quote from the article:
Quote:
In addition to the oversized cooling fans, one 230mm Colorshift side intake fan, one 200mm Colorshift top exhaust fan, two 120mm front intake fans and one 120mm rear exhaust fan,
(which mentions a "duct" after that - given the case already has a vent and spots for two 50mm vga cooling fans, I'd say "They made sure the plastic fits,")

And the Newegg listing:
Quote:

Cooling System 80mm Fans No
120mm Fans
1 x 120mm Colorshift Front fan(intake)
1 x 120mm Turbo Front fan(intake)
1 x 120mm Turbo Rear fan(exhaust)

200mm Fans

1 x 200mm silent Colorshift Top fan(exhaust)

230mm Fans Plug & Play
1 x 230mm Colorshift Side fan(intake)
There's just nothing actually screaming "This is made for Fermi's outrageous cooling needs" as opposed to "Branded to make more money."


 

Posted

Quote:
Originally Posted by Hyperstrike View Post
I call FUD.

If you actually read the article and the attached image, it states that the case is optimized for systems sporting Triple and Quad SLI setups (which are already notably warm-running). This means they're ridiculously overkill for single-card solutions.

And the use of 200mm and 230mm fans in cases isn't exactly new technology here. And please note the second link especially. It shows that the case you linked to is little more than a branded rework of a pre-existing chassis.

With this in mind, the way you've portrayed it is inappropriate.

*coughs*

Quote:
Okay, I really shouldn't mock or make too much fun of this.
... Apparently I wasn't clear enough that the initial commentary on the case design was supposed to be a mocking joke.


 

Posted

Quote:
Originally Posted by Memphis_Bill View Post
Which means this is no different than - say - Coolermaster 690 nVidia edition. Or the Silverstone TJ10B nVidia edition. Which was "Pay $30 more for green coloring and the nVidia name." (I think there was also a Cosmos 1000 like that.)
Pretty much. It's a "fanboy tax" thing.

Quote:
What does "Certified" really mean? It could mean nothing more than "Yes, the cards will mechanically fit, and we'll put our name on it and split the profits." In fact, I'd put money on it, if I were inclined to pay more money for the exact same thing.
As I know the people responsible for the certification, I can tell you.

Essentially it means the cards will fit, mechanically, in the case and the fit isn't so tight that the cards cook themselves due to bad case convection.

It does NOT mean that it's the only case that'll do so. Merely that someone paid nVidia money to make sure theirs would so they could slap a badge on it.

Quote:
Yes, the one je_saist linked mentioned Fermi. Why? Because it's the newest, upcoming card from nVidia. (Eventually.) And it's an nVidia branded... er, "certified" case. Read and compare the standard Element V's setup - it's the exact same, including the 200 and 230mm fans. Here, direct quote from the article:
(which mentions a "duct" after that - given the case already has a vent and spots for two 50mm vga cooling fans, I'd say "They made sure the plastic fits,")
Which is why I linked the way I did. The tone of the original message I responded to was somewhat...skewed. Now maybe I was imagining it, but that sort of brand bashing (for a product that isn't even available yet) has always been like nails on a chalkboard to me, and I'm too stupid to just let it pass by unanswered.



Clicking on the linked image above will take you off the City of Heroes site. However, the guides will be linked back here.

 

Posted

Quote:
Which is why I linked the way I did. The tone of the original message I responded to was somewhat...skewed. Now maybe I was imagining it, but that sort of brand bashing (for a product that isn't even available yet) has always been like nails on a chalkboard to me, and I'm too stupid to just let it pass by unanswered.
Oh, it was skewed on purpose, but again, apparently I wasn't clear enough on what I was skewing, or why.

I have a distaste for vendors launching accessories for a product that's not even on the market with the expectation of using that product as a springboard for sales, or in order to further the hype surrounding that product. I also have a distaste for corporations that deliberately mis-spend customers or clients money.

I used the example of the Bugatti Veyron because it's an example of a product that turned out to be very good. It is the fastest road car in a straight line. However, it was preceded, and accompanied by, a rash of products that simply latched onto the Veyron name, such as an aftershave and a custom watch... accessory products that were junk.

Now, in the specific case of Thermaltake, Nvidia, and Fermi, there are still several questions to be asked. Nobody knows yet whether or not Fermi is actually going to be any good for gaming. We can infer from Nvidia's reluctance to talk about clock-speeds, and the continual pushing back of the launch date, that Fermi isn't exactly all it's cracked up to be. We also now know that everything at Nvidia's late 2009 conference was faked in regards to Fermi, and that hurts Nvidia's credibility by a large amount.

What we don't know is if Thermaltake licensed (paid money to Nvidia), for the right to slap a Fermi-Certified sticker on one of their cases, jack the price up, and make that money back on users buying a case because it's certified for Fermi.

What we do know is that Thermaltake probably actually hasn't any time with Fermi silicon, so we can be pretty sure that the thermal requirements the case is designed to meet are based off of the thermal requirements mentioned by Nvidia. This poses an interesting scenario. What if Nvidia is paying Thermaltake to put the Fermi name on an existing product with a few mods, hoping that the finalized Fermi product meets the design limitations, and that Nvidia can make money back on royalties from products sold bearing the Fermi-Certified moniker.

In this scenario we have Nvidia spending money they actually do have, but in a way they should not. Nvidia's already in enough trouble with the likes of Asus, Acer, Aopen, FoxConn, and Clevo for having outright lied about the thermal properties of the G7x, G8x, G9x, and on initial reports, the GT2x series of chips in both desktop and mobile formats. Nvidia's lack of trustworthiness in detailing the aspects of their chips is commonly referred to as: "Bumpgate"

Now, in all fairness, given the thermal properties of the recent high-end cards from both ATi and Nvidia, nobody in their right mind is going to try to shove a high-end Fermi card into a chassis from Dell, Hewlet Packard, or Acer's Gateway and Emachines divisions. Most gamers interested in Fermi probably are going to have properly designed cases with proper air-flow. Not really a big deal.

However, the situation with Thermaltake does raise some other questions, such as the one I brought up in the first post on this particular subject. Does Nvidia seriously intend to make money off of either licensing Fermi-Ready Certifications to vendors, or by receiving royalties back on sold products bearing the Fermi-Ready moniker? How many more products are we going to see bearing the Fermi-Ready or Fermi-certified badges over the next 3 months as Nvidia and TSMC presumably ramp up production of Fermi based cards?

There's also another huge problem facing Nvidia right now. Fermi is a megachip, with some 3billion odd transistors. As far as we know, Nvidia hasn't exactly designed a low-end or mid-range product off of the Fermi Architecture. As is, Nvidia has only "now" released low-end and mid-range parts based on it's GT2x series of chips... which was really just a re-implemented version of the architecture used in the G8x chips.

Now, this might not mean much to the common user until I give the names of two graphics cards parts from the past.

GeforceFX
Radeon x800

Nvidia had originally planned to launched the GeforceFX series of cards up against ATi's Radeon 9500-9700 range of graphics cards. However, ATi, with the help of ArtX, had blown Nvidia out of the water. The Radeon 9500 and 9700 Graphics cards were perfectly capable of accelerating DirectX 9 games at 30fps, even in the dizzying resolutions of 1280*1024 and 1440*900. Nvidia basically had to take GeforceFX back into the lab and add in DirectX 9 extensions. Unfortunately, when Nvidia actually got GeforceFX out the door in the second quarter of 2003, they weren't having to deal with the the original Radeon 9x00 line up... they had to deal with the Radeon 9600 and 9800 series, which were clock-bumped and more efficient. The result was a disaster for Nvidia. Their only success of the GeforceFX line-up was the GeforceFX 5200 which was a popular OEM card.

Things changed on the next-line up of cards though. Nvidia launched the Geforce 6x00 series of cards in the second quarter of 2004. These cards featured DirectX 9c / OpenGL 2.0 support. ATi, on the other hand, was fielding the x800 and x850 lineups... which were DirectX 9.0b.

Okay, in all fairness, most games didn't ever actually use the additional features in either the 9.0b or 9.0c versions of DirectX... and OpenGL 2.0 was pretty much only matching the base DirectX 9. It wouldn't match / pass DX9.c until the 2.1 revision which came much later.

Still, the marketing damage was done. Nvidia was able to market the 6x00's range of full DX9c compatibility and win back most, if not all of the market share they had lost in the previous round.

Now, we're coming up to a similar situation. AMD has an entire line-up of cards coming out from the low-end OEM market to the high end gaming market that are all DirectX 11 / OpenGL 3.2 compatible.

Nvidia has a DX11 / OpenGL 3.2 compatible card for the high-end market... but not for the low-end or medium-range markets. Rather, from what we know of unofficial information, Nvidia intends to keep using the G9x architecture and it's GT2x respins in the low end market. These aren't DX11 / OpenGL 3.2 parts...

Okay, in all fairness, we, as gamers, know that there's not actually that much visual difference between a game that's coded in DX9 / DX10 / DX11 / OpenGL 2.x / OpenGL 3.x. Keeping in mind that the Playstation 3 and Xbox 360 are OpenGL 2.0 GPU's, which is roughly equivelent to DX9, we know that all of the API's can produce some amazing visuals. The question really becomes which can produce the best visuals with the best frame-rate. The GTX line-up of cards, for the foreseeable future, is going to be able to deliver excellent frame-rates with excellent image quality. Most game developers are not going to be leveraging the very latest graphics API exclusively. So far most engines, such as the CryEngine, Unreal Engine, ID Tech 4, ID tech 5, Source, Torque Shader Engine, and Unigine have offered different fallback levels of rendering.

Case in point being the user on the forums with the Workstation versions of the 7900 GT cards. Yes, they run most of today's hit games very fast... but they do so because they aren't running the hit games best image quality API rendering path. The user pretty much never noticed.

***

So, why is this such a big deal if it doesn't really matter what product you buy?

The simple answer is: It is a big deal to people who don't understand what's going on in the graphics, or does not understand what goes on a in a game engine.

As both Nvidia and ATi found out the hardway, offering a product in the low-end that doesn't have the same feature set as the product in the high-end can be very painful in a financial sense. It doesn't matter if the RadeonHD 3200 can't actually run Crysis in 1024*768 using the DirectX 10 API rendering path. As long as ATi can put on the sales box that the RadeonHD 3200 can accelerate DX10 graphics, that's all users see.

That's all the mass market cares about.

That's also where all the money is. Nvidia and ATi make the majority of their money off of low-end parts sold in contract to OEM's and ODM's.

***

Which is why the Thermaltake event irks me so much.

If I was an executive within Nvidia, I wouldn't be spending a dime on any promotions, on any marketing or rhetoric, until I had working silicon in my hand that I could take to Kyle Bennet over in Texas and allow him to benchmark, or send to the people who run sites like Phoronix or Mepisguides to benchmark and look at. I'd be throwing money left and right to get mid-range and low-end parts of the new architecture ready for 0-day launch with the high-end part.

I, however, don't work for Nvidia, so I can only speak on what I see from outside the company. What I see is Nvidia wasting money on potential marketing stunts, rather than tending to their core business.

And that's what just torques me off.


 

Posted

Quick question: my monitor will be an HP Pavilion 2159m LCD wide-screen monitor with a recommended resolution (H x V) of: 1920 x 1080 @ 60Hz

So, in Tom's hardware list, it has: a GeForce GTS 250 512MB listed as being good 1920x1200 performance in most games.

So, given my screen size, would that be sufficient?


Arc: 378122 "Tales of the Terran Space Marines -The Apocalypse Initiative" 5stars!
http://www.youtube.com/watch?v=6Rgl4...687B0FC89F142C
Arc: 481545 "Twilight of the Gods - The Praetorian conflict"8000+ hits!
http://www.youtube.com/watch?v=XxJ6S...848B21E2350DCC

 

Posted

Quote:
Originally Posted by je_saist View Post
Oh, it was skewed on purpose, but again, apparently I wasn't clear enough on what I was skewing, or why.

I have a distaste for vendors launching accessories for a product that's not even on the market with the expectation of using that product as a springboard for sales, or in order to further the hype surrounding that product. I also have a distaste for corporations that deliberately mis-spend customers or clients money.

I used the example of the Bugatti Veyron because it's an example of a product that turned out to be very good. It is the fastest road car in a straight line. However, it was preceded, and accompanied by, a rash of products that simply latched onto the Veyron name, such as an aftershave and a custom watch... accessory products that were junk.

Now, in the specific case of Thermaltake, Nvidia, and Fermi, there are still several questions to be asked. Nobody knows yet whether or not Fermi is actually going to be any good for gaming. We can infer from Nvidia's reluctance to talk about clock-speeds, and the continual pushing back of the launch date, that Fermi isn't exactly all it's cracked up to be. We also now know that everything at Nvidia's late 2009 conference was faked in regards to Fermi, and that hurts Nvidia's credibility by a large amount.

What we don't know is if Thermaltake licensed (paid money to Nvidia), for the right to slap a Fermi-Certified sticker on one of their cases, jack the price up, and make that money back on users buying a case because it's certified for Fermi.

What we do know is that Thermaltake probably actually hasn't any time with Fermi silicon, so we can be pretty sure that the thermal requirements the case is designed to meet are based off of the thermal requirements mentioned by Nvidia. This poses an interesting scenario. What if Nvidia is paying Thermaltake to put the Fermi name on an existing product with a few mods, hoping that the finalized Fermi product meets the design limitations, and that Nvidia can make money back on royalties from products sold bearing the Fermi-Certified moniker.

In this scenario we have Nvidia spending money they actually do have, but in a way they should not. Nvidia's already in enough trouble with the likes of Asus, Acer, Aopen, FoxConn, and Clevo for having outright lied about the thermal properties of the G7x, G8x, G9x, and on initial reports, the GT2x series of chips in both desktop and mobile formats. Nvidia's lack of trustworthiness in detailing the aspects of their chips is commonly referred to as: "Bumpgate"

Now, in all fairness, given the thermal properties of the recent high-end cards from both ATi and Nvidia, nobody in their right mind is going to try to shove a high-end Fermi card into a chassis from Dell, Hewlet Packard, or Acer's Gateway and Emachines divisions. Most gamers interested in Fermi probably are going to have properly designed cases with proper air-flow. Not really a big deal.

However, the situation with Thermaltake does raise some other questions, such as the one I brought up in the first post on this particular subject. Does Nvidia seriously intend to make money off of either licensing Fermi-Ready Certifications to vendors, or by receiving royalties back on sold products bearing the Fermi-Ready moniker? How many more products are we going to see bearing the Fermi-Ready or Fermi-certified badges over the next 3 months as Nvidia and TSMC presumably ramp up production of Fermi based cards?

There's also another huge problem facing Nvidia right now. Fermi is a megachip, with some 3billion odd transistors. As far as we know, Nvidia hasn't exactly designed a low-end or mid-range product off of the Fermi Architecture. As is, Nvidia has only "now" released low-end and mid-range parts based on it's GT2x series of chips... which was really just a re-implemented version of the architecture used in the G8x chips.

Now, this might not mean much to the common user until I give the names of two graphics cards parts from the past.

GeforceFX
Radeon x800

Nvidia had originally planned to launched the GeforceFX series of cards up against ATi's Radeon 9500-9700 range of graphics cards. However, ATi, with the help of ArtX, had blown Nvidia out of the water. The Radeon 9500 and 9700 Graphics cards were perfectly capable of accelerating DirectX 9 games at 30fps, even in the dizzying resolutions of 1280*1024 and 1440*900. Nvidia basically had to take GeforceFX back into the lab and add in DirectX 9 extensions. Unfortunately, when Nvidia actually got GeforceFX out the door in the second quarter of 2003, they weren't having to deal with the the original Radeon 9x00 line up... they had to deal with the Radeon 9600 and 9800 series, which were clock-bumped and more efficient. The result was a disaster for Nvidia. Their only success of the GeforceFX line-up was the GeforceFX 5200 which was a popular OEM card.

Things changed on the next-line up of cards though. Nvidia launched the Geforce 6x00 series of cards in the second quarter of 2004. These cards featured DirectX 9c / OpenGL 2.0 support. ATi, on the other hand, was fielding the x800 and x850 lineups... which were DirectX 9.0b.

Okay, in all fairness, most games didn't ever actually use the additional features in either the 9.0b or 9.0c versions of DirectX... and OpenGL 2.0 was pretty much only matching the base DirectX 9. It wouldn't match / pass DX9.c until the 2.1 revision which came much later.

Still, the marketing damage was done. Nvidia was able to market the 6x00's range of full DX9c compatibility and win back most, if not all of the market share they had lost in the previous round.

Now, we're coming up to a similar situation. AMD has an entire line-up of cards coming out from the low-end OEM market to the high end gaming market that are all DirectX 11 / OpenGL 3.2 compatible.

Nvidia has a DX11 / OpenGL 3.2 compatible card for the high-end market... but not for the low-end or medium-range markets. Rather, from what we know of unofficial information, Nvidia intends to keep using the G9x architecture and it's GT2x respins in the low end market. These aren't DX11 / OpenGL 3.2 parts...

Okay, in all fairness, we, as gamers, know that there's not actually that much visual difference between a game that's coded in DX9 / DX10 / DX11 / OpenGL 2.x / OpenGL 3.x. Keeping in mind that the Playstation 3 and Xbox 360 are OpenGL 2.0 GPU's, which is roughly equivelent to DX9, we know that all of the API's can produce some amazing visuals. The question really becomes which can produce the best visuals with the best frame-rate. The GTX line-up of cards, for the foreseeable future, is going to be able to deliver excellent frame-rates with excellent image quality. Most game developers are not going to be leveraging the very latest graphics API exclusively. So far most engines, such as the CryEngine, Unreal Engine, ID Tech 4, ID tech 5, Source, Torque Shader Engine, and Unigine have offered different fallback levels of rendering.

Case in point being the user on the forums with the Workstation versions of the 7900 GT cards. Yes, they run most of today's hit games very fast... but they do so because they aren't running the hit games best image quality API rendering path. The user pretty much never noticed.

***

So, why is this such a big deal if it doesn't really matter what product you buy?

The simple answer is: It is a big deal to people who don't understand what's going on in the graphics, or does not understand what goes on a in a game engine.

As both Nvidia and ATi found out the hardway, offering a product in the low-end that doesn't have the same feature set as the product in the high-end can be very painful in a financial sense. It doesn't matter if the RadeonHD 3200 can't actually run Crysis in 1024*768 using the DirectX 10 API rendering path. As long as ATi can put on the sales box that the RadeonHD 3200 can accelerate DX10 graphics, that's all users see.

That's all the mass market cares about.

That's also where all the money is. Nvidia and ATi make the majority of their money off of low-end parts sold in contract to OEM's and ODM's.

***

Which is why the Thermaltake event irks me so much.

If I was an executive within Nvidia, I wouldn't be spending a dime on any promotions, on any marketing or rhetoric, until I had working silicon in my hand that I could take to Kyle Bennet over in Texas and allow him to benchmark, or send to the people who run sites like Phoronix or Mepisguides to benchmark and look at. I'd be throwing money left and right to get mid-range and low-end parts of the new architecture ready for 0-day launch with the high-end part.

I, however, don't work for Nvidia, so I can only speak on what I see from outside the company. What I see is Nvidia wasting money on potential marketing stunts, rather than tending to their core business.

And that's what just torques me off.

Nvidia does NOT have DirectX11 yet. They will not until the 300 series hits the shelves. ATI is currently the only video card developer that does starting with the 5xxx series.


 

Posted

Having waded through the upteen pages here, I'm more confused then ever.

Here's my situation:

My old computer:
Windows XP3
AMD 64 Processor 3500+ running at 2.21GHz (Single Core)
2 GB Ram
nVida GeForce 8500 GT

However the power supply in the computer is in the process of dieing.

Hence I recently got the opportunity to get a Dell Optiplex 360 while over in USA and bring it back to Australia. These sell for about $1300 over here, and I got it much cheaper.

It was going to be my gaming computer, but came in a sealed box and I couldn't check the contents. When I got home, I found it contained:
Win 7 Professional
Intel Core 2 Duo E7500 with VT (2.93GHz Dual Core)
2 GB Ram
Integrated Video Intel GMA3100
255W Power Supply

So my question is this, In order to even play Going Rogue or CoH/V: Do I sell the Optiplex and try to save for a better computer, or do I upgrade it? If I upgrade it, what gets priority? I know the power supply will needs attention if I get a card.



"Just as I knew all of life's answers they changed all the questions!" - Unknown (seen on a poster)
Sig characters appear in the Ch�teau Rouge thread starting from post #100
I Support Nerd Flirting! - Story/Discussion/Sequel

 

Posted

Can you put in another video card, by chance? If so (and I think you can), then I'd look at the power supply. Can you upgrade that? I think you look pretty good otherwise. I play COH with a much lesser CPU and a budget video card (Radeon 4670) - only place I've got you beat is RAM, and that's it.


 

Posted

Quote:
Originally Posted by Dark Shade View Post
Having waded through the upteen pages here, I'm more confused then ever.

Here's my situation:

My old computer:
Windows XP3
AMD 64 Processor 3500+ running at 2.21GHz (Single Core)
2 GB Ram
nVida GeForce 8500 GT

However the power supply in the computer is in the process of dieing.

Hence I recently got the opportunity to get a Dell Optiplex 360 while over in USA and bring it back to Australia. These sell for about $1300 over here, and I got it much cheaper.

It was going to be my gaming computer, but came in a sealed box and I couldn't check the contents. When I got home, I found it contained:
Win 7 Professional
Intel Core 2 Duo E7500 with VT (2.93GHz Dual Core)
2 GB Ram
Integrated Video Intel GMA3100
255W Power Supply

So my question is this, In order to even play Going Rogue or CoH/V: Do I sell the Optiplex and try to save for a better computer, or do I upgrade it? If I upgrade it, what gets priority? I know the power supply will needs attention if I get a card.
The processor isn't spectacular, but it should suffice. Given that and assuming there is a PCIe x16 slot on the board (I would be very surprised if there's not) you would need:

A new power supply at US $80-$180 (depending on how fancy you want to get). Also a graphics card with price depending on your budget, but likely at least US $100.

Another 2 Gig of RAM (US $50) wouldn't hurt, but isn't strictly essential.

You need to figure out what graphics card you want/can-afford before you can know how much power supply you require.


 

Posted

Quote:
Originally Posted by Scyntech View Post
Nvidia does NOT have DirectX11 yet. They will not until the 300 series hits the shelves. ATI is currently the only video card developer that does starting with the 5xxx series.
... ... you do... realize... that you are like... preaching to the choir here.

I guess from this response that you just caught this little snippet of the post:

Quote:
Now, we're coming up to a similar situation. AMD has an entire line-up of cards coming out from the low-end OEM market to the high end gaming market that are all DirectX 11 / OpenGL 3.2 compatible.

Nvidia has a DX11 / OpenGL 3.2 compatible card for the high-end market... but not for the low-end or medium-range markets. Rather, from what we know of unofficial information, Nvidia intends to keep using the G9x architecture and it's GT2x respins in the low end market. These aren't DX11 / OpenGL 3.2 parts...

Okay, in all fairness, we, as gamers, know that there's not actually that much visual difference between a game that's coded in DX9 / DX10 / DX11 / OpenGL 2.x / OpenGL 3.x. Keeping in mind that the Playstation 3 and Xbox 360 are OpenGL 2.0 GPU's, which is roughly equivelent to DX9, we know that all of the API's can produce some amazing visuals. The question really becomes which can produce the best visuals with the best frame-rate. The GTX line-up of cards, for the foreseeable future, is going to be able to deliver excellent frame-rates with excellent image quality. Most game developers are not going to be leveraging the very latest graphics API exclusively. So far most engines, such as the CryEngine, Unreal Engine, ID Tech 4, ID tech 5, Source, Torque Shader Engine, and Unigine have offered different fallback levels of rendering.
Again, maybe I wasn't clear enough, but I thought it was pretty obvious that I was talking about Nvidia's FUTURE LINEUP OF FERMI GRAPHICS CARDS and how that line-up is only for the high-end market, not the mass market low-end and medium-range where the companies actually make money.

****

Quote:
So my question is this, In order to even play Going Rogue or CoH/V: Do I sell the Optiplex and try to save for a better computer, or do I upgrade it? If I upgrade it, what gets priority? I know the power supply will needs attention if I get a card.
Human Being pretty much already answered this. If you have a PCIE 16x slot you should be able to add-in a graphics card.


 

Posted

Quote:
Originally Posted by Human_Being View Post
The processor isn't spectacular, but it should suffice. Given that and assuming there is a PCIe x16 slot on the board (I would be very surprised if there's not) you would need:

A new power supply at US $80-$180 (depending on how fancy you want to get). Also a graphics card with price depending on your budget, but likely at least US $100.

Another 2 Gig of RAM (US $50) wouldn't hurt, but isn't strictly essential.

You need to figure out what graphics card you want/can-afford before you can know how much power supply you require.
One thing that constantly bugs me is that Newegg don't deliver internationally and that I was just in the USA and could have pick up all this stuff up for much less then it costs over here and included it in my luggage somewhere.

The motherboard has one PCIe/16 slot and 2 PCI slots.

2 Gig Ram - From Dell... that'll be A$100 plus delivery.
Video card: ATI HD5750 PCI-E 2.0 1GB from one different supplier - A$200
and then the power supply... 650W power supply ... A$180 plus delivery

... so I got some saving to do and some bargin hunting to do.

(650W is to allow for the burner I'll eventually add and better regulation of regional power)



"Just as I knew all of life's answers they changed all the questions!" - Unknown (seen on a poster)
Sig characters appear in the Ch�teau Rouge thread starting from post #100
I Support Nerd Flirting! - Story/Discussion/Sequel

 

Posted

Quote:
Originally Posted by Dark Shade View Post
One thing that constantly bugs me is that Newegg don't deliver internationally and that I was just in the USA and could have pick up all this stuff up for much less then it costs over here and included it in my luggage somewhere.

The motherboard has one PCIe/16 slot and 2 PCI slots.

2 Gig Ram - From Dell... that'll be A$100 plus delivery.
Video card: ATI HD5750 PCI-E 2.0 1GB from one different supplier - A$200
and then the power supply... 650W power supply ... A$180 plus delivery

... so I got some saving to do and some bargin hunting to do.

(650W is to allow for the burner I'll eventually add and better regulation of regional power)
Must be one heck of a burner if it takes an extra 200 W o.0, but that does give you some room to grow and I can't speak to anything but 120 V AC regulation - I usually don't pay attention to the other graph traces in reviews. Thermaltake is a decent brand in any case. If you want to do some comparison shopping, I'd recommend looking into an Enermax Modu82+ of some wattage. I actually *do* know that it has good 240 V regulation and excellent build quality.

Again, if it comes down to tight money, the extra RAM would be nice but isn't essential. You might consider whether that money would be better put towards another step up on the graphics card ladder. (An ATI 5770 would fit with a minimum 500 W power supply.)

EDIT: If you do get the RAM, I wouldn't buy it from Dell; you'll get over-priced and under-quality-ed. What you would be looking for is a 2 x 1 GB kit of 240 pin DDR2-800 or DDR2-1066. I checked Crucial's site just now and they have both of those certified-compatible for an Optiplex 360. You wouldn't need any fancy RAM with a heat-spreader either.


 

Posted

Quote:
Originally Posted by DeathSentry View Post
Quick question: my monitor will be an HP Pavilion 2159m LCD wide-screen monitor with a recommended resolution (H x V) of: 1920 x 1080 @ 60Hz

So, in Tom's hardware list, it has: a GeForce GTS 250 512MB listed as being good 1920x1200 performance in most games.

So, given my screen size, would that be sufficient?

Help? Anyone?


Arc: 378122 "Tales of the Terran Space Marines -The Apocalypse Initiative" 5stars!
http://www.youtube.com/watch?v=6Rgl4...687B0FC89F142C
Arc: 481545 "Twilight of the Gods - The Praetorian conflict"8000+ hits!
http://www.youtube.com/watch?v=XxJ6S...848B21E2350DCC

 

Posted

Quote:
Originally Posted by DeathSentry View Post
Help? Anyone?
Difficult to say. Based on my understanding, the 250 should run Ultra Mode in some fashion, but my guess is that it'll be somewhere between low-end Ultra Mode and medium-range Ultra Mode, depending on whether Ultra Mode ends up more texture bound or pixel fill bound.

Positron says the GTX 260 is probably a "middle of the road" performer in Ultra Mode. The 250 seems to have comparable texture performance, but a much lower pixel fill rate. That's forming the basis for my guestimate that the performance will lie somewhere between the low end of Ultra and the middle of Ultra. It should at least run Ultra at some level because it appears to have support for the latest OpenGL rev (version 3.2) and Ultra Mode is probably targeted a bit lower than that (somewhere between 2 and 3 would be most people's guess I think).


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)