Ultra-mode video card shopping guide


5th_Player

 

Posted

Quote:
Originally Posted by Human_Being View Post
Wait, I just reread that... Okay, I expect you meant tax refunds, not that you were able to write off a gaming PC as a tax deduction
I live in the uk, and am employed as a computer science lecturer and self-employed as a web developer, so it looks like I actually can write off a certain % of the price of a computer as a work expense. (still looking into the exact details of how much). I could also try buying it through the University I work for, but although they supposedly get a good discount, you have to buy from their approved suppliers (who generally overcharge for not very good components).

But at the minute still looking around to find the best deal and best way to buy things (pre-built Vs. ordering seperate compontents)


Always remember, we were Heroes.

 

Posted

Oh yeah, something else while my mind is on that direction.

Human Being brought up the 300 watt limit that the GTX 295 tries to stay under by using two GTX 275 chips. AMD's 5970 is a similar two-chip / single card monster, and is also limited to 300 watt's. However, AMD is reportedly cherry picking the HD 5870 processors that go into the 5970, picking the ones best suited to overclocking. The Stock heatsink is also designed to accommodate 400 watts of heat output. If you don't mind smashing through the 300 watt limit, the 5970 will, by default run at stock 5870 speeds for each chip, and generally reach higher clocks.

One of the concerns about AMD cherry picking processors to use in the 5970 might mean that 5870 cards could suffer on the available clock speed head-room front. If you aren't in the market to overclock... and seriously... why would you need to Overclock a 5870 to begin with... this really isn't a big deal.


 

Posted

Quote:
Originally Posted by je_saist View Post
I'm sort of tossed on XFX. My first couple of cards with them were the AGP Geforce 6600 GT's I had... which had bloody awful heatsink designs. However, it seems that Nvidia's 6600 AGP design was something vendors couldn't deviate from, so I really couldn't blame XFX for the cards.
Oh, those things were wretched. >_< It wasn't just XFX. I had a PCIe 1.0 6600 and it resulted in my very very first-est after-market component modification when I bought a Zalman VF700 cooler to make the hurting stop!

I've never trusted those little single-height fan-sink designs on any other card since then. Case In Point.


 

Posted

Quote:
Originally Posted by je_saist View Post
I've been running CoH under Xp / Vista with Crossfired RadeonHD 3870's and 4850's, and under Vista on a GTS 250 Triple SLI setup. Honestly, at the resolutions I run (1920*1200 on the Radeons, 1680*1050 on the Nvidia), there's not really a heap of difference in speed between the cards in single mode and multi-GPU mode.

This will likely change as of Going Rogue as the new engine supposedly will take advantage of multi-gpu rendering.
Can you tell us if there is *any* difference between single mode and multi-GPU mode right now? Everyone has always stated that CoX will not take advantage of multi-GPU setups but I haven't seen any real benchmarking on whether there is any affect on on CoX (+ or -).

Quote:
Originally Posted by je_saist View Post
If you aren't in the market to overclock... and seriously... why would you need to Overclock a 5870 to begin with... this really isn't a big deal.
For most people a 5870 is overkill, but there are still situations where a 5870 or 5970 aren't enough performance. Once Ultra debuts I'm planning on getting a 5970 or whatever is the performance king and seeing how it will perform under Eyefinity 2560x1600 x3 or 1600x2560 x3.

Currently for CoX, the GTX285 is the fastest and compatible GPU since there isn't any benefit to going SLI. The 5870 is likely faster but can't take advantage of all the eye candy until Ultra debuts.


 

Posted

Quote:
Can you tell us if there is *any* difference between single mode and multi-GPU mode right now? Everyone has always stated that CoX will not take advantage of multi-GPU setups but I haven't seen any real benchmarking on whether there is any affect on on CoX (+ or -).
Can do. Probably take me a bit to get something setup that's... benchmarkable... across all 3 systems. I've gotta work LaserTron support for Adventure-Crossing tomorrow, so I'll likely not have something till Sunday or Monday.


 

Posted

I know this probably wants to be asked by a few people, but I'm itching to ask now.

I'm useless at graphics cards and what their equivalents are, so how does anyone here think a 2GB NVIDIA GeForce 9500 GT will handle this based on Posis recommendations? I'll be looking to upgrade if I need to, but am so damned curious as is.


 

Posted

Quote:
Originally Posted by StormSurvivor View Post
I know this probably wants to be asked by a few people, but I'm itching to ask now.

I'm useless at graphics cards and what their equivalents are, so how does anyone here think a 2GB NVIDIA GeForce 9500 GT will handle this based on Posis recommendations? I'll be looking to upgrade if I need to, but am so damned curious as is.
Yes and No.

Yes, you'll be able to run all of the Going Rogue features. There's nothing in the feature sets of the cards listed by Mr. Miller that are not found in their lower counterparts.

However, the question is performance. You may have to accept a low resolution, like 1024*768 in order to obtain acceptable frames per second performance.


 

Posted

Quote:
Originally Posted by je_saist View Post
Yes and No.

Yes, you'll be able to run all of the Going Rogue features. There's nothing in the feature sets of the cards listed by Mr. Miller that are not found in their lower counterparts.

However, the question is performance. You may have to accept a low resolution, like 1024*768 in order to obtain acceptable frames per second performance.
Ahh, wicked. Was all I wanted to know. Will prolly upgrade once I find out first-hand. Play this game enough for the upgrade to be worth it. Thanks, Saist. :]


 

Posted

Ok here's my question. I have a 9800gtx right now on a quad core cpu (2.8ghz) with 8 gigs ram.
Would just getting a 2nd 9800 gtx for sli be a better choice than trying to spend alot on a 275?

After reading toms hardware guide on where video cards stand I'm reluctant to purchace any of the 2xx series at this time.
I'm also not interested in switching to an ATI setup either even if they are currently faster/better at this time.


 

Posted

Quote:
Originally Posted by Hypatia View Post
Can somebody speak to what this means for Mac users?
The simple answer is not YET.

More specific:

MacOS handles graphics processing differently from Windows. It will try its best to support the highest available OpenGL implementation, passing off whatever features the video card supports directly to the card, and trying to make up the difference using the CPU. This means Ultra will probably be *technically* supported on all Mac systems specifically excluding the ones with integrated Intel video (those are specifically limited at the driver level).

But so is FSAA. That particular feature was disabled in Cider for performance considerations, and there's always a chance Ultra will perform so badly in Cider that it too will be artificially disabled. I'm hoping it's not, because I'd like to see the CoH Mac client be the best looking one of any Windows-based game. We won't know for sure until after initial testing begins.

Keep your eyes open and your fingers crossed.


Manga @ Triumph
"Meanwhile In The Halls Of Titan"...Titan Network Working To Save City Of Heroes
Save Paragon City! Efforts Coordination

 

Posted

Quote:
Originally Posted by Geobaldi View Post
Ok here's my question. I have a 9800gtx right now on a quad core cpu (2.8ghz) with 8 gigs ram.
Would just getting a 2nd 9800 gtx for sli be a better choice than trying to spend alot on a 275?
Yes / no / not really

Yes, two 9800 GTX's in SLI are pretty powerful, but you are dependent on software support of multi-gpu rendering. It'd be a cheaper way to gain more performance while you wait for Nvidia to go bankrupt or get bought out. (and no, that's actually not a joke, I think Human Being addressed the financial problem Nvidia has trying to move GTX chips right now... or somebody did in the thread)

No, I'm pretty sure that a single GTX 275 would outrun two 9800's. Granted, I can't actually test this. I don't have a GTX 275 on hand, although I do have GTS 250's... which are pretty much just die-shrunk rebadged 9800's.

I'll get to the Not Really at the end.

Quote:
After reading toms hardware guide on where video cards stand I'm reluctant to purchace any of the 2xx series at this time.
I mentioned this earlier in the thread, but I'd keep a barrel of salt on hand if you're reading Tom's. The site has been busted multiple times for accepting money to slant reviews one way or another. Now, while I can come up with dozens of reasons of my own not to buy a GTX 2xx card from Nvidia, most of my concerns simply won't matter to the average gamer.

********************************************

Okay, here's the Not Really part for those not interested in a soft analysis.

Not Really: it's a toss-up. If your an Nvidia fan, well. To broach the subject again, it's really questionable whether or not Nvidia is actually going to matter in a few months. Fermi's going to be launching against Larrabee and shortly before the product refreshes on the RadeonHD 5x00 series. Intel and AMD are also going to be pushing CPU's with Integrated GPU's in a 6months to 8months... and that's going to once again create a three-horse race in the low to mid-range graphics processor segment. I'm not too entirely sure Nvidia can survive as a vendor of gaming graphics cards if it's traditional bread and butter market, the low end market where the likes of the Geforce2 MX, Geforce4 MX, Geforce FX 5200, and Geforce 6600 dominated, goes away.

Nvidia's going to have to launch Fermi with parts at all segments, from the low-end to the High-End, and deliver large quantities. Since they are dependent on TSMC, that's... not really something I'm sure Nvidia can pull off. AMD is having enough problems pulling their own 40nm parts that are completed out of TSMC.

In order for Nvidia to survive, they are going to need a megabucks deal. One of the hot rumors that is going around now is that Nvidia has won the contract for the next Nintendo handheld. Personally, I doubt it given Nintendo's decades long relationship with AMD. Remember, the N64 was created with help from SGI, and that SGI team left to become ArtX. ArtX did the design for the Gamecube's LSI, but were bought up by ATi shortly before 2003. ArtX turned ATi around, helping to launch the Radeon 9700 series of cards and revamp ATi's... broken... driver program. ATi then did the graphics design for the Wii, something AMD was rather grateful for after the merger as Nintendo's profits helped shore up crashing processor revenue streams. At one time ATi had a contract to do the GPU for the so called GameBoy 2, a console that was halted, then dropped, after the DS went from side-toy to main-show.

With this kind of history, Nvidia would have to offer an astoundingly good deal to beat what has been a financially successful string of consoles for Nintendo. So I don't think this kind of guaranteed income deal is in store from Nvidia.

We also know that Nvidia's president has very kind words for Apple, and has been developing mobile platforms targeted towards the markets serviced by PowerVR / Imagination... a company that Apple has invested in. Some have suggested Nvidia is trying to maneuver themselves into a position to be bought out by Apple.

Now, with this sort of background, and multiple questions surrounding what's going to happen to / with Nvidia, the sensible thing to do is wait it out. Wait for Fermi parts to be delivered.

TL;DR version: Honestly, I'd save your money for next year rather than rushing to upgrade now.


 

Posted

I was actually about to pawn a few things to get ready for this, thanks for the heads up, I think I'll get a few things upgraded inbetween though..!

-C.A.


 

Posted

Quote:
Originally Posted by je_saist View Post
Just commenting on this, but I wouldn't really cite Tom's site as a reference on how processor's perform. THG has been busted multiple times in the past accepting vendor money to slant reviews.

Really ? Do you have anything linking to this ?? I ask because I use the website alot and have always found them to be accurate. Also I couldn't find anything on the web about this that may just be my search abilities.


 

Posted

Quote:
Originally Posted by Another_Fan View Post
Really ? Do you have anything linking to this ?? I ask because I use the website alot and have always found them to be accurate. Also I couldn't find anything on the web about this that may just be my search abilities.
Yes and no.

Yes, there is evidence still on the web, like this piece by Charlie from The Inquirer back in 2006 : http://www.theinquirer.net/inquirer/...-hardware-rant

Other evidence can be found on some blogs, like this 2007 post : http://scientiasblog.blogspot.com/20...-its-soul.html

And the Inquirer brought it up again in 2007 as well : http://www.theinquirer.net/inquirer/...-hardware-sold

The allegations first came around back in 2001, and the first proof reached the web, and I think Slashdot, in 2002. The topic was discussed again when THG was up for sale since one of the hopes was that THG would regain it's credibility.

***

and no. The old slashdot discussions and forum postings I had saved now go to defunct links, such as ThatForum, StarEmu, and the old Gamenikki.com forums. Theoretically the content may still somewhere in http://www.archive.org/index.php ... which as far as I know, the search engines such as Google, Yahoo, and Bing / LiveSearch don't index.

***

edit: yes. I know. citing the TheInquirer is a bit like citing a politician. You just know it's going to come back and bite you on the rump. TheInq, and Charlie in particular, also have reputations for stretching the truth. Not as bad as TheReg's Andrew Orlowski whose posts could often be used as a foundation for a drinking game. The problems with THG's credibility was actually one of the few cases where Charlie, a couple of his hardware reviewers at the time, and I agreed on something. I'm a bit sad I didn't save the information as it happened... but I don't think it really occurred to anybody back in 2001 or 2002 that the information put on the web would one day be unavailable or dead linked.


 

Posted

Quote:
Originally Posted by Red_Gren View Post
I noticed a few people mentioning that they're not tech savvy & want to know how their card will do/compare. So... I thought I'd mention that every so often a site called Tom's Hardware publishes a guide on graphics cards, be it best for the money or best for gaming or whatever. Near the end of the articles, they publish a Graphics Card Hierarchy Chart which allows people to see more or less how their cards compare to the cards mentioned in the article. I figure access to such a chart might be helpful, so... Here's a link to the chart in the latest article I could find.

Hope this helps both here and when it comes time to make upgrade decisions.
Thanks for that link, Red. I just had to buy a new desktop to replace my laptop (I'll never buy another HP laptop again) and it looks like the Radeon HD 4350 that came with it, is right in the middle of that list. I'll likely have to upgrade in the future, but it does the job for now.

Oh, and I found another website that provides some pretty extensive hardware reviews at Motherboards.org


 

Posted

I am running on a NVIDIA GeForce GTS240, how well will my performance be for the new up comming 'Ultra Mode' expansion?


 

Posted

Quote:
Originally Posted by Geobaldi View Post
Ok here's my question. I have a 9800gtx right now on a quad core cpu (2.8ghz) with 8 gigs ram.
Would just getting a 2nd 9800 gtx for sli be a better choice than trying to spend alot on a 275?
"Maybe." It depends on a lot of things we don't have information on. Under ideal circumstances, an SLI-ed pair of 9800 GTXs should actually outstrip a single GTX 275. The operative phrase there is "ideal circumstances". Not every game supports multi-GPU configurations (CoX currently doesn't). Those that do support multiple GPUs don't necessarily scale 1:1; you might get 1.3x the performance of a single card rather than 2x. Those that do support multiple GPUs also don't necessarily react to Nvidia SLI and ATI Crossfire format the same way.

Assuming Ultra Mode responds to SLI with 1:1 scaling, there are other factors to consider. I presume that you have an open PCIe 2.0 x16 slot on your motherboard to put the second card in, but does the board actually support SLI? The last generation of motherboards tended to license either Nvidia SLI or ATI Crossfire, but not both. Also, do you have the requisite additional 6-pin power connectors on your power supply to hook it up? If you've got the connectors, is your power supply rated high enough to support both cards, your quad core, and whatever else is in your system?

If the answer to all of that is "yes", then there still remains the problem of getting a "matching" card. I'm not sure how precisely you are naming the board, but you can't find "9800 GTX"s anymore. You can find 9800 GTX+s and GTS 250s. The 9800 GTX, 9800 GTX+, and GTS 250 are all using the same G92 chip, which originally debuted in late-model 8800s, with a progressively more impressive sounding label. However, each one is clocked a >little< bit higher than its predecessor. In order to match the new card to the older one, you might have to underclock it a bit (which you can do with EVGA's Precision program).

Potentially, this could be a cheap(er) upgrade. It's not necessarily trivial though.


 

Posted

Quote:
Originally Posted by Tesla_Nova View Post
I am running on a NVIDIA GeForce GTS240, how well will my performance be for the new up comming 'Ultra Mode' expansion?
A GTS 240, depending on manufacturer's modifications, might work around the 9800 GT "entry" mark. However, the GTS 240 (and 220 and 210/310) does not support SLI. Unlike the 9x00s, there is no way to combine two of them for potentially greater performance.


 

Posted

My Video Card melted a month ago and I replaced it with a GTX360 (For Aion and HD TV) but I am glad it will be able to support Going Rogue.

Of course, they undersized the power supply so I had to replace it today with a 700 W supply.

Crystal Saint
70 Months (including beta) and still going strong.


 

Posted

After scouring the internet for days, I was finally able to find a store with a Sapphire ATI Radeon 5870 in stock (well, at their distribution hub, but it's still stock)! So that's ordered, and being shipped to me next-day once they get it from the hub. Mmmmm ... DirectX 11 goodness!


Positron: "There are no bugs [in City of Heroes], just varying degrees of features."

 

Posted

Quote:
Originally Posted by je_saist View Post
Yes, two 9800 GTX's in SLI are pretty powerful, but you are dependent on software support of multi-gpu rendering. It'd be a cheaper way to gain more performance while you wait for Nvidia to go bankrupt or get bought out. (and no, that's actually not a joke, I think Human Being addressed the financial problem Nvidia has trying to move GTX chips right now... or somebody did in the thread)
Yes, that was me. I'm not quite ready to order a funerary floral arrangement for Nvidia, but they are in a very nasty position.


Financially, Nvidia actually isn't in too bad a shape. According to the most recent numbers I saw, the company had a few hundred million in debt and a billion or two cash-on-hand. So Nvidia isn't in danger of immediate monetary collapse. It's problem, is that the outlook for any new sources of revenue is increasingly bleak.

Ironically (and I love ironies) ATI and Nvidia essentially have mirror-image predicaments. ATI is carrying about $5 billion (with a "B") of debt. They are very deeply in the hole. However, they have a winning solution on the Graphics Card market right now and AMD recently settled all their lawsuits with Intel for a $1.25 billion payout (and a promise to "play nice" from now on). The same agreement also allows AMD to sell its 30% stake in Global Foundries microchip manufacturing. Liabilities from Global Foundries caused AMD to take a loss rather than turning a profit last year. So they should lose the most draining portion of their business while gaining several billion dollars in cash. That cash on hand isn't important for getting rid of their debt, but because they will have their own funding (without searching for more credit) to research and develop new microchips. The R&D + manufacturing workup for a new chip can easily cost several billion dollars.

And that multi-billion dollar price tag is what may ruin Nvidia.

As I said earlier, Nvidia has basically had two core products over the last years: the G92 and the G200. Their other recent chips have derived from the architectures and research of these two.

The G92 was quite successful when it debuted, and Nvidia decided to elaborate the design into something larger during development of G200. The thinking was that ATI would fill its "traditional role" and produce something that was decent-and-low-priced-but-never-high-performance. If Nvidia built something impressively large, they would certainly dominate the high-end market.


Unfortunately ATI surprised and outflanked them badly with their 4xxx series of cards late last year. To begin with, the 4xxxs were much more powerful than Nvidia had expected. Compounding that strength was the fact that ATI had jumped to a 55 nm manufacturing process for the chips, rather than Nvidia's 65 nm resolution; the chips were smaller and cheaper per batch of silicon. Furthermore, ATI opted for "more expensive" and twice-as-fast GDDR5 memory instead of Nvidia's GDDR3...but with half the memory controllers Nvidia was using. The price for memory controllers turned out to be the greater factor and that made ATI's boards cheaper to construct. With half the memory components to power and a smaller chip, the boards needed less power-regulatory elements as well. The final result was products that could equal or outpace the equivalent Nvidia board at 20-50% lower price.

Nvidia scrambled to transfer to 55 nm process (GTX 280 -> GTX 285, etc), but they were still stuck with a larger chip than ATI was working with. Physical production costs per unit were simply higher for Nvidia. So their potential profit per board at each performance price-point was lower. ATI took full advantage of this and dragged prices down to the point where Nvidia essentially had to "pay people" to take their cards; selling them at a loss.

Buying market share that way is okay for a while; so long as you can transfer to a new product. That new product was supposed to be the G300 chip (aka "Fermi"). But Fermi was supposed to arrive this Fall. It is now delayed until at least late-spring/early-summer. A year and a half of selling their products for less than the price they get them from the factory at is *not* okay. So rather than continue to throw good money after bad with the commercially-failed G200 line, they are shutting it down and leaving the high end graphics market to ATI for now.


So what else does Nvidia have to survive on? je_saist already went over the problems with Nvidia's future as an integrated/mobile graphics chip provider. I'd add that laptop manufaturers may actually be *looking* for an opportunity to wash their hands of Nvidia due to a series of faulty chips Nvidia had been selling them up to August of last year.

The "new" thing that Nvidia is trying to do is make money from the academic "Compute" market. If you have some sort of massive computational task that requires a supercomputer, it's more efficient to have a number of small, specialized processors rather than a few generalized ones. Well, that's exactly what a modern GPU is. Following the logic of this, Nvidia recently started shopping their graphics cards around to universities and research institutes. Nvidia sees this as a huge and untapped market of applications. However, the idea is just beginning to move around and these are not the kinds of customers who make snap buying decisions or decide to be "early-adopters". Compute application sales were a tiny fraction of Nvidia's revenues last year.

Nvidia does make the graphics core of the Sony PS3, but ATI turns out to supply the graphics in the Xbox 360.

Nvidia basically has no product that it can rely on for a solid revenue stream in the coming year. The G92 and its derivatives have been a tremendous workhorse, but they're really showing their age now. The G200 essentially resulted in Nvidia flushing their billion dollar research costs down the toilet; its a dead end. Nvidia needs a new flagship product to maintain their business. And that brings us back to the G300/Fermi.


In response to the shockingly effective ATI 4xxxs, Nvidia decided to bring as much firepower as they could to bear on the next generational competition with its rival. Furthermore, they couldn't get caught with an older manufacturing process than ATI either. They also decided that they needed to incorporate capabilities for robust double precision calculation and memory error correction to accommodate academic Compute customers.

The G200 was massive. The G300 is monstrous. Fermi consists of 3 billion (yes, with a "B") transistors in a die that is larger than 500 mm^2. The ATI "Cypress" chip in the 5xxxs consists of only 2 billion transistors in a 334 mm^2 die. Remember what I said about the ATI 4xxxs costing less to produce than the G200s? Nvidia is once again in the exact....same....situation. Nvidia needs to unveil a G300 chip that turns out to eat Global Warming and poop unicorns. The only way Fermi will be able to compete with Cypress and recoup its development costs is if it completely blows the ATI chip out of the water in performance.


And that's where Taiwan Semiconductor Manufacturing Company's bungling becomes important... The new 40 nm resolution manufacturing process that both Nvidia and ATI are using from TSMC has turned out to be more problematic than advertised. ATI was getting very low yields of functional chips per batch of silicon. Fermi, being half-again more complex than Cypress, is even more vulnerable to random fatal defects. Reports were that the original test production runs resulted in 1.7% functional chips. Those that functioned turned out to do so at much lower clock rates than expected. If you have a chip with massive capabilities that works at a relatively low frequency, it may not outperform a less elaborate chip that can run faster...


Nvidia is in better shape than ATI to take a(nother) financial hit. The question is what, if anything, the company will be able to do *afterwards*?

So yeah, Nvidia? Deep trouble.


 

Posted

You're sounding a little Carl Sagan like with all those "billions"! Thanks for the info, though - especially the manufacturing issues. I keep wondering where the Hades the Fermi announcements are.


 

Posted

Quote:
Originally Posted by Positron View Post
We don’t have numbers from the Radeon 58xx series yet. This would be able to run Ultra-mode with the quality maxed out (except we have no data on Anti-aliasing, so caveat emptor when it comes to that specific feature).
I have an ATI 5870 and it rocks, however Anti-aliasing seems to crash games so I turn it off (I seem to recall having this problem with other ATI cards). Incidentally, I replaced my 9800GT with the ATI card and it was worth it, even if I can't use the Anti-aliasing.


 

Posted

Quote:
Originally Posted by Human_Being View Post
Nvidia does make the graphics core of the Sony PS3, but ATI turns out to supply the graphics in the Xbox 360.
ATI also makes the chips in the Wii.


 

Posted

im runing two 260 bfg gtx oc cards in sli . cards before this was two 8800bfg gtx cards in sli.had great luck with the company .this might not matter to some but

http://www.bfgtech.com/Product.aspx?Category=Graphics Cards

is a us based company. as is

http://www.evga.com/products/prodlis...+Series+Family

for those looking for deals on cards check

http://www.newegg.com/Store/SubCateg...cs-Video-Cards

http://www.compusa.com/applications/...me=Video-Cards

http://www.buy.com/cat/video-cards/61929.html

http://www.amazon.com/Graphics-Cards...rd_i=193870011

dont rely on price grabber like programs as you can find cheeper by just checking certian sites.many a time i find a good deal then see they charge a arm an leg on shiping but were cheepest in the price grabber. an sevrial times i find something cheeper on newegg but it doesnt even show in a price list.

an if you use amazon marketplace odds are it wount show up for weeks.but you can save a bit in the long run that way.personaly i prefer newegg or compusa/tigerdirect (its same thing) returns are easyer. newegg especialy probly best customer service for comp parts .