Ultra-mode video card shopping guide


5th_Player

 

Posted

Quote:
Originally Posted by Father Xmas View Post
I don't understand what you mean by "fit the hole on the back of the case"? The double bracket wouldn't slide into two adjacent slots? I could understand if the card was too long, the standard HD 5850 is an 11" or so card.

What case were you installing this in?
As I've mentioned previously (or at least I think I've mentioned previously), I've got a Dell Optiplex 360. Its a tower case.

Here are some pictures of the insides as requested (click to get to full size).



Yes I can remove the guards at the back of the case, that's easy... but notice that the PCI-express slot is on the far left of the slots, and if the pins of the card line up with it, it will conflict with the CPU Fan casing as well as the rear of the case. If only the slot in the middle was a PCI-express then I could put it there and I'd be laughing. Hope that makes sense.

Edited to add:

I've done some quick looking about... and the next tier down from the HD5850 is the HD4850.

Unfortunately the XFX HD 4850 is a dual slot card... while HD4850 cards from other manufacturers are single slot with largish heatsinks ... like say from Gigabyte or Sapphire.

Would the 4850 run Ultra mode? I know XFX have a good name for quality and gaming grade goods... but what about the other brands? Is there any difference?

And what is the difference between the 4850 and the 5850? I'll probably find out next time I'm on... but beds calling.



"Just as I knew all of life's answers they changed all the questions!" - Unknown (seen on a poster)
Sig characters appear in the Ch�teau Rouge thread starting from post #100
I Support Nerd Flirting! - Story/Discussion/Sequel

 

Posted

Quote:
Originally Posted by Dark Shade View Post
As I've mentioned previously (or at least I think I've mentioned previously), I've got a Dell Optiplex 360. Its a tower case.

Here are some pictures of the insides as requested (click to get to full size).



Yes I can remove the guards at the back of the case, that's easy... but notice that the PCI-express slot is on the far left of the slots, and if the pins of the card line up with it, it will conflict with the CPU Fan casing as well as the rear of the case. If only the slot in the middle was a PCI-express then I could put it there and I'd be laughing. Hope that makes sense.

Edited to add:

I've done some quick looking about... and the next tier down from the HD5850 is the HD4850.

Unfortunately the XFX HD 4850 is a dual slot card... while HD4850 cards from other manufacturers are single slot with largish heatsinks ... like say from Gigabyte or Sapphire.

Would the 4850 run Ultra mode? I know XFX have a good name for quality and gaming grade goods... but what about the other brands? Is there any difference?

And what is the difference between the 4850 and the 5850? I'll probably find out next time I'm on... but beds calling.
Ah, Well you do have a problem then unless you're up for some serious case modding. Yikes. Poor design (or intentional) on the manufacturer's part. Onwards...

Judging by publicly known info, I'd think a 4850 could do UM but at minimum specs while the 5850 ought to be able to hit the max or near to it. So there's that difference. Also, the 4850 is older tech so it doesn't support the newer DirectX and OpenGL specs like the 5850 does. It's DX 10.1/OGL 2.1 while the 5850 is DX 11/OGL 3.2.

As an alternative, you might have a look at this: http://www.newegg.com/Product/Produc...82E16814161317 It's a single-slot 5770 I found in a bit of searching. Not as good as the 5850 but better than the 4850 and it does the DX11/OGL 3.2 specs. I have no experience with HIS brand stuff but this card does have a good rating on NewEgg at least. You'll want to do your homework to see if it will fit but it might be an option for you.


It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.

 

Posted

Quote:
Originally Posted by Dark Shade View Post
As I've mentioned previously (or at least I think I've mentioned previously), I've got a Dell Optiplex 360. Its a tower case.

Here are some pictures of the insides as requested (click to get to full size).


Doh. Now it makes sense. BTX chassis. Yeah. That was Intel's attempt to "build a better mousetrap." Personally, Intel should just stay out of chassis development.

Quote:
I've done some quick looking about... and the next tier down from the HD5850 is the HD4850.
em. No. the next tier down from the HD 5850 is the HD 5830, then the HD 57xx series of cards, the 5750 and 5770.

The HD 4850 is a (generation) back.

Quote:
Unfortunately the XFX HD 4850 is a dual slot card... while HD4850 cards from other manufacturers are single slot with largish heatsinks ... like say from Gigabyte or Sapphire.

Would the 4850 run Ultra mode? I know XFX have a good name for quality and gaming grade goods... but what about the other brands? Is there any difference?

And what is the difference between the 4850 and the 5850? I'll probably find out next time I'm on... but beds calling.
Okay. the difference between the 4850 and the 5850 is an order of magnitude on performance, and differences in features. The RadeonHD 5000 series supports DirectX 11 / OpenGL 3.2 and OpenGL 3.3.

The 4850's performance comparative parts in the current AMD line-up are the HD 5750 and HD 5770... but each of these generally costs a bit more than the old 4850.

Now, I can tell you that you don't want the "single slot" Gigabyte 4850's. I put some together for a client and his computer is even louder at full tilt than my own Dual-slot Sapphire 4850 crossfire system.

Now, as to whether or not the 4850's will run ultra mode? Well. It is the "starting point" for Ultra mode. Key-words: starting point

Until the NDA is lifted, or I figure out how to give performance figures without giving performance figures, that's pretty much all I can say on how the 4850 performs in Ultra Mode.


 

Posted

Looks to me that you have to pick a card that's truly a single wide card, not just a single wide bracket but no "large" heatsink either. That's going to limit your choices a lot since most manufacturers simply assume that you can fit a double bracket or large heatsink if you are choosing higher end performing video cards.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

hmm i just bought the Nvidia GeForce 9800gt graphics card (1024mb gddr3 version) spent 139.00 bucks hope its good enough?


 

Posted

Quote:
Originally Posted by CaptainPower View Post
hmm i just bought the Nvidia GeForce 9800gt graphics card (1024mb gddr3 version) spent 139.00 bucks hope its good enough?
if you bought that in the US, you got screwed. 9800 GT's are selling for around $85-$95: http://www.pricewatch.com/search?q=9800+GT

If you bought it outside of the US, depending on what country you are in, it might have been a good buy.


 

Posted

Or you bought it at a store instead of online. Surprisingly people still prefer holding a physical item before parting with their money than parting with their money and waiting 3-5 business days for delivery. Don't begrudge them je_saist, the Interweb is a scary place full of Banks of Nicolai.

And actually checking NewEgg, a 1GB 9800GT is selling between $101 and $140 depending on the manufacturer, clock speed and accessories, not counting shipping so not really a bad deal.

As for UM, Posi says it'll support UM features at their minimal settings. The rest of the game settings can remain turned all the way up, just the new UM settings are one notch above OFF.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

Quote:
Originally Posted by Father Xmas View Post
Or you bought it at a store instead of online. Surprisingly people still prefer holding a physical item before parting with their money than parting with their money and waiting 3-5 business days for delivery.
Some of us like to shop locally, too.


 

Posted

Quote:
Originally Posted by Psyte View Post
Some of us like to shop locally, too.
lo-cal-ly? Like...LAN?


 

Posted

Quote:
Originally Posted by Cade Lawson View Post
lo-cal-ly? Like...LAN?
Supoport you local Mom and Pop Computer shop! 'cause were else are you going to get a hard drive when you Crash yours and the latest and greatest game come out tommorow!


Doom/Batman in 2012

The Resistance has boobs too, and better hair!

 

Posted

Quote:
Originally Posted by Daimyo_Shi View Post
Supoport you local Mom and Pop Computer shop! 'cause were else are you going to get a hard drive when you Crash yours and the latest and greatest game come out tommorow!
Yep. Thing for me is I do live on an island (Maui, Hawai'i) that's mostly reliant on tourism (come and visit us!). So spending money locally versus online is a bit more important to me. Even buying from the local pawn sho... er... Gamestop at least is helping pay somebody's check around here.

The fact I work for a newspaper company adds to this for me - I support local businesses and the local community because they're also the ones that pay for my check, too!

There are some things I can't get around here, but overall, even with exceptions? Try to get local.

I will say this - the local computer shops are good about saying, "yeah, we can get it for you, but honestly, just go to Newegg" if the pricing will be too out of whack. But at least I'm giving them the first shot


 

Posted

Quote:
Originally Posted by Psyte View Post
There are some things I can't get around here
As long as this is on your island:



we'll call it even.


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

Quote:
Originally Posted by Arcanaville View Post
As long as this is on your island:



we'll call it even.
Maui Mall, same side as Longs and Subway. You are a person of magnificent tastes. For those of you wondering, "what the heck is that?", guri-guri is a sherbert-y treat. Very nice, and I've personally only ever had it on Maui.

They don't do the beans* anymore, but then, every time I went they were always out anyways, so... Man... I haven't been there in a while. I should go sometime soon...

*I'm assuming azuki beans. I'm sure some of you were like, "wait, what? Bush's baked beans?" No, no, no...


 

Posted

Question for you system experts...

What effect would putting in PCI-E 2 vid card into a system board that is PCI-E 1 have? I know 2 is backward compatible, but I figure it has to slow down a top-end card somewhat anyway.

For comparison, here is my system info:

Board: ASUS A8N32-SLI Deluxe
Card: 8800 GTX
Memory: 2Gb
OS: WinXP
CPU: Athelon 64 X2 4400+
PSU: 650W BFG

If I were to put in, say, a GTX285 would I get a performance boost do you think?

Thanks!


 

Posted

The difference between PCIe V1 and V2 is V2 is double the bandwidth. Another way of looking at it is a PCIe x 16 V1 interface has the same bandwidth as a PCIe x 8 V2 interface.

If you think about it that way then you can look at the tests done over at Tom's Hardware on PCIe scaling. The results (using an i7-870 and an HD 5870) showed only around a 4% decrease in performance in the games they tested. Slower CPUs and video cards would make the impact even less.

Since the GTX 285 is considerably more powerful than your 8800 GTX, games performance that is being throttled by GPU performance will improve. But it isn't going to be as fast as it could be if it was in a PCIe V2 slot with a hefty CPU feeding it, but faster than the original 8800 GTX, certainly.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

Quote:
Originally Posted by Psyte View Post
Yep. Thing for me is I do live on an island (Maui, Hawai'i) that's mostly reliant on tourism (come and visit us!). <snip>
So, got a spare room I can crash in while I visit? I might be able to scrounge plane fare if I had a place to stay...

"Psyte's Paragon Hotel: The perfect tropical place for the wayward gamer. Ask about our beach LAN parties!"






And now, back to your regularly scheduled thread....


It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.

 

Posted

Any news in areas that could affect video card pricing? I'm sort of not expecting Fermri to do much at all (sadly), so I'm hoping something else might be happening to help bring prices down. ATI/AMD might do a refresh of the lineup later on, right? Do those, traditionally, help much?

I suppose, if nothing else, I should just look at getting a new PSU soon anyhow... just to be ready.


 

Posted

Quote:
Originally Posted by Father Xmas View Post
The difference between PCIe V1 and V2 is V2 is double the bandwidth. Another way of looking at it is a PCIe x 16 V1 interface has the same bandwidth as a PCIe x 8 V2 interface.
Thanks Father X - that is interesting. The ASUS mobo has 2 PCIe x16 slots, but I only have one vid card. So thought experiment (probably not really going to do it and UM doesn't like dual boards yet apparently):

What would the generic performance comparison for two SLI-linked 8800GTX (which can use 16x2 = combined x32 bandwidth) vs. one GTX285 which is a faster card, but can only use one PCIe1 16 pipeline.

Looking at nVidia's site and using the 9800GTX+ as a stand-in for the 8800 GTX (since they were about equal in performance - slight edge sometimes to the 8800 for the extra memory)

8800GTX x 2 = theoretical 30x 3DMark®Vantage Performance Preset (somewhat less due to SLI inefficiencies)


GTX285 x 1 using PCIe 1.0a single card = something less than 28x 3DMark®Vantage Performance Preset - perhaps 4% less

Going back and looking at 3DMark tests, 2x8800GTX scores 5467 on the SM3.0 test
http://www.legitreviews.com/article/421/7/

...on a PCIe x16 board, SLI giving it a x32 effective I guess. WinXP OS

and the GTX 285 scores 7540-7731 on the same test here
http://www.legitreviews.com/article/915/5/

...which is a PCIe 2 x16 board, so x32 effective I guess. Vista SP1 OS

So would it be correct to conclude that buying another 8800GTX would be about 75% performance as buying a GTX285 for a given system?

Interestingly, according to Google, the 8800GTX new is more expensive ($483 lowest it found) than a new GTX 285 ($352). Of course a used 8800GTX might be found, but still a weird disparity.


 

Posted

http://www.newegg.com/Product/Produc...82E16814131330

$139 after rebate sound like a good deal for a 5770? It would be coupled with a Q6600, replacing a Radeon 4670.

Guess I need to find a suitable PSU to go with it...

erg... Open Beta can't hit soon enough.


 

Posted

Would suggest that people try to log into the Test Server. Although you won't be able to actually log in, it seems that CoX auto-detects your graphics card and notifies you if it can use Ultra-Mode and at what setting (at the login screen).

It also seems that this notification will only appear after applying the current version/patch; so I'm assuming that any further changes would have to be done from within the game.

Hopefully, this is WAI.

Finally, this 'detection' may not be the end-all. Just because it says you can doesn't mean that there won't be levels of 'can' and without actually being able to get in and test...


Apparently, I play "City of Shakespeare"
*Arc #95278-Gathering the Four Winds -3 step arc; challenging - 5 Ratings/3 Stars (still working out the kinks)
*Arc #177826-Lights, Camera, Scream! - 3 step arc, camp horror; try out in 1st person POV - 35 Ratings/4 Stars

 

Posted

Quote:
Originally Posted by Yogi_Bare View Post
Would suggest that people try to log into the Test Server. Although you won't be able to actually log in, it seems that CoX auto-detects your graphics card and notifies you if it can use Ultra-Mode and at what setting (at the login screen).
Really? That's interesting!


 

Posted

Quote:
Originally Posted by Agonist_NA View Post
Thanks Father X - that is interesting. The ASUS mobo has 2 PCIe x16 slots, but I only have one vid card. So thought experiment (probably not really going to do it and UM doesn't like dual boards yet apparently):

What would the generic performance comparison for two SLI-linked 8800GTX (which can use 16x2 = combined x32 bandwidth) vs. one GTX285 which is a faster card, but can only use one PCIe1 16 pipeline.

Looking at nVidia's site and using the 9800GTX+ as a stand-in for the 8800 GTX (since they were about equal in performance - slight edge sometimes to the 8800 for the extra memory)

8800GTX x 2 = theoretical 30x 3DMark®Vantage Performance Preset (somewhat less due to SLI inefficiencies)


GTX285 x 1 using PCIe 1.0a single card = something less than 28x 3DMark®Vantage Performance Preset - perhaps 4% less

Going back and looking at 3DMark tests, 2x8800GTX scores 5467 on the SM3.0 test
http://www.legitreviews.com/article/421/7/

...on a PCIe x16 board, SLI giving it a x32 effective I guess. WinXP OS

and the GTX 285 scores 7540-7731 on the same test here
http://www.legitreviews.com/article/915/5/

...which is a PCIe 2 x16 board, so x32 effective I guess. Vista SP1 OS

So would it be correct to conclude that buying another 8800GTX would be about 75% performance as buying a GTX285 for a given system?

Interestingly, according to Google, the 8800GTX new is more expensive ($483 lowest it found) than a new GTX 285 ($352). Of course a used 8800GTX might be found, but still a weird disparity.
First, SLi isn't an automatic 2x improvement in performance. A lot of it has to do with how the game's code is structured so the game code running on the CPU is waiting on the video card to finish before the next frame can start. The longer the wait, the more of a speed boost having a 2nd card in the system.

Second, each card may get a full 16 lanes of PCIe V1 but it's not like either can "borrow" the additional lanes from each other as needed. The SLi bridge is so one card can access the frame buffer on the other to read for output to the monitor.

Third the price disparity is because the 8800 GTX isn't made anymore and if someone is looking for one it's because they already have one and is looking to do SLi. It doesn't occur to them, the gamer looking for a 2nd card, that a single newer card may be considerably faster and cheaper. You probably remember since you have one, the 8800 GTX was a $600 card when it debuted and I can understand why someone wouldn't want to pull it, bag it and stick it on a shelf of old parts when they think they can still use it if only they could find another for SLi.

Lastly, those two reviews you site use different hardware (CPU, OS, memory) so you really can't compare the two. But you can compare them if you know where they used the same setup to test every card. They are using a highly overclock CPU and rather high quality settings which helps SLi/ Crossfire to shine.


Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components

Tempus unum hominem manet

 

Posted

Quote:
Originally Posted by Yogi_Bare View Post
Would suggest that people try to log into the Test Server. Although you won't be able to actually log in, it seems that CoX auto-detects your graphics card and notifies you if it can use Ultra-Mode and at what setting (at the login screen).

It also seems that this notification will only appear after applying the current version/patch; so I'm assuming that any further changes would have to be done from within the game.

Hopefully, this is WAI.

Finally, this 'detection' may not be the end-all. Just because it says you can doesn't mean that there won't be levels of 'can' and without actually being able to get in and test...
Yeah, I'd definitely not take this as an absolute endorsement (or lack of) concerning whether you can run UM. The CB is still ongoing and the OB is yet to come so changes are entirely possible if not likely. Those with good cards can almost certainly rest easy about doing UM at some level. Those on the edge (whatever that is exactly) may find themselves getting a 'no' today and a 'yes' sometime later as things shake out. (Or vice-versa). So take whatever you may see with a big grain of salt.


It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.

 

Posted

Thanks again Father Xmas!

Quote:
Originally Posted by Father Xmas View Post
First, SLi isn't an automatic 2x improvement in performance. A lot of it has to do with how the game's code is structured so the game code running on the CPU is waiting on the video card to finish before the next frame can start. The longer the wait, the more of a speed boost having a 2nd card in the system.

Second, each card may get a full 16 lanes of PCIe V1 but it's not like either can "borrow" the additional lanes from each other as needed. The SLi bridge is so one card can access the frame buffer on the other to read for output to the monitor.
Yeah, that is what I meant by SLI inefficiencies, but good to point out explicitly...

Quote:
Originally Posted by Father Xmas View Post
Third the price disparity is because the 8800 GTX isn't made anymore and if someone is looking for one it's because they already have one and is looking to do SLi. It doesn't occur to them, the gamer looking for a 2nd card, that a single newer card may be considerably faster and cheaper. You probably remember since you have one, the 8800 GTX was a $600 card when it debuted and I can understand why someone wouldn't want to pull it, bag it and stick it on a shelf of old parts when they think they can still use it if only they could find another for SLi.
Ain't people funny? Personally, I got it for a killer price ($150 on eBay a few months after it came out. I put the bid in on a lark.) This actually further supports my position about Crossfire/SLI: it is sometimes positioned as a path to upgrade when, for the two I have had, it never made money sense to do so. It has both times been cheaper and better to buy a new card rather than a second card. So really, duo-card only makes sense to leverage the current technology for the price of beaucoup bucks. Because of the rate of technology progress, I doubt it will anytime soon make money sense to buy a second card to "keep up." Cheapest 8800TGTX I found was ~$150. Even at 50% of the price, it is a wash performance per dollar (at best).

Quote:
Originally Posted by Father Xmas View Post
Lastly, those two reviews you site use different hardware (CPU, OS, memory) so you really can't compare the two. But you can compare them if you know where they used the same setup to test every card. They are using a highly overclock CPU and rather high quality settings which helps SLi/ Crossfire to shine.
Oooh, wish I had known about that feature on the site - that is slick and I would have wasted a lot less time comparing Red Delicious applies to Granny Smith apples (not quite apples to oranges you see...though not quite the same apples to apples either).

So is that comparison saying that a SLI'ed dual 8800GTX is in fact almost comparable to a GTX285 in certain situations?


 

Posted

So I have a 9600 GSO. It's a little bit different that most of the 9600's in that it has a bit more memory and was a steal for the price I got it at. I'm wondering though if it is up to par with the 9800 models so that it could run Ultra mode? What do you guys think?