Ultra-mode video card shopping guide
Judging by publicly known info, I'd think a 4850 could do UM but at minimum specs while the 5850 ought to be able to hit the max or near to it. So there's that difference. Also, the 4850 is older tech so it doesn't support the newer DirectX and OpenGL specs like the 5850 does. It's DX 10.1/OGL 2.1 while the 5850 is DX 11/OGL 3.2.
As an alternative, you might have a look at this: http://www.newegg.com/Product/Produc...82E16814161317 It's a single-slot 5770 I found in a bit of searching. Not as good as the 5850 but better than the 4850 and it does the DX11/OGL 3.2 specs. I have no experience with HIS brand stuff but this card does have a good rating on NewEgg at least. You'll want to do your homework to see if it will fit but it might be an option for you.
It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.
I've done some quick looking about... and the next tier down from the HD5850 is the HD4850. |
The HD 4850 is a (generation) back.
Unfortunately the XFX HD 4850 is a dual slot card... while HD4850 cards from other manufacturers are single slot with largish heatsinks ... like say from Gigabyte or Sapphire. Would the 4850 run Ultra mode? I know XFX have a good name for quality and gaming grade goods... but what about the other brands? Is there any difference? And what is the difference between the 4850 and the 5850? I'll probably find out next time I'm on... but beds calling. |
The 4850's performance comparative parts in the current AMD line-up are the HD 5750 and HD 5770... but each of these generally costs a bit more than the old 4850.
Now, I can tell you that you don't want the "single slot" Gigabyte 4850's. I put some together for a client and his computer is even louder at full tilt than my own Dual-slot Sapphire 4850 crossfire system.
Now, as to whether or not the 4850's will run ultra mode? Well. It is the "starting point" for Ultra mode. Key-words: starting point
Until the NDA is lifted, or I figure out how to give performance figures without giving performance figures, that's pretty much all I can say on how the 4850 performs in Ultra Mode.
Looks to me that you have to pick a card that's truly a single wide card, not just a single wide bracket but no "large" heatsink either. That's going to limit your choices a lot since most manufacturers simply assume that you can fit a double bracket or large heatsink if you are choosing higher end performing video cards.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
hmm i just bought the Nvidia GeForce 9800gt graphics card (1024mb gddr3 version) spent 139.00 bucks hope its good enough?
hmm i just bought the Nvidia GeForce 9800gt graphics card (1024mb gddr3 version) spent 139.00 bucks hope its good enough?
|
If you bought it outside of the US, depending on what country you are in, it might have been a good buy.
Or you bought it at a store instead of online. Surprisingly people still prefer holding a physical item before parting with their money than parting with their money and waiting 3-5 business days for delivery. Don't begrudge them je_saist, the Interweb is a scary place full of Banks of Nicolai.
And actually checking NewEgg, a 1GB 9800GT is selling between $101 and $140 depending on the manufacturer, clock speed and accessories, not counting shipping so not really a bad deal.
As for UM, Posi says it'll support UM features at their minimal settings. The rest of the game settings can remain turned all the way up, just the new UM settings are one notch above OFF.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
Supoport you local Mom and Pop Computer shop! 'cause were else are you going to get a hard drive when you Crash yours and the latest and greatest game come out tommorow!
|
The fact I work for a newspaper company adds to this for me - I support local businesses and the local community because they're also the ones that pay for my check, too!
There are some things I can't get around here, but overall, even with exceptions? Try to get local.
I will say this - the local computer shops are good about saying, "yeah, we can get it for you, but honestly, just go to Newegg" if the pricing will be too out of whack. But at least I'm giving them the first shot
we'll call it even.
[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]
In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)
They don't do the beans* anymore, but then, every time I went they were always out anyways, so... Man... I haven't been there in a while. I should go sometime soon...
*I'm assuming azuki beans. I'm sure some of you were like, "wait, what? Bush's baked beans?" No, no, no...
Question for you system experts...
What effect would putting in PCI-E 2 vid card into a system board that is PCI-E 1 have? I know 2 is backward compatible, but I figure it has to slow down a top-end card somewhat anyway.
For comparison, here is my system info:
Board: ASUS A8N32-SLI Deluxe
Card: 8800 GTX
Memory: 2Gb
OS: WinXP
CPU: Athelon 64 X2 4400+
PSU: 650W BFG
If I were to put in, say, a GTX285 would I get a performance boost do you think?
Thanks!
The difference between PCIe V1 and V2 is V2 is double the bandwidth. Another way of looking at it is a PCIe x 16 V1 interface has the same bandwidth as a PCIe x 8 V2 interface.
If you think about it that way then you can look at the tests done over at Tom's Hardware on PCIe scaling. The results (using an i7-870 and an HD 5870) showed only around a 4% decrease in performance in the games they tested. Slower CPUs and video cards would make the impact even less.
Since the GTX 285 is considerably more powerful than your 8800 GTX, games performance that is being throttled by GPU performance will improve. But it isn't going to be as fast as it could be if it was in a PCIe V2 slot with a hefty CPU feeding it, but faster than the original 8800 GTX, certainly.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
Yep. Thing for me is I do live on an island (Maui, Hawai'i) that's mostly reliant on tourism (come and visit us!). <snip>
|
"Psyte's Paragon Hotel: The perfect tropical place for the wayward gamer. Ask about our beach LAN parties!"
And now, back to your regularly scheduled thread....
It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.
Any news in areas that could affect video card pricing? I'm sort of not expecting Fermri to do much at all (sadly), so I'm hoping something else might be happening to help bring prices down. ATI/AMD might do a refresh of the lineup later on, right? Do those, traditionally, help much?
I suppose, if nothing else, I should just look at getting a new PSU soon anyhow... just to be ready.
The difference between PCIe V1 and V2 is V2 is double the bandwidth. Another way of looking at it is a PCIe x 16 V1 interface has the same bandwidth as a PCIe x 8 V2 interface.
|
What would the generic performance comparison for two SLI-linked 8800GTX (which can use 16x2 = combined x32 bandwidth) vs. one GTX285 which is a faster card, but can only use one PCIe1 16 pipeline.
Looking at nVidia's site and using the 9800GTX+ as a stand-in for the 8800 GTX (since they were about equal in performance - slight edge sometimes to the 8800 for the extra memory)
8800GTX x 2 = theoretical 30x 3DMark®Vantage Performance Preset (somewhat less due to SLI inefficiencies)
GTX285 x 1 using PCIe 1.0a single card = something less than 28x 3DMark®Vantage Performance Preset - perhaps 4% less
Going back and looking at 3DMark tests, 2x8800GTX scores 5467 on the SM3.0 test
http://www.legitreviews.com/article/421/7/
...on a PCIe x16 board, SLI giving it a x32 effective I guess. WinXP OS
and the GTX 285 scores 7540-7731 on the same test here
http://www.legitreviews.com/article/915/5/
...which is a PCIe 2 x16 board, so x32 effective I guess. Vista SP1 OS
So would it be correct to conclude that buying another 8800GTX would be about 75% performance as buying a GTX285 for a given system?
Interestingly, according to Google, the 8800GTX new is more expensive ($483 lowest it found) than a new GTX 285 ($352). Of course a used 8800GTX might be found, but still a weird disparity.
http://www.newegg.com/Product/Produc...82E16814131330
$139 after rebate sound like a good deal for a 5770? It would be coupled with a Q6600, replacing a Radeon 4670.
Guess I need to find a suitable PSU to go with it...
erg... Open Beta can't hit soon enough.
Would suggest that people try to log into the Test Server. Although you won't be able to actually log in, it seems that CoX auto-detects your graphics card and notifies you if it can use Ultra-Mode and at what setting (at the login screen).
It also seems that this notification will only appear after applying the current version/patch; so I'm assuming that any further changes would have to be done from within the game.
Hopefully, this is WAI.
Finally, this 'detection' may not be the end-all. Just because it says you can doesn't mean that there won't be levels of 'can' and without actually being able to get in and test...
Apparently, I play "City of Shakespeare"
*Arc #95278-Gathering the Four Winds -3 step arc; challenging - 5 Ratings/3 Stars (still working out the kinks)
*Arc #177826-Lights, Camera, Scream! - 3 step arc, camp horror; try out in 1st person POV - 35 Ratings/4 Stars
Thanks Father X - that is interesting. The ASUS mobo has 2 PCIe x16 slots, but I only have one vid card. So thought experiment (probably not really going to do it and UM doesn't like dual boards yet apparently):
What would the generic performance comparison for two SLI-linked 8800GTX (which can use 16x2 = combined x32 bandwidth) vs. one GTX285 which is a faster card, but can only use one PCIe1 16 pipeline. Looking at nVidia's site and using the 9800GTX+ as a stand-in for the 8800 GTX (since they were about equal in performance - slight edge sometimes to the 8800 for the extra memory) 8800GTX x 2 = theoretical 30x 3DMark®Vantage Performance Preset (somewhat less due to SLI inefficiencies) GTX285 x 1 using PCIe 1.0a single card = something less than 28x 3DMark®Vantage Performance Preset - perhaps 4% less Going back and looking at 3DMark tests, 2x8800GTX scores 5467 on the SM3.0 test http://www.legitreviews.com/article/421/7/ ...on a PCIe x16 board, SLI giving it a x32 effective I guess. WinXP OS and the GTX 285 scores 7540-7731 on the same test here http://www.legitreviews.com/article/915/5/ ...which is a PCIe 2 x16 board, so x32 effective I guess. Vista SP1 OS So would it be correct to conclude that buying another 8800GTX would be about 75% performance as buying a GTX285 for a given system? Interestingly, according to Google, the 8800GTX new is more expensive ($483 lowest it found) than a new GTX 285 ($352). Of course a used 8800GTX might be found, but still a weird disparity. |
Second, each card may get a full 16 lanes of PCIe V1 but it's not like either can "borrow" the additional lanes from each other as needed. The SLi bridge is so one card can access the frame buffer on the other to read for output to the monitor.
Third the price disparity is because the 8800 GTX isn't made anymore and if someone is looking for one it's because they already have one and is looking to do SLi. It doesn't occur to them, the gamer looking for a 2nd card, that a single newer card may be considerably faster and cheaper. You probably remember since you have one, the 8800 GTX was a $600 card when it debuted and I can understand why someone wouldn't want to pull it, bag it and stick it on a shelf of old parts when they think they can still use it if only they could find another for SLi.
Lastly, those two reviews you site use different hardware (CPU, OS, memory) so you really can't compare the two. But you can compare them if you know where they used the same setup to test every card. They are using a highly overclock CPU and rather high quality settings which helps SLi/ Crossfire to shine.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
Would suggest that people try to log into the Test Server. Although you won't be able to actually log in, it seems that CoX auto-detects your graphics card and notifies you if it can use Ultra-Mode and at what setting (at the login screen).
It also seems that this notification will only appear after applying the current version/patch; so I'm assuming that any further changes would have to be done from within the game. Hopefully, this is WAI. Finally, this 'detection' may not be the end-all. Just because it says you can doesn't mean that there won't be levels of 'can' and without actually being able to get in and test... |
It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.
Thanks again Father Xmas!
First, SLi isn't an automatic 2x improvement in performance. A lot of it has to do with how the game's code is structured so the game code running on the CPU is waiting on the video card to finish before the next frame can start. The longer the wait, the more of a speed boost having a 2nd card in the system.
Second, each card may get a full 16 lanes of PCIe V1 but it's not like either can "borrow" the additional lanes from each other as needed. The SLi bridge is so one card can access the frame buffer on the other to read for output to the monitor. |
Third the price disparity is because the 8800 GTX isn't made anymore and if someone is looking for one it's because they already have one and is looking to do SLi. It doesn't occur to them, the gamer looking for a 2nd card, that a single newer card may be considerably faster and cheaper. You probably remember since you have one, the 8800 GTX was a $600 card when it debuted and I can understand why someone wouldn't want to pull it, bag it and stick it on a shelf of old parts when they think they can still use it if only they could find another for SLi.
|
Lastly, those two reviews you site use different hardware (CPU, OS, memory) so you really can't compare the two. But you can compare them if you know where they used the same setup to test every card. They are using a highly overclock CPU and rather high quality settings which helps SLi/ Crossfire to shine.
|
So is that comparison saying that a SLI'ed dual 8800GTX is in fact almost comparable to a GTX285 in certain situations?
So I have a 9600 GSO. It's a little bit different that most of the 9600's in that it has a bit more memory and was a steal for the price I got it at. I'm wondering though if it is up to par with the 9800 models so that it could run Ultra mode? What do you guys think?
What case were you installing this in?
Here are some pictures of the insides as requested (click to get to full size).
Yes I can remove the guards at the back of the case, that's easy... but notice that the PCI-express slot is on the far left of the slots, and if the pins of the card line up with it, it will conflict with the CPU Fan casing as well as the rear of the case. If only the slot in the middle was a PCI-express then I could put it there and I'd be laughing. Hope that makes sense.
Edited to add:
I've done some quick looking about... and the next tier down from the HD5850 is the HD4850.
Unfortunately the XFX HD 4850 is a dual slot card... while HD4850 cards from other manufacturers are single slot with largish heatsinks ... like say from Gigabyte or Sapphire.
Would the 4850 run Ultra mode? I know XFX have a good name for quality and gaming grade goods... but what about the other brands? Is there any difference?
And what is the difference between the 4850 and the 5850? I'll probably find out next time I'm on... but beds calling.
Sig characters appear in the Ch�teau Rouge thread starting from post #100
I Support Nerd Flirting! - Story/Discussion/Sequel