-
Posts
4197 -
Joined
-
Quote:Several problems with this:I'd grab what I just picked up for my rig ( which has the same specs OP listed aside from a 650 W PS). the Geforce GTX 260 896 RAM. It's shown on fatherXmas's chart he linked to and it is slightly above all the cards listed thus far.
A: the GTX prices are going up as stock decreases
B: The GTX card isn't DX11 / OpenGL 3.2 compatible
C: the cheapest 5770 starts at $155: http://www.newegg.com/Product/Produc...82E16814131326
The cheapest GTX 260 that really doesn't outrun it starts at $179: http://www.newegg.com/Product/Produc...82E16814143189
D: the performance difference between the 5770 and the GTX 260 isn't that great.
Actually, the Catalyst drivers are known to be holding several games back, which some reports placing the beta Catalyst 10.1 drivers as matching the GTX 260 in non Nvidia TWIWTB titles. Now, whether or not that Beta performance will turn into real-world performance on "your rig" is yet to be seen. This early in a video cards life though, it's not uncommon to see another 5%, 10%, or 15% over-all performance wrought out as the drivers are refined and optimized for the (new) architecture. With the GTX series being (Forcefully) end-of-lifed, there probably won't be any future driver performance enhancements.
The other side to the performance coin is that DirectX 11 / OpenGL 3.2 is a lot like DX10 / OpenGL 3.0 over DX9 / OpenGL 2.0-2.1. There's not actually a lot in the API's that cannot be implemented or accomplished on the older feature set. However, the architecture to support the API's allows for some operations to be accomplished quicker. A shader intensive scene that might require 8 passes on the DX9 / OpenGL 2.0-2.1 API might only require 5 or 6 passes on the DX10 / OpenGL 3.0 API.
This means, in theory, that graphics engines built against newer versions of various API's should run faster on the same hardware. This hasn't actually been the real world case with engines based on Unreal or CryEngine, where the DX10 path never really offered noticeable performance gains over the DX9 paths.
There is the counterpoint to this argument that these two engines were really DX9 / OpenGL 2.0 engines that were extended, rather than built from the gound up. Other engines, such as ID's Tech5, Valve's Source, and the new Unigine have been noticeably faster when running similar scenes against different API's on the same hardware with the same quality levels.
Since DX11 / OpenGL 3.2 continue to be evolutions, rather than revolutions, of existing API's, there is a question over whether or not engines built against previous API's extended to use new API features will be any faster.
Is a $24 price difference really worth it for a card that runs hotter, uses up more power, isn't actually any faster, and doesn't have the same API feature support?
Personally, no, it's not.
The 5750 an 5770 are quite capable of powering a DX11 game at 1680*1050, even with Anti-Aliasing: http://hardocp.com/article/2009/12/2...mage_quality/1
So there is some assurance that as DX11 / OpenGL 3.2 titles come down the pipeline, these medium range cards will be able to run them.
The GTX 260? Well. It won't. At all. -
-
Quote:There's a whole thread on this subject here: http://boards.cityofheroes.com/showthread.php?t=200245I'm trying to see if there's a good upgrade option worth the investment from my nVidia GeForce 8600 GT. My main reasons for wanting to upgrade is so I can be ready for Going Rogue when it comes out. I know system requirements for the new graphics haven't been listed yet, but I'm sure the higher end cards out at the moment would suffice.
My current specs are:
Dual Core E6750 @ 2.66GHz
3GB RAM DDR3 1066
PSU - 500W ATX12V V2.01
I know I may need to upgrade the PSU, but currently what I'm looking for is a good GFX card suggestion. I was thinking of the GTS 250, but I'm not sure how much better my FPS would be from that. Any ideas?
Short version, the GTS 250 is a 9800 GT that's been die-shrunk and rebadged. Yes, it's a bit over an 8600...
Nvidia is pulling the GTX chips from the market causing the price to go up as stocks dwindle.
Nvidia's Fermi won't be here until March at the earliest.
If you are buying right now, buy AMD. Buy any of the RadeonHD 5x00 series cards shipping. They are the only DirectX 11 / OpenGL 3.2 cards on the market.
The RadeonHD 5750 and 5770 cards are generally in stock and under $200.
The RadeonHD 5850, 5870, and 5890 cards are now being stocked as shipments are up, and most will start in the $300+ bracket.
The best new performance bang for your buck is RadeonHD 5750 in Crossfire mode: http://www.xbitlabs.com/articles/vid...rossfirex.html
Don't look for a price drop on the RadeonHD 5x00 series anytime soon. They have no competition.
The best old performance bang for your buck is the RadeonHD 4850 in Crossfire. They start under $130: http://www.newegg.com/Product/Produc...82E16814150351 -
-
Quote:Probably... not.Would Ultra Mode work on a Radeon X1600/1650 series graphics card? cause that's' what I got
The x1x00 series was the last generation of AMD's DirectX 9 / OpenGL 2.0 line-up of cards. If Going Rogue is utilizing OpenGL 2.0 ES, then yes, the x1x00 series should technically be capable of implementing all of the graphical effects called by the graphics engine.
However, given how an x1650 I have right now performs in today's games based on the Unreal-3 and CryTek engines, I seriously doubt it can compute Going Rogue's graphics fast enough to give a display faster than 2 or 3 frames a minute.
If the Going Rogue graphics engine is based agains OpenGL 3.0, then no, you will not be able to use the new engine at all.
The good news for you is that the OpenGL shaders (presumably written by Nvidia years ago) that break certain graphics, like Water effects and anti-aliasing, on ATi cards right now, have been re-written as part of the Going Rogue improvements. When the Ultra-Mode updates hit, you should find that the existing graphical errors with ATi cards should be fixed in the currently existing graphics engine. -
-
Quote:March. Unfortunately Digitimes has already placed the original report behind the members only block, but other sites reported on the recent semi-official news: http://www.digitimes.com/news/a20091228PD207.htmlThanks Human, and you are right, I was checking prices again on the cards, they just keep going up.. I read somewhere that NVidia is supposed to come out with new cards in Feb/March, hopefully that will put some pressure on ATI to reduce theirs a bit but then again, with so much function coming out with them, they can keep the supply low knowing the games demand them so we will pay :-( Once my box comes in I'm going to go for the power supply and then, maybe after my tax return comes in... dig deeper and get the card :-( .. and I'll keep up with Toms hardware lol
http://firingsquad.com/news/newsarti...searchid=22425
TMSC has reportedly fixed their 40nm problems with the RadeonHD cards. However the RadeonHD chips are significantly less complex than the Fermi chips Nvidia is trying to make (~2.15 billion versus 3billion plus transistors), so although the equipment is working now for AMD parts orders, it might not be working entirely for Nvidia parts orders. -
-
-
*makes a final attempt at abducting Ali for kitty huggles*
-
-
I find myself agreeing with the original post. A recent thread celebrated Triumph running an all-defender ITF... and I badly wanted to respond in that thread that no, Triumph wasn't dead as the thread title mentioned, but fragmented.
Part of the breaks come from having multiple badge channels. Triumph Watch went through a split a long time ago where a bunch of players moved to TW 2.0, some formed TW 2.1, and several remained in the original TW channel because the devs would shortly fix the problems with inactive accounts clogging up global channel membership lists. (Still hasn't happened). Then there were personality breaks, with some players being so atrociously bad that groups of players formed separate private channels. Then there were the Supergroup problems, such as the complete fall-out of Feline Fellowship earlier in the year.
If you're able to dance around the fragmentation, Triumph is an active server... but if you're part of the cliques that have pulled off, some may never know you are on the server at all.
I do think that the fragmentation of veteran players hurts new players entering the game. I used to enjoy entire Supergroups dedicated to providing Taxi-Rides, or watching over the lower level zones with healers. Several of the events that attracted me to CoH long ago... I don't see carried out unless I do them. And one stormy / therm rad / rad / bubbly / PB hanging around Atlas Park or Hollows... really isn't much. I also see a focus on supergroups only recruiting if they already have a full base, but I see so few players banding together to create a new SG... but then again, that might because I rarely hang around Atlas now on any server.
Is it actually a problem? Possibly... SG's and Global Channels need new players. Is there an easy solution other than players just opening up and talking to those who are new to the game?
Probably not... -
Quote:okay. missed that one. Probably because it's a one-shot and once you've got the Isolator badge, not much point to doing it again. That and enemies don't give an reward outside of the Isolator badge, so there's no inf or prestige bonus to be had.The Pilgrim's mission sending you back to the tutorial zone exemplars you to 1 (leaving you with your level 6 powers). You can also exemplar to an actual level 1 player (again, leaving you with your level 6 powers)
Remember, I16 introduced +5 powers when exemped. Thus if you are on a level 15-cap task force, you keep powers out to level 20. Which is why Illusionists can have Phantom Army on Positron, and most players can have stamina on Posi. With a level 20-TF cap, powers extend out to 25, thus making for the situation I outlined on the villain side.Quote:It used to be 20. Did it get changed recently?
I'd be a little bit more concerned if I hadn't chunked you on global-ignore for being a junk player years ago. Thanks for letting me know you were on the forums and to ignore you here as well.Quote:That is good to know, I will remember to one star and never team with you with your high ignorance.
Yet it can be a lifetime for people to learn an ego check
Yep. Here's why:Quote:So my plant/ff with softcapped lethal/smashing and ranged/aoe wouldn't make the cut? Oh well, skipping the ally shields probably would have meant I didn't make the cut anyway...
you have no defense debuff resistance. All it takes is on minus defense debuff hit to land and then you've gone into cascading defense failure.
I'd also have concerns about the odd slotting choices and power sets you'd have to take in order to fit enough defenses in, and I'd be questioning if you were actually capable of decent levels of control or buffing your team-mates. -
Quote:you want... our devs... to read up... on World of Warcraft.Do the devs read game informer? If so they should check out issue 201 and read up on wow article... esp the pvp bit... just someones opinion I personally like.
Happy holidays and everything to everyone.
I'm sorry, but what have you been drinking and is it legal anywhere in the US?
I want our devs staying as far away from World of Warcraft as is programmably possible.
I'd also rather they just convert all of the PvP zones to Co-Op, call PvP for the failure that it is, and cease to waste any more time, money, or resources on an aspect of the game that never is going to turn a profit. -
-
Quote:depends on what you are doing.You can't set a new mission if someone's still in the old one.
I had to boot someone last night cause they were AFK.
If you are in a Task Force and the mission you are in terminates with instructions to return to the contact, then you can call / visist the contact to kick the afk member from the team.
If you are in a Task Force and the mission you are in terminates with another mission immediately starting, then you cannot force an afk player to leave.
If you are running standard missions, pretty much anything but TF's, then I believe you are unable to kick by calling the contact. I have not actually tested this scenario. -
Quote:*points at the API limit*Oh. They'll probably work. Your performance will just be less than the hardware-equivalent GeForce card.
If we can get a developer to comment on what the OpenGL API in use is for Going Rogue, that may or may not determine if it will work. If it's OpenGL 3.0, the cards won't work.
***
Oh, you should have been around when I was working some HD 4650's in Crossfire on that board. Game's actually slowed down trying to work through the bridgeless cards. I had some less than pleasant words for the vendor that had said the cards support CrossFireX.Quote:Saist: Thanks for the very quick reply. Can't beat personal experience! Glad to hear the more-or-less good news. -
Quote:I expected this question. I've been searching, but nobody that I normally read (HardOCP, Xbit, Anand, HotHardware, Beyond3D, Firingsquad) has done anything pitting Workstation Cards and their drivers against consumer cards and their drivers recently.Thanks for the info....
One question though... Why do they seem to have no problem handling any sort of modern game right now (Crysis, UTIII, etc., etc.), if they're that bad for gaming? I thought that if they could handle something like Crysis, which is notorious for system bog-down, then I could handle a "higher graphics mode" for CoH...
Guess I was wrong.
"Alien"
Most recent entries I can find in Google point to Toms Hardware Forums: http://www.tomshardware.com/forum/26...er-video-cards
or TechPowerUp: http://forums.techpowerup.com/showthread.php?t=67824
There is also the aspect that you have to look at the problem in the aspect of rendering modes.
The 7900 GTX was a monster in DirectX 9 rendering, and even if you knocked 30% performance off, it's still bloody fast. Because that's the rendering limit, OpenGL 2.1 / DirectX 9, that's what games like Crysis are going to default to. You are not going to be running the same code path with the same quality settings as somebody with a Geforce 8800 or a RadeonHD 2900.
The reason why they seem good for modern gaming is that... they aren't running modern games. They are using the old(er) API rendering paths. -
Quote:I... wouldn't be using these for a couple of reasons.
Quadro FX 3500
I'll point you over to xbitlabs for a couple reasons why:
http://www.xbitlabs.com/articles/vid...x-firepro.html
http://www.xbitlabs.com/articles/vid...vs-firegl.html
http://www.xbitlabs.com/articles/vid...uadrofx_5.html
The basic problem is that Workstation card drivers are not optimized for first-pass fast rendering. Basically you'll be hamstringing your performance compared to a stock 7900 GTX in most common games.
The other aspect is that the drivers certified for the Quadro cards... aren't the same drivers that consumer Geforce users would be using: http://www.nvidia.com/object/Quadro_...1.78_whql.html
Now, as to whether or not they'll work in Ultra Mode at all? I doubt it.
The presumption right now is that Going Rogue leverages OpenGL 3.0... and the Geforce 7900 is not an OpenGL 3.0 part, which means that your Quadro Variants are not OpenGL 3.0 either.
If Going Rogue is leveraging OpenGL 2.0, 2.0 ES, or 2.1, then your cards will support the rendering API. -
Chad : for the most part you'll only crunch the bandwidth on a PCIE 16x 1.0 slot if you are trying to use Crossfire or SLI without a bridge connector. E.G. something like RadeonHD 4650 in Crossfire, or a Hybrid Crossfire setup. In these situations the bandwith constraints will really hurt multi-gpu processing.
If you are using a multi-gpu setup with an external bridge or a single card, there's more than enough bandwidth. I've also got one of the older D975XBX boards, and I can tell you from personal testing that most modern cards don't give a flip about whether or not they are on PCIE 16x 1.0 or PCIE 16x 2.0. -
Quote:Oh, lots of people saw it, and it was such an atrocious idea it didn't merit any further discussion.
but wasn't sure if anyone had really seen/read it.
**
Okay, as much as I'd like to leave this at the one liner, lets go over the basic facts again. PvP in City of Heroes is dead. Less than 1% of the player pop participates. Basically if something like this was implemented you'd piss off 99% of the paying base and render a zone unusable. I can also point you to posts on the forums from players upset with the way that the Hero and Villain events shut down zones. A PvP zone take-over would be even more annoying to a greater number of players.
Really, this idea needs to be taken to the land-fill and left to rot. -
Pointless, completely missing the point of the first couple of levels, and a general lack of how the leveling system works is what I think
The fact is, the developers current power selection forces players to take at least one attack, one defensive power if they are a defense type, one control power if they are a control type, and so on.
This means that even the worst players have to have the basic powers for their archtype. Allowing players to select power pools from level 0, even on a respec, would allow some players to completely bypass a primary or secondary power set choice. As of now, if I ever see anybody with more pool powers than their primary or secondary powers, I know automatically to one-star them and never work with that player.
***
Quote:You've... never played on Freedom have you.I think by the time an earned respec can be had, they should know enough to avoid that pitfall.
Trust me, I've had to explain to players with 48 month vet badges what basic powers like O2 Boost and Increase density do. All servers have players who even after multiple years in the game do not understand gameplay basics.
***
There's a big problem with this statement to. Players who have already earned a respec are level 24 or above. The lowest level they can possibly go on an exemption now is 14, since one Ourobous mission caps the player at level 9 for one of the badged missions, which is Break up the Clockwork and the Skulls. For the most part then players who exemp down hero side are capped at levels 19 and 20, level 19 for Ourobous missions, and level 20 for Positron Task Force.Quote:. I think there are certain builds where the forced selection of beginning powers is detrimental or results in powers that essentially will be lightly or never used
For players on the Villain side, again, they can drop down a bit further in level since both Weird Science and Snake Uprising cap at level 8 in Ourobous, placing the player at a level of 13. The lowest they can go on a strike force is 24, which is the Cap'Au"Diable Strike Force. Theoretically a player could do the first Tree Respec at 24, not level, then go on a Cap'SF and still be 24 under the exemption rules.
Now, since you absolutely have to choose 3 powers before the power pools open up currently on a respec, I'm left wondering at just how bad a players build is going to be if they want to select power pool sets earlier than that. Power Pool attacks are worthless, they don't do any particularly good damage per endurance, and there's already a vet reward to bypass the travel restrictions and take a travel power at an earlier level.
***
So I'm left struggling to see where this opening up power pools at an earlier level is a good idea. It breaks the safety aspect of a game by making sure a player has to have the minimum amount of powers to be their archtype, and unless you're trying to set world records for fastest Ourobous mission completion times, completely pointless for players looking at task forces or general teaming.
So, bad idea. Throw it in the dumpster where it belongs. -
We both know and don't know.
Since the HeroCon Going Rogue Demonstration was run on Crossfired RadeonHD 4870x2's, we can presume that yes, Multi-GPU setups will accelerate the game.
However, nobody from the developers side has actually come out and said that Multi-GPU setups will give a performance boost. -
Quote:I can attest to that. Here's what I wrote about Sound Cards covering my issues with Soundblaster over on HardOCP : http://www.hardforum.com/showpost.ph...6&postcount=16QR,
Creative cards suck. This crackling/popping issue has been around for years. YEARS! I had gotten a xifi extreme gamer, hit the crackling and popping issue and found out that it's a pretty major issue.
Quote:Okay. First of all, I'm the kind of person that spends way too much money on speakers and home audio equipment. I've got speakers from Pioneer, Aiwa, Altec-Lansing, Sicuro, and creativer; receivers from Kenwood and Sony; and even oddballs like the Zalman surround sound headphones.
Audio quality is very important to me, and I'm not just talking about sound effects and channel separation. I'm talking about the actual generation of the audio. Yeah, I am one of those "freaks" that can tell you when you've played back a song in 128k Mp3 encoding, or 128 Vorbis encoding.
While Intel's HDA audio was a large step-up in terms of audio quality... the reality is, even ancient chips like the AS9200] (hint, that one came from Aopen) or CT5880-DCQ (hint, that one is a creative labs card) will provide better analog-audio quality than many on-board HDA solutions. I've got a Clevo D900-T notebook, and when Steelsound sent me one of their headsets for testing which included a USB sound-chip, one of the first things I immediately noticed with the Steelsound headsets is that the Clevo's Intel HDA audio wasn't even trying to produce some of the lower-range sounds. That was pretty much the point at which I stopped using cheap logitech headsets as I could literally hear what I was missing.
Where the situation gets a bit... fuzzy... is in Digital Audio where you pass the sound processing off to a receiver to handle channel separation. However, even in these cases, and this is coming from somebody who bought Abit NF7-S 2.0's not because they were stellar Socket A overclockers, but because they had an onboard optical out, then followed that up with a DFI LANParty UT nF3 250Gb because it had optical out onboard, then followed that up with a Hercules GameTheater Xp because it had a break-out box with all the audio I/O you could ever need, then followed that up with buying Via Envy's with Optical out because their breakout boxes were SMALLER... Actually that's not quite true. I sold off an STA Media 7.1 a while back, but it's breakout box was as large as the Hercules box.
Anyways, a couple creative cards thrown in to the purchasing history. I count a couple Audigy's and Soundblaster's sitting in a box, two of them I bought with the front-panel boxes so I could get optical out... and those were junk... Had a long go back and forth with creative over problems with the front panel boxes in which the audio CONSTANTLY skipped... and Creative kept claiming it was my receiver when NO OTHER OPTICAL / COAXIAL OUTPUT I HAD ON HAND SKIPPED...
anyways, getting away from the point. Just because you have a digital signal generation on the motherboard, or on the sound-card... doesn't mean you automatically get a good quality audio playback. Again, I'm not just limiting myself to audio effects like EAX. I mean actual signal to noise ratio and audio processing.
While you might never notice the the audio difference on $20 Logitechs, or the speakers that came with your computer, most people can start hearing the difference with even low-end headsets like Logitech's $50 Digital Precision headset. Even with a decent mid-range speaker set, and I used to like to point to Sicuro's though they no longer seem to be in business, at least in the US, most people will be able to hear a difference in their audio with even the most basic of add-in cards.
Now, a lot of audio is subjective. I can hear the differences. You might not be able to. And that's fine. You might be perfectly happy with the audio quality Intel's HDA system delivers. I'm not. -
Quote:*ish hit by the catnip grenade and eyes go wide, followed by scampering over to Cien, wrapping paws around the puppeh and hugging*whipcream and jello...mmm yummy
* tosses catnade at sai* this will keep u buzzing puddy
awww, i just wuv you, your mah bestest puppeeeh friiiind in ze WHIRLD!!!

