-
Posts
4197 -
Joined
-
Quote:Well... filed a bug on http://ati.cchtml.com/ ... sent AMD an email... and left a post on their forums. Short of bugging Alex D to hurry up on Multi-GPU support under Gallium3D, I think that's all I can do at this point.
UPDATE 2-10-2010: Spoke with our engineer today, and he said we were seeing excellent performance from the Radeon 5770 card, which is a mid-range card. We have not tested the 5870, but expect the performance on that card to be one of the fastest for Ultra mode. Also noteworthy is that Ultra Mode does not currently support SLI configured cards. Were working with NVidia and ATI about this, but you all can speed up the process by contacting NVidia and ATI about updating things on their end to get it to work. -
-
Quote:I dunno. They did do the Pre-Order costume stuff for CoV, and the special sprint powers. CoV was also "given" to all CoH players that had never purchased the Expandalone, and CoH was "given" to all CoV players that had never purchased CoH. So the precedent is set for paid stuff from the past to be given away for free.The Prestige Power Slide came in the CoH collector's edition and as a payed for extra will never be given away free.
That being said, I am against issuing the Prestige Power Slide for free. I'm actually against re-issuing it at all. Much like a badge it's a mark of what you've done "For the Game." Sometimes things should remain exclusive.
Maybe an option though would be to make a new Power Slide animation, with different colors or a different animation styling. Something that satisfies the urges of those who want to glide around, but still preserves the original Prestige Power Slide as a special treat for those that bought the Collectors Edition. -
Quote:And a lot more people will have ATi cards, Intel GPU's, or Nvidia cards in systems with Intel Integrated graphics, or ATi cards in systems with Intel integrated graphics, or Nvidia cards in systems with ATi integrated graphics, or ATi cards in systems with ATi integrated graphics, or Nvidia cards with integrated Nvidia graphics, or ATi cards in systems with integrated Nvidia graphics.while true, a whole portion of the game code is going to waste. do you know anyone with a Aegia PhysX card?
run a survey to see who has what video card. A lot of people will have a nVidia card capable of PhysX.
No matter how you cut it, continued PhysX support is just a bad idea for the CoH developers, or any game developers. Just drop it, move to OpenCL. Everybody's happy regardless of what hardware platform they have now, or what hardware platform they use in the future. -
Quote:No.With UM on the horizon, Can we not have our Nvidia cards used to their fullest capacity?
The game currently uses the old Aegia PhysX.
Can get this get updated to the "Now" ?
Thing is with Nvidia PhysX is that it locks you to Nvidia cards. The developers would be in a much better position on support to move to OpenCL for physics calculations. OpenCL will give you the same performance advantages as PhysX, but it will work on Intel and AMD systems as well. This will be a critical factor as Intel and AMD roll out CPU+GPU systems over the next 6months. Gamers will be able to utilize the "weaker" integrated graphics chips for Physics or other calculations while using a better add-in card for graphics display.
The added advantage is that the developers only have to write once, and the code should automatically work across all operating system platforms. -
Quote:This used to be true. However ATi rewrote their OpenGL engine back in 2006 through 2007: http://www.phoronix.com/scan.php?pag...item=914&num=1CoX uses OpenGL and NVidia consistently writes better OpenGL drivers than ATI. ATI supports directX pretty well but they've historically been sloppy with their implementation of OpenGL.
Hardware performance is important without a doubt, but don't overlook the quality of the software controlling it all.
As an example I was playing around with OpenGL frambuffer objects, doing some post processing effects (glow, depth of field). No issues on NVidia in XP or Linux (the proprietary binary drivers). Completely broken with the ATI card in my lenovo. It seems ATI only supports a selection of possible pixel modes (# bits/color/alpha) and integer/float modes for framebuffers, and has some odd caveats about copying buffers and the final display buffer.
Even with my novice mucking around with OpenGL it's pretty obvious ATI cards eat up more developer time and probably end up adding duplicated effort to build the shaders and post processing filters.
AMD also released OpenGL 3.0 in the Catalyst 8.9 set, which was on 9/17/2008.
Nvidia didn't get around to it until December 16th: http://news.developer.nvidia.com/200...0-drivers.html
Nvidia's 64bit OpenGL drivers have also been broken for years: http://www.mepisguides.com/Mepis-7/h...ati-64bit.html
As users of Windows 7 64bit are finding out today, things haven't really improved on the Nvidia side.
So do us a favor, get out of 2006 and please rejoin 2010.
***
Okay, for some background data here. The support of ATi and Nvidia in OpenGL can pretty much be tied to the history of how OpenGL was handled.
Just a few years ago the OpenGL specification was directionless. One of the developer advantages to OpenGL is that the specification could accept various proprietary vendor extensions, but nobody in the OpenGL Architecture Review Board was really setting a path for which extensions should be standardized and which should be discarded. Nvidia made great use of the proprietary extensions of OpenGL with technologies like Ultra Shadow in Doom3. Nvidia used the proprietary extensions that OpenGL could support in order to make their hardware produce images that were better than competitors, while achieving better frame-rates. That's all well and good, but as CoH users have found out the hard way, when you let Nvidia write your shaders and background graphics libary, you'll wind up with a graphics system that resembles Sluggy's Torg after going through Riff's Paper cut bomb. (too lazy to go find that comic).
Starting in 2004, 3Dlabs started Pressing OpenGL 2.0 in response to the stagnation and aimless direction of OpenGL's development. Through 2006 and 2007 the Khronos Group took over the Architecture Review Board and pushed out OpenGL 2.0 ES, a stripped version of OpenGL that much like 3DFX's old Glide API, provided little more than the features necessary for fast-path texturing and shading... aka gaming related functions of OpenGL.
ATi, and other vendors like S3, Trident, and Matrox, weren't known for creating proprietary OpenGL extensions. Rather, they were known for relying on specifications. ATi's reputation for being closely tied to DirectX came from the launches of their Radeon cards, which were originally named after DirectX levels, and the launch of the 9700, in which ATi had worked closely with Microsoft to determine the specifications of DirectX 9, as well as producing a reference card to accelerate DirectX 9.
As OpenGL has become more standardized and regained a direction, as well as an overseeing body that isn't likely to do nothing with the API for years, it has been easier for vendors that build cards and drivers against published specifications to get involved with OpenGL. With these background changes, AMD has interacted more closely with the Khronos group, and OpenGL support has dramatically improved.
AMD, however, isn't in a luxury position to do what they want to do, when they want to do it. They, AMD, do have to follow the money. Unfortunately, right now, the big money for PC-gaming is on DirectX. Personally, DirectX is a multi-billion dollar mistake game developers can't make... and the point is made clear across consoles like PS3 and Xbox 360, as well as mobile platforms like Android and Iphone. Porting Xbox 360 games built on DirectX to the PS3's OpenGL ES 2.0 often results in poorer games. However, porting games built against OpenGL ES 2.0 on the PS3 first, then running the exact same code on the Xbox 360, generally results in good game ports. In the same way, developers focused on multiple markets are finding that DirectX is a pitiful way to reach the most amount of players. OpenGL... well. An app written against OpenGL will run on Android, Iphone, Windows, Linux, Apple OSX, PS3, Wii, Xbox 360... you name it.
So, as portable devices become more popular, we'll hopefully see the shift from DirectX to OpenGL that needs to happen, and we'll see AMD taking a larger and more active role in pushing OpenGL support. -
I'm half surprised as often as I go off about the game's design that I haven't been extended an offer. On the other hand... would they want somebody who carries a flamethrower?
-
Quote:My biggest problem with this is that a GTX 260 starts around $200.If he is upgrading he should get an Nvidia 260 because it is far better than what he was talking about.
Yet, the ATi card with the same feature set and comparative power, the RadeonHD 4870, is $30 cheaper.
At the same price point as the GTX 260 is the RadeonHD 4890, which competes against the GTX 275. For reference, the GTX 275's are still topping the $250 mark.
Then there is the huge problem that is the RadeonHD 5x00 series. The Radeon 5770, which can pretty much keep pace with the RadeonHD 4870, can be had for around $160.
So... if he's looking to spend $200... he can get a killer deal on a DirectX 10 card from AMD, a card that will outrun the Nvidia offerings at the same price points...
or he could get a DirectX 11 card.
Why on earth would anybody want to buy an Nvidia card right now? It's a plain rip off. You will be getting a raw deal, end of story. -
-
don't forget that precedent was also set with Everquest 2 for an MMO to intentionally have models, textures, and polygon build settings that exceed current hardware capabilities.
So there is a chance that some of the eye candy that could be made to work within the game may not run at appreciable frame-rates even on the latest cards.
However, we can sort of discount the idea that Ultra Mode will be too performance intensive. One of the factors to keep in mind is that Paragon Studios is maintaining the current graphics engine. That means that all of the underlying geometry will need to be built with the current graphics engine limitations in mind. This means that the eye-candy effects can only go so far. If you watch the video from the HeroCon opening keynote, you'll see that the Ultra-Mode effects only affect surface visible objects, such as reflections, shadows, water, and textures. So there is a question of exactly how far Ultra-Mode can actually push the game's graphics.
We also can reliably guess from the cards listed by Positron that the Ultra Mode graphics engine is leveraging OpenGL 3.0 as the rendering API. We can then infer from Posi's post that Ultra-Mode is being performance limited by the graphics card, not feature limited or performance limited by the processor. We also know from Ghost Falcon at HeroCon that the starting point for their coders was the RadeonHD 4850.
So, we know what the target platform is / was for a smooth framerate.
We don't know what the resolution scaling is though.
My guess, given how the RadeonHD 4850 performs in other games, is that the resolution scaling is based around 1080p, or 1920*1080.
If we assume a resolution scaling of 720p though, which is 1280*720 or the more common LCD panel 1440*900, you'll want something more powerful than the 4850 to drive a higher resolution. -
Quote:CoH will work against Intel Integrated chips, which even the 6200 will easily outrun. The current graphics engine will still scale down / back to cards like the Radeon 8500 series. The developers were pretty clear at HeroCon that the current graphics engine will continue to be maintained after Ultra-Mode arrives so as not to alienate or prevent players that do not have modern graphics hardware from running / playing the game.6200?
You really should think about upgrading, I dont think you can play this game on "low" settings with that card.
His problem sounds like a driver issue, but I already have a really low opinion of Nvidia's drivers. Honestly, they make XGI's drivers look good, and that should be a hard feat to accomplish.
Now, if I understand the original post, I think you simply transferred Windows XP over without doing a complete re-install / reformat of the hard-drive? Microsoft Operating Systems don't handle hardware changes very well, between product-activation and just plain bad coding.
If you are not in a position to fully reinstall Windows from the ground up, I would go get Driver Sweeper : http://www.guru3d.com/category/driversweeper/
Use it to clear out and remove all installed Nvidia software.
Then try to install the 175 drivers: http://www.nvidia.com/object/winxp_175.19_whql.html
Don't forget to install the appropriate drivers for your motherboard as well. Since you are using a Geforce 6x00 board, this will be an Nforce4 motherboard and you'll need the the 15.26 drivers : http://www.nvidia.com/object/nforce_winxp_15.26.html -
remember, the devs have access to Ouroborus.
-
what driver set are you using?
If you are using the latest drivers from Nvidia, try dropping back to an older set. -
Quote:I generally use TechArp as a reference:Would it be possible to list the types (like the 9800 series, 200, etc) in order of which is better? I don't need exact specs, but where they would rank compared to each other, with my 7600GT at the bottom.
Is that a reasonable thing to ask? Obviously, omit the ones that $200+. Say, a limit of maybe $150ish (ish giving some wiggle room there).
Theoretical ATi performance: http://www.techarp.com/showarticle.aspx?artno=88&pgno=3
Theoretical Nvidia performance: http://www.techarp.com/showarticle.aspx?artno=88&pgno=7
***
As far as how they rank?
Well, starting with the GeforceFX series Nvidia's naming scheme has worked like this:
x2xx - x4xx: Low End cards- 5200
- 6200
- 7300
- 8400
x5xx - x7xx: Medium Range cards- 5500
- 6600
- 7600
- 8600
- 9600
x8xx - x9xx : high end cards- 5800
- 6800
- 7800
- 8800
- 9800
So, with this model number arrangement, a card that has a higher first number, or the same first number with a higher second number, is more powerful.
The actual line-ups are more complicated as Nvidia uses a variety of suffixes like GT, GSO, GTX, and GTS to identify different types of cards. So a 9800 GT is less powerful than a 9800 GTX.
Recently Nvidia changed their marketing strategy since they were using the same model numbers as ATi used several years ago.
Nvidia's current line up is:
GT - low end cards- GT 220
- GT 230
- GT 240
GTS x40 - low med range- GTS 240
- GTS 340
GTS x50 - medium range- GTS 250
- GTS 350
GTX - high end- GTX 260
- GTX 270
- GTX 280
Nvidia's going to change their marketing scheme again with Fermi, as they'll start branding the Fermi cards with GF 100.
***
The ATi side is a little bit easier to follow.
Back when the Radeon launched, ATi decided to name their cards after the versions of DirectX they supported.
So, the Radeon 7xxx series supported DirectX 7.- Radeon
- Radeon 7000
- Radeon 7500
The Radeon 8xxx series supported DirectX 8.- Radeon 8500
- Radeon 9000*
- Radeon 9100*
- Radeon 9200*
The Radeon 9xxx series supported DirectX 9.- Radeon 9500
- Radeon 9600
- Radeon 9700
- Radeon 9800
Since DirectX 9 hung around for a while, ATi had a problem. Their low-end DX9 cards were simply too powerful to be low end, so ATi started rebadging the 8500 series as lower model 9x00 parts.
the next set went- x600
- x700
- x800
then the x1x00 series- x1300
- x1600
- x1800
- x1900
Again, very easy to figure out "where" your card was. If you had an x850, it was more powerful than a x800 or an x700.
Starting with DirectX 10, ATi's strategy changed again, with the RadeonHD lineup.
RadeonHD 2x00- RadeonHD 2300
- RadeonHD 2600
- RadeonHD 2900
RadeonHD 3x00- RadeonHD 3300
- RadeonHD 3400
- RadeonHD 3600
- RadeonHD 3800
RadeonHD 4x00- RadeonHD 4200
- RadeonHD 4600
- RadeonHD 4700
- RadeonHD 4800
RadeonHD 5x00- RadeonHD 5400
- RadeonHD 5600
- RadeonHD 5700
- RadeonHD 5800
Pretty clear to figure out where each card is from the model numbers. A card from the 4700 is going to be more powerful than a card from the 3600 series.
Things can get fuzzy at the top end, where a RadeonHD 48xx card can keep up with something like a RadeonHD 57xx card. Things generally are consistent though across the Radeon range.
As far as direct analogs go, the series matches up sorta like this:
Radeon 8x00 series = GeforceFX Series\
Radeon 9x00 series = unmatched
Radeon xx00 series = outmatched
Radeon x1x00 series = Geforce 6x00 Series / Geforce 7x00 series
RadeonHD 2x00 series = outmatched
RadeonHD 3x00 series = Geforce 8x00 series
RadeonHD 4x00 series = Geforce 9x00 / GTS 250 / GTX series
RadeonHD 5x00 series = unmatched
To explain this, the Radeon 9500-9800 series cards didn't have any Nvidia equivalents.
The Geforce 6x00 series took the upper hand, and ATi wasn't able to match the Nvidia offering till the x1x00 series of cards, which competed against the 7x00 series as well.
With the 8800 Series, Nvidia again outmatched ATi.
ATi eventually gave a performance answer to the 8x00 with the RadeonHD 3x00 series.
The RadeonHD 4x00 series took on both the 9800 series and the GTX series.
With the RadeonHD 5x00 series, ATi is currently unmatched by anything on the Nvidia side. Nvidia is expected to match the high end RadeonHD 5x00 series with Fermi.
***
I hope this answered your question. -
Quote:gets worse when you realize that the "9800" could also stand for the Radeon 9800, which was released back in 2003.See...this is why I don't like shopping for components. A 4850 is better than a 9800.
*head hurts*
Basically, Nvidia, AMD/ATi, and Intel are never going to come to an agreement on marketing their products in a way that you can actively compare the products by the product name. So while it may hurt your head, it's just marketing as usual. -
Quote:really depends on who you ask, and what you play.I'm debating finally getting a rig primarily for gaming/media, and I'm dithering over the video card. As I understand it, if I plan to go the multi-vid card route I have to get a mobo that supports it from the get-go. Question is, is it really worth it? Two cards means a beefier power supply (not much of an issue, just an annoyance) to better thermal regulation (definately an issue) and possibly other things that I don't even know about (biggest issue).
So, is the gain in graphics shineyness worth juggling the above factors? Or should I just get a decent single card that's near-cutting edge?
Some games don't benefit from multi-gpu setups. Some games do. Some games only show benefits with Nvidia SLI. Some games shine best with Crossfire.
The advantage to multi-gpu setups is that you can often get equivalent rendering power for less cash. For example, a RadeonHD 5870 will cost you around $400 to purchase. Two RadeonHD 5770's, which cost around $160-$170 each will offer rendering that will get you close to, if not past, a single 5870: http://www.xbitlabs.com/articles/vid..._14.html#sect0
two RadeonHD 5750's, which run around $130-$140 each, can keep up with the $300 5850.
So you can spend a bit of cash now, and say get a single graphics card, and then get an identical card a couple months down the line, and have even more rendering power.
Then there's the added advantage of processing technologies like Nvidia PhysX or OpenCl that can utilize "extra" processors. The scope of OpenCL is wider than the scope of PhysX, and game developers are already planning on how to write code that can utilize additional GPU's to deliver better physics, AI, or even better positional audio. With both Intel and AMD delivering GPU/CPU combination's over the next 6 to 8 months, developers will have a growing platform base upon which OpenCL acceleration can be useful.
Personally, I don't build a computer now without making plans for a multi-gpu setup... but that's just me. -
I wouldn't mind an option to set various musics already within the game to a map or to an event.
Since we know that the AE file-size will be doubling shortly, it could probably be accomplished under the new text-file description limits.
***
On the original topic, playing our own music in the game has been shot down several times, almost on a weekly basis. Maybe one of the mods needs to put up a sticky that players being able to upload music just simply isn't going to happen as long as entities like the RIAA are around.
That being said, maybe, just maybe, NCSoft and Paragon Studios could cut a deal with some of the guys from OCRemix.org to produce AE specific music.
***
Oh, and Television can take a long walk off a short pier and short circuit in the ocean. Aeon Forever! -
In that price range you have on the Nvidia side the 9800 GT Series.
On the ATi side you have the RadeonHD 4850 series.
If I was buying though, I'd pay out a little more and go for the RadeonHD 57xx series of cards.
The 5750's start around $130 and the 5770's start around the $160 mark.
Thing is, the 57xx series is really the base starting point for cards that can run DirectX 11 features at an appreciable frame-rate. While the RadeonHD 56xx series can be had for under $100, at least in Dirt 2, they can't run DX11 features like tessellation fast enough to generate playable frame-rates in "normal" wide-screen LCD resolutions like 1440*900 or 1680*1050.
If I broke the $100 mark, I wouldn't buy an Nvidia card though because they don't support DirectX 11, and currently there won't be a mid-range card from Nvidia, nor a low-end card, that will support DX11 until August at the earliest. -
Quote:... I think that link qualifies as a whiskey tango foxtrot...Can you get one of these in that game?
then I read some of the links in the paragraph about that item... and those are actually worse / better. I can't decide. -
Quote:Except Jack Emmert hasn't had much to do with Champions Online since November 2008, when he handed off the reigns to Bill Roper, for better or worse.If you really believe that Jack Emmert has nothing to do with CO, and hasn't been involved since Bill Roper was hired, I've got a large amount of ocean front property in Montana to sell you.Quote:
... Uh... yeah. He handed Champions Online over to Bill Roper a year and a half ago so he could concentrate on Star Trek Online. -
Check this thread: http://boards.cityofheroes.com/showthread.php?t=207468
-
I find myself entering this conversation late... most of the pertinent flaws with an 18+ server have already been addressed throughout the thread. One that I don't think I've seen addressed in full is the... ahem... need... of the game to maintain a certain rating across various countries. The short version is that NCSoft, as a publisher, submits the developer created content before various review boards. They use this certification as a basis on how, and who, they can sell the game to.
While some players like to cite examples such as Halo or Call of Duty as examples of mature games that sell to, and are played by, a younger crowd, the fact is, games like this tend to get creamed by titles like New Super Mario Brothers. The Nintendo Wii console dusted competitors that focused on the hard-core gamer, a market that didn't grow or expand as various accountants expected.
The mature market on the Wii still continues to elude publishers like Capcom and Sega, with titles such as Madworld flopping due to it's focus on continuous vulgarity and profanity. Capcom couldn't follow up Resident Evil 4 with another adventure title, and gamers turned their noses up, and their wallets down, at the extended light-gun shooters. It's not that the market does not exist, but that publishers can't get the right mix of what gamers want to play, with what gamers want to experience. Too many publishers and developers equate Mature with violent, sexual, and immature behavior.
The concept holds across movies. G and PG movies routinely outsell R-rated movies. Yes, every once in a while an R rated film will rake in some cash, but for the most part, movie companies out to make profits have to focus on fitting within the PG-13 envelope. This doesn't mean that they can't still do epic adventures or tell great stories. It does mean that they (the content creators) have to pay attention to how the content is presented. Case in point: The rebooted Batman franchise.
The fact for NCSoft is that the Teen / 16+ rating gives them the flexibility to market the brandings of their games to a wider audience, while still giving room for naughtier concepts such a Dominatrix. While no current game retailer refuses to sell M rated games, for mass market retailers like WalMart, it's much easier, and cheaper, to lock in store-promotions and positioning for a game that can sell to a wider market.
Adding an 18+ server, behind anything else, would basically require that NCSoft re-certify the developer created content against the new rating. This in turn could hurt future marketing opportunities. Stores like Walmart, that happily promoted the Good Vs. Evil edition of the game, might not so happily, or cheaply, do the same thing again.
****
While I'm at it, one of the common misconceptions of the game is that simply having the profanity / vulgarity filter allows players to say what they want to say, and if other players don't like it, simply enable the filter. It's been made quite clear by the forum moderators, by the in-game moderators, and by the developers that this isn't true. The various chat filters do not simply grant carte-blanche for players to type or emote what-ever they want. Players who continuously willingly trigger the filter system stand to have their accounts suspended or banned.
Another common misconception is that the players themselves are responsible for every single genericed reporting. Some players don't seem to realize that the GM's play the game to, and GM's can use the /search function just as well as any other player. So just because you (a player) got genericed, doesn't mean that another player went through the trouble of using /petition.
Something else to consider is that the GM's, like players, can dump public broadcast chat logs to file and save them for later. So again, a player doesn't have to be the one to report somebody for excessive vulgar or profane language.
Just something to keep in mind for those thinking, oh yes I can get away with being a jerk. Chances are... you can't. -
Quote:Short version: Epic, Pool, and Patron powers were not part of the initial target for color custimization and they require additional tech and UI work to expose the functionality to the player.So now we're able to change the colors of our primary and secondary powers to pretty much any color of the rainbow.
So why can't we change epic power pool colors as well?
Long Version: some of these power sets may never be customizable. The developers have cited the position in the case of Patron powers that you, the player, are borrowing somebody else's power, and in that aspect, they aren't sure if they want to storyline allow you to change the color animation. These and other soft-factors contributed to the developers only focusing on making primary and secondary power sets customizable. There still primary and secondary sets, like stone armor, that aren't customizable on certain powers. These powers will take priority in future development. -
Quote:You really have no idea just how bloody annoying that tagline is do you? Do you have any idea how often players on these forums have used rumors to justify their behavior or expectations? Till a rumor is proven, leave it alone.I wasn't at that event, but please feel free to construct whatever mental image suits you.
And no, I don't plan on divulging my sources, other than to say they were shadowy. It's a RUMOR.
Remember though ... today's rumors are next week's news.
In other words.
Today's rumors ARE NEVER NEWS. -
temporary fix is to run CoH in windowed mode, but at your desktop resolution.
Annoying really.
