-
Posts
4197 -
Joined
-
Quote:Shadows are as processor intensive as they are GPU intensive.I don't know why I was talking like I had no experience with Ultra mode... brain fart!
Anyway. Yeah, with Ultra Mode CoH side I get like, 15fps. CoV side I can only get 10 at best. But that's when nothing is going on. Really, the only setting that really causes any problems is shadows... which happens to be the part I enjoy the most
Basically, I just want to be able to max it out and have nice 25+ frame rates.
I've never really been able to get good FPS out of CoH really.
But if I can't get any real performance boosts without paying lots of money, I guess I'll just have to put it off for now. By the way, my budget was £150-£200. But as I said, unless there'll be a real noticeable difference, that won't be worth it.
I can tell you that against a powerful enough processor, the RadeonHD 5770 will give you mostly what you want in CoH, and is under the target range of £150-£200.
The problem is, I don't know what your processor is, and I don't know what your target resolution is.. I know from personal experience that a Core 2 Duo at @2.23ghz won't drive 1920*1200 with a 5770, but a Phenom 9600 (the launch one with the TLB issue)... will. A RadeonHD 5770 coupled with an Athlon64 6000 AM2 can do ~25fps in 1920*1200.
If you do have a Core 2 Duo, or a Pentium 4, or Socket 754 Athlon, or a Socket 939 Athlon, higher resolutions like 1920*1200 may be out of your reach for newer games no matter what GPU you use. -
-
-
that sounds like Nvidia's OpenGL driver.
look here: http://boards.cityofheroes.com/showthread.php?t=219502 -
Quote:Jaso: I've had hands on with a GTX 470. Stock, it can't outrun a 5850. Basically put, I can't duplicate or replicate the results I see other websites getting, or say the 470 gets.actually a gtx 470 can out run a 5870 and really shines in sli (not applicable to coh apparently though)
http://www.tweaktown.com/articles/32...nt/index6.html
and as you can see tweaking it will get you 480 levels...
but...
the ati offerings are so much more effcient in power usage and run much cooler. as stated it all depeneds on how much you want to spend...if you can afford it - I would go 5870 just for sheer bang for the buck - if you game at higher than 1920x1080 then you start to get in the sli crossfire area...
The last time this happened was back with the GeforceFX... and I think most people are agreed with how that ultimately turned out. To get a lot of it's "performance" the FX was using a metric ton of driver shortcuts and "driver cheats" for lack of a better word.
This is why I link to HardOCP when it comes to reviews because Kyle, regardless of whether or not I agree with him, Gets It. Performance of a graphics card is more than just how the card performs in a synthetic benchmark. It's how the card performs in actual games. And in actual games, the 470 does not outrun the 5850... and in actual games where SLI and Crossfire are "equally" supported, it doesn't run away there either. -
-
if your starting point is the GTS 250, there's not a whole lot of room to move without breaking out your wallet. First, a GTS 250 will do Ultra Mode, it will just do UltraMode with low settings.
Now, I'm not entirely familiar with the UK pricing strategy, so while in the US market all Nvidia cards other than the GTS 250 are bad buys, that might not be entirely true in the UK market.
In terms of image quality / detail settings with CoH's Ultra Mode, against a good processor, the RadeonHD 5770 can deliver ~25fps+ with all Ultra-Mode settings maxed / sans Ambient Occlusion @ 1920*1200.
Your preferred site lists a few of these cards: http://www.overclockers.co.uk/produc...ortby=priceAsc
Now, the cheapest one there is £124.98, which at the current exchange rate is around $190 (US), so you are paying a premium. The problem is, depending on what games you ultimately play, the 5770 may or may not give you a visible clearly better than your GTS 250 experience... which is pretty much the problem with all of the sub £150 cards I see on the site.
The first card that will make you go "Wow, this is much faster!" would be the RadeonHD 5830. You can get a Gigabyte for around £190 http://www.overclockers.co.uk/produc...=56&subid=1711
From there on up is the RadeonHD 5850 which you can get for around £225 http://www.overclockers.co.uk/produc...ortby=priceAsc
Then the RadeonHD 5870 which costs around £320: http://www.overclockers.co.uk/produc...ortby=priceAsc
As in the US, the Nvidia Geforce GTX 465 and 470's really are not options from the prices I see on your preferred site. The GTX 465 has a starting price around £240. That is a large chunk over the RadeonHD 5850... and a significant problem when you consider that the Nvidia Geforce GTX 470, itself selling for near £300, can't outrun a RadeonHD 5850.
So basically, with the GTX 265, at least from that site, you'd be paying a £15 premium for less performance, and on the GTX 470, a £75 premium for no more performance.
If you have over £400 to spend, the GTX 480 does offer ultimate performance, ignoring things like power draw or heat output: http://www.overclockers.co.uk/produc...ortby=priceAsc
There's no real consistency in pricing in the site you prefer, just a constantly steadily increase in cost from around £430. I wouldn't buy that that £390 version, mostly because it's an off-brand. Given the quality that on-brand vendors such as Zotac, Galaxy, and Sparkle have had to drop to in order to be profitable selling Nvidia kit, I'd be really hesitant of something that was built with even less quality controls.
As with the threads on US prices, I consider this to be a rip-off. Yes, the GTX 480 is more powerful than the RadeonHD 5870. It, however, is not, by any means, worth £80 more, much less the near £110 premium the UK vendors seem to be asking for. -
-
Quote:depends on what kind of modem you have, and depends on whether or not I'm understand the situation correctly here:Yes, I have. I've even pressed the hidden button behind the pinhole on the back of the cable-modem to reset it.
I guess I'll call my ISP. Any information I should gather together beforehand?
So, just to clarify:
You are running through a Belkin Router.
You have this really long loading times on all computers connected through this Belkin Router
You have bypassed the Beklin router and the long loading time remains on the single computer connected
So we know there's a network issue, and that the issue seems to be coming from outside the internal network. :: Do I have this straight so far?
From here: When you are directly connected to the modem, you should be able to connect the IP address: 192.168.100.1
On most cable / dsl modems, this will take you to the modem's control panel.
My suspicion is that if your network speed or levels are out of range. If it's a cable modem you should have a TX, RX, and a SNR level.- The TX level should be between 45-58.
- The RX level should be close to 0
- The SNR should be around 30
My suspicion here is that your cable Signal to Noise ratio is either really high, or your Transmit level is really low. -
Quote:thanks Shark.no. but it does give the devs an idea of where to start them at. i'd almost wager that they look at them after GR.
If you've ever read anything of what I've wrote about the beta testing process before, you wouldn't be asking this question. It's well known that I often go off about running into people in beta's who do not understand what a beta is for. As is I've already climbed the cases of several players in the past, both on the forums and in the -Test client, for basic incomprehension of beta testing.Quote:And you presume that the selection of players running TF's in beta testing is representative of the normal player base?
I don't pretend to suggest, or indicate, that the normal players involved in a beta test represent any significant majority of the player base.
In the case of the Positron Task Force, they don't have to. There's a difference here between data reading and data mining. The developers already have loads of data from the original Positron Task Force over how many different groups respond to, and handle, the various enemies found-within the Task-Force.
From the developers perspectives, the only combat data they lack is how groups handle the Shadow-dopplegangers from the CoT, and how the groups would handle an Archvillian.
For practical purposes of reward assignment, the developers already have the expected numbers for the majority of the battle content within the "new" Positron Task Force, and only need the combat data from the Dopple-Gang Encounter, and the combat data from the fight from the Dam.
All a beta has to provide is confirmation on whether or not those numbers are within the range of what the power-gamer achieves, and what the oh shinies player achieves. -
-
-
Quote:short version: no.so if I go back to non ultra mode I should get sli then....or crossfire.
SLI / Crossfire support was never triggered for CoH in the past.
I really don't know why Nvidia didn't support it, given that the game was one of their "babies" at launch... Given that games like Doom3, Prey, and Quake 4, could leverage SLI, and in some cases for some amazing performance results... I know it wasn't an issue with CoH's usage of OpenGL as a rendering API.
So, I couldn't tell you why Nvidia never did support SLI, in any form, with the old graphics engine.
Now, given that Nvidia is supposed to be the champion of multi-gpu support... I mean.. seriously... you think Multi-GPU, you think SLI... I would have thought they would have been chomping at the bit to have CoH's Ultra-Mode running in SLI on launch.
I'm guessing that Nvidia still has engy's working on the program, and indications from Television suggest SLI support could be arriving later this year. It may, for reasons I'm not sure I understand, ultimately require a hard-switch as the test-server did.
* * *
As far as AMD/ATi goes... back then ATi couldn't care about OpenGL.
ATi's OpenGL support was actually so bad in it's pre-2007 state that AMD pretty much ordered the old engine junked, and they've been through... at least one other complete engine re-writes now for OpenGL support. I think I'm also right in saying that the OpenGL 4.0 driver is using yet another completely new engine. I could be wrong there... Terry Makoden runs screaming from AMD's headquarters any time I send him an email. (no, I don't know if he actually does run, but it has been hinted in the past that one of my emails did generate a hole in a wall.)
Anyways, the rumor is that AMD does have a Crossfire solution in the works for CoH. However, this solution will likely only be for RadeonHD 2xxx series and newer. If you are running Crossfire on an x800, x1800, or x1900 series card, it probably won't work. On the other hand, I know what the crossfire editions of these cards sold like... and I'm pretty sure that AMD/ATi isn't exactly concerned with Crossfire support on these cards.
For that matter, I know what the HD 2000 series sold like too... and honestly... I'm not sure AMD/ATi should bother.
* * *
Anyways, my theory is that SLI / Crossfire don't work with the original graphics engine because the code was fundamentally broken to deliver some effects against various cards, such as the Intel Integrated graphics card. The graphics engine has been described to me as load of rusty plumbing with band-aids every 2 inches.
The Ultra-Mode update reportedly cleared out a lot of the old custom patchwork code, which promptly generated problems for many users with older graphics cards / Intel graphics accelerators.
Theoretically, the new-engine should be easier to accelerate in a multi-gpu environment.
Again, this is largely dependent on what the driver see's and is programmed for. -
Quote:Heh.They said OpenGL, not OpenGL 3.x. Also, according to that press release, SLI just magically works and the Devs don't have to do anything, so this thread shouldn't exist

Bingo.
Let me spell this out B_Witched: Nvidia. Lies.
They've disabled PhysX support in their drivers if you don't use an Nvidia card to render a scene... Not really a "big" deal since Intel killed off the Larrabee add-in cards and that means you'd have to be one of those users with an ATi primary card and an Nvidia secondary card.
They've been caught with their pants down sabotaging competitors graphics cards in games like Need for Speed, Borderlands, and Batman Arkham Asylum. Again, maybe not the biggest deal since again, the only competitor is AMD/ATi.
Oh... and they've got numerous class-action lawsuits over the multiple product recalls from literally exploding laptops... where Nvidia outright lied to vendors over just what the actual thermal outputs of their chipsets were. Of which, in this theme, yet another Vendor issued another recall for yet ANOTHER line-up of EXPLODING laptops... THIS WEEK.
I'm sorry, it's not that I'm pro-AMD/ATi. I just hate companies that outright play nasty every chance they get. As of late, Nvidia's either been paying game developers to sabotage game-code so that it breaks on competitors equipment; won't tell OEM's and ODM's just how hot their chips actually are and allowing sub-standard cooling solutions for their chips to hit the market; and essentially tells add-in board vendors that if they want to make a profit, they'll have to use cheaper board-materials (Sparkle, Zotac, Galaxy).
Okay. Fine. I'm biased. Nvidia won't support what actually matters to me. They won't release the specifications on their graphics cards. They won't recognize the Nouvea driver. They have no open-source / Linux strategy beyond a binary driver.
Hell, even Intel has partially won me over with the I7 processor and with Larrabee could do. I really think Intel made a bone-headed mistake pulling Larrabee from the market. I really think it could have been the graphics tech that would have lit a fire under both AMD and Nvidia. The I7's a really good processor, and you no longer get reamed up the rump on the infrastructure costs of building an Intel rig. That and Intel's actually pretty damn good on the open-source front. Sure, their normal IGP's resemble a hybrid mix of a Rage 128 and a Hoover on Windows, and the Linux driver is even worse... but at least they decided to match what AMD does.
I'm getting away from the point here.
Just because Nvidia says SLI support is automagic across DirectX and OpenGL, and should work no matter what... DOES NOT MEAN IT ACTUALLY WORKS THAT WAY.
First: not every game is going to benefit from AFR.
Second: support still needs to be entered in at the driver level. Nvidia hasn't had real games to optimize OpenGL 3.x for.
Okay, there is a fair point in that, yes, OpenGL 3.0 has been out for a couple years now, and quite frankly both AMD and Nvidia could have used their own internal demo's to prepare for potential rendering methods... and there's some really talented coders on Rage3D and Nzone that could have provided a quick and dirty OpenGL 3.x rendering demo to work against... so... yeah, I think both companies have a bit to answer for on why the support is so poor.
On the other hand, the gaming industry has been chasing after DirectX like it was some kind of liquid crack... Not only because OpenGL stagnated like a Georgia swamp when SGI decided to take a nap, but also finding out that when you piss off several million gamers still using Windows Xp... it does bad things to earnings reports.
Now, I could go into the full on spiel here about why DirectX, and any proprietary platform API is a bad idea... but I don't think anybody wants to read lecture number #42 which starts as:
I'm getting away from the point I wanted to make. Nvidia, as a company, doesn't have a history of telling the truth to it's buyers, end-user or corporate, and is known to use more flowery terms than me.Quote:Hence the resurgence now in visible commercial products turning back to OpenGL. Khronos, as an organization, has gotten far-more involved in driving OpenGL adoption, and when you present the only selection of programming API's that will allow a programmer to hit any platform, regardless of OS?
Here's the thing:
Anytime you read something from AMD, Intel, or Nvidia, have a barrel of salt on hand.
Then talk to the guys who actually write the code.
Granted, I will admit that hanging around the X.org dev channels on IRC is a bit... scary. -
Quote:Okay, so you can copy from Nvidia's press-site.SLI technology can be enabled for every gaming application, including both OpenGL and Direct3D gaming applications. SLI technology provides either 3D performance scaling using alternate frame rendering (AFR) or split-frame rendering (SFR) or increased visual quality using the SLI Antialiasing mode. In order to provide the optimal 'out-of-box' experience for its customers, NVIDIA has created an extensive set of optimized game profiles which enable SLI scaling automatically. The full list of these SLI-optimized games can be found here.
NVIDIA® SLI technology can be enabled for every gaming application. In addition, to provide the optimal 'out-of-box' experience, NVIDIA has created an extensive set of over 500 custom application profiles which enable SLI technology automatically and optimize scaling performance. These optimized applications, shown below, are enabled automatically with no control panel changes required.
That still does not change what I said. -
short version: Multi-GPU is normally accelerated through the driver itself. This is not a case of CoH not supporting SLI or Crossfire. It is a case of the Driver Vendors not having added Multi-GPU support.
City of Heroes Ultra Mode uses the OpenGL 3.x API for it's graphics, and at the time of launch, was the only commercially launched game to utilize OpenGL 3.x API.
As such, Nvidia, nor AMD, have had their hands on fast-past OpenGL 3.x rendering samples to optimize their drivers for. They haven't had "time" to optimize their drivers for multi-gpu support atop OpenGL 3.x.
There are other titles out there now leveraging the OpenGL 3.x and 4.x API's, such as the Unigine Heaven Benchmark and Valve's Source Engine. Id's Rage will also reportedly be driving an OpenGL 3.x rendering path for the PC release, although it will be using an OpenGL 2.x rendering path for the Xbox 360 / Playstation 3 releases.
* * *
Now, it is technically possible for a developer to force a game to sabotage or make better use of various Multi-GPU modes. For example, when Borderlands launched it was deliberately sabotaged at the code level against AMD Crossfire configurations.
During the I17 Beta, CoH has also had a special command line accessed rendering mode enabled on the test server, taking advantage of 2x SLI setups. As of the current test server revision, this command line mode seems to have been pulled. -
Okay. Thanks to my browser being stupid, I lost what I was originally posting.
Short run down:
Multi-Gpu support: Not really needed. Unless you want to push high levels of detail and massive image filtering, most multi-gpu rigs are overkill. A RadeonHD 5850, for example, can be beaten by two 5770's working together; the 5770's in crossfire can actually best a single 5870 in some games; and with the 5850's original launch supply issues and retailer markups, the $320-$340 you'd spend for two 5770's was probably going to be worth it.
But, the 5850 can now be had, reliably, for under the cost of 2 5770's, and unlike two 5770's, it's not dependent on Crossfire support, so you'll get that high performance... all the time. Not just in situational events.
The 5770, on it's own, can push DX11 / OpenGL 4.0 shaders in 1920*1200, so if you are buying for future proof, it's sort of a safe mid-range bet; providing you don't mind not being able to apply high-levels of anti-aliasing.
* * *
There is a catch through when buying a motherboard with an eye to Multi-GPU support. Nvidia decided a couple years ago to abandon the AMD market, and basically won't allow AMD to support Nvidia-SLI on AMD chipsets. This is among some of Nvidia's other nasty moves, such as disabling Nvidia-PhysX when a non-Nvidia GPU is rendering the graphics.
The result is that while Newegg.com currently lists 139 different Socket AM3 motherboards with AMD chipsets, it only lists 15 Socket AM3 motherboards with Nvidia chipsets. Of those, only 5 motherboards have 2 PCIE-16x 2.0 slots. Two of those are Open-Box units... meaning only 3 Socket AM3 motherboards will do SLI:- MSI NF750-G55
- ASUS M4N75TD
- ASUS M4N98TD
Only one motherboard will do Triple-SLI- MSI NF980-G65
These boards will also only do SLI. They won't do Crossfire. So if you are looking at Multi-GPU, you'll basically have to buy an AMD motherboard with either an Nvidia chipset or an AMD chipset... and then only use multi-gpu setups from whichever vendor the chipset came from.
Intel on the other hand... well... if you buy an Intel Motherboard with the P55 chipset, many of them will do both Crossfire AND SLI.
Such as this BIOSTAR T5 XE CFX-SLI.
Basically, Intel motherboards are not the screw-over jobs they used to be. $134 for a BioStar Motherboard is actually a pretty decent amount. You basically don't have to bend over and accept a hot-red spike up the... well... you know... just because you bought an Intel motherboard.
In fact, the majority of the AMD boards are not much off that price for the same level of feature support.
So, really, the motherboard itself isn't so much a deal-breaker like it has been in the past. Granted, that is largely in-part because I7 and I5 motherboards don't need to have the memory controller wired into the board, and vendors no longer have to pass the price of that memory controller onto the end-user.
Problem is, you'll still make up that memory-controller surchage on the processor.
Basically, if you were looking to spend for ultimate performance, I'd say get an I7:
You won't get reamed on the motherboard.
You won't get reamed on the memory.
You won't get reamed on multi-gpu support if you buy a motherboard with a P55 chipset
You'll still get reamed on the processor price itself since most I7's you can buy start at $300
If you aren't looking to spend $300 on a processor alone though... Intel still hasn't quite caught up in the mid-range and low-end.
Down here were the "normal" folk buy hardware, AMD's still got the best bang / buck, although you do lose out on multi-gpu rig choices. -
Quote:actually... Intel's I5 is the processor series that makes woof-woof noises. There's a reason there's a flood of laptops and mid-range desktops this year that are dropping I5 for AMD offerings..Hey folks,
Hope all is well. I'm in the process of buying an ASUS motherboard, but I'm not sure which is really best for my needs/playstyle:
1. Games-Currently played is just COX. Future, maybe a FPS like CoD or a MMO such as DCU Online or Bioware's Star Wars.
2. Bang for the buck-looking towards future. I think Intel's i5 is marginally better than both AMD's Phenom IIx4/IIx6. I know AMD is generally better "bang for the buck". On this note, why not just build a high end dual core instead of the 3 mentioned?
Also, do I really need a motherboard for two video cards? I just wanna pay for practical value/usage, not eye candy.
Thanks in advance.
Now, just for pricing, the AMD x4 955 is $160, and the x4 965 is $180.
By comparison, the cheapest Intel I5 is $180, and it's the 650 Clarkdale, and it's a dual-core.
The cheapest Intel I5 quadcore, and the only one Newegg offers, is the $195 750 Lynnfield.
So, as far as bang per buck goes, I5 isn't really bang for buck at all. It's straight out IPC isn't actually that much better than the Phenom's, due in part to the cut-cache amounts from the I7 it was built from; and you basically pay more money for less cores. Sure, a Dual-Core Clarksdale might keep up in a single application, but when you turn to multi-threading and multi-processing... it's a case of bai bai bai.
It's also not likely that the quad-core can keep up either. Case in point, here, I've got a Phenom II 965, and a Intel I7 920 Engineering Sample. The 920, if you could buy one, costs around $300. In every single game and benchmark thrown it's way, the Phenom II 965 absolutely destroy's it. Granted, the Phenom II also came out long after the I7, and has nearly an 733 mhz raw speed advantage (2.66ghz versus 3.4ghz).
Now, on paper, the I7 does have the advantage of Triple-Channel memory support and Hyperthreading: The motherboard I bought won't do triple-channel; and with 4 native cores and 2.66ghz clock speed, Hyper-threading really doesn't have much to add to the typical gameplay session. So the performance of my I7 is probably very close to where the performance of the Lynnfield I5 is going to be, ignoring other internal architectural changes as the cache and clock speed are the same between the two.
And... it's not a contest. The Phenom II is way more powerful.
* * *
Now, that being said, building a high-end dual core from AMD could be cheaper. I've got an Athlon64 6000 Black Edition sitting on a Socket AM2+ board. On paper, it has the same amount of cache as an Athlon II X2 250, but a slower HyperTransport bus (2ghz versus 4ghz on Socket AM3).
So the performance would probably be close... and in some games... well, I'll be honest, you can't tell a difference between the dual-core A64 X2 and the Phenom / I7 systems.
Would I buy an Athlon II X2 250? Oh no. The Dual-Core Phenom II 550 is only $18 more... and in the long run, it's extra cache is going to be more future-game proof.
I'd also be looking at the Athlon II X3 445. It's around $85, and while it lacks the cache... that extra core is kind of hard to pass up.
If I could get over the $100 mark, to $120, I'd be looking at the Phenom II X4 940.. Okay, it's a Socket AM2+ processor, not a Socket AM3 processor, so you do have to use the slower DDR2 memory. The Socket AM3 version is near $30 more, and the memory difference between DDR2 and DDR3 really almost pays for itself there. At the $150+ mark though,
* * *
Okay: motherboard.
Actually, I'm going to post this to get it up, THEN do the motherboard bit. -
Quote:It's not an un-named graphics card. It's no graphics card. The unit shipped with an Intel Graphics Accelerator.I've tried several times to use the graphics and sound option, but anytime i select it, it crashes,Claiming an issue with ig4dev32.dll is this somthing that can be fixed?
It's an ASUSTeK Computer, to be more specific an ASUS CM5571 It's running a Pentium Dual core CPU E5400 for a processor with 6.00 GBs of RAM and the standard unamed Graphics card that came with the system
First: Go to www.intel.com and look for updated graphics drivers.
Second: think about buying a real Graphics Processing Unit
Third: check these posts out:
http://boards.cityofheroes.com/showthread.php?t=220618
Quote:1. right click on CoH shortcut
2. click on properties
3.click on shortcut tab at top
4. go to target window "C:\Program Files\City of Heroes\CohUpdater.exe"
5. after the last " press space once
6. enter the following code exactly -useTexEnvCombine
7. click on apply then ok
8. right click on CoH then click run as admin.
http://boards.cityofheroes.com/showp...9&postcount=26
http://boards.cityofheroes.com/showp...73&postcount=6 -
-
-
-
Quote:I normally say something like: "Hold down your left control button please. Now left click on your transfusion power. Got it? There. Now you'll heal melee and take away their regeneration.Reminds me of my experience with a lame Kin who wasn't using his heal (Transfusion), our only heal on the team.
(seconds from Team Wipe...)
Me: [lame Kin], spam Transfusion please!!!
Blaster: Thanks but I'll pass, I've eaten the stuff, that's as close as I wanna get to injecting into my body.
I laughed for about 20 minutes picturing a spam transfusion.
Anyways, having now said the sensible thing, now for that mental image!
I see a farmyard of NOTHING but Kinetic Pigs, threatening anybody who gets close with Spam Transfusions! Their motto?
"You can have the Spam, but Never the Ham!" -
Quote:as far as I'm aware, no "floated" driver (meaning one that has been passed to OEM's / ODM's or sent out to the Catalyst beta testers) has AA+AO working on the 5000 series. This doesn't mean there isn't a driver build that enables the two together, but my guess is support is probably 2 / 3 months off. It took about that time for the AA+AO fix on the OpenGL 3.3 driver to filter from the floated driver to the stable release.Yeah, I saw that, but all that meant to me was that it was the only option you didn't have cranked all the way up... not that it was completely off. For all I knew you might have had it on low, and it gave me hope that the conflict was already fixed... well I hope it comes soon. weeee! lol
I'm under the impression that both Unigine and Source use AA+AO in the same manner that CoH uses the two, so there is some pressure on AMD/ATi to get the support fixed. -
Quote:yeah... this is pretty much it.the devs don't care what your(i mean you) average time is. obviously they feel that they give the appropriate merits at this time.
The developers have access to the timings on every Positron Run made on every server... and that's what they base the merits off of.



