-
Posts
6048 -
Joined
-
The question I have is what resolution are you looking to run the game(s) at? Needless to say, the higher the resolution, the more GPU power you will need. Also do you insist on playing games with AA on? Again it matters as AA Vs no AA affects performance.
And this page from the HD 5830 review summarizes the results of a number of games at either 1680x1050 and 1920x1200 at no AA or 4xAA against a number of games compared to an HD 5770 and an HD 5850. Actual numbers are found earlier in the article. -
Yea, it appears that while the HD 5830 is about 10-15% faster than the HD 5770, it's a 25% slower than the HD 5850. On top of that it uses more power than the HD 5850, due to it's higher clock speed.
List prices for HD 5770 ~ $180; HD 5830 ~ $250; HD 5850 ~ $300; so base on performance the HD 5830 should be priced around $220. Problem with that it is still uses the same large GPU that's in the HD 5850 as well as the same board. Sure it's a neutered version of that chip with half of the ROP circuitry disabled as well as an additional 320 SPs but the manufacturing costs for everything else must be similar.
I bet in a month we'll see it's price dropped to near $220. It outperforms the GTX 260 and the GTX 275 is nowhere to be found. And with nVidia saying that there isn't going to be a Fermi based mid level card any time soon, ATI will be happy to fill the niche between the GTS 250 and the GTX 285 with three cards at various price points to nVidia's one. -
Quote:In that case maybe they should have Supergroup Groupie as a Day Job.On a related note, I only recently discovered that a character with no supergroup membership and no means to access a base can still earn the day job badge simply by logging out next to a base portal. I had previously assumed you needed to actually enter a base before logging to get credit, but not so.
-
Quote:It's not that the game is locked down to two cores. It's just that only two threads really want CPU time while the others only run infrequently, not very long or they are synchronized to one of the two "main" threads meaning the main thread pauses until the this thread is done.AI vet pets would be cool.
Correct me if I'm wrong, but you seem to be stating that there's nothing the devs can do to the engine that the OS isn't already handling in order to "spread the pain" across all the cores on a proc.
Granted, my understanding of processor engineering is nonexistent, but if that's the case, why did the devs have to enable the renderthread flag so that it could use the second thread/core in the first place?
Why is it locked down to two cores as it is? Or is it? Resource Monitor is showing me that cityofheroes.exe is using 17 threads for 13% CPU utilization with most of the action appearing to happen on CPU 0 and 6 with half as much working on 2 and 4. All the odd numbered CPUs are labeled "parked" and show almost no activity at all.
Oh. I guess it doesn't matter.
Right now I'm looking at 400 threads running under XP. Of those only a handful is seeing any noticeable CPU time. The rest wait patiently for when they whatever event they are keyed on to occur. Same is likely true with the game. -
You got this desktop system right? And then added a video card.
The GT 220 is better than the 9500GT, depending on the type of memory it has. There is a huge difference in performance between a GT 220 with 512MB of DDR2 memory and one with 1024MB of GDDR3 memory. The assumption on the chart on Tom's Hardware is of a GT 220 with GDDR3 memory. But we are still talking about somewhere between half and two thirds the performance of a 9800GT.
Now the problem with a 9800GT is that I'm going to guess your system doesn't have the power supply to support a 9800GT comfortably or reliably. Sure it may work, for a while but like red-lining an underpowered vehicle, it's not good for the longevity of the engine, or in this case the power supply.
Objectively, the E5400 isn't a really good CPU for gaming. A slow FSB of 800MHz with only 2MB of L2 cache hurts applications that access lots of memory, all the time, like 3D games. I'm not saying it's horrible or crippled, but it's what you will find in a low cost system. If I was building a system, there are other Intel CPUs within $10 of the price of the E5400 which are better for gaming. As for recommended upgrades, I simply don't know enough about what CPUs that system's BIOS is ready to accept to offer one. Perhaps there is info in the system manual about what CPUs would be compatible.
You have plenty of memory and running 64-bit Win 7 so no problem there.
Over the weekend I was visiting a friend and played on his system which is a dual core 2.4GHz Athlon X2 with a 7950GT for graphics. I was seeing 45+ FPS in CoH at nearly the same settings as you are running. I think your system is just as powerful in not more so than my friends. So overall it's not a bad system. Just not one that is going to run the game at HDTV resolution well. -
Well the Mac Book Pro used the 9600M GT. The 310M has maybe 70% of the performance of a 9600M GT.
-
Before PCIe was AGP. Now some business targeted or ultra inexpensive systems only had PCI. If you have PCI right now, your performance isn't going to be all that great with the current 3D engine.
Ultra Mode is just an option. Something beyond the maximum settings of today for people with systems that can handle it. You aren't going to be excluded from this game if you don't have the hardware to handle Ultra Mode, you will simply have the same graphics as you have today. -
The nVidia 310M is not all that powerful. It'll run but you will probably need to play with the advance settings to get a framerate you feel comfortable with.
-
Check the first post in the thread. Posi updated it about 2 weeks ago.
-
Quote:My SWAG says between minimum and medium Ultra Mode settings with that rig.My Settings now are
ATI HD 4870 1 Gig GDDR5
AMD 3.2 Ghz Dual Core Processor
4 Gigs of DDR2 Memory.
Will I be able to run Ultra Mode? I just got this card not to long ago and had to spend money already to get that. I was hoping Ultra Mode wouldn't me insanely taxing on the system, given I can play CoH right now at full power and not even touch 30% of my PC's strength, but now I'm reading I'll need top of the line stuff? Seriously? -
I'm not surprised by any of this.
The price, it's a big chip. A really big chip in comparison to ATI's RV870. It's not going to be cheap to make or it's initial yields high. Then you have nVidia's traditional "performing nearly as well as our previous gen twin GPU card" and pricing it as such. In this case the GTX 295, which is currently over $500.
As for performance I believe the problem comes down to one word. Power. I'm not sure if nVidia wanted to introduce a card that requires three, PCIe power connectors. Easiest fix, run the chip slower and maybe at a lower voltage. Don't be surprise to see this card with two 8-pin PCIe connectors. And of course power also means heat. Anyone care for a three slot video card?
As for the drivers. Unlike the 8xxx, 9xxx and GT200 chips, the Fermi isn't just throwing more SPs on a die and calling it a day. There are significant differences in the underlying organization. So I would imagine that the drivers may need to be built from the ground up to use this chip at full capacity.
It is also possible that nVidia is planning on releasing a revamped driver with significantly improved performance after seeing what ATI will do to counter the Fermi. Brand the new drivers with a catchy powerful sounding name (Xviscerator) then kick back and watch the buzz as the new drivers kick the performance up by 30% in whatever games are being used at the time for benchmarking. God, I'm starting to think like a marketing person.
I also believe that nVidia actually doesn't care about the high end video card market right now. The Fermi cards are really targeted at Wall Street/University/national labs for HPC, High Performance Computing, applications. There isn't a pricing pressure in that market right now, at least nowhere near the same as in the consumer video card market. As gamers balk at the pricing, the majority of Fermi chips manufactured will be for the HPC market. -
I did a quite filter at NewEgg and got this list. I simply filtered for systems less than $800 that have a video card capable of some level of Ultra Mode.
-
And if you want to see if a Forum Name is already in use, go to the Advance Search page (click Search at the top of the page and then Advance Search in the popup that appears). Now in the field labeled "User Name", type your choice. If that's already in use the name will appear as a drop down under that field (starts to display after 3 characters).
-
Cool, so a 2.8GHz Athlon II X2 240 for the CPU, a 785G uATX motherboard, 4GB of DDR2-800 memory and a 512MB 9800GTX+ for the graphics. Also a new PSU to power it and a DVD drive due to the current one not making the move successfully.
So, it's still the original case and hard drive(s) right? Got to do something about that. -
Quote:When I said secret code, I meant that literally. Well not a code per se but a decryption key. The part of the nVidia's driver that handles SLi is encrypted. At boot time the key is fetched through a function call and if the call succeeds the SLi portion of the driver is then decrypted. In the era of the chipset, the key was found there.I hate to disagree with you, but Nvidia was already abandoning the AMD chipset market when Intel was bringing the I7 architecture to market. Nvidia was looking to get out of the chipset market, and simply used Intel's positioning on I7 licensing as an excuse to stop producing chipsets. Licensing SLI technology would be more profitable to Nvidia than to continue to make their own chips.
There was just one slight problem.
There's no secret code. According to Intel engineers, they didn't have to make any changes to X58 to support SLI. Nvidia just had to allow the setup in the official drivers. Various users have been using leaked, beta, or hacked drivers since Crossfire motherboards started hitting the market to run two Nvidia cards atop an ATi chipset, or Intel chipsets that support Crossfire.
It ticks me off that this locking up of nVidia's SLi technology was equivalent to an inkjet printer's ink cartridge being chipped so no third party cartridge will work. Especially the impression they gave when it first came out was they were doing something clever with the PCIe controller in their Northbridge.
But nVidia is still waving the lockout of the QPI interface from Socket 1366 and the DMI interface from Socket 1156 licenses as examples of Intel competing unfairly to the FTC. -
There is always a game every few years that brings the current generation elite gameing system to it's knees. Before Crysis was Oblivion. Before Oblivion was FEAR with soft shadows on. Before FEAR was Farcry and the circle is now complete. Of course this was back in the day before LCD panels were locked at 60Hz.
In other news, nVidia says not to expect a new mainstream GPUs based on the GF100 technology (Fermi) any time soon because their current crop of mainstream GPUs are "fabulous".
If it wasn't for their success with their Tegra system on a chip that's in the latest rev of the Zune, that news would make me want to sell nVidia stock, if I had any, which I don't, just to be clear Mr. SEC.
The G210, GT220 and GT240 are Dx10.1 but the GT240 tops out at $120 (with 1GB of GDDR5 memory) but it's performance is less than the 9600GT. So between that and the Fermi all nVidia has left are the G9x based Dx10 video cards, assuming the GT200 based ones are truly being phased out. I have no idea what nVidia is thinking. -
No P_P, the GTX 275 card with the built in GTS 250 for PhysX was a waste. It costs as much as the GTX 285 and nVidia GPU based PhysX doesn't work with this game.
-
An MMO that's using the CryEngine 2, ouch.
Hopefully our Ultra Mode is nothing like the CryEngine 2 in terms of performance. At enthusiast settings (knobs turned to 11) I don't think I've seen any single GPU setup that could run Crysis at above 30FPS and at a reasonably high resolution at the same time. That could just be a problem with Crysis and not the engine in general but looking at Calypso's recommended requirements that may also not be the case. -
Be aware that the HD 5870 is a long card at 11". That may be too long for some mid size cases.
-
Quote:Maybe he bought it in a store instead of ordering it online. Best Buy has them at $70-80.... you paid $100 for a 9400 GT?
... please tell me you aren't in the US.
The starting price for a 9400 GT is $40. If you paid $100 for it, you paid $60 over the retail value of the card. You quite literally, PAID MORE THAN DOUBLE WHAT THE CARD IS WORTH!!!
If you're paying $100, you'd be looking at GTS 250's and RadeonHD 4850's, NOT 9400 GT's.
So first thing, if you can, send that video card back and get something worth your money.
Second thing, yes, it's probably the VGA cable.
And Nick, start with the cable unless you have another monitor to try. -
Quote:They didn't decide to get out of the chipset market, they were pushed by Intel who didn't want any competition in the chipset market for their latest line of processors (Socket 1156 and Socket 1366). They aren't too happy over nVidia's ION chipset for the Atom either but that's a FSB license and that horse is out of the barn. That's one of the reasons behind the FTC probe of Intel.Nvidia did not open up SLI licensing until Intel launched the I7, and Nvidia decided to get out of the x86 chipset market.
All nVidia could do was to license their secret code to motherboard manufacturers to put into their BIOS so SLi would work. Otherwise their whole "buy multiple video cards" business plan goes POOF! -
Well the interesting thing is that it's smaller and lighter than some. This one is 50% larger in volume than the Hyper and 150 grams heavier.
-
So any problem with the Hyper 212 Plus heatsink?
The mounting instructions makes it appear to be a little more complicated than some other third party heatsinks. Of course that's probably due to it's multi platform compatibility as well as the picture oriented, 63 languages on one page instruction sheet. -
OK I'm cutting and pasting from the one of the recent times this was asked.
Quote:First, there isn't any direct downloads.
Second, the updater will remember where it left off so don't be afraid stopping the updater. You could stop the updater and restart it. It may end up connecting to a different download server.
Third, you could try modifying the shortcut to the updater (cohupdater) to force it to use a different port. Edit the Target line and add to the end -port 13094 (space before the -) as described here. The default port is 6994 which happens to be in the usual default range of ports for Bittorrent clients so there is a chance that a router located between you and the download servers may have throttled the download.