-
Posts
6048 -
Joined
-
I'm betting it'll be a few more weeks before I17 goes open beta. Even though UM has been in development for a while (they showed it at the last Hero-Con), the closed Beta allows them to test a great many more hardware configurations and video cards. Who knows what weirdness might spring up. And that's just UM.
Then toss in the new UI for WW/BM, MA changes, and the e-mail transfer system. If the e-mail transfer system hiccups, I'll be able to hear the screams from here in East Nowheresville. Heck if the market hiccups or the MA gets another major XP loophole the howling won't just be a movie. Look at the cries over the BotZ change.
Open Beta ideally is for adding the spit and polish, not major debugging. If GR is starting Closed Beta next week I'm betting it'll be in a separate training room server. We'll also see the pre-Issue 17 patch being downloaded. -
From what I heard on one of the vids from the presentation. Endgame system only available if you get Going Rogue. No Going Rogue, Incarnate level/powers for you. GR will have the first level and powers. I19 the other 9 levels.
I'm guessing that powers/slots unlocked for Incarnate levels are the only ones you can pick at those levels. You can't add slots to that power you picked at 49 or pick powers from your current primary/secondary/aux/patron power sets. This may extend to enhancements that can only be slotted into those powers.
Kinetic Melee looked interesting but if people complain about the powers with long attack animations or the DBZ character invasion during the early years, you aren't going to be a happy camper. If the Level 9 power sounds like a Spirit Bomb I will be disappointed. Ryo Street Fighter attacks also apply. -
These were labeled as Aion time cards, $15, no container, just on a hanger, activated for use at the register. Sadly my "local" BestBuy (20 miles away) didn't have any NCSoft time cards (in a DVD case) or any CoH/V box editions. This was as of a couple of weeks ago.
-
Not sure about that, the Aion timecard I saw at BestBuy only had Aion branding. No CoH/V or Lineage branding on them. They were similar to the CoH/V game card in packaging, meaning none.
-
I thought the 9800GT was listed as minimum Aggelakis (unless Posi changed it recently). Still that's spliting hairs because the 8400GS has less than a 10th of the shader power of a 9800GT.
-
I keep my backup on USB thumb drive and a 2nd hard drive. I'm on dial-up so a drive crash would be very, very bad with a game this large.
-
If you bring up the task manager, add base priority to the displayed columns and check off "Always On Top" in the options, you will see that when you go back to the game, the priority goes back to normal. Click back onto the Task Manager, you'll see it drop again.
It's entirely normal. -
-
Nothing should prevent you except available power and cooling.
-
Drive motor must of seized and/or shorted if the PSU won't even turn on.
-
That and nVidia looking to sell something other than performance since ATI started kicking their butts with the HD 58xx line. Just like when all nVidia could talk about was CUDA and PhysX when the HD 48xx series came out.
-
Video card is seriously under powered. Seriously.
CPU and memory are fine.
I would also hazard a guess that the included power supply is also seriously under powered, hence the low end video card. -
Agreed, CPU is fine, more system memory is always a good thing but from my reading of Posi's Ultra Mode buyers guide suggestions (it's on the Dev Board) suggests the 9800GT (an 8800GT by another name) is the minimum when it comes to UM on minimal settings. So if you want more than minimal UM settings, gonna need a bigger boat ... ah video card.
-
Memory helps when a game has lots of large highly detailed, uncompressed textures. The more memory a card has, the more textures it can store on card which reduces the need to transfer the texture from main memory.
In the case of Ultra Mode with this game, more memory doesn't help, it's shader power that's needed. The only talk about improved textures is "marking" some as reflective. -
Quote:First, SLi isn't an automatic 2x improvement in performance. A lot of it has to do with how the game's code is structured so the game code running on the CPU is waiting on the video card to finish before the next frame can start. The longer the wait, the more of a speed boost having a 2nd card in the system.Thanks Father X - that is interesting. The ASUS mobo has 2 PCIe x16 slots, but I only have one vid card. So thought experiment (probably not really going to do it and UM doesn't like dual boards yet apparently):
What would the generic performance comparison for two SLI-linked 8800GTX (which can use 16x2 = combined x32 bandwidth) vs. one GTX285 which is a faster card, but can only use one PCIe1 16 pipeline.
Looking at nVidia's site and using the 9800GTX+ as a stand-in for the 8800 GTX (since they were about equal in performance - slight edge sometimes to the 8800 for the extra memory)
8800GTX x 2 = theoretical 30x 3DMark®Vantage Performance Preset (somewhat less due to SLI inefficiencies)
GTX285 x 1 using PCIe 1.0a single card = something less than 28x 3DMark®Vantage Performance Preset - perhaps 4% less
Going back and looking at 3DMark tests, 2x8800GTX scores 5467 on the SM3.0 test
http://www.legitreviews.com/article/421/7/
...on a PCIe x16 board, SLI giving it a x32 effective I guess. WinXP OS
and the GTX 285 scores 7540-7731 on the same test here
http://www.legitreviews.com/article/915/5/
...which is a PCIe 2 x16 board, so x32 effective I guess. Vista SP1 OS
So would it be correct to conclude that buying another 8800GTX would be about 75% performance as buying a GTX285 for a given system?
Interestingly, according to Google, the 8800GTX new is more expensive ($483 lowest it found) than a new GTX 285 ($352). Of course a used 8800GTX might be found, but still a weird disparity.
Second, each card may get a full 16 lanes of PCIe V1 but it's not like either can "borrow" the additional lanes from each other as needed. The SLi bridge is so one card can access the frame buffer on the other to read for output to the monitor.
Third the price disparity is because the 8800 GTX isn't made anymore and if someone is looking for one it's because they already have one and is looking to do SLi. It doesn't occur to them, the gamer looking for a 2nd card, that a single newer card may be considerably faster and cheaper. You probably remember since you have one, the 8800 GTX was a $600 card when it debuted and I can understand why someone wouldn't want to pull it, bag it and stick it on a shelf of old parts when they think they can still use it if only they could find another for SLi.
Lastly, those two reviews you site use different hardware (CPU, OS, memory) so you really can't compare the two. But you can compare them if you know where they used the same setup to test every card. They are using a highly overclock CPU and rather high quality settings which helps SLi/ Crossfire to shine. -
There is a contingent of players who want animated hair, especially the long hair styles. They would love it if it moved based on player movement, similar to capes (ie move up while falling, sway as you twirl, fly back as you run). But since it doesn't move it must be held in place by hairspray, lots of hairspray or other product.
-
The difference between PCIe V1 and V2 is V2 is double the bandwidth. Another way of looking at it is a PCIe x 16 V1 interface has the same bandwidth as a PCIe x 8 V2 interface.
If you think about it that way then you can look at the tests done over at Tom's Hardware on PCIe scaling. The results (using an i7-870 and an HD 5870) showed only around a 4% decrease in performance in the games they tested. Slower CPUs and video cards would make the impact even less.
Since the GTX 285 is considerably more powerful than your 8800 GTX, games performance that is being throttled by GPU performance will improve. But it isn't going to be as fast as it could be if it was in a PCIe V2 slot with a hefty CPU feeding it, but faster than the original 8800 GTX, certainly. -
Honestly now, it's good that Novawulfe is getting what he feels is an acceptable performance for him. I would love to see a CoHHelper report to see what the settings are but given he's running a dual core, with 3GB of system memory (yes some will be stolen by the HD 3200) and at a relatively low resolution, all which helps overcome some of the limitations of an underpowered GPU.
It's my understanding, from a programming point of view, that features like ambient occlusion is very computationally intensive. Since it would be running on the graphics hardware and Positron's listing the 9800GT as a minimum starting point it's only logical that the limited shader hardware in the HD 3200 isn't going to cut it.
But we simply really won't know until someone gives it a try ... and can talk about it, which means once Issue 17 goes into open Beta. -
You can also save two different layouts of the various game windows and load them, manually, so you don't have to shuffle them around as you alternate between resolutions.
Use /wdw_save_file filename to save the layout and /wdw_load_file filename to load the layout. -
Quote:Glad to hear you get reasonable performance from that laptop. However from my understanding of Ultra Mode, even at the lowest settings, these new graphic effects can floor the frame rate on lower end video cards to the single digits. It makes the frame loss from DoF, the current #1 frame rate killer, seem minor in comparison.Just got the game downloaded and updated, and I can run the max settings of graphic and sound with no flashing, slow refreshing or slow in processor speed. And I have all the graphic options except for one of them, it's a totally different experience.
Not in beta, I'm talkin about live. -
It's not that it nails the sweet spot as the HD 5830 and HD 5870 aren't priced linearly relative to the performance of the HD 5850. In the case of the HD 5870 that's OK as the top end part is always sold at a premium.
The problem with the HD 5830 is that measured by performance, it's about 1/3rd of the way between the HD 5770 and the HD 5850 but is priced about 2/3rds of the the way between the HD 5770 and the HD 5850, at least at the time of it's introduction. Now that the price of the HD 5850 is drifting every higher every week, it's closer to half way in between. -
Quote:Well other than being a $600 foot long (the card is 12.2" long) and actually use two fully enabled RV870 GPUs, the GPUs on the HD 5970 are clocked 15% slower than the one on the HD 5870 with memory clocked 9% slower. These happen to be the clock speeds of the HD 5850.
To summarize. HD 5970 is two HD 5870 running at HD 5850 clock speed. -
Or you bought it at a store instead of online. Surprisingly people still prefer holding a physical item before parting with their money than parting with their money and waiting 3-5 business days for delivery. Don't begrudge them je_saist, the Interweb is a scary place full of Banks of Nicolai.
And actually checking NewEgg, a 1GB 9800GT is selling between $101 and $140 depending on the manufacturer, clock speed and accessories, not counting shipping so not really a bad deal.
As for UM, Posi says it'll support UM features at their minimal settings. The rest of the game settings can remain turned all the way up, just the new UM settings are one notch above OFF. -
Well all that matters is that what you have now performs better than what you had before. If you can now run the game with better settings and/or at a higher resolution while maintaining the same or better frame rate, then even better.
The E3300 may be stunted by the lack of L2 cache compared to other Core 2 CPUs but it's still faster than any Pentium D ever sold. The G 210 may be a low end introductory video card but it outperforms the $450 video cards from 6-7 years ago.
It's easy to laugh off low end CPUs and video cards. First rarely anyone reviews them and when they do they test them under conditions that could only make them look worse. For video cards this is done by testing games at max quality settings at a high resolution. That's like taking your Honda Fit and racing it against a Top Fuel Funny Car. Gee I wonder how well it will do? If all you have is a 1280x1024 monitor, do you really need a $300 video card to get good performance? Does turning the quality knob down a notch or two really noticeable if you aren't looking for the differences?
As for CPUs, the difference between performance from best to worse, per core, is around 2 to 1. That's why in today's CPU benchmarks the number of cores is more important. I'm sure you would appreciate a 6 core, $1100 CPU like the i7-980X if you photoshop, render 3D scenes or encode video all day for a living. In that case you will see a 4-6x performance improvement over the lowly E3300. But is it worth over $1000 more for only 60-70% improvement in gaming? Of course not.
Sorry, it's easy to forget how much performance you can buy today for a pittance relative to only a few years ago.