-
Posts
6048 -
Joined
-
-
Quote:Added upon.Wings take up the same space on your costume as a cape (or a trench coat), so you can't have both, but any wings you've got, no matter where you got them, are available as soon as you get access. So a 15-month vet, for example, can make a brand-new level 1 character with angelic wings, but can't put a cape on that character, even when they reach level 20, unless they remove those wings in order to open up the "back item" slot on their costume.
-
Another thing.
In CoV you will need to see your initial contact way back on Mercy, to get the cape and later the aura mission.
When you unlock a costume slot, the costume in the first slot is copied over. So if you don't want to pay twice for a cape, do the cape mission first, modify your first costume slot and then do the mission to unlock the next slot. Depends if you want a cape and cape-less look or both looks with capes. -
Ah, no. The HD 5770 is a little less powerful than the GTX 260, NOT the 275.
The HD 5850 is around the same performance to slightly faster than the GTX 285. The HD 5870 is faster still.
Then there's the power issue. The HD 5xxx series does use less power than nVidia's high end cards. The HD 5770 maxes out around 115 watts of 12 volt power, the HD 5870 around 212 watts. Considering the HD 5770 GPU is essentially half of a HD 5870 GPU (800 SP Vs 1600 SP) this sort of makes sense.
General performance in games, we're looking at 60-70% faster. -
I would double check the motherboard manufacturer's web site to make sure your motherboard can handle the current crop of Core 2 CPUs. Usually it's just a BIOS update.
I assume you want to run Ultra Mode, which you don't need to do for Issue 17. It's not a requirement.
As long as you don't run other memory hungry programs while playing the game, 2GB is enough for now. But if you ever do bump that up to 4GB, along with a GB video card, you should really consider going to 64-bit Windows 7.
I concur with Flame, video card first, CPU second, memory third. -
This is what happens when they copy it from the previous box (which was copied from the box before that, etc) as opposing to what's on the support site.
Smack. Bad web monkey. No Latte for you. -
I would suggest you PM The Ocho, Niviene or Avatea and point them back to this thread and tell them that a lot of you EU players are still having problems.
-
The HD 3200 isn't a graphics card but an integrated graphics part of the motherboard chipset. It'll work but don't expect OK performance at anything but minimum graphic settings. Well I guess that all depends on what you think OK is.
-
Only see one, and that one is out of stock. Your next problem is powering it.
-
Quick drive by post.
Welcome to the game.
Note that most missions are instanced. That along with travel powers, base teleporters and other various means of travel to and from those instanced missions, you will not see a lot of players roaming the streets fighting crime. It does give the appearance that there isn't a lot of players in game.
The two most populated servers are Virtue and Freedom in the US. -
I really hate it when I can't sleep.
Psyte the large picture listing power requirements, etc. is a photoshop job, and a poor one at that. Compare the 6 in 600 to the 6 in 6-pin, different font. It looks to be a GTX 280 picture originally.
Edit: Nope my bad. It's a pic from a box mock up at CeBit. Still it's a bad photoshop job by the guy whose job was to do the mock up. -
I damn well hope so.
See the first post in the thread. Posi says GTX 260 will allow UM at medium settings. With a fast, modern multicore CPU and 4GB of memory I think the rest of the rig is more than capable. -
Quote:I'm talking about the power, reset, LED headers and not the USB/audio/SATA cables from the case. If you look closely at each cable header they should be marked with a + side and/or a - side. Polarity counts with LEDs.(The MSI board in this Father Xmas build did come with 3 header blocks, which helped immensely...)
So..... I have everything all assembled, except for the videocard and ram, which are still on their way. I got an OEM copy of Windows 7 Home Premium, and on the sleeve it mentions it must be installed using an "OEM Preinstallation Kit"
Is this an actual piece of software I need, a set of instructions.. or?
I have installed win xp and win 7 before and never had to do anything special besides insert the cd/dvd and follow the prompts.
Is this something I need to worry about?
Thank you for any help!
If MSI did take a page out of Asus's book of providing a block to connect the case headers to first then great. Didn't see anything in the documentation or reviews mentioning them.
That software allows an OEM to set up Win 7 so when the customer boots they can complete the registration in their own name. So not a problem.
Make sure the CPU fan is plugged in. The CPU should shut down in case of overheating anyways when no damage but still better safe than sorry.
Hopefully the video card is on it's way and not simply backordered. -
There is also an alternative explanation to the reported performance numbers. The card is so powerful that games are now CPU bound.
We'll see what the mass tech media thinks of the card once it comes of NDA and is benchmarked on a dozen different tech sites if the card and GPU are a bomb or the bomb. -
Quote:When you say die shrink, it implies that there was a version of the chip in the GT 220 that was produced and in cards. The same way that there was a G200 GPU and a G200b GPU. The G200b being the die shrink.part of me is tilting my head on this... and I'm going to try to explain why. The GT 220 and other chips are build off of the GT200 architecture, the same architecture behind the GTX series. Back when the GTX series launched, several sites, such as like Beyond3D and BitTech looked at the known information, and concluded that the base architectures of the G80 and GT200 were pretty much the same.
Ergo, for a chip derived off of the GT200 series, I'm not entirely convinced the GT 2xx series is new.. so much as it is Nvidia doing what they pretty much did before, re-implement an existing solution in a new die.
Also, as the original G80 was based on programmable shaders, DX 10.1 wasn't exactly that big of a deal : http://www.extremetech.com/article2/...129TX1K0000532
(mental note for another thread: the extremetech link is also relates to another thread about fallbacks between OpenGL and DirectX)
Pretty much the biggest deal(s) for DX 10.1 is that 4x AA is mandatory, and it forces 32bit floating point precision. If you look at the rest of the details, the G80 architecture was pretty much capable of the support from the start: http://www.istartedsomething.com/200...-101-siggraph/
So I'm pretty much willing to standby my statement that the GT series isn't actually a "new" chip, that it's just a reimplementation of the stuff Nvidia was already selling, just tweaked to sound more attractive to prospective buyers.
Yes the G8x, G9x, G2xx GPUs are similar in design. The GPUs in nVidia's 6xxx and 7xxx series were also similar to each other. The GPUs in ATI's HD 2xxx, 3xxx, 4xxx, 5xxx are similar to each other.
je_saist, you are painting with a broad brush, well they are all similar. So are automobile engines on the surface. This is a 4 cylinder, that is a 6 cylinder. That is a V8, this one is supercharged. The more you start digging into what you may consider to be inconsequential differences in design actually do affect performance.
So what if the GPU in the GT 220 is based on the architectural elements of the big G200 GPU from the GTX 2xx series? It's still a new GPU, one that's a bit better than the 9500GT it replaced which was a bit better than the 8600GT it replaced.
The only "bad" thing about the GT216 and GT215 GPUs is that they are a lower end part. One that should have been available to the masses 12 months ago. Where's the "half of a GT 280" to replace the 9600GT/9800GT? -
Quote:Adding an SLi profile for this game is something we can do right now. Or at least you were able to the last time I looked. That is only one small step in the equation of SLi/Crossfire equation. All that link will do is include a multi GPU profile instead of the single GPU one as default. ATI is more of an issue since they didn't (don't know if this is still true) allow users to set up their own configurations and the default mode is a simple subdivision of the screen between multiple GPUs as oppose to an alternate frame or tile configurations.Hey, not trying to throw the current conversation off topic or anything...
But, I managed to find the Nvidia Sli request form, for anyone who wants Ultra Mode sli capable. I posted before with a link but here it is again followed by the Sli Application Compatibility Request link.
http://forums.nvidia.com/index.php?s...&#entry1009215 - Discussion
http://www.slizone.com/object/sliapp_request.html - Sli Application Compatibility Request
Also there are a list of "avoid doing in a 3D renderer" tips in the SLi software developer's guide that will limit the impact of a multiple GPU solution. So there may need to be modifications to the rendering engine itself. We don't know the level of help Paragon/Cryptic (since they wrote the original engine) received from nVidia and ATI when Ultra Mode features were being added (as well as fixing ATI quirks) and whether or not it the subject even came up in their discussions. -
Quote:Well according to Notebookcheck's info on laptop GPUs, the 9800M GTS has similar performance to the desktop 8800GS/9600GSO/GT 240 which are all slower than the desktop 9600GT.I play CoH/V on a laptop with a 9800M GTS video card.
Any idea if this will run Ultra-Mode? And if so, how well it's likely to do? I understand that we don't have hard data yet, but an idea would help.
Thanks.
And since we really don't know the actual impact to performance that Ultra Mode has combined with the current myriad of game settings as well as your preferred game resolution, etc. we really can't say but it's likely on the no/not well side. -
Quote:No it's not. It's an entirely new chip.okay. reimplemented then at a lower die size
to me it's sort of like comparing the Radeon 9700 to the 9800, or I think more accurately, the 9800 to the x800. yes, it's technically different... but really... it's the same basic architecture underneath.
9400GT - 16 SPs, 8 Texture units, 8 ROP units
9500GT - 32 SPs, 16 Texture units, 8 ROP units
9600GSO - 48 SPs, 24 Texture units, 16 ROP units
GT 220 - 48 SPs, 16 Texture units, 8 ROP units
9800GT - 112 SPs, 56 Texture units, 16 ROP units
On top of that the GPU in the GT 220 does support Dx10.1 where the older G9x based GPUs only support Dx10.
In other news there are rumors that video card prices are going to creep up in price due to continuing 40nm GPU shortages as well as RAM prices going up again. -
Quote:No it's not a rebadged anything. Entirely new chip. Technically it's a mash up of the 9500GT and the 9600GSO and it's performance, the better ones (with GDDR3 memory) fall right in between their performance.no. Not even close.
The GT220 is basically a rebadged Geforce 9400 GT. That's several steps below the starting point of a 9800 GTX / GTS 250.
je_saist is still right about that the GT 220 is far below the 9800GT. Somewhere around 1/2 the performance of the 9800GT. -
Again, the game comes with the the original PhysX driver and library and is coded to use it. If you don't have a PhysX card, it is emulated in software. Exploding mail boxes, shell casings, swinging signs, blown leaves due to flybys, etc. is available to everyone regardless if you have a PhysX card or an nVidia card that could run PhysX.
As long as the "AEGIA(TM) PhysX(TM) support" is grayed out, you are running the CPU PhysX emulation and not on hardware. The game's included driver does not recognize nVidia's GPU PhysX as a valid PhysX card. The game's included driver was an early 1.0 beta of PhysX. Aegia, before they were bought by nVidia redid their drivers and they were no longer compatible with this game. If we loaded their new drivers it this game would still use the ones that came with it back when CoV first came out. The same ones that are still in the game's folder. The next slider, "Particle Physics Quality" adjusts how much PhysX effects we see.
Now if the game had supported the last generation of Aegia's drivers then when nVidia bought and incorporated a CUDA version in their video drivers then the game would be able to use them. But that didn't happen. This game was too early of an adopter of PhysX and like the first guy on the block who bought an HDTV, we find ourselves with a piece of technology that isn't quite the standard and is lacking in compatibility. -
If I am a winner, I permit NC Interactive, Inc. and NCsoft Europe Limited to use my name, likeness, photograph, hometown, and any comments that I may make about myself or this contest that I provide for advertising and promotional activities. I also certify that I am at least 13 years of age and am eligible to participate in this contest.
These copies of "Metahumans Online" I found behind the Games R Us burn surprisingly well. -
Waiting for the inevitable cries of "but I didn't have time/enough free transfers" to move my characters back to their home server". Just you wait and see.
-
Yes, their "new" lower end GPUs support Dx10.1. This includes the G 210, GT 220, GT 240 and their renamed GT 3xx equivalent cards (yes nVidia is pulling another renumbering job). I don't follow the mobile GPU market that closely so I don't know which laptop GPUs now support Dx10.1.
-
-
There is still the original CPU emulator for PhysX in the game. What the game won't recognize is the nVidia GPU powered PhysX.
So the question is what's better? Stealing CPU cycles for physics or stealing graphics performance (assuming one video card) for physics. Which will impact framerate the least?