-
Posts
6048 -
Joined
-
I never asked for money (for Fortune Teller pre-Ouro) but I did have teammates offer after the mission was done.
-
Well be aware that the G310 nVidia video card is now out in some OEM system. It is NOT based on a G300 GPU or anything even derivative of it. It's a 64-bit DDR2 based card with 16 SPs, Dx10.1 and in all other respecs the same as the G210, possibly yet another renaming.
-
Quote:Or magical curse (Ranma 1/2)Was it Hobo or H8mail? Maybe someone else entirely... anyway, one person used this for his "character" - a item which possessed it's "users". So the different genders and what not make sense.
Genetic family anomaly (Futaba-Kun Change!)
Disguise to attract girls while searching for one in particular (Sailor Moon Stars)
Your little sister reprogrammed your pocket universe generator (Moldiver)
Then there's always the tomboy disguise. -
There is also the case when the enemy's regen tick kicks in just before your damage is applied. Combine that with the frequency the enemy's HP is updated with the rounding factor.
This is when I end up using my origin power.
Me: SLICE
Them: Ha, still standing.
Me: Taser Dart
Them: THUD -
Quote:I would like to remind everyone that's the power of the entire system, running FurMark (an OpenGL graphics demo/GPU burn in software) and not just the video card. It's also the AC wattage as measured at the wall.I was PM'd this article: http://www.hardocp.com/article/2009/..._card_review/1
Performance is near identical. Temperature testing says that 5770 also goes over 80°C under full load, so no difference there. Power consumption is a huge difference, 270W under full load. So I'm leaning towards the 5770.
Thanks everyone for all the tips. I'm going to get the card tomorrow at 2pm, so still time to show me some more stuff to change my mind or cement my choice...
The actual DC power draw of a 5770 under FurMark is around 80 watts. -
You can see the effect if you set the task manager so it's "Always On Top", under Options.
-
Hey can't a guy take a night off?
Problem with DDR memory and AMD is that the memory controller is part of the CPU and not the motherboard.
Socket 754 and 939 CPUs have a DDR memory controller (single channel for 754, single/dual for 939)
Socket AM2/AM2+ CPUs have a DDR2 memory controller.
Socket AM3 CPUs have both a DDR2 and DDR3 memory controller which is why they are backward compatible with AM2+ sockets (although I believe the HT bus is slower in AM2+).
Over on the Intel side during the Socket 775 era the memory controller is found on the Northbridge chip on the motherboard. While Intel and nVidia focused on DDR2 and DDR3 support over the last few years there are a few motherboards that supported DDR or DDR2 that used a Via Northbridge.
Of course with Intel following in AMD's footsteps and are now integrating the memory controller on the CPU as well with their Socket 1156 and 1366 CPUs (DDR3 only) the idea of preserving memory from one CPU era to the next is starting to fade away. -
TV advertising costs money, Blizzard has more money than God.
I hear Dubai called them up looking for a loan.
If NCSoft is willing to loosen the purse strings for advertising, I would expect it to be around the time for Going Rogue and not the holidays. Remember sexy graphics sell. -
XP came out Oct 2001 and Microsoft says support is over April 2014 so that's roughly 12 1/2 years.
-
Yes, I'm thinking AFR which is what I believe the nVidia SLi profile is for this game. No idea what if any profile ATI uses.
-
Generally, SLi and Crossfire will show an advantage when the CPU is waiting on the GPU to finish the current frame before it can start another. The longer the wait, the greater the impact. If a single GPU is fast enough so the CPU wait is short enough, there should be little if any difference in performance.
There wasn't a lot of difference in the XP/Phenom 9600/HD4850 and a bit of difference in the Vista/Phenom 9600/HD4850 (around sample 13 through 18). An improvement in that range also showed up in the Win 7/i7-920/HD3870 graphs and the Vista/A64-6000/GTS250 graphs.
Whatever happens during that part of your test really strains the GPU. I don't see anything in your Youtube clip that looked out of the ordinary, perhaps just a lot of objects in the view coupled with a worse case arrangement of those objects during that segment. -
Quote:Just wanted to say thanks and goodbye to all my forum friends. It's been wonderful to get to know all of you. You've all taught me so much. Unfortunately, I'm just not gaming as much and I can't see maintaining my account any longer.
So to all of you, again, Thanks!
-Wolf Shadow
Bye Wolf, you'll be missed. -
Quote:At some point the heat still needs to be transferred out of the liquid and into the air. Except now in these self contained liquid coolers the radiator mounted over the rear case fan has fins that resemble corrugated cardboard on edge that will get clogged. So you are just moving the problem from one spot to another.Considering the heatsinks that I've had to perform 'surgery' on to get the dust out of them... that would be more than enough reason for me to try liquid cooling.
-
Mine is simply my first true main character that I leveled up. Rolled him up about within a week or two of starting the game. I remember getting schooled in Kings Row and Steel when Issue 2: Attack of the Roaming Eyeballs started.
-
First, there isn't any direct downloads.
Second, the updater will remember where it left off so don't be afraid stopping the updater. You could stop the updater and restart it. It may end up connecting to a different download server.
Third, you could try modifying the shortcut to the updater (cohupdater) to force it to use a different port. Edit the Target line and add to the end -port 13094 (space before the -) as described here. The default port is 6994 which happens to be in the usual default range of ports for Bittorrent clients so there is a chance that a router located between you and the download servers may have throttled the download. -
The $1200 build.
OK, quite a number of changes in this rig. The primary reason behind this is that prices have gone up on some components, sometimes by a lot (DDR2 memory prices more than doubled), over $100 all together. Plus a lot has changed in the last 10 months.
First I'm switching from a Socket 775 Core2Quad 9550 to a Socket 1156 Core i5-750. This in turn required me to change the motherboard, the memory and the heat sink. As strange as it sounds this configuration is $46 cheaper yet the i5-750 performs better.
I also dumped the sound card and put some of that money into a faster video card (GTX 275 in place of a GTX 260). The reason is because the motherboard uses a very nice integrated audio chip and I believe that better framerate from a faster video card trumps any framerate gained by using a Creative gamer sound card. Also to provide the additional power for this card I decided to up the power supply from a Corsair 650 watt to the 750 watt model. This also saves $35 from the previous design.
The case, hard drive, optical drive and thermal compound remains the same.
There is one other thing different about this build and the previous one. The motherboard support SLi and Crossfire where the previous one supported neither. It's not that I changed my personal opinion about multiple video cards it's just that this build has always been designed with thoughts of a system for an enthusiast. That is the reason for the low latency RAM and big third party CPU heatsink so if someone wanted to, they could overclock without needing to upgrade anything. It is also another reason I upgraded the power supply, this one has four 6+2 PCIe power connectors to support two video cards. -
Didn't think you cared about multiple video cards but I was confused since you listed the premium gamer motherboard from MSI.
My link problem, trying to bold them corrupted the URL.
Everything looks good to me. -
1) Yes it should run some level of Ultra Mode (see Ironblade's comment).
2) Liquid cooling in a gamer PC is like having a spoiler on the sports trim package of a car, it's expected but that doesn't mean it does anything out of the ordinary. A lot of the boutique gamer companies are adopting the no muss - no fuss self contained CPU liquid cooling system. However I've yet to see a review that shows that they outperform traditional heatsink tower coolers.
3) To keep the price only at $1300 while their profit reasonable, they offer only 3GB of memory (i7-9xx CPUs like memory modules in groups of 3), at least the upgrade to 6GB is surprisingly reasonable ($50, likely adding 3 additional 1GB modules rather than swapping out 1GB for 2GB). The double the standard memory for the video card is like the liquid cooling, because it sounds "right" to your average magazine taught PC gamer. In reality it will only make a difference in extremely graphic intensive games at high resolution and game settings. However it's questionable that the framerate will be high enough to meet player expectations even with the extra memory at those settings. -
There was a nice a proper line in front of the Target I went to at 4:20AM, in the rain. At the local Walmart there was still a line outside at 6:30am since they were only letting people in as others leave.
Local news media report that there was a near riot at the nearby Toys R Us in which the nicely formed lined spontaneously disintegrated at Midnight when the doors were suppose to open. Police were called, the line reformed and the store opened an hour late. No reports of injuries that I heard of. A quick Google seems to suggest this wasn't the only Toys R Us where this happened. -
Yea, most of those are P4s and the rest are the usual suspects for 65nm Core2s.
The reason Pricewatch is so expensive is because others like you are looking to do the same thing. -
First, all your links won't work because they are your private links attached to your NewEgg account. That's OK, the Wish List ID can be cut and pasted into a public wishlist link. So if others want to see here:
Foundation - http://secure.newegg.com/WishList/PublicWishDetail.aspx?WishListNumber=12267192
AMD - http://secure.newegg.com/WishList/PublicWishDetail.aspx?WishListNumber=11258865
Intel 1156 - http://secure.newegg.com/WishList/PublicWishDetail.aspx?WishListNumber=11258885
Lets start with the common parts. My only comment, the video card is a 192 SP version of the GTX 260 and not the newer 216 SP version. As long as you are aware of that fact then fine.
AMD Vs Intel - Well off hand the i7-860 will eat the Phenom II X4 955 for lunch. The only reason the prices of these lists are remotely close is the fact you chose more expensive memory and motherboard for the AMD build over the Intel.
First lets take a look at the memory. I'm going to guess you prefer Corsair. Problem with that is that compared to these they are slower (Cas 9 Vs Cas 7) and are more expensive. Yes the Crucial Tracer memory has the silly (IMO) das blinkenlights and ground effect LEDs but that doesn't change the fact that they have lower latency than the Corsairs and are still less expensive.
Are you aware that the motherboard on the AMD build can't do nVidia SLi but only ATI Crossfire? That's OK, neither can the motherboard from the Intel build and it's Crossfire support is poor (2nd video slot is 1/8th the bandwidth of the primary). I'm pointing this out because you are paying a premium on the MSI motherboard to have two x16 or four x8 Crossfire support. Using a nVidia card kind of makes all that superfluous. If you were interested just in overclocking the Phenom II and not extreme Crossfire configurations then a motherboard based a 790GX/SB750 configuration is a lot less expensive and still have Advance Clock Calibration for improved overclocking.
Well those are my comments and thoughts. -
Well since you have a Dell, it's doubtful that they have a BIOS update so you could take advantage of the current crop of 1066MHZ FSB CPUs from Intel (like the Pentium Dual Core 6xxx series or the Core2 7xxx series).
This leaves a decent PSU and a graphics card. Now you don't need a lot of power (overall wattage), just the right kind of power (12 Volt wattage). Two examples I can think of are Antec Earthwatts EA500 with a maximum of 408 watts at 12 volts and the OCZ ModXStream Pro OCZ500MXSP with a maximum of 432 watts at 12 volts. Both of these have two PCIe power connectors which are needed for either the GTX 260 or the HD 4870 (each use two).
As for which video card, well it depends if you mind playing this game with the current caveats dealing with AA and high end shading effects, see BillZBubba's ATI settings sticky to see what I'm talking about (but since you already have a HD 4670 you already aware). They perform about the same but the ATI card tends to be less expensive. Either would be a significant improvement (for games in general) over what you currently have. -
For your perusal.
Issue 12, new zone set far into the past.
Issue 13, Day Jobs, patrol XP (bonuses for characters who have been offline)
Issue 14, Player created missions.
Issue 15, Minor changes
Issue 16, Power customization, automatic sidekicking/exemplaring, new difficulty system -
Bring up the task manager (Ctrl-Alt-Del or click on the task bar). Select the Performance tab and look at the number next to Total in the box labeled Physical Memory (K), it's in kilobytes. 2GB is 2097152.
-
Hey, I remember when Miss Kitty originally posted this. I thought it was a clever and funny use of the CCG generator at the time. At least enough to save a copy of it.
Well out to do battle at Black Friday.