-
Posts
6048 -
Joined
-
Since it's a laptop there isn't much in the way of a graphics card he can get Zom.
But you are right about the drivers, extremely old.
You are also right Zom about Du Hast needing to provide us with a HiJackThis report. -
Quote:Well of course, the card is doing hardware tessellation on top of all the same work done by the Dx10 path. You don't get tessellation for free.difference between supports and runs, which is something Kyle looked at with Dirt 2. The 5670 blows past the 240 when running the DX10 shader path, but wasn't exactly blazing fast when asked to do the tessellation from DX11.
Quote:also, I just pointed to the link for it to be read, because there's another factor to consider. The RadeonHD 4850 is in the $100 bracket.
And that's capable of cruising past the top-of-the-line 9800 GTX, which still are at the $120 mark or higher.
That Sapphire model looks sweet at $100.
And BTW, you forgot to toss in the GTS250 with the 9800GTX+ cards, they are essentially the same. -
-
Quote:I was asked countless times "why should I buy a Core2Duo at 2GHz? My Pentium 4 is 3GHz!"Quote:Yeah, trying to explain the vagaries of instruction pipeline efficiency to the terminally clueless almost as much fun as banging your head off a wall while sticking your *BLEEP!* in a press compactor set to "agonizingly slow".
You can even use the age old "it's simply more efficient" but I understand your pain trying to un-teach the megahertz myth from the P4 generation of consumers. -
There is quite a power requirement difference between an HD 4350 and a GTS 250. Maybe you were experiencing an available power problem.
-
Quote:Yes but that doesn't mean it can't be a teaching moment. All I can imagine is some young gamer bragging to his friends that his memory is clocked at over 3GHz simply because they're running a pair of DDR3-1600 clocked at 800MHz.Are we writing to a hardware engineer, or to a non-techie who wants to know what the speed difference will be? :P
I know, you are aware you're nitpicking, but think of your audience here...
If you had simply said that dual channel memory is twice as fast as single channel I wouldn't have batted an eye. However once you started to use clock frequency as a substitute for total bandwidth I had to say something. -
Quote:I'm going to nitpick here.Example, in my system: DRAM Frequency 480MHz (due to a little overclocking), x2 because it's DDR2 memory = 960MHz; x2 because it's Dual Channel = 1920MHz final speed.
First Leandro is right, DDR/DDR2/DDR3 memory actually uses a clock frequency that is half of the advertised speed. So DDR2-800 uses a 400MHz clock, DDR3-1333 uses a 666MHz clock, etc. This is because it was decided when DDR memory first came out to list it's effective speed when compared to older SDR memory. You get the same theoretical maximum bandwidth out of SDR-400 as you do with DDR-400 even though the DDR-400 memory is running at half the clock speed.
However extending that analogy to multiple memory channels is simply wrong. While multiple channel memory can double (or triple in the case of Core i7-9xx CPUs) the memory bandwidth, no hardware engineer would remotely consider describing it as another double or tripling of the memory clock. That's because each channel has it's own memory controller, operating at the actual clock speed of the memory. On the AMD Phenom series, the memory controllers are unganged, meaning one could write to while the other reads from memory for a minor performance gain.
The other way memory is described is by it's maximum bandwidth. DDR-400 memory is also listed as PC-3200 (each stick is 64-bits or 8 bytes wide, 8 x 400 MHz is 3200 MB/s), DDR2-800 is PC2-6400, DDR3-1333 is PC3-10600 (or PC3-10660 or PC3-10666, it depends), etc. -
-
I upgraded my parent's system (2.4GHz P4) from 512MB to 1.5MB. Kept the smaller pair where they were and installed the larger pair in the empty sockets.
As long as both pairs of memory can be configured to run at the same speed and timings, there really isn't a problem.
Now there I've seen motherboards that advertise extreme memory speeds but only for one pair (or triplet) of memory, the second pair prevents such aggressive speeds.
There is also Command Rate, one of the more obscure timing values. I usually see this talked about more with AMD processors and motherboards where one pair can be set for a Command Rate of 1 but two pairs can't. Here is an old article (DDR days) that benchmark the differences between Command Rate and timings. Needless to say but in games with their setting cranked up, the video card is your limiting factor, not memory timings.
The thing is, more real memory is always better if the system ever needs to start hitting the swap file hard. Doesn't matter how much performance you may lose because of single channel Vs dual or more conservative timings or slower memory speeds, needing the swap file is a million times worse.
My parent's machine for instance needed to hit the swap file while booting up and logging in, due to the anti-virus, anti-malware, HP printer/scanner drivers, GPS utility, etc loading. Once loaded a good chunk of memory was freed up but during the load process over 600MB was committed on a 512MB system. With 1.5GB now, no slow down on boot plus more memory usable for drive caching. -
I'm wondering how many people will end up needing to buy transfers because their character got stuck on a server they were only planning to visit.
-
If I remember the debt cap is about 40% of what it once was. If you are always in debt than I would have to suggest that you either reevaluate your tactics or difficulty setting for that character.
I can't play my sonic/electric defender the same way and with the same difficulty as my katana/regen scrapper (with reckless abandon aka scrapper lock). -
I pay $10 a month for dial-up, $8 if I pay a year in advance. Even with that the towns that host the phone banks are 15-20 miles away, at least they are in my local calling area (years ago pre-Internet Compuserve, they weren't). If I was in range to get DSL the cheapest service is $30 a month for 768Kbs/128Kbs max. If the cable infrastructure was built out the cheapest cable is $50 a month for 3Mbs/256Kbs. Hill is in the way (as well as the condo association) for Satellite.
But since some people can get DSL or Satellite, my zipcode gets a big check as being broadband enabled in those government figures on coverage, even though it's not available for everyone in town. This means that 10% figure is best case, the actual number is a bit higher.
Japan, Korea, most of Europe, has a much higher population density making broadband penetration easier, less expensive to install and allowing much higher bit rates than here in the US. Doesn't help that the DSL competition is set up to be non-existent and cable franchises are monopolies. -
I beg your pardon.
While yes the 2.8GHz E6300 does suffer from a lower performing memory subsystem due the 2MB of L2 cache and a "slow" 1066MHz FSB it's not bad for an $80 CPU. A future $600 rig may include a AM3 based CPU but the price of DDR3 is a problem.
And the GTS 250 isn't a slouch either. Yes it's essentially been around since Dec 2007 as the 512MB 8800GTS, but with a 14% OC. But at only $130 it's tough to beat. The HD 5750 is a little better, it generally cost a little more. -
To put the GT 220 performance in context.
GT220 ~ 9500GT ~ 8600GT with the GT220 being slightly faster than the 9500GT which is slightly faster than the 8600GT.
Just putting this info out there so we don't get people poo-pooing the GT220.
The two obvious settings that impact performance on a low end card like this is screen resolution and AA mode. AF is relatively free (low impact on framerate) on modern GPUs but you may want to experiment with 8x or even 4x AF settings. If you are spending a lot of time in CoV or like to hang out in CoH zones with lots of water then you could drop the water effects a notch or two from Flameshot's settings.
You have the CPU power for the settings that are CPU intensive, physics and world detail, so you shouldn't need to play with those and you have the video memory for the highest quality textures.
Still in the end it's all about experimenting with the settings, using /showfps 1 to see how much performance is gained or loss and settling on what you consider to be a fair tradeoff between framerate and pretty. -
The $1200 build.
You know I really hate it when I'm pushed into changing something, not because of better technology or lowered prices. So please excuse the ranting nature of this post.
So in a bit under six weeks I find myself reworking the $1200 rig again.
First the positive changes.
I've ditched the old Cooler Master CM 690 case for the new Cooler Master CM 690 II Advance for $15 more. It has better cable management, a nifty SATA drive dock on the top and they moved the side fan to the top and made it a 140mm. Still has as many optional fan mounts as the old case. It's sort of Cooler Master's answer to the Antec Nine Hundred Two case but without the window.
Next I swapped the CPU cooler from the XigmaTek HDT-S1283 with the additional Socket 1156 adapter to the Cooler Master Hyper 212 Plus. Near identical performance but $18 and change less. So that pays for the case.
Now for the changes I was forced into.
It's widely known that nVidia is in a spot of trouble right now. Their next gen chip is late and with ATI's introduction of the HD 5xxx series, nVidia's GTX line simply doesn't have the performance to compete at ATI's price points. Whether or not that has anything to do with the fact that the GTX 275 being perpetually out of stock, I don't know, but finding one from one from the "big name" manufacturers is next to impossible. Maybe they are just really popular. Anyways I hate listing something you can't get.
So I need a new video card and since I don't like the idea of going back to the GTX 260/216 and sacrificing performance the last build had. This means I'm going with the ATI HD 5850 for $40 more. It's faster and cheaper than the GTX 285 and with the rumored changes to the CoH/V graphics engine for Going Rogue also resolves the outstanding "quirks" that the game has with ATI cards in general, I think it's now a safe alternative to take.
Lastly I need to change the motherboard as Gigabyte seems to have dropped the GA-P55-UD4P for the GA-P55A-UD4P. I love Gigabyte. They produce a high quality motherboard at reasonable price points. They are recommended on loads of tech sites and are universally praised. However, even a company with an excellent reputation occasionally takes a misstep. Gigabyte has with their current line of P55A variants of their P55 motherboards.
Their P55A motherboards add 6 Gigabit SATA and USB 3.0 support via an additional chip on the motherboard. 6Gb SATA should help performance with future SSD drives, doesn't really help conventional hard drives which are just now barely beyone the 1.5Gb SATA speeds. USB 3.0 will definitely help with external HDs (but you would need a USB 3.0 external HD, doesn't do jack with the current ones). The problem comes with how Gigabyte added this chip to the motherboard. The chip needs a single PCIe V2.0 channel to work. The only source of PCIe V2.0 channels are the 16 found on the CPU that is normally connected to the video card(s). So if you enable support of either 6Gb SATA or USB 3.0, you lose not only SLi and Crossfire compatibility but the primary video card slot is now only x8 and not x16. On top of that Gigabyte charges $20-$30 more, depending on the model, for the privilege.
<RANT>
Now that wouldn't be a problem but the older, higher end, non P55A motherboards seem to have been unofficially discontinued since nobody is carrying them anymore. So we have a bad engineering "hack" on a otherwise good motherboard that no right minded gamer will turn on but are charged for it and then they drop the original motherboard line without this feature. It's a gamer stupidity tax.
</RANT>
So until Gigabyte comes up with a better solution, like Asus did with their P7P55D-E series of motherboards, I, with a heavy heart, am staying away from Gigabytes entire P55A line of motherboards. So does this mean I'm going Asus? No, since I got rid of the dedicated sound card last November I want a motherboard with better audio quality than what's found on the Asus. So I'm going with the MSI P55-GD65. It performs well, gets good reviews, uses the same sound chip and if memory serves me, is actually $10 cheaper than the GA-P55-UD4P.
Of course as soon as I post this a container ship full of GA-P55-UD4P and GTX 275s will dock in Long Beach and the parts that I was forced to change will once again be back in stock. -
I take it that you are looking for a card for that new Dell you are getting welshtom.
Things to remember.
1) Is your model a full width, mini tower case or is it one of the slimline versions of the Optiplex line? If it's a thin case it will severely limit your options.
2) Power supplies that come with business class computers are not really that hefty, all they expect a business to do is add more memory and hard drives. This is important because video cards that excel at 3D well use considerably more power than cards that do not. This limits your options as well. On top of that the smaller, slimmer variants of the Optiplex have even smaller (in wattage) power supplies than the mini tower and have considerably less options if you consider upgrading it. -
Recent is a relative term. The X1300 came out 4th quarter 2005 as a low end card. The Dual likely refers to dual monitor support.
Here are its basic specs. Memory is DDR2. Motherboard is either Socket AM2 or AM2+. -
Oh, and to clear up a few things.
1) There is no such thing as a P55A chipset. P55A is a designation that GigaByte uses for their line of stupidly hacked motherboards that support 6Gb SATA and USB 3.0 by sacrificing not only the SLi and Crossfire support but turning the primary video slot into only a x8 slot. Sure the board will act just like their P55 designated motherboard if you disable the 6Gb SATA and USB 3.0 but then why pay $20-30 more? Maybe someday they will come out with a "P55B" motherboard that isn't crippled when 6Gb SATA and USB 3.0 is enabled. Also maybe late this year or early next Intel will introduce a Socket 1156 chipset that doesn't need an add on chip for 6Gb SATA and USB 3.0.
2) "The Windows 7 RC will stop working on June 1, 2010. After that date, your PC will stop working and it may be difficult to recover your files. If you're running the RC version of Windows 7, please be prepared to reinstall a prior version of Windows or the final version of Windows 7 before June 1, 2010."
3) A good inexpensive Socket 1156 heatsink is the Cooler Master Hyper 212 Plus for $30. Review -
No complaints here. The i5-750 is a better quad core the the older Core 2 Q9550.
You're already 50% over your original budget and 2x1GB DDR3-1600 Cas 8 is another $60-70. If you really want more than 4GB might as well spring the additional $30-40 for another 4GB instead of just 2GB. -
Quote:Honestly an SSD? Give it another year or two before they are finally ready (and the prices drop from their extreme 30x premium per GB) for the mainstream.I repeat, SSD is by far the best bang for the machine, 64GB is more than enough, and install COH on it and any othe disk intensive game. You will not notice much while playing, monster will appear quicker. The big notice will be at login and zoning where the the machine is pulling large numbers of different textures and really seeking hard. That is SSD's spot. You will notice that you will always be the first in a zone of a team, changing zones will be faster and places where you approach large numbers of different textures (like Atlas Park during a costume contest)
SSDs are simply the current status symbol among hardcore PC gamers. An expensive knick-knack that they self-delude to believe it's worth the expense while quantifiable benchmarking show no such "extreme" improvement.
I'm not saying that they are useless, just that $300 are better spent on other items. -
Quote:The 6 or higher rule isn't as pertinent anymore after a couple of rounds of nVidia rename game.The standard "rule of thumb" is that you should look for a laptop with 1) a dedicated graphics card, and 2) a card that has a 6 or higher as the second digit in model number.
Over at NotebookCheck they classify mobile GPU performance. I wouldn't go with anything less than what they bin as Class 1 or 2. -
See here're the problems.
1) Intel isn't making any "new" and more "powerful" CPUs for Socket 775. Instead they are introducing low end Core 2 CPUs with smaller caches and slower FSB speeds (in the case of dual core).
2) The Socket 1156 and Socket 1366 CPUs are DDR3 only. This means you will need to replace RAM as well. (BTW, is that one stick of 4GB and one of 2GB or a pair of 2GB and a pair of 1GB?).
3) Next week Intel will be introducing their "replacement" for office and home CPUs, the Core i5-6xx and Core i3-5xx. They are dual core but allow two thread per core so the OS sees it as a quad core. The i5-6xx has Intel's Turbo Boost that will automatically bump the clock speed up, the i3-5xx doesn't. They use yet another new motherboard chipset (H55/H57) and are reported NOT officially allowing a multiple GPU setup, unlike the Core i5-7xx or Core i7-8xx (thank you Intel for a clear, straightforward number scheme /sarcasm).
So right now there isn't really a $300 upgrade you can do to your rig. Twice that could give you a Socket 1156 MB that supports both SLi and Crossfire, 4GB of reasonably fast DDR3 memory and a Core i5-750 true quad core.
Sure you could go with something like this motherboard coupled with this CPU is $290 before sale and rebates. Not sure how much of an improvement that would be in terms of gaming. Most of the benchmarks here thrive on quads so the slower quad wins hands down. The last four are game benchmarks and they do show that a game that can use more than two cores, FarCry 2, really shows an advantage. I should note that they aren't pushing the games at their highest quality settings so they can highlight CPU differences. Cranked to the max there might not be any noticeable difference at all. -
Damn, he was bad enough for us dial-up people at the BM when he was on the other side of the docks.
-
If you are talking about the Virtual XP Mode found in the Pro and higher versions of Win 7 be aware that there are multiple hardware requirements in the CPU and Motherboard (BIOS).
For Intel processors, the majority of inexpensive Intel Core 2 based CPUs don't have the necessary hardware support. This includes the E2xxx, E4xxx, E5xxx and E7xxx dual core and the Q8x00 and most Q9x00 quad cores.
Doesn't matter anyways because the Virtual XP mode doesn't have 3D video support so no 3D gaming.