-
Posts
4197 -
Joined
-
Quote:... wow.And you think it worths a power slot that is unslottable? the side effects are barely noticable and only your secondary damage is affected with no change it it's base value...that's a huge cost for the benefit.
Side affects are barely noticeable?
Yeah. Um. Would you mind spending more than 15 seconds with your avatar in game before running to the forums and making a post? -
it's one of the more pointless powers of the dark armor set, yes.
As a stealth on it's own, it's not actually that stealthy. It has about the same stealth properties of super-speed. You'd actually need to either take super-speed (bad idea) or a pool power stealth in order to get it to a point where it's useful to walk around un-noticed. Super Speed is a bad idea since dark armor lacks knockback protection... and if you can afford IO's, combat jumping is a better proc holder for knockback procs... and if you can't afford IO's, there's acrobatics in the Super Jump set.
As a defensive power, it's lousy too. Yes, it makes a good proc base for Luck of the Gamber... but you are only gaining 5% base defense, and there's no defense debuff resistance in the Dark Set. So even if you take Weave and work the IO's to softcap, all it's going to take is just one defense debuff attack to get through, and poof, no more defense. If you don't take weave, or you don't work the IO's, your defense will be at 0% most of the time anyways if you run CoD.
Now, I can't actually prove beyond a shadow of a doubt that running Clock of Darkness causes you to lose aggro. On my own dark / EM and dark / fire tanks, running CoD does seem to make enemies scatter or lose interest in me... but there's a difference between perception and what actually happens.
The subject of Dark Armor was brought up to WarWitch in the Ask a Dev thread, but as far as I know, no response has been put forward to the issues raised with the set: http://zerias.blogspot.com/2009/09/c...-and-fire.html -
Quote:I hate to say this, but most games are like this when it comes to suggestions... My first real encounter with the... mindset... of the ludicrous was really Planetside. It wasn't uncommon to see junk ideas like making Tanks that mimicked (the then popular game) MechAssault topping the thread list. Sadly, the development staff assigned by SOE to Planetside had a bad habit of implementing the bad ideas.If the community hadn't decided they were more interested in treating it like forum PvP instead of coming up with a system so that good ideas got better, then it may have been. However when we tried people just really wanted a place to B.M.W. and that's what they got.
The devs do look at it though, never fear. However until people start making a widespread effort to promote good ideas more than argue over bad ones, the section is gonna stay where it is.
I seriously doubt that the Suggestions forum has a chance of moving anywhere in the near future though. I still see horrible ideas like trying to shove martial arts into a blaster secondary, a blast / defense archtype, equalizing Mastermind Pets with the Mastermind, shoving Trick Arrow into Dual Pistols, allowing sprint to buff fly... and... and...
and I'm just literally reading off a list of what the top threads are right now in the suggestions forums. I'm not really making these bad ideas up. These Ideas and suggestions that are more likely to cause the developers behind Posi (Mr. Miller), Castle (Mr. Grubb), and Brawler (Bruce!) choking fits rather than a pause accompanied with a we should look at this more closely.
I'm also not sure that the idea of putting it into For Fun works as well as Ex Libris and Lighthouse intended. There still are quite a few players who argue about every suggestion like any of the development staff is seriously going to even think about the matter. Then there are the players who are convinced that the more they ask about a particular feature, the more likely it is the developers are going to implement that particular feature, the quote squeaky wheel gets the oil mantra.
I honestly feel sorry for Ocho and Niv... keeping tabs on the suggestions forum has got to drive both of them bonkers. -
Everybody on the Test Server just got kicked! Cross your paws and wiggle your tails, I17 is closer!
-
-
Quote:Just run the Toy Shop executable file. It should start looping over and over. If it's a problem with your graphics card over heating, this should cause your computer to lock up like CoH is locking up.What do u want me to do with the Toy Shop demo? Is it just a game?
It's also a heavy duty stress test... so if the toy shop demo is able to loop endlessly, then there likely isn't a problem with your graphics driver or your hardware. -
Quote:I've got another question for you:After all this, I'm still freezing!!!!!!!
Any last ideas to get this fixed without getting a whole new laptop???
Maybe I did something wrong? I dont know...
is this the only game you play on this laptop?
A thought just bubbled to my head after Red Valkyrja's post. While Nvidia is known for the exploding laptops... it's possible that if a fan has died in your laptop... your laptop could be overheating and locking up...
Try something out like QuakeLive or looping something like this demo: http://developer.amd.com/Downloads/a...yshop-v1.2.exe
(you may need to login and register on AMD's site to download the Toy Shop demo) -
Quote:oh. I sent him a PM about it. The crashing described is pretty much consistent with a particular vendors 64bit driver support. However, since I've been rather... harsh.... about that vendor, and even more so in recent posts... was trying to stay out of that thread.je_saist, could you give Legion_of_One a hand down one thread?
-
Quote:you should be able to uninstall the Catalyst Control Center through the Programs and Features section of the Windows Control Panel.Okay after doing all of this, one more slight problem. "The setup has detected that version 2.08.1110 of CCC is already installed. This setup installs an earlier version of CCC (2.008.1028.2133). You will have to uninstall the previous version before installing this version".
How do I do this?
What's puzzling me here though is how you managed to get an updated Catalyst Control Center installed... -
-
Quote:Okay. Open up the Driver Sweeper program I had you get earlier. You should have a check box like this:After downloading it, it tells me I have ATI Mobility Radeon X1300
GPU: M52
Technology: 90nm
Release Date: 2006
Transistors: 105m
Device ID: 1002 - 7149

Put a check in the ATi and Nvidia boxes, then Click on Clean.

***
From this point, to install your "new" driver you'll need to follow the instructions listed Here: http://www.hardwareheaven.com/modtool.php -
Quote:.. okay. lets see if we can find out what you HAVE first.The first driver didn't work.. I'll visit the site but I don't think I'm going to know what to do. By the way, it said it couldn't find a driver that is compatible with my hardware. Any help?
Which driver am I supposed to download? Intel 945G Express?
And was I supposed to do something with the Driver Fetch after I downloaded it? I just scanned and then did nothing else.
Download and run this utility: http://www.techpowerup.com/downloads...-Z_v0.3.9.html
It should tell us what kind of graphics card you have in this laptop. -
according to this information: http://www.notebookreview.com/default.asp?newsID=2767 :: http://www-307.ibm.com/pc/support/si...IGR-62487.html
The T60 never had an Nvidia graphics card. IBM says it only shipped with an Intel Integrated graphics chip, or ATi chips.
Quote:You'll need 3 tools to install a "new" driver.* Intel Graphics Media Accelerator 950
* 64MB ATI Mobility RADEON X1300
* 128MB ATI Mobility RADEON X1400
* 256MB ATI Mobility FireGL V5200
* 256MB ATI Mobility FireGL V5250
First, you'll need Driver Sweeper: http://www.guru3d.com/category/driversweeper/
This will clean out all of the driver files that are in the computer. Since you have tried to install Nvidia drivers, you'll need to clear both the Nvidia drivers and the ATi drivers, and I'd clear the Intel graphics drivers too.
Second, you'll need mobility modder: http://www.techbeta.org/hardware/ati-mobility-modder/
This will mod the Catalyst driver for installation.
Third, you'll need a driver. I'm going to presume you have the x1300, x1400, or Intel GMA 950 since they were the most common consumer T60 sold.
ATi's last (supporting) driver release was back in March 2009, the 9-3 driver: http://support.amd.com/us/gpudownloa...?&lang=English
***
Now, if you have the Intel GPU, you'll need need to go visit Intel's site: http://www.intel.com/products/chipsets/gma950/index.htm -
Quote:*actually chokes*That would be me, for one. (Their "support" on my Radeon 7000 series made me decide never to buy them again at the time.)
However, they've since had several changes - including being purchased by AMD. Enough changes that, yes, I've decided to give them another shot in my new rig (as opposed to putting off finishing it until something decent from nVidia is actually in stock) and so far, I've been happy.
If you call what they did back in the Rage days and the original Radeon days support... you'd be perfect for helping to sell call-center services situated in India to US corporations.
Really, ATi's turn around from heel to hero can be traced to two events:
ATi's purchase of ArtX, which directly lead to the Radeon 9700 release, and then the resulting Catalyst Overhaul.
AMD's purchase of ATi, which lead to a complete OpenGL overhaul, www.x.org/docs/AMD, www.radeonhd.org, another overhaul of the Catalyst driver set (the 2D/3D driver source is now shared between all driver sets), as well as hiring this guy.
Anyways, I hope you stay happy MB
-
Nvidia's mobile chips are almost never as powerful as the desktop versions. You might be able to play in Ultra Mode depending on what the resolution scaling is.
-
Quote:part of me is tilting my head on this... and I'm going to try to explain why. The GT 220 and other chips are build off of the GT200 architecture, the same architecture behind the GTX series. Back when the GTX series launched, several sites, such as like Beyond3D and BitTech looked at the known information, and concluded that the base architectures of the G80 and GT200 were pretty much the same.No it's not. It's an entirely new chip.
9400GT - 16 SPs, 8 Texture units, 8 ROP units
9500GT - 32 SPs, 16 Texture units, 8 ROP units
9600GSO - 48 SPs, 24 Texture units, 16 ROP units
GT 220 - 48 SPs, 16 Texture units, 8 ROP units
9800GT - 112 SPs, 56 Texture units, 16 ROP units
On top of that the GPU in the GT 220 does support Dx10.1 where the older G9x based GPUs only support Dx10.
In other news there are rumors that video card prices are going to creep up in price due to continuing 40nm GPU shortages as well as RAM prices going up again.
Ergo, for a chip derived off of the GT200 series, I'm not entirely convinced the GT 2xx series is new.. so much as it is Nvidia doing what they pretty much did before, re-implement an existing solution in a new die.
Also, as the original G80 was based on programmable shaders, DX 10.1 wasn't exactly that big of a deal : http://www.extremetech.com/article2/...129TX1K0000532
(mental note for another thread: the extremetech link is also relates to another thread about fallbacks between OpenGL and DirectX)
Pretty much the biggest deal(s) for DX 10.1 is that 4x AA is mandatory, and it forces 32bit floating point precision. If you look at the rest of the details, the G80 architecture was pretty much capable of the support from the start: http://www.istartedsomething.com/200...-101-siggraph/
So I'm pretty much willing to standby my statement that the GT series isn't actually a "new" chip, that it's just a reimplementation of the stuff Nvidia was already selling, just tweaked to sound more attractive to prospective buyers. -
oog. http://hardocp.com/news/2010/02/26/f...icing_reported :: http://www.digitimes.com/news/a20100225PD202.html
Quote:It looks like the pricing leaks were/are correct. Nvidia seriously wants you (the consumer) to pay a full $100 more for a card that's slower than the competitors $400 offering... and a whopping $280 more for a card that on average is only 5%~8% in real games.The sources expect Nvidia will give supply priority to first-tier makers or makers that only produce Nvidia cards. Listings offering cards from XFX and PNY with GTX 480 models priced at around US$679.99 and GTX 470 at about US$499.99 were briefly available at some online retailers, but have since been taken down. PNY contacted Digitimes to stress that it is not offering pre-orders for Fermi cards at this time, which implies that the listings were the result of miscommunication in the company's channel.
Is anybody else but me thinking Nvidia execs are slipping Superdine into their kool-aid? -
Quote:okay. reimplemented then at a lower die sizeNo it's not a rebadged anything. Entirely new chip. Technically it's a mash up of the 9500GT and the 9600GSO and it's performance, the better ones (with GDDR3 memory) fall right in between their performance.
je_saist is still right about that the GT 220 is far below the 9800GT. Somewhere around 1/2 the performance of the 9800GT.
to me it's sort of like comparing the Radeon 9700 to the 9800, or I think more accurately, the 9800 to the x800. yes, it's technically different... but really... it's the same basic architecture underneath. -
Quote:Oh, I'm not going to argue that for a while, Microsoft was actually the good guy for gaming. The DirectX API integration was something that the *nix world had never thought of, and even today developers get confused over what API's and development tool kits are available and what can be used to reach the most users. (if you haven't seen the video at this link, take the time to sit through it)I do think, however, that Direct X was good for the PC gaming scene to begin with. There were so many different engines and so many different performance differences from one game to the next that something had to be done. Some games didn't like this version of a driver, other games demanded you had this one, if you had this driver combined with this sound card driver the game would crash etc etc. It was also a good thing as it game developers something to aim for. It still wasn't ideal, the graphics available on an Xbox 360 show what is possible when you have a fixed spec and you can program and develop to it's advantages rather than having to make your graphics scalable and the engine run on hundreds or even thousands of different specifications that could be classed as a 'PC'.
Granted, I am of the opinion that publishers can take the API confusion problem into their own hands. It's also my expectation that at least two game publishers involved with Khronos will probably be force that issue.
Ai. I'm reminded of that line from Star Wars... the one about systems slipping through tightening fingers.Quote:Microsoft decided as a business step not to launch Direct X 10 onto Windows XP purely to push sales of it, the same reason Halo 2 PC was DX10 only when it was proved it could be run on windows XP, and indeed it was cracked to do just that. There are still very few pure DX10 games as far too many people still run Windows XP! IMO there will never be a pure DX11 game either, and most probably no one will ever make a pure DX version only game ever again, MS game studios excepted, purely to drive the next OS!
***
part of me wonders if the developers are looking at reworking the sound engine to use OpenSL or OpenAL... That would ("help") solve some of the 3D sound issues that occur on Cider / Cedega, and probably address some of the looping audio problems that still crop on Windows every now and then.Quote:Direct X9 is listed as a minimum requirement for CoH/V for the GUI and sound, not the 3D rendering. -
Anybody else find it interesting that Golden Girl objects to being described as Tiny, but not to being pictured as a psychopathic weapon of mass geometry deformation and single targeted tackling?
-
-
Quote:Other than Microsoft just loves to be in control of everything?Any idea why Microsoft keep pushing to restrict DirectX for XP? Or is it just to push people to buy Windows 7?
My opinion is that Microsoft does utitilize DirectX as leverage on game developers. As far as I'm aware, and anybody whose actually more familiar with the DirectX API can answer this... previous API specifications aren't always implemented in the Current implementation.
With OpenGL, the fallback rendering path is supposed to be part of the OpenGL driver. The idea is that if you make API calls that the hardware does not support, OpenGL just does not run those calls, but still builds the scene anyways. There are some API calls that are deprecated: http://www.gremedy.com/tutorial/_add...sarydeprecated :: Although this is how I understand the fallback process, that doesn't mean that I'm right here, or that this is how it actually winds up working. Somebody who actually has experience writing to the OpenGL API's is better qualified to speak on how the fallbacks actually work.
With DirectX... the memory that sticks in my head comes from Half Life 2. At the time Half Life 2 was launched Valve software said something about having to maintain a separate rendering path for DirectX 9 support, DirectX 8.1 support, and DirectX 7 support. They couldn't just write one coding path, and let the driver / underlying system figure out what to display / what not to display.
***
Since DirectX puts an additional burden on graphics developers, there is a financial limit on just how much work can go into a project that will give a return. As OpenGL shows, the graphics API is not as integrated as Microsoft would like to have everybody believe. The Graphics API of DirectX 10 was developed as an update for WindowsXP to begin with, something Microsoft doesn't really like to talk about.
The... implication... is that Microsoft is using DirectX to force publishers into a hard spot. Either the publisher okays funding for coders to work the hours needed to maintain and support separate rendering paths... or... they don't. Microsoft's pressure on the publisher is what winds up putting pressure on the consumer.
There's no technical reason that I'm aware of that Microsoft cannot implement DX10 and DX11 atop Windows Xp. That OpenGL can render the exact same scene with the exact same image quality pretty much torpedoes arguments that the graphics API won't work at all.
***
The other aspect to think about here is Microsoft's business model. Microsoft's primary business model is built upon people buying new computers and replacing their operating system with new paid for versions. It's often referred to in the Linux communities as the Microsoft Tax. Steve Ballmer and Bill Gates are actually on record as stating that they believe that all computers should ship with a copy of Windows. Microsoft is also on record, both directly and through the Business Software Alliance, of taking computer manufacturers who sell computers that don't include Windows... into Court on charges of piracy. It's not possible that somebody makes living or runs a shop selling computers with no os or with Linux... if you sell a computer without Windows installed and without paying Microsoft... well...
Really, Microsoft's business practices resemble something along the lines of the Capone's or Winter Hill Gang. Which is probably also why Microsoft has been convicted of federal / state level crimes on 5 continents. (I'm not aware of any convictions in either Australia or Antarctica.)
Given that this is their business model, things start to fall apart when people don't buy completely new computers with Windows pre-installed, or don't buy Windows at all. So, Microsoft keeps trying to come up with ways to keep people buying. Remember, the original idea with Windows XP is that it was to be licensed software with consumers paying a yearly fee, whether or not Microsoft updated the OS. Microsoft actually did launch a subscription type service in 2008: http://arstechnica.com/microsoft/new...are-bundle.ars
***
From my point of view, more publishers and developers are coming to the realization that DirectX probably wasn't the way to go. I suspect Khronos is probably talking with some of the other major publishers about what they can do to get around the intentional restrictions put in place on Windows Operating systems. Linux aside, OpenGL allows developers and publishers to do what Microsoft does not want them to do:
Reach the most amount of potential payers for the least cost. -
I'm afraid you got that backwards. Nvidia's the one whose been having driver issues. Please get out of 2003.
-
Quote:The theory behind OpenGL support is that each successive API includes fallbacks for the older API's. So if you write an application that uses the Tessellation from OpenGL 3.2, theory states that if the OpenGL driver finds your hardware does not support Tessellation, it will render the scene without tessellation. You should still have the same basic polygon / structure build though.That's minimum support right now, though. There's nothing saying maximal settings won't require a later version of OpenGL to run Ultra Mode. That said as far as I'm aware most recent cards and operating systems support all recent versions of OpenGL, so I don't see a problem with this.
There's several threads, like this one, over on OpenGL about the fallback rendering paths.
OpenGL was originally developed and implemented by the Professional Graphics Development studio, SGI: http://www.sgi.com/products/software/opengl/Quote:Quick question: Whose design was OpenGL? I remember the "OpenGL vs 3DFX" debates back in the day between nVidia and Voodoo, if I recall correctly, and it seems that 3DFX died in that face-off, but was OpenGL an nVidia design, or did they just use that at the time?
3DFX used a re-implementation of a subset of the full OpenGL command list in their Glide API. 3DFX's hardware was built to process this limited processing set entirely in the hardware, which was also why it was so fast compared to competitors: http://www.linuxselfhelp.com/HOWTO/3Dfx-HOWTO-7.html
Nvidia's support of OpenGL was a little bit more robust. Rather than simply reimplementing the specification, they just supported OpenGL directly. However, Nvidia achieved fame for their proprietary extensions to OpenGL, one of the advantages to the API. Vendors could make up their own extensions, which they didn't necessarily have to tell other OpenGL developers about. The most famous of these proprietary extensions was UltraShadow, which was used extensively in the Doom 3 engine to make Nvidia's GeforceFX look good compared to the Radeon's of the time.
***
OpenGL began to fade out in 2002 and 2003 for 2 very good reasons. Microsoft's DirectX API had leapfrogged OpenGL's performance support by a good measure with DirectX 9. ATi's Radeon 9500 and 9700 series were also perfectly capable of processing DirectX 9 commands fast enough to be playable in high resolutions like 1024*768 and 1280*1024.
The Professional Graphics Development Studio, 3Dlabs, pushed the development of OpenGL 2.0 forwards as the Architecture Review Board was basically twiddling their fan blade.
Starting with the Geforce 6000 series, we also saw Nvidia take a step back from OpenGL and work more on DirectX support... largely because although 3Dlabs was pushing for an updated OpenGL specification, the Architecture Review Board was still pretty much playing Phil Keaggy's Doin' Nothin'.
It wasn't till 2007 that things began to change for OpenGL, with the Khronos group basically doing what 3DFX had done years before, and making a list of gaming specific commands from the full OpenGL API: http://www.opengl.org/pipeline/article/vol004_2/
It's expected that 3DFX's legacy of hand selecting gaming specific calls will continue with the expected OpenGL 3.0 ES or 3.2 ES specifications. Rumor has it that Activision Blizzard, EA, and Transgaming had quite a bit of input on what's expected to be the next short list of OpenGL 3.x gaming specific calls.
**
I hope that answered the question :P -
No. The game is purportedly built against OpenGL 3.0, based on the starting point for support being the 9800 GTX / GTS 250, and the RadeonHD 4850.
