Scyntech

Informant
  • Posts

    11
  • Joined

  1. 10 whole days?...wowwy that's pretty lame
  2. The costume changes are nice, But pretty much everything else in this pack reads "Fantasy Pack" and not "Mutant Pack"

    The "mutant" idea really seemed to be an afterthought.

    The lack of costume parts is disappointing as well. The face for Bioluminescent is just horrible. The "Joker" mouth seems really out of place along with the cat eyes. I was expecting something that screamed mutant. This screams....sort of....maybe....we thought a little about it being mutant. Whats with the heels on the boots?

    The secondary mutant powers are a nice feature,but really other than that and the costume changes. I wouldn't have bought the pack. Luckily it was free. I guess they are allowed one bad apple in the bunch.
  3. I want to point out I changed my settings back to what I originally had prior to latest patch.


    I am currently using Nvidia drivers 195.62 "which I was told was out of date" I am currently back at 25-40 FPS.

    This includes a fresh re-install of the game. Nvidia drivers 197.45 and Beta 257.15 did nothing to correct the lag issue. That does not mean drivers 197.45 and Beta 257.15 will not work properly.

    I really do not know what caused the issue in the first place, unless some of us have corrupted files cause by the recent patch or at the very least files not rewriting as they should.
  4. Tech support is telling me it's a driver issue.....they obviously did not read my reply....looking for the overly big "rollseyes" smiley

    I did reinstall today, used Nvidia drivers 195.62 - 197.45 and 257.15Beta ...nothing

    Something in the latest patch messed things up. Multiple people having the same issue suggests that.
  5. Ok, I just finished 2 hours of settings changes. I would set the f/x to default, restart, then adjust.

    The game literally showed different settings each time I opened "advanced" from default. Hmmm...that should not happen.

    anyway, I have to get back to working. I spend another 2 hours another day Lv'ing my settings.
  6. I have noticed "Shader Quality" will not change when clicked on. It stays at "High"


    UPDATE: tried resetting as suggested and noticed no difference. Still having severe lag. I don't know, I guess wait and see


    2nd UPDATE: Logged into my friends account,same issues
  7. same here, I went from a consistent 40+ FPS down to 5-15 FPS! tried new drivers,different settings. Nothing is working.

    It is not my system, it is the newest patch. Crysis runs 40ish FPS,COD4 is at 50-60 FPS Aion is 60-110 FPS at max settings.

    Something is torqued in this new patch.

    Now my friend that runs a near identical system but has Windows 7 is running fine. I guess wait for them to fix this or transfer the game to a better engine.....which should have been done in the first place.
  8. Quote:
    Originally Posted by je_saist View Post
    Oh, it was skewed on purpose, but again, apparently I wasn't clear enough on what I was skewing, or why.

    I have a distaste for vendors launching accessories for a product that's not even on the market with the expectation of using that product as a springboard for sales, or in order to further the hype surrounding that product. I also have a distaste for corporations that deliberately mis-spend customers or clients money.

    I used the example of the Bugatti Veyron because it's an example of a product that turned out to be very good. It is the fastest road car in a straight line. However, it was preceded, and accompanied by, a rash of products that simply latched onto the Veyron name, such as an aftershave and a custom watch... accessory products that were junk.

    Now, in the specific case of Thermaltake, Nvidia, and Fermi, there are still several questions to be asked. Nobody knows yet whether or not Fermi is actually going to be any good for gaming. We can infer from Nvidia's reluctance to talk about clock-speeds, and the continual pushing back of the launch date, that Fermi isn't exactly all it's cracked up to be. We also now know that everything at Nvidia's late 2009 conference was faked in regards to Fermi, and that hurts Nvidia's credibility by a large amount.

    What we don't know is if Thermaltake licensed (paid money to Nvidia), for the right to slap a Fermi-Certified sticker on one of their cases, jack the price up, and make that money back on users buying a case because it's certified for Fermi.

    What we do know is that Thermaltake probably actually hasn't any time with Fermi silicon, so we can be pretty sure that the thermal requirements the case is designed to meet are based off of the thermal requirements mentioned by Nvidia. This poses an interesting scenario. What if Nvidia is paying Thermaltake to put the Fermi name on an existing product with a few mods, hoping that the finalized Fermi product meets the design limitations, and that Nvidia can make money back on royalties from products sold bearing the Fermi-Certified moniker.

    In this scenario we have Nvidia spending money they actually do have, but in a way they should not. Nvidia's already in enough trouble with the likes of Asus, Acer, Aopen, FoxConn, and Clevo for having outright lied about the thermal properties of the G7x, G8x, G9x, and on initial reports, the GT2x series of chips in both desktop and mobile formats. Nvidia's lack of trustworthiness in detailing the aspects of their chips is commonly referred to as: "Bumpgate"

    Now, in all fairness, given the thermal properties of the recent high-end cards from both ATi and Nvidia, nobody in their right mind is going to try to shove a high-end Fermi card into a chassis from Dell, Hewlet Packard, or Acer's Gateway and Emachines divisions. Most gamers interested in Fermi probably are going to have properly designed cases with proper air-flow. Not really a big deal.

    However, the situation with Thermaltake does raise some other questions, such as the one I brought up in the first post on this particular subject. Does Nvidia seriously intend to make money off of either licensing Fermi-Ready Certifications to vendors, or by receiving royalties back on sold products bearing the Fermi-Ready moniker? How many more products are we going to see bearing the Fermi-Ready or Fermi-certified badges over the next 3 months as Nvidia and TSMC presumably ramp up production of Fermi based cards?

    There's also another huge problem facing Nvidia right now. Fermi is a megachip, with some 3billion odd transistors. As far as we know, Nvidia hasn't exactly designed a low-end or mid-range product off of the Fermi Architecture. As is, Nvidia has only "now" released low-end and mid-range parts based on it's GT2x series of chips... which was really just a re-implemented version of the architecture used in the G8x chips.

    Now, this might not mean much to the common user until I give the names of two graphics cards parts from the past.

    GeforceFX
    Radeon x800

    Nvidia had originally planned to launched the GeforceFX series of cards up against ATi's Radeon 9500-9700 range of graphics cards. However, ATi, with the help of ArtX, had blown Nvidia out of the water. The Radeon 9500 and 9700 Graphics cards were perfectly capable of accelerating DirectX 9 games at 30fps, even in the dizzying resolutions of 1280*1024 and 1440*900. Nvidia basically had to take GeforceFX back into the lab and add in DirectX 9 extensions. Unfortunately, when Nvidia actually got GeforceFX out the door in the second quarter of 2003, they weren't having to deal with the the original Radeon 9x00 line up... they had to deal with the Radeon 9600 and 9800 series, which were clock-bumped and more efficient. The result was a disaster for Nvidia. Their only success of the GeforceFX line-up was the GeforceFX 5200 which was a popular OEM card.

    Things changed on the next-line up of cards though. Nvidia launched the Geforce 6x00 series of cards in the second quarter of 2004. These cards featured DirectX 9c / OpenGL 2.0 support. ATi, on the other hand, was fielding the x800 and x850 lineups... which were DirectX 9.0b.

    Okay, in all fairness, most games didn't ever actually use the additional features in either the 9.0b or 9.0c versions of DirectX... and OpenGL 2.0 was pretty much only matching the base DirectX 9. It wouldn't match / pass DX9.c until the 2.1 revision which came much later.

    Still, the marketing damage was done. Nvidia was able to market the 6x00's range of full DX9c compatibility and win back most, if not all of the market share they had lost in the previous round.

    Now, we're coming up to a similar situation. AMD has an entire line-up of cards coming out from the low-end OEM market to the high end gaming market that are all DirectX 11 / OpenGL 3.2 compatible.

    Nvidia has a DX11 / OpenGL 3.2 compatible card for the high-end market... but not for the low-end or medium-range markets. Rather, from what we know of unofficial information, Nvidia intends to keep using the G9x architecture and it's GT2x respins in the low end market. These aren't DX11 / OpenGL 3.2 parts...

    Okay, in all fairness, we, as gamers, know that there's not actually that much visual difference between a game that's coded in DX9 / DX10 / DX11 / OpenGL 2.x / OpenGL 3.x. Keeping in mind that the Playstation 3 and Xbox 360 are OpenGL 2.0 GPU's, which is roughly equivelent to DX9, we know that all of the API's can produce some amazing visuals. The question really becomes which can produce the best visuals with the best frame-rate. The GTX line-up of cards, for the foreseeable future, is going to be able to deliver excellent frame-rates with excellent image quality. Most game developers are not going to be leveraging the very latest graphics API exclusively. So far most engines, such as the CryEngine, Unreal Engine, ID Tech 4, ID tech 5, Source, Torque Shader Engine, and Unigine have offered different fallback levels of rendering.

    Case in point being the user on the forums with the Workstation versions of the 7900 GT cards. Yes, they run most of today's hit games very fast... but they do so because they aren't running the hit games best image quality API rendering path. The user pretty much never noticed.

    ***

    So, why is this such a big deal if it doesn't really matter what product you buy?

    The simple answer is: It is a big deal to people who don't understand what's going on in the graphics, or does not understand what goes on a in a game engine.

    As both Nvidia and ATi found out the hardway, offering a product in the low-end that doesn't have the same feature set as the product in the high-end can be very painful in a financial sense. It doesn't matter if the RadeonHD 3200 can't actually run Crysis in 1024*768 using the DirectX 10 API rendering path. As long as ATi can put on the sales box that the RadeonHD 3200 can accelerate DX10 graphics, that's all users see.

    That's all the mass market cares about.

    That's also where all the money is. Nvidia and ATi make the majority of their money off of low-end parts sold in contract to OEM's and ODM's.

    ***

    Which is why the Thermaltake event irks me so much.

    If I was an executive within Nvidia, I wouldn't be spending a dime on any promotions, on any marketing or rhetoric, until I had working silicon in my hand that I could take to Kyle Bennet over in Texas and allow him to benchmark, or send to the people who run sites like Phoronix or Mepisguides to benchmark and look at. I'd be throwing money left and right to get mid-range and low-end parts of the new architecture ready for 0-day launch with the high-end part.

    I, however, don't work for Nvidia, so I can only speak on what I see from outside the company. What I see is Nvidia wasting money on potential marketing stunts, rather than tending to their core business.

    And that's what just torques me off.

    Nvidia does NOT have DirectX11 yet. They will not until the 300 series hits the shelves. ATI is currently the only video card developer that does starting with the 5xxx series.
  9. hey wow steam! Great! Awesome! Now how about letting us buy some stuff? Fix them dang servers!!!!!
  10. Edit: I got it. Great TT!=)

    Scyntech - VanGuard Unlimited