I don't post on the boards often, but this seemed to warrant some commentary. Also, I'm a lot more familiar with OpenGL than Direct3D, although I do have experience with both.
Quote:
Originally Posted by je_saist
The theory behind OpenGL support is that each successive API includes fallbacks for the older API's. So if you write an application that uses the Tessellation from OpenGL 3.2, theory states that if the OpenGL driver finds your hardware does not support Tessellation, it will render the scene without tessellation. You should still have the same basic polygon / structure build though.
There's several threads, like this one, over on OpenGL about the fallback rendering paths.
|
I go into a bit more detail below, but this isn't accurate. What's happening in this particular example is that the programmer has to explcitly check for two pieces of hardware functionality: Programmable Shaders (GLSL), and Multitexturing. If they don't exist, he then turns off a particular portion of the render path. The driver does not handle this task automatically. Attempting to call the functions where the hardware does not support them causes an error. Some of us have seen this in COH. If we try to run it on a machine with default drivers or a lousy video card, a message pops up to the effect of: Unsupported Extension: GL_ARB_MULTITEXTURE.
Quote:
Originally Posted by je_saist
It wasn't till 2007 that things began to change for OpenGL, with the Khronos group basically doing what 3DFX had done years before, and making a list of gaming specific commands from the full OpenGL API: http://www.opengl.org/pipeline/article/vol004_2/
It's expected that 3DFX's legacy of hand selecting gaming specific calls will continue with the expected OpenGL 3.0 ES or 3.2 ES specifications. Rumor has it that Activision Blizzard, EA, and Transgaming had quite a bit of input on what's expected to be the next short list of OpenGL 3.x gaming specific calls.
|
OpenGL ES (Embedded Systems) is not a set of gaming-specific calls. It is a subset of OpenGL that is used for low-powered devices: cell phones, mp3 players, portable gaming devices, refrigerators, and other embedded systems. It is used for games on cell phones / portable consoles, but the major use is actually for user interfaces. It's not intended for PC games, or high powered consoles like the PS3 / Xbox 360, where a more full-featured 3D API would be appropriate.
Quote:
Originally Posted by je_saist
Other than Microsoft just loves to be in control of everything?
My opinion is that Microsoft does utitilize DirectX as leverage on game developers. As far as I'm aware, and anybody whose actually more familiar with the DirectX API can answer this... previous API specifications aren't always implemented in the Current implementation.
With OpenGL, the fallback rendering path is supposed to be part of the OpenGL driver. The idea is that if you make API calls that the hardware does not support, OpenGL just does not run those calls, but still builds the scene anyways. There are some API calls that are deprecated: http://www.gremedy.com/tutorial/_add...sarydeprecated :: Although this is how I understand the fallback process, that doesn't mean that I'm right here, or that this is how it actually winds up working. Somebody who actually has experience writing to the OpenGL API's is better qualified to speak on how the fallbacks actually work.
With DirectX... the memory that sticks in my head comes from Half Life 2. At the time Half Life 2 was launched Valve software said something about having to maintain a separate rendering path for DirectX 9 support, DirectX 8.1 support, and DirectX 7 support. They couldn't just write one coding path, and let the driver / underlying system figure out what to display / what not to display.
Since DirectX puts an additional burden on graphics developers, there is a financial limit on just how much work can go into a project that will give a return. As OpenGL shows, the graphics API is not as integrated as Microsoft would like to have everybody believe. The Graphics API of DirectX 10 was developed as an update for WindowsXP to begin with, something Microsoft doesn't really like to talk about.
The... implication... is that Microsoft is using DirectX to force publishers into a hard spot. Either the publisher okays funding for coders to work the hours needed to maintain and support separate rendering paths... or... they don't. Microsoft's pressure on the publisher is what winds up putting pressure on the consumer.
|
The Direct3D API does change from version to version, but the changes from D3D 9 to 10 were not that different than the changes between OpenGL 2.1 to 3, depending on how you define your OpenGL rendering context. Some function calls are added, some are changed, and others are dropped. If you are going to support D3D 9 and 10, you'll need seperate render paths for each, as they have incompatible rendering contexts. If you generate a D3D 9 context, you need to use D3D 9 calls, and the same is true of D3D 10. Performing a D3D 9 Call on a D3D 10 context will fail.
OpenGL 3 has a similar mechanism: if you define an OpenGL context to be "forward-looking," you set the minimum version of compatibility - say 3.1, and you're strictly forced to comply with OpenGL 3.1+ function calls. For instance - because the fixed function pipeline was deprecated in 3.0, and removed in 3.1, certain basic calls to it like glTranslate() will throw an error. There is also a backwards compatible context, where deprecated calls aren't removed.
However, the interaction between some deprecated OpenGL calls and the new OpenGL 3 calls is undefined. Thus, if you opt to use both the deprecated and new features, the action that will be taken is whatever the driver feels like doing, which can be the right thing, but is not guarenteed to be. Because of this, it can preferred to have a forward-looking OpenGL 3 context, especially with new code, or heavily optimized code. So you wind up with two seperate render paths with OpenGL (one for 2.1, another for 3, if you are planning on supporting both), much like you would with different versions of Direct3D. Personally, I'm curious what route the devs choose for Ultra Mode.
The deprecation in OpenGL 3 is long overdue. The Fixed Function pipeline has been around since the late 80s (OpenGL's been around since '92), and it's not indicative of how things are rendered anymore. Initially, The new features of OpenGL 3 were not going to have explicit backwards compatibility with 2.1 (Google "OpenGL Longs Peak" if interested in the history), but ultimately the Architecture Review Board decided to maintain backwards compatability, at the cost of performance. The majority of fixed function pipeline calls have a significant performance hit for rendering scenes with a large number of vertices.
Ultimately, OpenGL and Direct3D have different design philosophies. Like all APIs, they are slaves to the hardware. Which is "better" is really a matter of what the task at hand is.
Quote:
Originally Posted by je_saist
There's no technical reason that I'm aware of that Microsoft cannot implement DX10 and DX11 atop Windows Xp. That OpenGL can render the exact same scene with the exact same image quality pretty much torpedoes arguments that the graphics API won't work at all.
|
I managed to ask a Microsoft developer at Siggraph about this a couple years ago right after Vista was released. His claim was that the reasoning behind not porting DX10 (really Direct3D 10) to XP had to do with a design decision made on integrating it with the
Windows Vista Display Driver Model, which would also need to be ported, and doing that was supposedly deemed to be too much of an ABI change for a production OS. I've not actually been able to verify this anywhere else, so the guy could have been making this up. However, it did seem to be a reasonable explanation.
Quote:
Originally Posted by je_saist
I'm afraid you got that backwards. Nvidia's the one whose been having driver issues. Please get out of 2003.
|
From my perspective, Nvidia's drivers remain more stable than ATI's at least for OpenGL development. Perhaps the better term would be more "forgiving," as they don't crash as often when I am debugging (and usually sending bad data in some form or another to the card). As always, YMMV. ATI certainly has the hardware lead for now.
Anyhow, hopefully this helps clear things up a little.