This isn’t supposed to be a coding blog, but users do ask about DirectX vs. OpenGL, or sometimes start fights in the forums about which is better (and yes, my dad can beat up your dad!). In past posts I have tried to explain the relationship between OpenGL and DirectX and the effect of OpenGL versions on X-Plane.

At the Game Developers Conference 2010 OpenGL 4.0 was announced, and it looks to me like the released the OpenGL 3.3 specs at almost exactly the same time. So…is there anything interesting here?

A Quick Response

In understanding OpenGL 4.0, let’s keep in mind how OpenGL works: OpenGL gains new capabilities by extensions. This is like a new item appearing on a menu at your favorite restaurant. Today we have two new specials: pickles in cream sauce, and fried potatoes. Fortunately, you don’t have to order everything on the menu.

So what is OpenGL 4.0? It’s a collection of extensions: if an implementation has all of them it can call itself 4.0. An application might not care. If we only want 2 of the 4 extensions, we’re just going to look for those 2 extensions, not sweat what “version number” we have.

Now go back to OpenGL 3.0, and DirectX 10. When DX10 and the GeForce 8800 came out, nVidia published a series of OpenGL extensions that allowed OpenGL applications to use “cool DirectX 10 tricks”. The problem was: the extensions were all NVidia specific tricks. After a fairly long time, OpenGL’s architectural review board (ARB) picked up the specs, and eventually most of them made it into OpenGL 3.0 and 3.1. The process was very slow and very drawn out, with some of these “cool DirectX 10 tricks” only making it into “official” OpenGL now.

If there were OpenGL extensions for DirectX 10, who cares that the ARB was so slow to adopt these standards proposed by NVidia? Well, I do. If NVidia proposes an extension and then ATI proposes a different extension and the ARB doesn’t come up with a unified official extension, then application like X-Plane have to have different code for different video cards. Our work-load doubles, and we can only put in half as many new cool features. Applications like X-Plane depend on unity among the vendors, via the ARB making “official” extensions.

So the most interesting thing about OpenGL 4.0 is how quickly they* made official ARB extensions for OpenGL that match DirectX 11’s capabilities. (NVidia hasn’t even managed to ship a DirectX 11 card yet, ATI’s HD5000 series has only been out for a few months, and OpenGL already has a spec.) OpenGL 4.0 exposes pretty much everything that is interesting in DirectX 11. By having official ARB extensions, developers like Laminar Research now know how we will take advantage of these new cards as we plan new features.

Things I Like

So are any of the new OpenGL 3.3 and 4.0 capabilities interesting? Well, there are three I like:

  1. Dual-source blending. It is way beyond this blog to explain what this is or why anyone would care, and it won’t show up as a new OBJ ATTRibute or anything. But this extension does make it possible to optimize some bottlenecks in the internal rendering engine.

  2. Instancing. Instancing is the ability to draw a mesh more than one time (with slight variants in each copy) with only one instruction to the graphics card. Since many games (like X-Plane) are limited in their ability to use the CPU to talk to the graphics card (we are “CPU bound” when rendering) the ability to ask for more work with fewer requests is a huge win.

    There are a number of different ways to program “instancing” with OpenGL, but this particular extension is the one we prefer. It is not available on NVidia cards right now. So it’s nice to see it make it into the core spec – this is a signal that this particular draw path is considered important and will get attention.

  3. The biggest feature in OpenGL 4.0 (and DirectX 11) is tessellation. Tessellation is the ability for the graphics card to turn a crude mesh with a few triangles into a detailed mesh with lots of triangles. You can see ATI demoing this capability here.

There are a lot of other extensions that make up OpenGL 3.3 and 4.0 but those are the big three for us.

* who is “they ” for OpenGL? Well, it’s the architectural review board (ARB) and the Khronos group, but in practice these groups are made up of employees from NVidia, ATI, Apple, Intel, and other companies, so it’s really a collective of people involved in OpenGL use. There’s a lot of input from hardware vendors, but if you read the OpenGL extensions, you’ll sometimes see game development studios get involved; Transgaming and Blizzard show up every now and then.

About Ben Supnik

Ben is a software engineer who works on X-Plane; he spends most of his days drinking coffee and swearing at the computer -- sometimes at the same time.

3 comments on “New Toys

  1. Yay! I was starting to become a little concerned that we would not have a path toward tessellation or instancing. That's very exciting news.
    I'd like to cast my vote toward some procedurally based low level terrain. Having instanced grass, rocks, etc, along with tessellated terrain will finally provide crucial low-level detail for more precise and enjoyable helicopter flying.

    Great posts, I always enjoy them. Keep up the awesome, and dedicated work.

  2. One question that I have after reading that exciting post, Ben, is this: How current does one's video card have to be in order to take advantage of OGL 3.3 and 4.0? Is it merely a cool driver update, or is it time to start saving again (after having dropped $400 on a brandy new GTX card a few months ago!! 😉 )

  3. Roughly speaking, DX10 class hardware should do the 3.3 stuff (dual blending, instancing) and DX11 hardware should do tessellation. That would mean:

    3.3: NV GeForce 8, 100,200,300 series
    ATI Radeon HDs 2000-4000

    4.0: NV 400 series (coming soon?) and ATI Radeon HD5000.

    Two bits of fine print:
    – Some Radeon HDs have tessellation hw already – ATI's way ahead on that. But I don't know if the tessellation hw is good enough to meet the Dx11/GL4 spec. So we'll see if we see the tessellation extension in the HD4000 series, for example. (This is a case where OGL is nice because if hardware does have one-off interesting features it can advertise one extension, even if it isn't full OGL 4 hw.)
    – It is my understanding that the GF8800 does _not_ have the right hardware to truly optimally do instanced arrays, so the driver "fakes" this extension. This is moderately surprising to me given that this hardware can do some other extensions that strike me as equivalent difficulty. TBD how this shakes out.

Comments are closed.