TL;DR: 10.36 works around the latest NVidia driver – let X-Plane auto-update and everything will just work the way it should.

X-Plane 10.36 is out now – it’s a quick patch of X-Plane 10.35 that works around what I believe to be a driver bug* in the new NVidia GeForce 352.86 drivers.

10.36 has been posted using our regular update process and has been pushed to Steam, so if you’re running 10.35, you’ll be prompted to update. The update is very small – about 10-12 MB of download.

With this patch, you can now run the latest NVidia drivers. I have no idea if those drivers are good (I have anecdotal reports that they’re both better and worse than the last drivers, but these kinds of reports often have a large ‘noise’ factor**).

We patched X-Plane because we can cut an X-Plane patch faster than NVidia can re-issue the driver, and the driver issue was causing X-Plane to not start at all for any users, which was turning into a customer support mess. Past NVidia-specific patches have been to fix bugs in X-Plane, but in this case, we’re simply avoiding a pot-hole. I hope that NVidia will get their driver fixed relatively soon so that people installing from DVDs with older versions of the sim won’t be stuck.

Update: NVidia fixed this bug in the new 353.06 drivers!

[OpenGL, Windows 8.1 -x86/x64]: GLSL shader compile error. [1647324]

* The bug is that #defines defined within a function body don’t macro-substitute, but #defines outside a function body too. The work-around is to move some #defines out of function bodies. If anyone can find a reason why #defines can’t be in function bodies, please shout at me, but it’s a pre-processor.

** We’ve had reports of huge fps improvements and losses on beta updates where we’ve made only cosmetic changes to the sim.

About Ben Supnik

Ben is a software engineer who works on X-Plane; he spends most of his days drinking coffee and swearing at the computer -- sometimes at the same time.

29 comments on “NVidia: 4 Ben: 1

  1. Interestingly, at least to me, I see significant improvements on my laptop with dual 970Ms but zero improvement to possibly a little degradation in fps on my desktop with a Titan X. Although I see performance improvements across the board in other games the Titan X with X-Plane has been a disappointment to me.

    1. I was at risk of being shut out. 🙂

      I’m mostly joking with the score-card, but there is a point to it: for every real driver bug in the modern OpenGL stacks (and there -are- quite a few!) there are a lot more mistakes made by application developers that, due to their symptoms varying by card, get blamed on the drivers.

      The fault is with the system – if you read what a lot of the driver guys said about Vulkan at Sigraph this year, they’re all really excited to have a much -smaller- API to implement. Fewer API calls means less complexity, means it’s a lot easier to test -every- case and find all of the bugs. OpenGL has just become too big and sprawling for anyone to really know what it is supposed to do.

      1. As far as I understood Vulkan proposes an approach where much of the memory and thread management to generate command buffers, previously done by the graphics driver, would be moved to the application itself. I don’t know if you plan to migrate to Vulkan somewhere in the future, but I imagine it would be one heck of an overhaul to the X-Plane source code…

        1. I have been following Vulkan as closely as possible, and I read the entire Mantle programming guide (which is now freely available) to get some sense of the IP they were starting with. You are correct: Vulkan moves resource management, memory management, and threading responsibility to the app. This is almost entirely a good thing; applications typically know a lot more context about these things than the driver does.

          For example, X-Plane’s threading model is built such that a lock is -never- needed between a resource being loaded and a resource being drawn. OpenGL, however, is required to apply locking _just in case_ the app is thrashing between drawing and uploading.

          So one thing we could do would be to create an emulation of the basic OpenGL calls we use (e.g. create-a-texture, draw a VBO) in Vulkan. When Vulkan is available, those implementations could be much faster than OpenGL because we could implement the simpler, more direct problem.

          For example, X-Plane -never- mutates a texture’s size. OpenGL has to check for this and handle it, but we don’t use it. So our Vulkan layer’s texture builder would be much simpler, and could skip a lot of sanity checks and special cases.

          The challenge for us (and anyone who developers a game/rendering engine) is the capabilities that Vulkan can do that OpenGL just can’t. Because Vulkan’s threading model is “don’t hurt yourself”, it is possible to create a rendering engine that spreads the CPU work of building the frame over several cores and then assembles the results later.

          There really is no equivalent of this in OpenGL, so an application that wants to do “multi-core render” would have to support two very different code paths in the rendering engine – the “multi-core” draw path and the “single-core” draw path.

          That kind of -fundamental- split of capability is what drives up the cost of development – it’s literally building two engines, and each one is likely to have very different bugs, require separate testing, etc. That’s a lot harder than having Vulkan provide a “fast GL” by taking advantage of what we know about the sim.

          In terms of time: I believe we could probably put together Vulkan-based rendering that replaces OpenGL in weeks. The reason I think this is that our iPhone app used to be GLES 1.1 and we had to (at the last minute) change it to GLES 2.0 for performance, which meant replacing a chunk of Open GL 1.1 (texture combine, matrix transform, lighting) with our own code. The entire change took about a day of coding to get something that worked just about as well. And what made the ‘port’ so quick was that we only had to implement the parts of tex-combine, matrix transform and lighting that we -actually used-. GLES 1.1 has spot lights, but we only use directional lights – time saved. We never use a modelview matrix that’s not orthanormal – time saved.

          I hope someone will build a complete GL 1.x on top of Vulkan – the result would almost certainly be more reliable than ‘real’ OpenGL, because the OpenGL API is too complicated to be properly tested by the IHVs. But a full GL 1.x is a huge project — tons of texture options, line stipple, lighting options, etc.

          What we would need to do would be a lot simpler: shaders, VBOs, UBOs, textures, and framebuffers…that’s pretty much it. It’s doable.

  2. Good day,
    yup, I sent a PM to NVidia and gave them the #’s on the driver I get a stutter when I use the glider; “Antares20E, I increased the “flight model” # to 4 as the instructions advise. the number for the total texture at current settings is only 350.69 meg’s so in no way have I maxed out the card. I then up-dated to 10.36 to see if the issue would be resolved .. and the issue is still there. the graphics looks better I just get the little stutter so there is something I am not doing right.
    thank-you for the work

  3. The last nvidia driver 352.86 seems to improve the quality of textures and of course there’s an improvement in FPS, that doesn’t necessary mean that x-plane 10.36 is made on this driver but probably the GPU works better with the new driver.
    Finally i think that x-plane 10.40 could and sholud be better optimized so we can fly smoothly even in populated zones and this colud pass through the use of directX maybe, which surely are better than opengl. But i don’t know which could be the technical difficulties in programming the sim with directX.
    Thanks for your beautiful work!!!

    1. All the Mac and Linux folks would surely be very disappointed to hear that x-plane is going to directx 😉 And part of Ben likely just died a little …..

  4. If X-Plane use only DirectX, Mac and Linux user will ditch X-Plane. I use Linux (Lubuntu 14.04) and I like some extra framerate increase by a few frames per second.

    1. I have no idea, why other games have much better graphics. Maybe its because of updates and old non-optimized code so plugins and 3rd parties continue to work.

      But still, there are lots of great technologies out there that allows developers to optimize resources, yet, xplane stays behind, i think.

      Its my main simulator now, runing at 35gps average (have gtx 970 oc a bit), but now Im waiting the new version of FS remade by Dovetail, hope it has better performace and more realistic graphics, like fog and distance, lights and ambient oclusion. (ie, right now moving at night on airports cant see the plane at all, even having hdr enabled).


      1. @Ben, with which OpenGL Version/Extension is X-Plane 10 running exactly?

        1. At a minimum, we require OpenGL 2.0.

          On top of that, we will use the following extensions, as far as I can tell:


          Basically X-Plane pulls ad-hoc the GL 3 and GL 4 features it needs via extensions. Rather than have a million if() cases, we have a few predicates – e.g. we have a function that answers the question “can we do deferred rendering on this hardware” – it goes and looks for the entire set of extensions we need.


          * GL_ARB_framebuffer_object is generally supported by DX10 hardware. Technically X-Plane can (barely) run with DX9c-class cards and will detect GL_EXT_framebuffer_object and the other pieces that can be used to fake it.

  5. Yes but graphically speaking it would be very very better with directX which are well supported by new GPUs like directX 10, 10.1 and 11!!!

    1. I agree with you, but on the other hand, what to do with other platforms?

      Idk the % of users of each platform, and its a technical issue based on those numbers.

  6. I think the release of X-Plane 10.40 will already be after the release of Direct X 12 c Windows 10.
    And, and I hope that in the X-Plane 10 will soon schedule c 8K resolution with all the well-developed lanshaft including the Arctic and Antarctic.

  7. Just stumbled on this page and this comment.

    Isao says:
    May 23, 2015 at 11:02 am
    If X-Plane use only DirectX, Mac and Linux user will ditch X-Plane. I use Linux (Lubuntu 14.04) and I like some extra framerate increase by a few frames per second.

    Do we Mac people have to be concerned that X-Plane won’t work for us in the future?

    1. No. We are not dropping Mac support. (Look carefully at the comments to note that there is no original LR comment that spawned this.)

      You do have to be concerned about performance, but that’s always been true – see also people reporting way better fps under boot camp, etc.

  8. Hi have always been following the blog here. Nice to get so “insider” knowledge- good job Ben and the LR team.

    I have NO hardware and software skills so this is just a thought…

    XP is using OpenGL 2.0 as a base and extension to pull calls from OpenGL 3 and 4.
    Isen’t this an ineffective way ? using this old base ..
    was looking at //
    OpenGL 2.0 is dated back to 2004..
    Would it not be good to do a “clean-up” move to a 4.0 base ??

    Please enlighten me *S*


    1. Three issues:

      1. We have a minimum set of hardware requirements that we shipped X-Plane 10 with. While we are still in the v10 run, we are not going to raise those hardware requirements.

      If we raise our minimum API version to 3.0 or 4.0 we won’t be able to run on that hardware. So no bump to 3.0 or 4.0 minimum in v10.

      2. It’s important that X-Plane be demo-able to users who have never done flight sim. A -lot- of those users will have Intel HD graphics on their CPUs and nothing else. So the lowest-end Intel GPU we want to support really matters; if we crank the spec up to GL 4.0 minimum, it’s going to be tough for people to try X-Plane without buying a new system first.

      3. On OS X, you can’t get the ARB_compatibility extension with GL 3.0 or 4.0. And without ARB_compatibility, pretty much every single plugin will break. So we use GL 2.1 + extensions.

      RE: efficiency, there’s no question that the rendering engine would be -easier- to support if we required a GL 4.0+ renderer. But it would also run on a lot less hardware; those multiple cases in the sim are serving multiple hardware ‘buckets’, and that’s pretty much just something we have to do as part of our business. We can’t mandate a super-computer for every user.

      1. Hmm
        So does this mean XP will always be catering to being accessible on Intel HD graphics?

        ugh. That’s like tying a racing boat to an anchor forever.

        1. We need to run on Intel. We don’t need to run on Intel at maxed out rendering settings.

          So Intel’s not holding us back in terms of what we can plan for at top settings; it’s necessary always to run on a spectrum of hardware.

  9. I would very much like that in the X-Plane 10 graphics and resolution were lanshafty 8K UNDTV (4320р)

    1. Kirill: please limit yourself to _one_ comment for each blog post. If you want to reply, reply -to- that post. Please do NOT post to the blog multiple times. (I have had to delete many of your ‘extra’ post copies.)

      1. Sorry, I just wanted to express my point of view, probably overdone.
        I just want to X-Plane 10 was the most advanced in the world, both graphically and in terms of productive aircraft simulator.

  10. The new 353.06 drivers fixes an OpenGL compilation error. I tested this new driver with X-Plane 10.35 (I have not updated to 10.36 yet) on Windows 7. No crashes seen here.

Comments are closed.