Plugins can use OpenGL to draw directly inside X-Plane; this capability is almost fifteen years old and goes back to X-Plane 6. Plugin drawing typically falls into four categories:

  1. Drawing user interface (floating windows, HUD-like UI, map details, etc.).
  2. Drawing custom panel instruments (particularly 2-d glass displays).
  3. Drawing effects in 3-d (custom smoke or fire, or maybe diagram marks showing lift).
  4. Drawing solid stuff in 3-d (e.g. pushback trucks, parts of the airplane, etc.).

These first two use cases work pretty well; the third case works barely, and the last one is a bit of a train wreck.

Problems With Plugin Drawing

OpenGL an 3-d hardware was very different when Sandy and I created the original plugin SDK 1.0. Hardware had a fixed function pipeline built into the GPU itself, and your computer had one CPU core, so the issues I’m going to describe weren’t a problem back then. Here are some problems with plugin drawing:

  1. Increasing overhead just to draw. It used to be that we could just send a message to the plugin system at “good times” and let drawing happen. X-Plane now has to do a bunch of work to prepare OpenGL for plugin use; this synchronization work slows down X-Plane, and it’s getting worse as X-Plane’s internal design diverges from the canonical OpenGL 1.5 pipeline. (Example: we have to copy our matrices into the fixed function pipeline matrices before we call you, then copy them back if you call us with a drawing routine. Slow!)
  2. Plugins don’t have access to the fastest drawing paths. The less our drawing looks like OpenGL 1.5, the less efficient plugins are in comparison to X-Plane. This is getting worse over time too. (For example: if we want to render two VR eyes at once, we have to call your plugin twice, once for each one.)
  3. Plugins don’t have access to our lighting environment or G-Buffer formats, and thus can’t easily draw in a way that integrates with our 3-d world. This is getting worse as our rendering engine becomes more complicated.

So…this is not fantastic. The first two use cases (UI and panel) aren’t that badly affected because the two of them require only 2 or 3 call-outs to plugins, drawing in a very simple manner.  The second two (3-d use cases) are quite problematic.

These costs are going to get worse when X-Plane moves to Metal and Vulkan drives; the cost of syncing over to OpenGL will be higher, plugins won’t even be drawing on the fastest APIs, and we may have to do some expensive texture synchronization.

Fixing The Problem

My goal isn’t to completely eliminate the costs of plugin drawing; rather just to minimize the costs by providing better alternatives for common tasks.

So in the long term, my plan is to add something to the SDK that we should have had for a while: the ability to create persistent object and effect systems in the X-Plane world.

Right now if you want to draw an object, you call XPLMDrawObjects – hopefully you’re doing this from just the right callbacks, but if you get this wrong, I forgive you; the rules for how to do this correctly in the v10 and v11 renderers are insanely complex, because the plugin API was invented fifteen years ago.

In the future, you will be able to simply create an object and X-Plane will draw it at all the right times, for any rendering mode, handling all of the special cases for HDR, shadows, reflections, you name it. You’ll just need to tell us when it moves. The same idea can be applied to particle systems for effects.

The one open issue with this scheme is how animation will work. Currently animations is handled by reading datarefs – if you want to animate multiple instances from a plugin, you have to figure out (inside your dataref handler!) which instance is being drawn and return the right animation values.

Not only is this a complicated mess, but it’s also slow; it requires us to call your dataref callback many times from inside the drawing loop, slowing everything down.

What I’d like to have in the future is some kind of ‘animation block’ where:

  1. Your plugin queries the object to find out which animation values it needs.
  2. Each time you update a particular instance of that object, you provide a packed block of the values of all of those animations.
  3. We use the block directly and never have to call datarefs.

This technique isn’t just faster, it also is easier to make multi-core and will play nicely when we can instance animation in the future.

The time frame for this is “during the v11 run hopefully” – that is, a new object API is not going to be in 11.0, and I can’t guarantee when it will be. I expect to support the old way (XPLMDrawObject and drawing hooks) at least through the entire v11 run, if not longer; our expected failure mode is ‘if you do it the old way, the sim might be really slow’. The new APIs will be designed to be completely friendly with multi-monitor, VR, Vulcan, and multicore.


About Ben Supnik

Ben is a software engineer who works on X-Plane; he spends most of his days drinking coffee and swearing at the computer -- sometimes at the same time.

49 comments on “In the Long Term, Plugins Shouldn’t Draw In 3-D Directly

  1. Hi Ben,

    Do you plan for a Vulkan rendering engine in X-plane 11?

    I’m quite disappointed with the x-plane 11 performance. At 4k resolution, I get 20fps – with very low GPU and CPU usage- , when in a populated airport with world objects maxed, on a Titan X + i7 5820K + 48GB DDR4 system.

    Since neither the GPU nor the CPU is stressed by any means, I can only assume that there’s a single thread on the CPU for the rendering work, which can’t cope with feeding instructions to the GPU. Vulkan should allow to overcome such issues on a multicore, multithreded CPU.

    Thanks for your detailed technical responses so far. I hope the team focuses more on performance from now on.


    1. We do plan to do a Vulkan port.

      I’m not sure what’s up with your setup – I get 30 fps on an older, slower NVidia system at lower res..the result of higher res should be to increase GPU usage. So I’m thinking you’ve got settings turned on that you need to turn off.

      1. Swapping out an nvidia 780 to a 1070 for me showed no improvements in 4k speed.
        On an nvidia 780, switching from 2k to 4k gave a significant framerate drop.
        Changing the 780 to a 1070, the framerates were the same as the 780 for 2k and 4k.
        So if my cpu is the bottleneck, the cpu must be doing more work at 4k – my guess was some draw distance or lod was tied to the frame size … but you’re not expecting that?

        1. The LOD might be boosted up – X-PLane has some internal calculations based on the size of a pixel, e.g. objects can be seen further at larger resolutions. It’s possible that at 4k this is making the object density go nuts. A good test would be to turn the world 3-d stuff ALL the way off and then re-compare the two cases…

  2. Hi Ben

    Reading you blog entry every time… Interesting! But being a Real life Pilot and not into software coding.. I (sometimes) have no Ideer what you guys are talking about *S*

    When you port to Vulkan.. Please get rid of all the old fashion way of doing things… Make it hightech and ready for the next many years… Multi core CPU and SLI/Crosfire..

    keep up the great work

    Henrik (Denmark)

    1. Hi Henrik,

      SLI/Crossfire are _not_ on our radar. The use case of a user who has _two_ really huge expensive graphics cards and a single monitor is rare enough that it doesn’t justify the total mess that happens when you try to make the two cards render a single frame stream.

      1. So if I understand right, does X-Plane support multi-GPU with multi-monitor? As in, could I use two graphics cards to, say, drive 4 displays (2 per GPU)?

        1. We do not support multi-GPU multi-monitor right now. I’m contrasting it with SLI in that I think multi-GPU multi-monitor makes more sense for x-plane in the long term than multi-GPU single-monitor.

          1. Metal is from scratch build for multiple GPU single-monitor (Metal came from iOS and all iOS devices (from late 2014) have only one monitor but multiple GPUs). It should be fairly easy for Laminar to support multiple GPU single-monitor on for example Mac Pro 2013. I think it is far mor complex on Vulcan with multiple GPU single-monitor.
            I do, however, agree with Ben, that stronger support for multi-monitor is the best way to go, right now.

          2. Hi Christer,

            iOS devices are not multi-GPU, and Metal provides _no_ API support to make a multi-GPU machine like the 2013 Mac Pro run with two GPUs, one monitor.

          3. Huh?
            Apples ARM A7 SoC has four GPU cores built in (PowerVRG6430). The later A processors also have four or more GPU:s (cluster configuration). On a WWDC the year after (2014) a developer from Epic Games showed a games Zen Gardens and using Metal with with extreme performance boost (up to 10x faster).
            If I recall, Metal is using OpenCL and supports SMID. Final Cut Pro and a few more apps is using both GPU:s in Mac Pro. I also recall that developer needs to code get the GPU talking to each other (i.e. No API)

          4. Four GPU cores sure, but that is not the same as SLI GPUs at all.

            The PowerVR chips are meant to be modular – when they talk about a GPU core they’re really looking at something a lot more like a “CUDA core” on an NVidia board – that is, a cluster of shading units that all collaborate to render area from a _single_ dispatched wide compute job. In other words, given 1000 pixels to shade, more cores all work in parallel on the same triangle to go faster.

            This kind of ‘more cores’ is great and is handled transparently on all APIs – that is, we never need to worry about this for any GPU on any API…the difference between a $$ and cheap GPU on desktop is the number of cores too – you can look up the shader core count on wikipedia.

            By comparison, SLI/Crossfire/multi-GPU is a different beast – each GPU is basically a separate computer. In this case, the SET of cores on each one each has its own private VRAM (they don’t easily see each other’s VRAM) and it’s own command dispatcher, and they don’t work on the same work.

            So to get faster frame scaling in the multiple GPU case (meaning two complete computing devices and not just more cores sharing the same memory system) the app (or driver) has to coordinate sharing of data between the two sets of VRAM, dispatch different drawing to different GPUs separately, etc. It’s a very complicated mess – to the point where in Vulcan and Metal, the driver writers threw up their hands and said “app developers, do it yourself.”

            The speedup you saw in 2014 with Metal was almost certainly due to the game having lower CPU use with the metal API than GLES – if it’s 10x faster, that may say poor things about the original GLES driver and GLES thing – but there’s no question that on the CPU side metal is more efficient than GLES and CPU use is a consideration in rendering.

            Apps like Finalcut Pro use multiple GPUs by dispatching different compute jobs to different devices. If the app’s model is “send the frame to the GPU, process it, get it back” then it can do this with multiple GPUs.

            Consider X-Plane – we use the LAST frame’s image on the GPU to calculate exposure levels of the next frame. If we’re using two GPUs, last frame’s image is not available to the other GPU, so it can’t start the next frame until the last frame is done – no speedup. This is the kind of thing that makes SLI/Crossfire a mess.

  3. Hello,

    Sorry for being off-topic. Just question.

    Will X-Plane ever have a system that allow user to configure keyboard and joystick bindings per aircraft? and not needing and external plugin like X-Assign … Shouldn’t be so difficult. That possibility will help using different model without having to reconfigure everything each time we change aircraft,

    Thank you for your attention.

    1. That is something we have considered and may do in the future. For 11.0 we wanted to get the basic new UI working and we didn’t have time to spare for additional features. But we are aware of X-Assign — heck, I wrote something similar for joysticks myself back in the day before I worked for LR.

      One of the problems is that it isn’t obvious what the right model is for changing presets – automatic changing of configurations is useful for power users, but can also be astonishing.

      1. Thank you for your answer Ben. I’ll try to explain my idea on the subject:
        I’m not programmer, just a simple user, but I’d like XP11 to look for and load the key/joystick binding configuration first from aircraft folder, if it isn’t found, use a common configuration. I’d place this binding windows on the Customize windows of each aircraft (where the weapon button, weight…, failure ….appears)

        Thank you again and sorry for being repetitive.

    2. DCS has this system where every aircraft has separate keybinds and it has the opposite problem where you really have to reconfigure everything for every AC which is massive pita. If there will be once such a functionality I would like to see it somehow smarter to have a default preset for all aircraft and be able to create just a few binds for a specific AC. That way if I wanted to change my “screenshot” key for example I would not have to go and change it in every profile for every AC I fly.

      1. It could be done. Custom configuration per each model would override default only for the funtion that it contains. For example, if default keys tells that hat 1 up is view up and inside your aircraft custom you don’t specify it would load default for that function.
        The system could also ask to change the key binding for all the models…

        1. take xjoymap, (bless you Joan,) create a .xjm file for your default aircraft



          now for some other aircraft





          assign your button to the xjoymap command,,now you have what you have been dreaming for,,

  4. Please don’t lock out 3D drawing from plugins in future 3D API work. There are niceties that are just hard to do without direct access to the drawing infrastructure, such as the various environmental/cloud texture plugins and a whole bunch of other “nice things”. OTOH, I totally get if you want to avoid making life difficult for yourself. We don’t need the plugin drawing interface to be absolutely beautiful (like abstracting all of the underlying APIs, which would be a monumental undertaking), but at least have something in place, if only a direct passthrough and making the plugin dev write for both Vulkan and Metal if they want to support all platforms.
    Completely cutting direct 3D drawing access off would make X-Plane’s 3rd party plugin ecosystem poorer.

    1. We’re not going for _zero_ plugin drawing callbacks – just fewer. If you want to go replace the entire cloud system, you kinda need a callback…and that’s okay, as the amount of work done in the callback will be a lot, so overhead isn’t a factor.

      I would like to get rid of the need to use a drawing callback to draw a 3-d object…that case is pretty silly.

      1. Thanks for the clarification and I fully support your effort to get the plugin drawing interface sanitized, it’s much needed work!

      2. Just to get you right – does that mean that i.e. plug-ins like “Autogate” etc. would die?

        Especially “Autogate” with the animated jetways and marshallers has become a high value standard for quality custom airport sceneries. Getting rid of this would be a major step back!

        Or will there be other ways to keep these outstanding add-on features alive? Maybe a native X-Plane feature like this?


        1. I’m not proposing we permanently break the API. I am proposing we provide a better way to do this. So autogate would have the option to update to a new API that would put it on a much better performance path.

        2. I don’t think “Autogate” plugin uses any OpenGL calls to draw jetways. The plugin provides custom datarefs to be used by the objects, which are placed normally as any other object.

  5. “I’m not proposing we permanently break the API.”

    Yes, I got that. You also wrote that the future “legacy” API support may lost thru XP 11. I just hope that the future methods will allow doing such things and that scenery authors will update their sceneries to those methods…

    Yes, I know you lost your crystal ball concerning my hope for scenery authors updating all items. I am familiar with crystal ball issues – I broke mine a long time ago… *g*


    1. “you also wrote that the future “legacy” API support may lost thru XP 11.”

      I’m not sure what you mean by that….I think you misinterpreted my post. I specifically said we’d support the existing (which will become legacy) APIs through the entire v11 run _if not longer_. So full v11 support is a given, and there could be support after that too.

      1. Vulkan should break any plugin that does its own OpenGL drawing, requiring any plugin to update to the new API. This means that maintaining legacy support means that it will be tied to OpenGL. With that the case, I don’t see a long life for anything that relies on legacy support.

      2. oh dear – that was a typo. I meant that the legacy support would lAst thru XP11, but I typed lOst. Sorry.

        Well, by now it is more than crystal clear… *g*


  6. Dear Ben,

    from this we will be able to have a windows for my 2D interface can be on a seconds monitor (like you do with other X-Plane windows ?

    If my understand is good, no change for normal 2D windows using your Windows widget ?


    1. We have a TODO item to expose multiple monitors to the plugin system.

      There is _no_ guarantee that existing code that uses widgets will be able to take advantage of such new capabilities in the future without modification.

      I don’t know what modifications may be needed, because we have not designed the new API.

      1. Thanks Ben,

        No probleme, if both system are alive together a bit time, it will give me a time to make changes.

        Do you have an idea when -multiple monitors- will be available for plugin please ?
        Many GoodWay users ask it because you have introduced a very nice feature 🙂


  7. “”” You Don’t Need to Reinstall X-Plane to Fix It
    To get X-Plane back to its clean state, you can do this:

    Run the updater. If you’ve modified a file by accident it will ask if you want to replace it. Say yes.
    Delete all of the files in Output/preferences.”””

    This is an impertinence to expect. Many users have all their control units configured in painstaking work. For my controllers I’ve needed a lot of hours. And these should be deleted and the whole procedure start from the beginning?

    1. You don’t _have_ to delete your prefs. I’m just saying that if you’re going to delete things, don’t delete the 60 GB of data you downloaded.

      The joystick and keyboard prefs are in their own files specifically so users can do partial resets of prefs if needed.

  8. Not related to this post, but have the new DSF tiles been uploaded? Can I redownload these now?

    1. I’m working on that now. We will be publishing an installer shortly with a new “update scenery” function that will get you the new scenery when it’s available.

      1. I just update my global scenery for XPpb13 and was pleasantly surprised to have the installer add several GBs of above 60 degrees tiles. The Global Scenery folder grew in size to 65.1 GBs.

        It seemed to me, though, that some Earth Nav Data subfolders that need “fixed” were not. So I did a complete download of a second copy of XP11pb13 and lo and behold the Global Scenery folder in that install is only 49.5 GBs in size.

        What is going on?

        1. Hi Maurice,

          1. In the future, please DO NOT do a complete re-download of the product. The new installer will be out next week and can update scenery. We put this in so that users would not feel the need to ‘just redownload everything’ – it’s not necessary and chews up bandwidth for us.

          2. The recut scenery for pb13 is significantly smaller than pb1 – we fixed a DSF compresssion bug and it resulted in enough free space to be < 60 GB _and_ go to 72N. So we got to have our cake and eat it. DSFs are about 20% smaller for the same data. In pb1, a 64-bit-related bug was disabling internal DSF data compression. 3. I don't know what you mean abotu hte EArth nav data sub-folders. You should probably file a bug.

          1. 2. I am happy that you get to have your cake and eat it t00. Can we have a piece of that cake also: how about (the now few) areas that were available in XP10 and still missing from XP11 and would not add much, so that you can keep the whole <60 GB.

            3. I am referring to my understanding that in doing a recut, some dsf tiles were also being corrected or fixed (for tears or mis-fits, e.g.) as many had filed bug reports about them.

            1. The installer should not offer to install a second copy, it gives people ideas. 😉

  9. Hi Ben,

    Slightly off topic I know; but could we possibly have an update on the weather engine? Maybe I’m being slightly negative, but I find it rather difficult to get excited about minor tweaks, when the weather engine (by this I mean clouds), is still…after all these years, such a performance killer (even on gaming rigs).

    This is the one thing which stops myself and many others from enjoying fully the fantastic benefits of X-Plane 11. What makes it even more painful, is the fact that (unlike before) the user has no real control over cloud density; without tweaks.

    I know you and the team are working hard to improve things, but if you could possibly give us some kind of roadmap of where the weather is going, that would be ace.

    Apologies if this comes across as some kind of rant…it’s not; I just wanted to get it off my chest as I feel it’s important 🙂



    PS…I’ll send the send the flowers in the post, haha!!

  10. Did some testing and got this idea.
    Is it possible for you to somehow not have AA on for cloud rendering, if so it will improve FPS vastly with big layers of clouds.
    Like disconnecting the AA from working only on clouds.

    1. That’s part of what’s going on – the HDR AA other than FXAA is OGSSAA (ordered grid super sampling), which is fancy talk for just drawing things biiiig and scaling them down.

      Whenever possible, the b14 code separates clouds from the main screen, escaping the costs of the size of the screen, anti-aliasing, etc. But there are some views where this cannot be done.

      You should find that in some views, the penalty for 4x FSAA in HDR mode is much lower than it used to be with clouds – it was basically never useable with clouds.

  11. Is there a bit of Buzzwinkle in the X-Plane11 code… a bit snozzeled, but capable of still hanging around forever, but now has to be moved on.

    Great story SD

    1. I like to think that I am the Buzzwinkle of Laminar Research…I’ve been around forever, I’m often sloshed, and you have to hit me pretty hard on the head to get me to move. 🙂

Comments are closed.