It’s been very busy here internally, but a few things to mention.

X-Plane 11.55 – Crash Fix

We’re focusing almost all of our effort on future technology, but Stairport reported a bug severe enough that we went for a patch: X-Plane would randomly crash when plugins use the new “instancing” drawing APIs.

The instancing APIs are the Vulkan-compatible way to draw objects from plugins, the way to add particle effects via plugins, and will someday support sound as well. Simply put, instancing is meant to be the foundation for plugin-created dynamic content. Third parties did a great job of switching to instancing to be Vulkan compatible when we released X-Plane 11.50, so having this API be rock-solid is really important.

With X-Plane 11.55, correctly written add-ons should “just work.” The interaction between instancing and datarefs does sometimes confuse developers, so I’ll cover that in some nerdy detail in a future post.

The Latest Gateway Airports

While we were cutting a hot patch, we took another set of airports from the X-Plane Airport Scenery Gateway – 11.55 features over 1000 new 3-d airports and 443 brand new airports. I remain amazed at the Gateway community’s progress and results.

X-Plane 11.55 release notes can be found here.

What Is Photometric Rendering

I’m excited to finally be able to talk about something I’ve been working on for a while now – the new photometric lighting pipeline. Here’s the preview video Chris and Thomson made:

X-Plane’s lighting and rendering have leveled up several times – in X-Plane 10 we moved to HDR with global lighting, and in X-Plane 11 we introduced Physically Based Rendering.

I know this kills, but it’s too soon to talk about release dates. I can say a little bit about what you’re seeing in the video though.

First, the new lighting pipeline is photometric. What that means is that color values during rendering match real world values (in real world units) through-out the entire rendering process. Rather than say “1.0 is a bright thing, and, um, 4.0 is a really bright thing”, with photometric rendering, there are real answers. The sun is about 120,000 lux. The blue sky might be 8000 cd/m^2. A landing light on the 737 might be 200,000 candela at its peak intensity.

The idea behind photometric rendering is to have all elements of the scene be calibrated to real world values so they all match each other. That cloud should match that sky and that airplane because they’re all in the same units. No more tweaks to try to make things match.

We shipped an HDR renderer years ago, but the new pipeline is, well, more HDR. A lot more HDR. Because we’re working in real world units, we have to maintain a wide HDR image from beginning right to the very end when we tone map. The result is that every part of the scene can have a wide dynamic range.

While our shipping pipeline dims displays during the day by darkening them (to give the appearance of wash-out), the new pipeline simply draws everything in real world units – the camera is simply set for day time exposure (set in real EV units like a real camera) and the displays look dim due to the camera.

The new pipeline features a new tone mapper – one thing you might notice from the videos is that color with the new pipeline are richer. This is partly because the new tone mapper (which works well with real-world illumination values and is HDR-display-ready) does a better job of preserving saturation.

Sun and sky colors in the new pipeline are driven by a new atmosphere and sky simulation – they’re not painted textures. We get the sun color from the composition of the atmosphere and the relative position of the sun and scenery.

Finally, the new pipeline can run screen space reflections (SSR), dynamic exposure, bloom and other effects (still to be previewed), all running with real world photometric HDR values.

Why Photometric Rendering?

The photometric renderer gets us a bunch of visual quality improvements and new effects that were high on our TODO list (and, based on the feedback site, yours too.)

Photometric rendering also serves as a foundation for a bunch of other features that are high on our priority list. That will be a topic of a future preview and a future blog post.

About Ben Supnik

Ben is a software engineer who works on X-Plane; he spends most of his days drinking coffee and swearing at the computer -- sometimes at the same time.

73 comments on “Crash Fixing, Airports, Photometric Lighting

  1. Thank you thank you thank you thank you thank you thank you thank you. I knew I wasn’t crazy when I thought the sim currently lit the world with a moderately powerful stadium light.

    Any thoughts of supporting or not going out of your way to defeat EDR as a target when rendering to Metal? //prolost.com/blog/edr

    Also thank you thank you thank you thank you. (Quietly, setting up airport runway use rules is hella confusing, I can do it, but some sort of logic evaluator in WED would be sweet.)

    1. EDR – I think we will be able to support EDR with this new pipeline – probably not in the initial release, but at some point we’ll do a pass to support EDR on Metal and the various HDR output options on Vulkan. We’ll also at some point hopefully support gsync/freesync 2 on Vulkan and adaptive refresh rates (which are coming to the Mac) for capable hardware.

  2. The new sky model sounds more computationally demanding (sweeet). I recall in previous descriptions of this part of the simulation (years past) that you referenced a ‘GPU Gems’ article/algorithm because it was light on the CPU cycles.

    1. A gajillion years ago I coded the O’Neil algorithm from GPU Gems 2.

      //developer.nvidia.com/gpugems/gpugems2/part-ii-shading-lighting-and-shadows/chapter-16-accurate-atmospheric-scattering

      I gotta give Sean credit both because this is the great-grand-daddy of the current sky algos and because he was so nice and helpful in answering questions. In the end we never shipped it – if you look at the GPU gems pics, you can’t. help but notice that the colors are pretty wrong. This is inherent in the first-gen algos – they don’t do enough math to get the optics right. Eric Bruneton’s paper that followed set the baseline for real-time skies that look good enough to ship.

      So X-Plane 10 and 11 use a series of artist-rendered skydomes from a wide variety of weather + times of day, then the sim has some math that tries to blend them together in sane ways.

      The new algorithm is strictly mathematical, with some tweaking possible by e.g. modifying the contents of the atmosphere – there’s no blue texture to paint purple-ish in Photoshop. Thanks to compute shaders, I don’t think it will be a performance problem.

      1. Hi Ben,
        Great news for the sky!
        Are you also working on the general atmospheric scattering? There are a few problems with the shader in XP 11:
        — First, I don’t remember exactly what equation you used for in and out scattering, but it causes really hard cutouts (overexposed colors at the horizon). the correct formulas should use exponentials with negative exponents.
        — Second, there was a very relevant comment inside the shader, that said that the art controls for in-scattering should be different than those for out-scattering. Duplicating these constants is indeed the only way to get correct sunset lighting AND realistic blueish tint at the horizon.

        thanks for the great work!
        Pascal

        1. The new lighting pipeline is generalized atmospheric scattering, so it’s totally different from what we ship now (and different from what we ship on mobile). I believe there should be a lot less problems with “really strong scattering results in goofy colors” the way we have now.

          1. Fantastic! Thanks!
            If I may ask another related question… will the clouds will be integrated in the general lighting model and be affected by the scattering? (I think this could be very important to make them believable, especially far away)

          2. Yes. To make clouds look right, the light reflected or emitted from the clouds to the camera must be scattered by the intervening air. Here in the US Northeast (where it is hot and humid and the visibility isn’t amazing) this effect is very noticeable – the darker bases of the cloud are more “sky blue” than grey.

  3. A sky that realistically blends gently from light to dark? A night sky that looks like night – no more halo-pixelated smudges? If there are no more painted texture referents for colors, this will bring the environment rendering in line with blade element theory flight physics. Just my humble prognostication. No wonder you’ve been so quiet, Ben. Thanks for the update. Got enough coffee? 😉

    1. That’s actually a good analogy – the flight model runs in real physical units – we try to stay SI internally. Some of the systems are still “Ratios” (oil pressure – 0 = min, 1 = max) but Austin and Philipp have, over the last few years, been moving to real units for that too. This moves the lighting model in the same direction.

      There’s still gonna be a LOT of Penn and Teller slight of hand tricks – that’s just what real time graphics are. :-). But having a solid foundation makes staging those tricks a lot easier.

  4. I would like to ask you, is there any chance you gonna change the texture maps for the rendering? I am missing AO – there is still alpha channel free in LIT map, and I would also swap glossiness and reflectivity with roughness and metalness.

    1. We are looking at various extensions to the texture mapping schemes. AO in the lit alpha channel is an interesting idea – you can actually get AO (sometimes) now in the “detail” shader, but it’s only available for autogen.

      The problem with the lit map is that we don’t need it when we need AO and vice versa. Is your goal to do an AO bake?

      (The other problem with the LIT texture is: we have on our road map a second UV map for OBJs – a huge texture space optimization would be to let the lit texture have its own UV map and not map the whole UV space when most geometry is not emissive. But this would break re-use of alpha as AO.)

      If the LIT texture is compressed (it usually is) the alpha isn’t free – if there’s no alpha you can use DXT1 and cut VRAM in half. So we may need to find some _other_ place to stash AO.

      1. I think the same concept applied to AO only would have considerable gains for XP. The two things that really drive up texture sizes for both aircraft and scenery are

        a) Rasterized AO bakes into the diffuse map
        b) Substitution for detailing methods (e.g. detail maps and decals).

        Having XP produce either…
        i) A second UV map for AO, which can be additive/subtractive based on the current lighting.
        ii) More liberal use of SSAO with a finer resolution. Not just applied to external objects only.

        …would go a long way to really condense down size of assets and provide a more consistent visual look. But I’m just an artist, not a programmer.

        On a different note, the new photometric lighting looks absolutely sublime. Can’t wait to see what else has been cooking!

        1. Those are all pretty sane ideas – I think we expect long term to have a little more flexibility with UV maps and channels in the material system. Not a full node system, but also not the limits we have now.

      2. Hi ben!
        Are you considering multiple uv maps in general for materials too? so now the normals + PBR are on the same texture, a separation of both would be nice + possibly better for VRAM. (2 channel normal map compressing)

        1. We’re considering multiple UVs for OBJ models – we haven’t decided how the UV maps would be tied to the materials yet. In an _ideal _world, each texture input could select from the A or B channel UV map and you could set it up the way you want it. But we haven’t written this code yet, so I can’t say what limits we’ll hit.

          The new renderer does have _more_ channels of “stuff” on a material (more on that in a future post) so flexibility is becoming more of an issue.

      3. As mentioned below Dellanie, we are currently using ambient occlusion in our base colour texture, which isn’t bad, but I feel in the future, separate AO would give users an even little bit better experience with the shadows. The separate grayscale texture will definitely do the same job.

        But I fully understand you with the optimization, that’s our priority as well.

  5. Hi Ben, hi LR! First of all, I want to thank you. I really like what I’m seeing, I think this improvement was really needed in XP.

    I know this is just a preview, weather system might not be implemented yet, and probably is WIP yet. But I’ve got a question.

    Will be able to see blue twilights instead of purple ones? Or in other words, will the sky colors after sunset will be different depending of the ambiental conditions and or height? I noticed that most of sims (included MSFS, which I think uses a similar technique to the one you preview) tend to give a purple tone to this hour, a color that I rarely see in real life. At least not at ground level. Instead, I find more similar to the blue hour linked below (There are some pictures too). And I noticed those purple colors in the preview too.

    Also, I see those blue tones more accentuated on cloudy days, but might be a coincidence, since is out of my knowledge: Below I leave you some links about what I found.

    //en.m.wikipedia.org/wiki/Blue_hour

    //en.m.wikipedia.org/wiki/Chappuis_absorption

    (Ignoring the photography part of you want, I liked the explanation of twilights here)
    //www.photopills.com/articles/understanding-golden-hour-blue-hour-and-twilights

    Again, thank you, and patiently waiting for new previews 🙂

    1. We’ll see – we may have the same purple issues that other sky simulation algos have … the down-side of an algo-driven sky is that if your art director goes “I want the blue hour to be blue” you can’t just paint it that way in Photoshop. Twilight and darker are very difficult times to model because all of the light is very low level and thus small effects matter a lot.

      1. I think this is an inherent problem in RGB space vs “spectral” space. In RGB, you might hit 255 for blue, and any additional red component will send it purple. In HDR you can minimize that by blue being 4.0, 10.0 or whatever and red being .05.

        But either way, when this is summed in RGB, is it representing “purple” or “violet?” If it’s violet, then you need an additional “eye transform” which greatly reduces the brightness and redness due to our limited sensitivity to violet. If it’s purple, then the red needs to be boosted if it’s on the orange side of red, or if on the “extra red” side reduced because we are relatively insensitive at the edge of our red vision.

        Another thing this new system won’t give us is the effects of scattering on fluorescent light sources (like bridge lights, the Verrazano specifically). Up close, the Verrazano is blueish white (mostly white) but a few miles away, it’s green. Since the spectrum isn’t continuous, and scattering is exponential with wavelength, scattering the blue away doesn’t make it cyan, it just skips to next wavelength. In RGB whitish blue, say 240, 240, 255, scattering will make the light white, then yellow!

        This will be various levels of broken until compute power and technology gets to the point we can have ROY(aquamarine)(cyan)(ozone blue)(spectral blue)Violet sub pixels and internal representation of light values that are spectral until they last step when it’s rasterized to the available sub pixels.

        But artists can and will fake it well with custom code for special cases.

        1. Whoops, forgot a “G” in my extended sub pixels palet. A yellow green might be useful too.

        2. So first, I completely understand your point that RGB rendering isn’t truly spectral – as long as we’re in RGB, we can’t get black grass under a sodium arc-lamp, for example.

          But I don’t understand the Verrazano bridge case. If we have an out-scatter filter function as an RGB triplet and our light is blue-ish white, I’d expect to get something that’s more of a neutral white moving toward yellow as we out-scatter. In real life is there very little red light component in the lights, and if so, why doesn’t it look more aqua or cyan up close? I guess I’d expect the scattering code to be close, if not perfect.

          Currently I’m not out-scattering billboards – but that is something we can easily do in the new system…I’ll add it to my list of things to try and see how noticeable it is. The spill light _is_ scattered because it hits the surface first, then the reflectance back to the camera is always scattered.

          1. In the Verrazano case, I believe it’s “white” because it’s super freaking bright blue/green… maybe next time I go to my dad’s I’ll try to use a spectroscope.

            Once you get far enough away for the out-scatter, there is really just bright green, not much yellow/red in the spectrum.

            I am not criticizing x-plane btw. Any pipeline that does math to rgb will result in oddities. It’s like sampling a C major chord by only measuring D, F# and A#. It works pretty ok for wide spectrum black body, but falls on its butt for sodium/neon/cfl.

            The applicability to x-plane is just the purpling of blue sometimes, and lights at night not changing color as experienced in real life because the RGB scatters, not the “light”. This is only just starting to be modeled by physically accurate path tracers. I don’t expect it to come to shaders for *years*

      2. Understandable. Thank you for the response. I know if someone can make it is LR team. Anyway, is a good improvement 🙂

    2. +1

      I rarely see purples in sunsets. It’s mostly shades of blue/red/orange, yellow, green.

  6. Awesome!

    It’s amazing how game graphics developed from putting characters on the screen (anyone remembers re-defining characters for graphics?) and picking cyan for the sky and dark green for the ground from 16 available colours. To simulate lighting so close to real world physics in real time, was not even thinkable at that time.

    Thanks for letting us look a bit behind the facades again, Ben!

  7. Hi Ben

    After making the update to version 11.55, I noticed a large drop in FPS (between 15-20) compared to the same situation in the previous version, Toliss A319 at the default LEBL airport without clouds

    1. This was a known issue with the Toliss release – they released a patch a short while afterwards.

  8. I am confused, online is the 11.53.1 patch, but not 11.55 as the release notes depict. Is the new update not online yet?

      1. Yeah we don’t just give the beta to everyone without asking, nor do we spam everyone with “HEY, you wanna try some new beta software????”

        But once you are IN the beta, all future betas will auto update so your beta stays latest.

  9. Hello! New lightning looks awesome. But it will be in X-Plane 11 or already in new version of X-Plane? And if it’s not a secret, can we expect a beta soon? I check all the news every day and look forward to this day

    Thank you!

    1. I’m sorry, but I just can’t make date announcements or anything like that. I’d say the new lighting is too big of a feature for a patch.

      1. When you say ‘too big of a feature for a patch’ is the an uncommittal way of inferring something about Xplane 12?

  10. Hi Xplane Team…
    Congratulations on fixing the bug and also releasing new gateway airports.
    Please, the update never mentioned anything on the xplane mobile bug that has been reported since April, please when should mobile (android) users expect an update??

  11. Bonjour à tous,

    La mise à jour X plane 11.55 est-elle déjà sur steam ?

    Merci

    David

  12. Looks fantastic!
    Will the upcoming changes to x-plane take account of the lack of new GPU availability?
    I’m still on a GTX 1070 because upgrading is prohibitively expensive (if I can find one!)
    Kind regards

    1. Sigh…it’ll run on a 1070 – that’s what I have. I can definitely say that we’re _annoyed_ that crypto is driving up GPU prices, and it’s a very different environment from 5 years ago where owning a top-tier GPU wasn’t a reach. (There were also fewer tiers of GPUs, so the top tier wasn’t quite the cash-ectomy it is now.)

      With that in mind, we are still trying to move work to the GPU – GPUs have gotten expensive but even the intermediate ones are _really good_ and their capabilities are better revealed – now that we have Vulkan and they have async compute, we can squeeze a lot more out of them.

      1. >and they have async compute

        Angels came down from heaven and played that sweet music.

        >a 1070 – that’s what I have

        sad face.

        ___
        >dates

        While not wanting to rain on this allstar parade. There are a lot of known issues in the current 11.55 release notes, and quite a few known issues not in the release notes. Is there an expectation that these will eventually be resolved on the current version or is the plan now all “next gen” and deal with them in a new major version.

        Or to put it another way.
        Should we now expect XP to get more buggy rather than less for the foreseeable future….

        On the one hand I want everything you just said 6 months ago, on the other hand I’d love to see some of the more annoying bugs like dark contrails, whats left of the temporary terrain tears and broken windsocks first.

  13. I would like to say all the posts from Ben are very important for X-Plane community. This long awaited post is just a Photometric glimpse into a future, where we can see an existential X-Plane’s aerodynamic model, based on the blade element theory and a mathematical lighting one as well. Exciting perspective. Thanks for the very curiously movie and a slightly open door we could see it through.

  14. Under the new lighting model, will Laminar have to adjust their default scenery textures to account for the new lighting? And, will us end-users using ortho need to make any adjustments to our textures?

    1. Probably not. We may adjust our default textures to _make them better_ (I personally think the default textures could use some color management, but I leave these decisions to the art team because they can see more than 6 shades of color). I think the intent is that if you have reasonable sRGB albedo values in your albedo texture, you’ll be fine.

      E.g. if you took a photo of an X-Rite color checker, mapped the photo to sRGB in photoshop, and color corrected it so the swatches match what X-Plane publishes, you’d have a realistic sRGB albedo for your OBJ. That OBJ should look realistic and good in the new color pipeline.

  15. Hey Ben,
    Nice progress, it would be really nice and helpful for scenery developers if you guys can integrate multi material export so we can export multiple objects that uses 2 or more texture/drawcalls
    Cause import each models with different specific textures is really tedious process

    1. What’s the use case? Do you have a big show-piece (e.g. an entire international terminal) built out of a few atlases, or are you trying to use your “brick” texture and your “concrete” texture on the same OBJ?

      1. While not me asked, what you mentioned ben is something that would be really helpful IMO.
        Think of building a terminal, but it would be great to have stuff like clutter (AC units, furnitute etc( under a big atlas that can be used over multiple buildings.

    2. Multi material export is already possible by using .agp file, which can be a box storage for the unlimited objects. For example a big terminal, where we have the glass, metal,
      bricks,concrete etc. All the kinds of materials with the owns textures should be organized into a Blender’s root collection, than exported as x-plane .obj and collected into an .agp file.

      1. I should also say – with the AGP you get a ground ortho layered on for free with pretty good horizontal alignment to the ortho.

  16. Hello,

    All those news sounds great to me.
    Just a question, can you give us some news about having x-plane native apple silicon ? 🙂

    Kind Regards.

    André

      1. Thank you Ben, great news !
        I am now wondering how it performs, if it is really better than x86 and what gain of performance you obtain. 😀

        Kind Regards.

        1. You gotta compare Apples to Apples. (See what I did there.). And in the one comparison we have (Apple M1 in low power uses vs some kind of Intel I-5 or I-7 in a low power usage, GPU on the CPU die) the M1 is definitely better. Faster for less battery use. No funny business – the M1’s a really great chip.

          How will a “bigger” Apple chip compare to an Intel or AMD gaming chip? Dunno, we haven’t seen one yet. But I suspect it will do well – since heat and power is the limiting factor on chips, Apple being better in perf per watt is, I think, going to turn into advantages across the board.

          The other thing that surprised me is the M1’s GPU does surprisingly well with GPU code that is not at all optimized for a no-VRAM tiling GPU that came from the mobile space. It’s not amazing, it’s not replacing a discreet GPU with VRAM right now, but it punches above its weight and is a lot nicer than the Intel GPUs we’ve had on the mobo.

      2. Amazing – perhaps you can ignore my new post below! Any chance of beta access?

  17. Love reading this – it satisfies my inner geek. I am new to X-Plane and so far have been impressed how the demo version works on my Macbook Pro M1 with 16GB RAM. Are you able to say if/when Laminar Research will release a universal binary that will work on both M1 and Intel without needing Rosetta translation? I’m itching to buy and that knowledge would tip me into clicking “Buy”.

    1. We are running native/universal2 in the lab already. So it’s a question of when it becomes available, not if.

      It doesn’t make as much a difference as you might think, though. Yes, the native X-Plane runs faster than the rosetta one on my Mac mini, but that is due to no plugins and crap running, because crap doesn’t come in universal2 yet. The frame gain is due to not having plugins slow things down, not due to the lack of Rosetta2. Which actually just shows how damn impressive Rosetta2 and the M1 chip really is.

  18. I’m late, cause we had a heat wave here (sweat all day even with the computer off).
    Two comments:
    First on the video: While I like clasic music, too, I don’t like that background sound. My proposal for a substitute would be Bach’s Sir in the g string, maybe by Nigel Kennedy.
    Second, on the new model: So you basically changed the light definitions; what about the materials? Do they still have colors, or do they have a reflection function now? Likewise if the view is like from a camera: Can the user apply some exposure correction? Custom gamma?
    What about user-defined sunglasses?

    1. The material model is the same, and it already has a reflection function _and_ color.
      I expect we will ship a user settable brightness control but we are not going to set a user controllable gamma curve. Changing gamma is not a good proxy for brightness management.

  19. Got two questions:

    1) will the photo metric rendering benefit from accelerated Ray-Tracing hardware rendering in new GPUs from AMD and Nvidia?

    2) As far this topic is focused on rendering pipeline and promising on better clouds; how far can we relates with Austins recent mention of Next-Gen Scenery. i can understand the scenery will have better fresh looks under the new rendering pipeline, but what is the intent or definitions of next-gen scenery – apart from textures ) , would be happy to hear a bit on that.

    1. Photometric lighting is orthogonal to HW accelerated ray tracing – they can be combined, but you could build a ray traced non-photometric renderer or a non-photometric ray-traced renderer.

Comments are closed.