Nvidia announced their latest bitcoin graphics cards on August 20th at Gamescom this year. Among the usual increase in transistors, they also disappointed all crypto miners by adding a feature that cannot (yet) be used to calculate cryptographic hashes: Ray Tracing! Ray tracing has long been seen as somewhat of a holy grail of graphics rendering, because it’s much closer to replicating the real world than traditional rasterization and shading. However, doing ray tracing in real time has been close to impossible so far. But hey, Nvidia just announced their new RTX GPUs that can do it, so when is X-Plane going to get a fancy ray traced renderer? This and various other questions that have been asked by X-Plane users, as well as some myths, shall be answered! If you have a question that isn’t answered here, feel free to ask it in the comments.

What Nvidia has shown is absolutely impressive. Unfortunately, the fine print of all the marketing hype is that sadly it can’t just be thrown in without engineering effort. The first thing needed is actual RTX hardware, which no one at LR currently has. The second thing needed is a Vulkan-based app; we are getting there, but not in any way that would support RTX. (the whole goal of the Vulkan renderer is to not change the way the world looks, so we’ll first need a shipping production Vulkan renderer.) But then… well, it’s not entirely clear what it takes to actually write a ray traced renderer in all of its details. Nvidia has not yet published the specification for the Vulkan extension (VK_NV_raytracing), but they have published slides from presentations. One thing is very clear: you can’t just copy and paste five lines of Nvidia sample code and suddenly wake up in a ray traced world.

What Nvidia provides is the scaffolding necessary to describe a scene, as well as to provide new types of shaders that allow casting rays from point A to point B and then report back what they hit along the way. This is a huge amount of work that the hardware is providing here, but it’s not the promised “5 lines and you’ll have ray tracing in your application” that’s being promised. To adopt ray tracing you will have to write the whole ray tracer yourself, from scratch; the hardware just enables you to do so now. This is akin to implementing HDR or PBR: Shaders are the base requirement to implement both of these, but once you have shaders you still need to actually implement HDR or PBR on top of them. Another example is building a house and being provided a plot of land that can support it. Sure, it’s great, now you have a place to build your house, but you still have to come up with a blueprint, pick materials to use and then actually build the thing. Implementing ray tracing will take a great amount of engineering effort, nobody is throwing in awesome reflections with every purchase of one RTX2080Ti for free!

The other thing that’s not entirely clear is how well ray tracing will even perform in an environment like X-Plane! Worlds in X-Plane are huge and open, not small scenes from a shooter with tight spacing. Lot’s of rays are needed, and they have to travel quite far, potentially intersecting with large amounts of geometry. How good does the hardware and API scale up to these sizes? Only time will tell. That’s of course not to diminish Nvidias achievement here, it’s an incredible feat of technology in its own right and this is just the first generation!

The other thing worth mentioning is that ray tracing is not just something that Nvidia secretly cooked up in their basement for a decade. This is going to be an industry wide thing, with APIs that will work across vendors! Historically one vendor has come out with a fancy new way to do things which then became the standard adopted by other vendors. Nvidia has come forward and offered their extension as base for a core Khronos extension for Vulkan. They have a vested interested in making a cross vendor, cross platform API available.

In the foreseeable future, rasterizing renderers are unlikely to go anywhere. Rather, ray tracing for the time being can be used for additional effects that are otherwise hard to achieve. Clearly Nvidia is acknowledging this as well by providing a traditional rasterization engine that by itself is more powerful than previous generation ones. This also means that if X-Plane were to adopt ray tracing tomorrow, you could still run it on your old hardware, you’d just get extra shiny on top if you have ray tracing capable hardware.

Last but not least, this is another reason why you should stay away from the shaders! One day we’ll wake up in the glorious Vulkan future which will open the door to the glorious ray tracing future. All of this means that we’ll have to keep changing our shaders.

About Sidney Just

Sidney is a software developer for X-Plane; As the Vulkanologist he is an expert on Pyroclastic flow and bitcoin mining hardware.

43 comments on “Nvidia RTX ends with an X, X-Plane starts with one. Perfect match?

  1. great post btw

    yeah getting a solid base of a working Vulkan render comes first but things like single source lighting is huge for flight sims and getting cockpit lighting correct

  2. I’d say get over with your Vulkan/Metal adoption and then focus your limited resources on more important things than adding more eye candy….

    Such as:
    – Extending the DSF spec so it allows scenery designers to finally patch the mesh in a practical and standardized way.
    – Provide a common scenery packaging format and built-in management interface that makes installing and managing custom scenery straightforward (ordering, exclusion debugging, switching between different base meshes etc.) without requiring the normal user to become an expert in X-Plane’s internal scenery formats and concepts.

    Just my $.02

    1. “I’d say get over with your Vulkan/Metal adoption and then focus your limited resources on more important things than adding more eye candy….”

      That’s the point of the whole post.

      In 2 years this blog will be infested with “implement ray tracing!! your sim sucks” type of comments.

  3. Hi Sidney,

    Thanks for the great write-up. Do you guys plan to increase the update rate of the current cubemap reflections or add cubemap blending so that the reflections don’t look like they’re running at 1 fps?

  4. Please answer me!!

    I have read a lot about Vulkan and what it will do (or what it is expected to do)

    I’m not here to ask for the launch date on Xplane 11.

    Only if, with the Vulkan we can expect a slight improvement in the level of increase of Frames in virtual cabin for example.

    A greeting

    1. Nobody knows when they will release X-Plane 11. Not even Laminar Research. BTW – I posted this message from the past, in 2009…

        1. @Coda and @Jeff

          “…on x-plane 11” not “…of x-plane 11”!

          The original poster is asking about release of vulkan ON x-plane 11 not OF x-plane 11 itself.

          If I remember correctly from the Las Vegas videos, Ben mentions the goal is to release in 2018…. the earliest 😉

  5. The Issues of Jagged Shadows, inside/outside the airplane bugs and bothers flight simmers, on both platforms alike, x-plane and p3d … (yes i know this is an x-plane blog)

    This is a shiny area whereby the new generation of GPUs can come to rescue, through AI + Ray-tracing.

    For example, the RTX offloads Ant-aliasing from GPU into the tensor cores, thus increasing frame-rate through distressing the GPU… and there is more…

    Most importantly, 4k 60fps will be mainstream in a matter of couple of months.

  6. We are at the beginning of a new era. Ray tracing is the future and it is fantastic to see Invidia taking the risk to push this tech already in 2018. Ultimately ray tracing will make the job of engine and content developers easier as much less fakery is necessary to achieve pleasant (read physically correct) results.

    There are many ways in which even partial implementations of RT could benefit a flight simulator. Correct & soft cockpit shadows, cockpit night lighting. Real-time physically correct material reflections in the cockpit and for the external A/C models to name a few.

    I can’t wait to see this tech coming to X-plane in the not too distant future. The future is indeed looking bright!

  7. Hi guys,

    Awesome post and looking forward to the future ray tracing goodness that will hopefully come to XP in time!

    In the meantime, and kind of related to new technology so hopefully relevant here, I wonder if you could please advise on the best setup for an i9-7900X in terms of ITBM v3. My BIOS has a native mode, using windows drivers, and a mode requiring installation of the ITBM utility. The latter results in XP11 hammering a single core at 100% with some light work on supporting cores, whereas the former hits all 10 cores with a similar load.

    Which is the correct configuration for XP11 please?

    Thanks!

    Ady

  8. Does the current raster render engine ever cast rays for any reason? For example, shadow directions? Or is it all 3D to 2D transformation matrices?

    1. The rendering engine doesn’t cast any rays. I have seen some confusion online about ray tracing and real time reflections/shadows, with some people claiming that ray tracing has been the standard for years now and everyone has been doing it, because they show real time reflections. The truth is that current hardware is really bad at real time ray tracing, especially at the resolutions required. All games that I know of, including X-Plane, use a normal rasterizer to render everything.

      Reflections are achieved with another camera rendering the scene again into a texture and then using that second cameras output when drawing the final scene with reflections.

      1. I think I would have a hard time telling a difference between the current Xplane PBR shaders and a ray-traced version. All the demo videos w/ RTX have mainly focused on fancy reflections which aren’t a huge fraction of the visible massive world encompassed by Xplane. Dirty airplanes don’t produce clear reflections which means less rays. Water can be decently approximated without rays. Seems like a hybrid raster+ray render engine might work nicely. I wonder if such a thing is even possible?

        1. Dirty planes don’t necessarily lead to less rays, and in the worst case you’ll have a bare metal livery like the old Lufthansa ones. People would be rather upset if liveries performed differently against each other. The nice thing about PBR is that every pixel takes the same amount of time to compute, no matter if it’s super shiny or super dull.

          Regardless, hybrid renderers are the only thing that seems to be possible with the new RTX cards. The RTX engine and the normal rasterization engine on the cards runs in parallel and the two results are merged afterwards for the final scene. So that’s definitely something to consider when first adopting the new technology: How big is our ray budget, how much geometry can we throw at it and where can we use that most effectively?

  9. Currently reflections kill fps – I can go from 50 fps no reflections to 30fps low reflections which is a shame as reflections really add to the scenery where there is water.
    Will ray tracing help with this, and/or object shadows which is another fps killer, or do reflections and shadows need some other fix?

    1. it should help cube maps are slow and take a lot GPU time to render
      it would be off loaded to its own hardware in the case of nvidia

  10. Spoiler-alert!
    I am somehow engaged in computer graphics since the 80ies, so I am as excited as most of us here about the idea of real time ray tracing. And, hearing about RTX my first thought was “what about X-Plane?!?”.

    But, let us face reality, before we ruin the devs nerves by asking about RT on every topic from now on:

    1. The announced RTX cards really seem to be performance monsters, but nobody has seen them work in a real environment yet. And, as far as I got it, the supposed first games using RT will do some hybrid rendering, not pure RT.
    I believe, like it is with every new technology, it will take _at very least_ one whole generation of RTX cards, for the new tech to become really working.
    Not to talk about drivers, more sophisticated shaders and tuning tricks, that have to evolve over some time (probably a few years).

    2. As Sidney said, XP has to go to Vulkan first. And then that has to be optimized for better performance (to make VR users and everyone else happy).

    3. Repeating again what Sidney said (as some may have overread), flight sims are very different from most other games: Tons of objects placed on a huge area. And all of them can be seen from very far and very close, with no limitations where the camera is.
    RT needs to test a ray for every pixel on the screen (2 million in HD, 8 million in 4K) and see which object (and which triangle of the object’s model) the ray hits first. And then it is not over at all, usually lots of reflection rays have to be tested. All that for _one_ pixel of millions for every frame.
    The amount of work for the GPU really explodes with the number of objects. (To have at least a chance for reasonable FPS, Ben and his colleagues will have to find some very smart ways of grouping and ordering of all the houses and trees.)

    So, let us stay excited but also be very patient! Ray tracing in X-Plane is not around the corner… (Sidney, please correct me, if I’m wrong!)

    1. RTX is a hybrid of raster and ray tracing its not using it for everything only where you want it to

      also it doesnt care how many objects are in the scene only how may rays you can cast be for it has to kick out the next frame

      1. That’s unfortunately not true. The hardware will have to do some kind of spatial bucketing of the geometry supplied to it, and then determine what bucket a ray falls into. For each object in the bucket, the hardware has to check if the ray intersects or not. You don’t get unlimited objects free of charge, there is a limit. Not to mention that the further the ray has to travel, the more buckets potentially have to be checked before it intersects with anything.

        There might also be a limit about how big a level can be. We don’t know what kind of bucketing Nvidia uses internally on their hardware. We also don’t know how much memory the GPU has available for geometry.

    2. You hit the nail pretty much on the head! Personally I’m super excited for ray tracing! Unlike Ben I’m young enough to not view everything new with cynicism and can take it for the first gen tech that it is.

      Sure, RTX is definitely meant as a hybrid, which is why Nvidia _still_ has a fantastically powerful rasterization engine on the chip. But this is definitely the future and something that we as developers need to at the very least be aware of. It remains to be seen how useful the new first gen RTX cards are going to be, and what the other GPU vendors will come up with. Right now, all we can do is speculate how it will work and affect X-Plane, we need real hardware and real benchmarks to get an accurate picture.

      1. I really wonder how that hybrid rendering works. Is it like doing normal rasterization and when having some reflecting surface bring RT in? I’d believe that would be very ineffective and would require a lot of redundancy in VRAM.

        I also wonder, if the visual quality of the first gens of real time RT games might not be a little disappointing. Of course, reflections and shadows should be much better, but what about sharpness and clearity (something that rasterization is very good at)?
        When keeping the number of rays manageable, ray tracing becomes very grainy and noisy. Nvidia built in an obviously very good noise reduction, to tackle that (in some demos they turned it on and off to show it). But even the best algorithms will make the image somehow blurry. Ok, with motion you get away with a lot of blurriness and we’re used to that with our mobiles and youtube, but in games (and sims, forgive me…) we are used to perfect sharpness.

        Well, as you said, all a lot of speculation. We have to wait how that all works out in reality…

      2. Ray tracing is the technology of the future…and always will be! 😉

        Seriously though, I’m always excited to see new fixed function parts of the rendering pipeline…these programmable shader thingies have been given carte blanche for WAY too long. First tessellation, then ray casting. The future is fixed function!!

      3. Quote
        Unlike Ben I’m young enough to not view everything new with cynicism
        unQuote

        i recall when playing in 1990 the MS FLIGHT SIM ver 1.0 on my x8086 and CGA graphics card, i was 12 years old, but even with the poor graphics and vague – almost black scenery, it was taking off my wild imagination to play it at night in dark room and feel myself really there.

        Things has evolved, and will be… today the hunger for more immersive experience keep the ball rolling… you have a lucky offset picking up point..

  11. Lets just focus on more urgent graphical updates like clouds and water rendering instead of dreaming about real time raytracing
    Please take a look at Unigine 2 Sim cloud engine or maybe license some 3rd party tech

    1. I’m really curious as to what Active Sky for X-Plane will deliver. If they are able to fix all this, it might not be justifiable for Laminar Research to work on it too, even though I totally agree with you.

  12. “One significant benefit of ray tracing is that its speed is minimally impacted as scenes get larger, unlike rasterized graphics. Each tree shown has 2 to 4 million triangles, over 20 different trees, making for about 100 million unique triangles along with the terrain. The forest then has over 80,000 instances, making the rendered scene over 300 billion triangles. All of this is without any culling or swapping, or level of detail – meaning all that geometry is there; all the time. And Lavina will eventually be able to handle much more geometry, as it will also support scenes too large for GPU memory by running “out of core” and using system RAM while maintaining most of its speed.”
    Phillip Miller – Choas Groap

    Ref:
    https://www.chaosgroup.com/blog/ray-traced-tendering-accelerates-to-real-time-with-project-lavina

    1. I think this blog post is a bit misleading about real time ray tracing tech.
      1. By definition everything that is real-time ray traced on the new GPUs has culling – the culling hardware is _built into the GPU_. That’s the whole point of this new hardware – we get fixed function ray collisions in hardware and NV can speed that up by throwing more transistors at it in the future, or by using better algorithms. But let’s not kid ourselves – this “giant scene” is culled just like a rasterized one. (And note that plenty of rasterization-based 3-d engines are culling on the GPU and have been for a few years now – see the AMD Froblins demo a while back)

      2. It’s impossible to comment on this results via a youtube video because the video’s compression makes it impossible to evaluate anti-aliasing and temporal stability. (The temporal stability and anti-aliasing may be fantastic and it’s just the video codec chewing things up.) But naively if you go to shoot rays at a tree with a million triangles and the tree is “really really small”, you have three choices:
      – Shoot a LOT of rays per pixel to anti-alias the tree. This will be slow – your FPS fall proportionate to all of the sub-pixel geometry you’re trying to render. This is great for non-real-time movie renders but not good for a game.
      – Shoot one ray – it will alias as the camera moves and slight camera movements change what leaf you’re hitting. This would result in flickering and shimmering.
      – Use some kind of proxy geometry when the tree is far away (e.g. a billboard with a bake of the tree). This is the standard solution for rasterizers and would work just fine for ray tracing too.

      In both of these cases (culling, LOD), they are problems for _both_ ray tracing and rasterization – they’re hard problems and there is no free lunch. So my pointing out that “this is a problem with ray tracing too” doesn’t diminish the value of ray tracing acceleration hardware. If anything, I am all in favor of seeing culling move to a fixed function unit – it’s a central rendering problem and I like having someone else invest a ton of resources in a specialized unit to do better than I can.*

      But this kind of “RTX will change the laws of physics for rendering” post is misleading; there’s no free lunch here.

      X-Plane’s engine is designed differently from shooter games because our scenes are much larger and our camera angles are more varied; that will be true in a rasterized or ray traced world. Since (in a hw ray traced world) the culling is being done at least partly by the GPU, it remains to be seen whether the hardware culler can adapt to our kinds of work-loads.

      * Typically a compute shader runs a culling pass against a crude depth buffer and writes out draw-indirect data using unordered shader writes with atomics to track the instances. This is the more modern version of a geometry shader being used to write out instance buffers and drop whole instances. MY view is that culling _and_ ray spawning are both similar to tessellation: when the ‘number’ of things going on is going up and down, it’s better to let the GPU guys do their own hardware. Hence rasterization, tessellation, RTX in hardware. The vertex and fragment shader are 1:1 input/output blocks and thus can be efficiently sent out to the compute farms.

      1. Thanks Ben for the educational insight.
        Indeed that is a realistic see-through into the technology inner-working, as perceived by developers, who really know where wheels meet the rails, in terms of underlying hardware interfacing with the software layer.

        The Froblins Demo (reminding of Goblins 1991 from Coktel vision).

        In real life, light is similar to an infinite band, in hardware 10 giga rays is still a limited resource which should be wisely spent on target objects based on some filtering criteria, (like you said, aka tessellation), that scales depending on object relative distance to camera.

        Yes true, as you have rightly stated:
        “Shoot one ray – it will alias as the camera moves and slight camera movements change what leaf you’re hitting. This would result in flickering and shimmering.”
        They are relying on tensor AI cores to do the denoising image stabilizer, but the mainstream user do not have any AI – Capable card yet.

        I think over the last month, very few people had purchased NV Volta or AMD Instinct, …. maybe a couple of years from now, situation will change.

        At best, we hope this new technology to give the airplane interior and airport environment a new depth.

        You are doing great to bring the whole engine to the next level.
        Keep up the good works

      2. what it looks like nV is doing is you set how far way you want things to be ray traced they showed this in the BF V video where some things far way where not showing reflections at all with RTX on

        so there is your solution once you get x meters way you just dont render shadows or reflections for that object which for a flight sim is fine once your up say 1500 feet or so i could care less about shadows on the ground vs when im on approach and right above buildings and trees

        tl;dr you can set a clipping plain for how far way you shoot your rays

        1. To refine your “solution”, how would you then handle the shadows and reflections of mountains and clouds, for example?

          I think, it would not be very convincing to see trees on the shore reflect on water, but no clouds and not the mountains in the distance.

          Or think about cloud shadows.

          And BTW, “you just don’t render shadows or reflections for that object” is not how ray tracing works.

  13. The 1st few paragraphs essentially say it’s really difficult and you need a vulkan this before you can have a raytrace that. Looking back at X-Plane 10 the discussion was about 64 bit and the developer article suggested a 64 bit architecture would only become worthwhile if navigation was celestial. But hey presto X-Plane 11 is 64 bit…. Why can other software developers make 64bit, vulcanised ready titles when they are students (stage 9)? Given its scale (competing with Lockheed Martin) why doesnt Laminar Research have access to the latest development tech before the general consumer? Finally why can EA Games have all the tech in place while you are explaining to your following why it is so hard.

    1. because they “dont want to get in bed with nvidia” which i think is silly

      nvidia would throw money and dev help at LR if they asked for it the catch is LR would have put nv marketing on there product which i dont think that much of draw back

      the other issue for the longest time was LR wanted to keep things the same across OS versions but with Apple pulling out of the performance computing world that doesnt mater now

      i still think LR should just drop any ideas of supporting Metal and just focus on Vulkan alone with OpenGL as fall back and then Vulkan only for XP12 when that ever happens

      the thing is a lot of 3rd party devs dont support there add ons on Mac anyway and not at all on Linux its just not worth the time it takes for 10 people that would be using it

    2. What developer article are you reading that suggests 64 bits would only be useful for celestial navigation, exactly? Do you have a link? (It sounds like you might be mixing and matching 64-bit address spaces with 64-bit coordinate systems.)

      Anyway, I appreciate that you’re asking why we don’t get new tech out as fast as a company that brings in five billion dollars a year…it feels great to be called up to the big leagues!! 🙂

      1. well right now nV is throwing tons of there people at any Devs willing to step up and put DLSS and RT in there stuff for free

        its just email way

        at the very lest you guys should look at DLSS since all you have to send nV a copy of the sim to train the AI and nV sends you back to code to add

Comments are closed.