One of the biggest requests we received with the previous VR previews (1, 2, 3, 4, 5) is to allow plugin-created windows to join the VR fun. I’m happy to say the initial implementation of this—allowing windows to float in 3-D space, Minority Report-style—is live in VR6.The changes are part of the SDK version 3.0.1 release. With XPLM301 defined for your project, you can use VR as “just another positioning mode,” similar to how we treat pop-out windows:

XPLMDataRef vr_dref =XPLMFindDataRef("sim/graphics/VR/enabled");
const bool vr_is_enabled = XPLMGetDatai(vr_dref);
if(vr_is_enabled)
{
    XPLMSetWindowPositioningMode(my_window, xplm_WindowVR, 0);
}

When a window is added to VR space, it will be attached to the user’s head-mounted display (HMD)—as they move their head, the window will move with it. The user can then move the window into a fixed position (relative to the aircraft) by grabbing the window with their controller (or the mouse, in 3-D mouse cursor mode) and dragging it around. This mirrors the way the built-in ATC window works right now.

Limitations of VR windows

There are a few unsolved problems here—most of which are unsolved for both plugin windows and X-Plane’s built-in windows.

You can’t place windows in a specific location in VR

This is, admittedly, possible for X-Plane code, but we don’t use it, nor expose it to the plugin system for one major reason: it’s really hard to reason about the visibility of a window in the 3-D cockpit!

Consider the case of the F-4, which has a small cockpit cluttered with lots of 3-D “stuff.” If you wanted to position a window programmatically within that cockpit, you’ll have to contend with lots of possibilities:

  • the window is too large to fit anywhere in the cockpit (how do you choose which part of it will be cut off by the cockpit geometry?)
  • the window is too large to fit anywhere but 1 cm in front of the user’s face (how do you weigh the annoyance of this against the problem of having part of the window cut off?)
  • the window fits in the cockpit, but will be occluded by 3-D objects (how much occlusion is too much?)

These concerns are not just academic—without taking careful account of the cockpit geometry, it’s very easy to position a window in a way that it’s just not visible at all in some cockpits!

Thus, our current implementation represents the most straightforward solution we can come up with: by attaching to the HMD initially, we guarantee the window will be visible (given some amount of head movement). And, by allowing the user to position the window thereafter, we sidestep concerns about occlusion, intersection with 3-D objects, etc. Users can’t position a window such that they can never see it or interact with it again, and we leave it up to them to decide how much of the window really needs to be visible.

Of course, there are scenarios where you can absolutely reason about the best place for a window—e.g., if you’re shipping an aircraft-specific plugin. This is something we’re looking at supporting in the future, once our own internal format for handling positions gets locked down.

You can’t restore window positions from prefs

This follows straightforwardly from the fact that we don’t support programmatic positioning of windows, but it’s worth noting separately. The unfortunate consequence of this is that if you get 10 windows popped open at sim start, they will all be right in front of your face, and you’ll have to move them before getting to your flight.

Eventually, we’d like to have an API for saving and restoring VR window positions on a per-aircraft basis—that is, your prefs for where a given window is located would be dependent on the aircraft you’re flying. In the C172, your prefs might have the window in the top left of the cockpit, while in the 737, it might be in the copilot’s seat.

You can’t lock the location of a window

On a 2-D monitor, we’re used to “flight overlays” that sit along some fixed edge of the screen and are always visible, waiting for interaction. There’s not a good VR equivalent for this, though—locking UI to, say, a fixed position relative to the HMD is kind of annoying as a user.

You can’t bind a window to a 3-D object

X-Plane’s “xPad” is an example of UI that’s integrated into the VR world. In essence, we have a 2-D window “stuck” to a 3-D tablet object, giving the illusion that you just have an interactive object. This isn’t possible in the current version of the SDK, but it may be someday.

About Tyler Young

Tyler is a software developer for X-Plane. Among other projects, he was in charge of the X-Plane 11 user interface and the massive multiplayer implementation.

34 comments on “VR Support for Plugins is Here!

  1. This is really awesome. Can’t wait till HTC Vive released the next gen HMD, so I can get it and experience all this 🙂

    I’m always super curious as to what you guys will work on next. If it would be possible to give a small hint on what you are *planning* to work on next, that would be greatly appreciated. Emphasis on *planning* as obviously, as priorities can change at any time.

    Thank you!

    1. Ben’s said publicly that transitioning to next-gen graphics APIs (Vulkan/Metal) is the next major chunk of development scheduled for X-Plane 11. 🙂

      1. You guys are unstoppable! Also, I am very happy about the prioritisation of the GFX APIs.

      2. Awesome, thanks Tyler!!

        I hope the Vulkan API will help with squashing the cloud transparency bug 🙂

        Cheers!!

      3. Maybe only Vulkan is needed?

        2018
        On February 26, 2018, Khronos Group announced that the Vulkan API became available to all on macOS and iOS through the MoltenVK library, which enables Vulkan to run on top of Metal.[46] Previously MoltenVK was a proprietary and commercially licensed solution, but Valve made an arrangement with developer Brenwill Workshop Ltd to open source MoltenVK under the Apache 2.0 license and as a result the library will be available to all. Valve also announced that Dota 2 can as of 26 February 2018 run on macOS using the Vulkan API, which is based on MoltenVK.[47]

        1. Ben has already stated that it’s unlikely we use Vulkan on Mac. I might be misquoting him but I believe he said Vulkan is a lower level API than Metal. The reason to use Vulkan is to get direct access to the hardware so if it’s implementation on Mac is going to go THROUGH Metal, it defeats the purpose of choosing the API…adding a layer of “unknown” in between which can lead to all kinds of ‘fun’.

          1. That’s right. While I understand how Khronos might have some desire to have Vulkan try to become a “universal” API since it already runs in a lot of places, Vulkan as abstraction layer for other low level APIs is totally weird – it’s whole purpose in life is to _not_ be an abstraction layer.

  2. I have a working VR window by modifying the Hello World (SDK 3) example. I can make it smaller and bigger and the text follows but have two issues.

    The red close button does not work. It will turn green when touched but the trigger will not close the window.

    Second the vr_is_enabled is always testing false even though I am clearly in VR mode. When I look at the dataref using DataRefTool it is set to true so wonder if the issue is because I am looking in XPluginStart and need to do a deferred initialization?

    One thing that would be nice is a sample code that would be like the ATC window with check boxes and radio buttons as a good starting point for developers.

    Thanks Bill

    1. The issue with the close button will be fixed in the next upcoming beta. Re: the VR enabled test, you can wait for the first “scenery loaded” message (per the new VR Window Sample), but note that this, um, embarrassingly won’t work in vr6—it results in the window being created in the holodeck, not the sim itself. Again, a fix is coming in the next beta.

      1. I got the VR Window Sample to build and tested it and did find it in the holodeck.

        I build for windows using mingw-w64 on Linux and after fixing a link issue built fine.

        Thanks for the feedback and looking forward to the upcoming beta.

        Bill

      2. Making some good progress in testing how to get Xchecklist to show up in VR.

        So when will I be able to test in the cockpit and not in the holodeck?

        Will it be X-Plane 11.20vr7 or 11.20b1?

        Thanks Bill

        1. Hi,
          Since sparker had discovered this issue, I wanted to replicate it but on my system the sample code produces a window in the sim itself (and not in the holodeck at all). So I sent my build to him but when he runs it the error persists on his system. Could it have something to do with the fact that I’m on the Rift and he is on Vive/SteamVR ? Moreover on my system the “Toggle VR” button doesn’t work, and yields some strange behavior as documented in this (small) thread https://forums.x-plane.org/index.php?/forums/topic/143397-vr-support-for-plugins/. Do you need a bug report or are these oddities already part of your workflow ?

          1. @JrKok, if you’re set to bypass the main menu (per the Settings > General) screen, I believe you should indeed see the window in the sim. The fix coming in the next release will make this happen for everyone, regardless of settings.

            Please do file a bug if the Toggle VR button isn’t working for you.

          2. @Tyler Young, Thank you very much for your answer ! I always start in the main menu (no bypass), I’ve tried whatever I could imagine to evoke the VR Window in the holodeck, no success. I’ve filed a bug for the toggle VR button.

          3. In further testing I have modified the VR sample so I can run it in both VR and 2d.

            As I am trying to make a checklist I have check boxes that when clicked on they change state. in 2d as soon as I click on the box with the mouse it changes state. When I am in VR is takes many clicks of the touch controller to do the same thing.

            I have filed a bug report with my source and built code.

            Bill

  3. A few days ago I came across this snippet regarding the release of Vulkan 1.1:

    (https://arstechnica.com/gadgets/2018/03/vulkan-1-1-adds-multi-gpu-directx-compatibility-as-khronos-looks-to-the-future/)

    “Vulkan 1.1 is also better for Virtual Reality applications. VR requires rendering two different perspectives of the same 3D scene, one for each eye. This is possible today through brute force—first, submit all the commands to draw the left eye to the GPU, then submit all the commands for the right eye. With Vulkan 1.1, developers can use multiview, where a single set of rendering commands produces multiple, slightly different outputs with a single call.”

    This sounds to me like a huge potential performance gain. Could you maybe comment on this?

    1. In our current OpenGL VR implementation, we absolutely _do not_ submit all of the commands to draw each eye serially. What we do is use hardware instancing to draw twice as many _copies_ of everything, with every other copy going to the left and right eye. The cost of this is:
      – Twice as much _vertex_ work on the GPU – we are building two complete sets of vertices, because your eyes don’t see things in the same positions.
      – As much shading work as the HMD takes – this is dominated by the headset res.
      – Almost no increase in CPU work compared to mono rendering, because we don’t have to do any CPU work twice – we just let the GPU know that we want 2x the copies.

      We do have to compute slightly more transform state info because we have to prep both eyes, but this is a small fraction of the CPU-side work of rendering.

      From what I can tell, the original Vulkan 1.0 spec is _completely_ capable of supporting this path, which requires:
      – Hardware instancing
      – Multiple viewports
      – Writing the viewport ID from the vertex layer.
      Vulkan 1.0 definitely has the first two, and it probably has the last one, I just couldn’t find a spec ref super fast.

      My understanding is that multiview is “better” than this technique in that it can share some of the vertex computing and temp storage (between the vertex and fragment shaders) between the two eyes. This strikes me as a marginal improvement over what can already be done with instancing in Vulkan 1.0, GL, and Metal 2.0. (Metal 2 really did make Metal Vr-ready, because an instancing + v-shader-viewport path wasn’t available in Metal 1.0.)

      So there’s no magic here – GPUs for the last few generations have had adequate tech for VR.

  4. I am Vive user and I have the F-35 payware model.

    I asked the developer about the HMD in that package in this thread on x-plane.org

    https://forums.x-plane.org/index.php?/forums/topic/143627-f-35-hmd-in-vr/

    But I’m not sure I understand. Does that mean I have to wait for Laminar to enable something? This plane is amazing and functions well in VR, except for the missing HMD info, so I’m just trying to understand. I will update to the 1.21 version of the plane as suggested and see what happens.

  5. Sorry, my first language is Jira…
    Problem: UI dialog boxes can be cut off by the scenery (I’m guessing)
    Problem: VR controller can be lost behind scenery, common if using real joysticks when the controller needs to be put down.

    Are you considering a draw mode where after the main scene is drawn, you can then draw again ignoring the z-buffer, allowing you to put windows any distance (yep, can give bad 3d, but better than loosing the window) and drawing outlines of the controllers (a ghost mode, like the Oculus guardian system I’m-yet-to-see-cuz-of-the-bugs).
    ….sorry, just thinking out loud! Definitely not solutionizing…..

  6. Hey guys,
    Now that you’ve switched over to the Oculus SDK, can we see virtual hands in the cockpit rather than those silly controllers? The Oculus has great native support for gripping and pointing that I think would be really useful in cockpit. For example, I could push a button, just like I would push a button in real life; simply extend the index finger, get near the button and depress the switch. Some feedback when pushing the button would seal the deal. This feels much more intuitive and realistic than getting nearing a switch and squeezing a trigger (as is done now). Also, I can grab the xPad using the “make a fist” gesture with the Touch controller. Again, feels much more intuitive than what is currently implemented. Hope all is well. And thanks for listening.

    Cheers.

    1. This is great when it works, but often pressing switches this way is hit and miss. I’d love to see a decent implementation of this though.

  7. I know this is off-topic, but I couldn’t find another location to pose this question.

    In the “vrconfig.txt” file there are several entries for each viewpoint. I am working on a vrconfig file for a 3rd party aircraft but so far have only been able to set the viewpoint “circle” (the blue one that shows up when you move the touch controller joystick to select your viewing position) but am not able to actually get it to select. I’m assuming that would be the “AABB” entry but there are a lot of parameters there and I’m not sure what they all do. Additionally, the last three entries are all set to “0.0” in most cases but I’ve come across one that has a different value for the first of these three entries.

    The bottom line – and the question – is this file documented somewhere? I’ve found some blog entries which show most, if not all, of the manipulator values but nothing on the viewpoint entries.

    Is such documentation is not currently available, can it be made available, please?

    Thanks

    1. It’s partially documented but I haven’t finished it yet. It will be released during the beta test period. The AABB is simple. Min-X, Min-Y, Min-Z, Max-X, Max-Y, Max-Z. The Preset XYZ is the location of the pilot’s head and the Psi/The/Phi are the orientation of the head once the teleport to the hotspot has been completed.

      1. Perfect! That’s exactly what I needed to know. I was partially there by the time I read your reply, but this gives the details I was hoping for.

  8. A weird moire pattern appears when using normal maps with ATTR_draped .obj’s. It also affects X-Plane’s default .net roads, which use normal maps. However, it doesn’t affect .pol’s. The pattern is only visible when the camera is close to the draped .obj/.net road.

    Are you already aware of this, or should I file a bug report?

  9. Good job! I know you have a million things on your plate, but any guesses on when you might open this up for other VR/AR HMD’s, possibly via OS VR or even “your on your own, don’t call us” guide to modifying your shaders? I’m doing stereo now with custom post-processing in XP 10 plugins, but would love to piggy back on your efforts.

  10. The VR support is really amazing! I’ve actually bought X-Plane because of its native support, so I’m quite new to this simulator and have never developed plugins for it so far.

    In another comment it was stated that it might be possible to add 3D objects with 2D windows in the future, exactly like the existing map tablet. I have another idea that might be more simple: It would be cool if plugins were allowed to add new menu entries to the existing tablet, for example to implement an iPad like screen that allows the user to browse charts etc. Charts and routes are the only reason that break my immersion so far and I would love to develop something easy to just get PDFs or a browser into X-Plane’s VR through a plugin.

Comments are closed.