****EDIT: There’s apparently an issue for people running the Vive and WMR where they’re seeing reduced resolution. We’re looking into it and will post an update as soon as possible.
****EDIT2: We’ve found the issue affecting Vive and WMR. We’re testing a fix internally and will release an update hopefully in the next 24 hours. Please do not submit any more bug reports about Vive/WMR resolution.
11.20 VR4 is Live on the servers. Aside from the usability fixes that Ben already mentioned, the major ‘feature’ in VR4 is…Oculus users will no longer need SteamVR. If you downloaded it just for X-Plane, go ahead and remove it. It will no longer be necessary.
As we said we would do from the very beginning, we investigated the relative performance of X-Plane through the native Oculus SDK versus SteamVR and what we found, through data collection, is that the overall experience for Oculus users was better by going through the Oculus SDK directly. I know many of you are thinking “Duh! I told you that a month ago ya big dummy!” and yes…yes you did. Fortunately/Unfortunately, we try not to make engineering decisions based on gut feelings and anecdotal evidence when we have a way to collect actual numbers. We wanted to tackle a majority of the usability issues affecting everyone before we looked into performance.
In the various A/B tests that we performed, we found that going to the Oculus SDK directly got us about 25% improvement in frame rate. This does not necessarily indicate that there’s anything wrong with SteamVR itself. There are several factors influencing the performance in VR. First, Oculus has their “home”, that little bachelor pad where you hang out while waiting for games to load. SteamVR has their “home” as well. When you use SteamVR, BOTH are running on your machine. Those houses are not free and X-Plane is already CPU bottlenecked so anything consuming CPU resources is going to directly affect performance. (I noticed an Autodesk updater in my task manager that was stealing 5% of my CPU consistently. That too was decreasing my performance….every bit matters!). Going directly to the Oculus SDK removes the SteamVR house from the equation.
Sure, getting 25% improvement is a huge win, but that’s NOT the biggest win. The biggest win, in my opinion, is that Asynchronous Space Warp (ASW) works MUCH better even at very low frames rates down to about 22.5fps. It appears as though the timing of the frames is critical for ASW to work properly. Being at 22.5, 30, 45, 90fps feels smooth! Being in between those frame timings seems to make ASW lose its mind creating an annoying judder; the opposite of what ASW is supposed to be doing for us. Oculus seems to be V-Syncing us to hit those intervals, allowing their algorithms to make reliable timing decisions and predictions. It’s my suspicion that SteamVR was just not hitting those intervals, causing ASW to flip out.
TLDR; Performance for Oculus will be on par with what Vive users have been seeing all along. The smoothness of the rendering seems consistent even down to 22.5fps. If you’re a Vive user, you will still, of course, need SteamVR as that IS your native SDK. If you’re a WMR user, you will still need SteamVR. I have not seen any reprojection issues with WMR like we have with Oculus. Supposedly the upcoming versions of SteamVR have some performance improvements coming for WMR users as well so we’ll be sticking with SteamVR for all headsets other than Oculus. That can always change in the future…based on data.
We’re starting internal and private testing of VR preview 4 – if it goes well, we’ll release it early next week. A few notes on usability:
Have Your Mouse Cake and Eat It
We got a metric ton of “bug reports” that users couldn’t click “Disable VR” when using the 3-d mouse. I put bug reports in air quotes because disabling click zones on the main monitor when using the 3-d mouse was totally intentional! (In other words, I broke that button on purpose.) My thinking was that you might click on “Disable VR” by accident while in the headset because you don’t know what the 2-d mouse is hovering over while clicking.
In VR4, we have a solution to the problem of whether the 2-d or 3-d mouse is the “intended” cursor when a click happens: we read the headset’s “on the user’s head” sensor and disable the 3-d mouse (temporarily) when you take the headset off. So you can be clicking in 3-d, take off the headset and immediately click “Disable VR” and we figure it out. This feature is great for developers who need quick access to 2-d debug tools like the texture browser or DataRef editor. Read More
Plugin developers: the title says it all – don’t register a drawing callback that doesn’t actually draw anything.
When I write it like that, it sounds a little bit silly, but some plugins register a drawing callback on startup, because they might draw something – but then only do the drawing when certain things happen. I am guilty of this myself – libxplanemp registers its drawing callbacks at init time even if there are no multiplayer planes to draw.
Most plugins did this for simplicity and to avoid any strange rules about when you could register things – the original 1.0 SDK was a lot pickier than what we have now. And back in 1537 when the French invaded Flanders, the cost of a no-op a draw callback was pretty cheap – just the cost of the function call-out and a few instructions of book keeping.
Over time, as X-Plane’s renderer has become more complex, this has become not the case. X-Plane now does some significant work to save and restore GL state when we hit a drawing callback, and that overhead is going to become significantly more expensive when we move to Vulkan/Metal.
To optimize this, Tyler modified the drawing callback code in 11.10 – starting with 11.10 if we find that no plugin has registered a given drawing callback, we skip the entire process, including the OpenGL book keeping.
So if your plugin goes in and registers a callback for every single drawing callback type then we have to fall back to the slow path, carefully stashing our GL state and prepping for your running at each drawing callback point.
Here’s my guidance:
- Never register for a drawing callback you’re not going to use.
- Do delay registering for a drawing callback until it is possible to need it.
- Don’t try to optimize your drawing callbacks by registering and de-registering per frame.
So as an example, it would be reasonable for a multiplayer API to register a drawing callback when the first airplane enters the system and then keep it around until the last airplane goes away, even if that airplane is off screen.
(The reason for that last rule is: if we have to make an expensive graphics transition once we discover that drawing callbacks will be needed, we don’t want to get thrashed by drawing callbacks disappearing and then re-appearing over and over.)
I don’t think there’s been a subject that has attracted as much attention on the developer blog as VR – 274 comments on the announcement and another 90 on the next VR topic. We’ve also received tons of bug reports, emails, and Chris has even sat down and watched endless video commentary on Youtube about our VR preview. (Just a reminder, this is a PREVIEW, not a Beta. In a Beta, we expect the features to be mostly complete, though perhaps with some bugs. In this preview, we’re still actively working on the features!)
A lot of the discussion has been about how you interact with the virtual world; Chris and I have spent literally hours discussing this since VR1. In this post, I want to share our thinking and what we’re working on for VR2.
We can view VR interaction in terms of four use cases: fully virtual, virtual with touch and hardware, no touch controllers, and HOTAS. Let’s look at them in detail.
The case that works best in VR1 is a “fully virtual” work-flow – take two touch controllers and sit in your room and you can operate the entire aircraft. There are some rough edges that we are looking at.
- Lots of users complained that the yoke movement in VR is always a joystick motion (rotation controls pitch and roll), even if the aircraft has a yoke you pull toward you to climb. Chris’s always-a-joystick choice was based on ergonomics: since it’s fully based on angle, you can move your hand to rest on something while flying without changing your pitch. This reduces the fatigue that would come with having to hover your hand over the exact position if you were using the yoke in a realistic way. With that said, we’re looking at providing both a “realistic” mode (where motion tracks the virtual yoke, matching our other manipulators) and an “ergonomic” mode, where the control is always a joystick and you can fly from any virtual hand position.
- There’s no way to input rudder right now if you don’t own physical pedals. Tyler is working on making the touch controllers configurable as joystick devices. With this, you can map rudder onto your joystick or D-pad, and also use some of those buttons for common functions. (This capability is useful for two-controller use cases; users with one controller and physical hardware will probably want to keep most of the VR controller buttons mapped to VR-specific functions, as some of these cannot be moved to a conventional joystick.)
Controllers and Hardware
We heard about this case in the private beta: users with touch controllers and physical hardware have a problem when the virtual control is located inside the physical control in the real world. This happens if you are flying the Cessna and have a CH Products Yoke, for example.
To solve this, we’re adding the ability to “touch” cockpit elements from a distance via the controller lasers. It’s a little harder to use than grabbing the controls directly (because small wrist motions are amplified by distance) but it lets you get at controls that would otherwise be blocked.
Laser grabbing also turns out to be really handy for airliners – the gear handle is a stretch from the captain’s chair in the 737; now you can grab it by laser.
Laser grabbing should work seamlessly with conventional direct manipulation of controls, so for users who are happy with the direct interaction in the default fleet, there should be no loss here, and no need to revert to a mouse because some of the controls are physically blocked.
Mouse Control to Major Tom
Some users don’t have touch controllers, just an HMD, a mouse and real flight hardware (E.g. a USB yoke or something). For these users, we’re working on 3-d mouse interaction with the sim. I am not sure how capable it will be, but if you used to fly with a monitor, stick, and mouse, now you can fly with an HMD, stick, and mouse; anything you would have done with the mouse before (UI clicking, 3-d cockpit interaction) stays on the mouse.
Some users fly with only physical hardware (E.g. a Warthog or some other HOTAS setup). This is the group of users that Chris and I are, frankly, kind of perplexed by. If you flew X-Plane HOTAS before VR, X-Plane is, right now, just as capable in VR; anything mapped to your USB hardware still works on your USB hardware, and if you have to go pick up a touch controller, you would have had to pick up the mouse anyway. Once we have mouse support, the mouse can be “the thing you get for UI” instead of the touch controller.
We are looking at HMD-mounted UI selection, e.g. look at the UI and click a button – Rift users may be familiar with this from the “look at this warning message for 2 seconds” notice at startup.
My personal opinion here is: having used half-written mouse code, touch controllers and HMD-aimed selection, personally I thought HMD-aimed selection was not only the worst of the three ways to interact, but it was not in the same league as mouse or controllers in terms of precision. But it does provide a way (not possible right now without VR) to stay HOTAS for the entire use of X-Plane, which could be valuable.
I’m not sure how soon we’ll cut VR preview 2 – I wanted to get it done this week, but I think it’s going to be some time next week; we’re working on these issues now, but it’s hard to say when it’s “done” – we might try all of our “really good ideas” and realize they need more work. That’s how user-interface is…you have to try it to know whether the idea works.
(This post was edited to reflect a discussion Ben and I had)
A constant request for XPlane2Blender and other exporters maintained by Laminar Research is an import OBJ feature. So, one last time, here is the news: by now an importer is basically never going to be a part of XPlane2Blender, but it could be its own project! It could take years of development to make it very well polished, however.
There are a few projects started on the forums you can checkout if you’d like!
Why Is This Such A Hard Problem?
It is simple to read the vertex table, attributes, animation keyframe table, and turn that into Blender data, some projects can already do this, in fact. The trouble is getting the exact or nearly the exact .blend file back.
OBJs Don’t Save Everything You Want And Need As An Artist
Some of the things things you could never import from an OBJ
- Window Settings
- Text Blocks with scripts, notes, or annotations
- Any non-XPlane2Blender Blender settings we don’t care about
- Object, layer, material, and texture names
- A bunch of settings that can’t easily be inferred (more on that later)
- Blender’s parent child relationships
If the OBJ was produced with comments and correct indentation, you might be able to get some of these things back (likely just a few names.) A large complex Blend file without this stuff is a nightmare of an unorganized mess and would make the rest of the manual reverse engineering process even harder.
Multiple .blend files Can Produce The Same OBJ Content
If A.blend, B.blend, and C.blend can produce the same OBJ, which one should the OBJ be reversed engineered back into? The relationship between XPlane2Blender settings and what appears in the OBJ can be very esoteric and there is not always a 1:1 relationship. Some ATTR_s only appear when certain combinations of settings are used. You may find there are absolutely no good defaults.
A Moderately Smart Importer Would Need To Know Massive Amounts of History Of All Exporters
In order to perfectly solve these ambiguities and produce .blend files that are similar to what originally exporter the OBJ file, it would be incredibly useful to know about the behavior of the exporter that exported the OBJ in question. This means an importer would need to know all the bugs and features of every exporter, and we don’t even know that after developing the exporter. Bugs are waiting to be discovered, or used by artists until they have to be turned into a feature. Our exporter currently struggles to take into account past bugs, and that’s the exporter!
This turns into it’s own ambiguities to solve: “Is this OBJ’s mention of deprecated ATTR_s a choice the artist made. despite the deprecation warning, or was it a bug that this was still getting written, or was this OBJ written before it was deprecated?” Now you’re messing not only with valid or not, but what the OBJ is supposed to look like.
Optimizations Create Further Challenges
Exporters often take optimizations to improve an OBJ’s performance and quality. For instance:
- Removing duplicated vertices, keyframes, attributes
- Rounding or changing data on the fly
- Ignoring or appending ATTRs to handle deprecations or obsolete OBJ features
Now you will have even less information or, now, seemingly incorrect information! OBJs are meant for X-Plane, not humans. As such exporters can take many liberties with the content of OBJs as long as they match what the artist meant. This can result in very complex optimizations that might even break our own guidelines, all to deliver the best (and deliver on time) to the consumer and our artists. This makes developing an importer that reproduces the importer OBJ, either exactly or simply visually matching (let alone animated or textured properly) even harder.
In a raw .blend file, objects are like Lego bricks which get baked into one solid piece. Going in reverse will likely not get the same neat separation. Blender is not smart enough to tell what is a wing shape, what is a wheel shape, what is a throttle level shape. It may be impossible to separate the vertices back into these distinct meshes (especially after optimizations)! Making a “really smart” importer that is shape aware is a brutally hard algorithm that the world’s best computer scientists are still attempting to solve. It may not be challenge you want to take on.
We May Build A Reasonable Importer One Day!
With all these challenges, an importer would have to be willing to not handle all edge cases and not attempt to reverse engineer an OBJ back to the exact .blend file that made it. For instance, a 3 year old OBJ would not be reversed engineered into the .blend file that produced it 3 years ago, especially since an artist would want to then update their assets anyway! Having art assets “marooned” on other exporters or “dead” is a pretty terrible waste of time, likely be more painful than hand fixing all lot of little things. As Ben pointed out “If we can’t make a direct import [.skp->.blend, .ac->.blend] path, OBJ import is the least bad option.
2.49 to 2.7x Converter
A 2.49 converter is a much more manageable project (far on the back-burner.) The complexity of this tool is much more manageable, because the 2.49 is not in active development and you are converting .blend to .blend, not .obj to .blend. Blender to Blender is something which Blender is already very good at.
For airplane authors, the biggest question VR brings up is: what do I have to do to my aircraft to make this work? This is the first of three posts explaining not only how manipulation works with VR, but why we made our decisions and how we got here. Bear with me – this will end with specific recommendations for what to do with your aircraft.
A Long Time Ago…
Old-timers like me will remember that when the 3-d cockpit first came out, almost all of the “brains” of it were powered by clicking on the 2-d panel, which was “mapped” to the 3-d cockpit. X-Plane would find out where your mouse click intersected the 3-d cockpit, find the texture location on that triangle, and map that back to the 2-d panel.
This seemed pretty cool when it came out 15 years ago, but as authors started making more sophisticated 3-d cockpits, it became clear that we needed a way to specify how the mouse worked directly with the 3-d cockpit shape, because the mapping back to the 2-d wouldn’t allow the user to do basic things like dragging a throttle.
Thus in X-Plane 9 we added the first set of manipulators – tags on the 3-d mesh of a cockpit specifying what happens when the user clicks on it. The manipulator could specify a dataref to modify, a command to actuate, and some data about what kind of mouse click operation was happening.
The key words here are mouse click. The original manipulators were designed entirely in terms of a mouse on a 2-d screen; they were designed to be exactly as powerful as a 2-d panel system, which was of coarse also totally mouse-centric. The relationship between the generic instruments and the original manipulators is pretty tight.
Moving to Mechanisms
The good part of the original manipulator system was that it was very flexible – mouse-centric handlers were a low level tool around which authors could do whatever they wanted.
The down-side of this design was that mouse-centric handlers were a low level tool around which authors could do whatever we want. We examined our own default fleet of a dozen or so aircraft and found that no two aircraft’s cockpits operated the same way, and almost all of them had at least some aspect that was hard to use – a poor manipulator choice for the job.
Knobs were, in particular, quite difficult – the original 2-d panel system used click-and-hold over a knob to actuate it, but authors working in 3-d had often used drag actions to do knobs, and the drag actions would respond differently depending on camera angle. We received approximately 8,452,124 bug reports that the knobs in the 747 were hard to use specifically because of this.
So during X-Plane 10, we added some new manipulators to X-Plane, and they had a very different philosophy: the manipulator described what was happening in the simulated aircraft, and X-Plane picked a mouse behavior to make that work. The new manipulators described knobs that turned and switches that either flipped left-right or up-down. These manipulators reacted to the scroll wheel automatically because X-Plane knows what the knob is and therefore what a reasonable scroll-wheel interaction should be. (By comparison, with the old-style manipulators, the author has to specify what a reasonable scroll-wheel action is.)
Real Interaction Things Before It Was Cool
As it turns out, mechanism-centric manipulators are a lot better for VR than mouse-centric ones. Consider two cases:
- The 2-d axis manipulator (ATTR_manip_drag_xy) specifies what happens when the mouse moves up-down and left-right, relative to the screen. What on earth does that mean in VR? Even if we made something that tracks up-down and left-right relative to the screens of the HMD, the results would be completely weird because the “thing” in the cockpit would not move the way your hand was moving. With a mouse, a mis-match between mouse and aircraft isn’t surprising, but in VR, it’s weird to reach out and touch something and have it behave the wrong way.
- The knob manipulator. It describes a knob that does things when turned clockwise or counter-clockwise. This is pretty easy to deal with – you click it with your controller and turn your wrist clock-wise or counter-clockwise and it turns. Because we know what the mechanism is (and not just what the mouse should do) we can make a good decision about how to handle this manipulator in VR.
As it turns out, we ran into this problem of not doing what to do with the mouse, and needing to know what the mechanism was before we even started looking at VR. The exact same problem (“I want to touch the 3-d cockpit as if it was a thing and have it respond in the expected way”) exists in VR and on X-Plane Mobile!
Because X-Plane mobile runs on a touch screen and you have your physical finger on a switch, there are a lot of cases where the switch has to track really well and work the right way. If the switch goes up and down, you want to flick your finger up to turn the switch on; if it goes left-right you want to flick left and right, and anything else is astonishing.
So X-Plane mobile set us on this path toward mechanism-based manipulators, and VR further drove us in that direction, since both have the same physical, non-mouse environment where a user directly interacts with the 3-d cockpit.
Guidance For Authors
So as an author, what should you do when working on a 3-d cockpit? My answer is: use some of the manipulators, but not others, to model the way things work in 3-d.
Use these manipulators ALWAYS
These manipulators are the best way to create certain physical mechanisms – use them every time this case applies to your aircraft.
ATTR_manip_drag_axis. Use this for a physical mechanism that slides along a linear path, like the throttle handle of a Cessna.
ATTR_manip_drag_rotate. Use this for a physical mechanism that rotates around a fixed point, like the throttle handles of a 737, or a trim wheel. (Prefer this to the old technique of putting a drag axis along the edge of the throttle handles.)
ATTR_manip_noop. Use this to make a manipulator that blocks the mechanism behind it from being touched, e.g. the safety guard plate over a switch.
ATTR_manip_command. Use this for push buttons that actuate momentarily while a button is held down, like a button to motor a starter that you push in and hold down.
ATTR_manip_command_knob, ATTR_manip_command_switch_up_down, ATTR_manip_command_switch_left_right. Use these for knobs and switches with 3 or more positions that rotate or move. Pick the right manipulator for the mechanism!
ATTR_manip_axis_knob, ATTR_manip_axis_switch_up_down, ATTR_manip_axis_switch_left_right. These provide an alternative to the above manipulators when you must use a dataref. We recommend commands over datarefs any time you have a choice; the command system provides better interoperability between custom plugins, custom aircraft, and custom hardware.
ATTR_manip_command_switch_left_right2, ATTR_manip_command_switch_up_down2, ATTR_manip_command_knob2. Use these for knobs and switches with exactly two positions. You can use these to get a simple click-to-toggle with the mouse (quicker and easier for mouse users) while getting real 3-d movement in VR.
Use These Manipulators Sometimes
These manipulators are not perfect fits with physical motions and may require tweaking for VR, but they’re the best options we have now for certain problems. Try to use something from the always list instead if you can. Do not use these manipulators if the mechanism you are simulating matches something from the list above.
ATTR_manip_drag_xy. This is the only 2-axis drag we have, and is the best choice for eye-ball vents and the yoke. The yoke is special-cased in VR and should be based off of a 2-d xy manipulator. We are looking into more powerful multi-dimensional manipulation in VR, but this work won’t be complete for 10.20.
ATTR_manip_command_axis. This manipulator runs a command once based on moving a lever down or up. You should probably use a drag axis or command up-down switch, but there may be some odd mechanisms in aircraft where this manipulator is a good fit. It should not be your first-choice go-to manipulator.
ATTR_manip_push, ATTR_manip_radio, ATTR_manip_toggle. These manipulators can serve as alternatives for push-button style controls when you absolutely have to write to a dataref and not use a command. WARNING: Do not ever use these manipulators for things that move, e.g. don’t use these for banana switches, spinning knobs, or things like that.
Do not use these manipulators
These manipulators were for mouse-centric movement and should be replaced with knob or switch manipulators.
ATTR_manip_delta, ATTR_manip_wrap. These were designed to simulate knobs by click-and-hold – use the knob manipulators instead.
ATTR_manip_drag_axis_pix. This was designed to simulate a knob by click-and-drag – use the knob manipulators instead.
We Love Commands
As you may have noticed from this list and the ever-growing size of X-Plane’s built-in command list, we love commands – they have become our go-to mechanism for our aircraft. Commands have a few advantages over data-refs:
- There is a clear design in the plugin system for changing a command’s behavior. So a third party aircraft can easily “take over” the existing engine start command. This is very, very difficult to accomplish via datarefs.
- Commands can be mapped directly to joystick and keyboard hardware. There is no way to bind a joystick to a dataref.*
- Commands can be mapped to sound to capture the act of the user “doing” the thing, regardless of whether it works. For example, if you want the click of the starter switch regardless of whether the starter motor engages (maybe the battery is dead), commands make this easy to do. Often there is no dataref for ‘the user is trying but it’s not working’.
In X-Plane 11 we’ve been moving to commands to get our planes ready for VR, the new manipulators, and FMOD sound all at once.
In the next post I’ll cover some of the issues specific to VR and the 3-d cockpit.
* Developers who write plugins that provide external interfaces to X-Plane: make sure you have a way to provide access to commands and not just datarefs! There are things you can do only with commands and not datarefs.
X-Plane 11’s native VR support uses SteamVR, so let me clear this up now:
You will be able to use VR directly in X-Plane 11 with any distribution of our sim:
- An X-Plane 11 digital download from Laminar Research (or any other reseller) if you have a 24-digit product key.
- An X-Plane 11 DVD set from Laminar Research (or any other reseller) if you have DVDs.
- X-Plane 11 on Steam if you bought X-Plane on Steam.
VR will work for X-Plane 11 no matter how you bought it. You do not have to have bought X-Plane from Steam to use VR.
(This post does not contain any guidance on supported hardware – I’d like to wait until we get further into beta and get some feedback on how system requirements affect actual users. This is just a reassurance that you didn’t buy X-Plane “the wrong way” for VR.)
What Is SteamVR
SteamVR is a free download from Steam’s app store that X-Plane (and many other games) use to talk to your VR hardware. We use SteamVR for both the HTC Vive and the Oculus Rift – X-Plane’s native VR support requires that you have both the Oculus drivers and SteamVR.
X-Plane 11.11 is now final – you’ll be notified to auto-update. 11.11 is a small bug fix release that fixes a few key issues that we didn’t find out about in time to get into 11.10. Here’s the release notes.
Our VR private beta program is underway – we’ve been cutting VR private betas in parallel to 11.11 testing; I’ll post more on VR over the next few days.
- Bone animation bugs, where multiple nested bones or meshes attached to Armature datablocks would not export animations correctly
- #317 Custom lights will now never autocorrect due to old hidden data being considered
- #313 Custom light properties work again
UPDATE: After receiving around 70 requests, we are closing applications for private beta testing at this time.
X-Plane’s native VR is nearly ready for beta. Possibly as soon as next week, we will be moving into private beta testing of VR-capable builds. We are looking for a small number of volunteers to participate in the initial private beta.