X-Plane 910 was an update to X-Plane 9 for our professional customers. But all of the new features that they got in 910, everyone got in 920. Here’s how it happened:
X-Plane 9 had a very long beta, and the end of that beta was mostly spent with a finished sim and me trying to fix the pixel shaders for five thousand flavors of video card, driver, and operating system. During this time, Austin started work on new systems modeling features for professional level sims. We branched the code, with my work going into 900 and his going into 910.
When he finished his systems code and I got the pixel shaders fixed and both were fixed, the two were combined into what became 920.
So that’s how we “lost” the 910 version number – some professional customers have the version number, but everyone got the features.
There are two ways to make 3-d instruments in your 3-d cockpit:
- Create 2-d instruments on a panel and use the “panel texture” (ATTR_cockpit or ATTR_cockpit_region in your OBJ) to show those 2-d instruments in the 3-d cockpit.
- Model the instruments in 3-d using animation.
So…which gives better framerate? Well, it turns out that they are actually almost the same…a few details:
- If your card can’t directly render-to-texture, there is an extra step for the panel texture. But that would be a weird case – all modern cards can render directly to textures unless you have hosed drivers.
- For very small amounts of geometry, there’s pretty much no difference between rotating a needle using the CPU and telling the GPU to do it by changing the coordinate system.
- The panel texture does put pressure on VRAM – if you’ve had to go to a 2048×2048 panel texture to have enough space, it’s going to hurt you.
Both approaches are actually quite inefficient – you get best vertex throughput on the card when you have at least 100 vertices per batch. But if a panel has 800 batches, you don’t necessarily want to do this – you’d pick up 80,000 vertices just trying to “utilize” the graphics card. That’s not a huge number, but it’s big enough to consider. Panels have enough moving parts that they’re going to push the CPU more than the GPU.
A number of authors like the 3-d approach because they are more comfortable with 3-d tools, and because it can look sharper (since there is no intermediate limiting texture resolution).
There is only one case where I would advise against the 3-d approach: if it takes a huge number of animation commands to accomplish what can be done in one generic, use the panel texture; the generic instruments are all coded cleanly and none of them take that much CPU power. But some of them produce effects that would be relatively difficult to reproduce with animation.
The X-Plane export plugin for AC3D doesn’t handle panel textures very well. The current plugin tries to identify cases where you have used your panel background as a texture – this queues it to generate ATTR_cockpit. This scheme has a number of problems:
- The search paths for the panel background are not up-to-date. The plugin doesn’t know about the new naming conventions or the cockpit_3d folder.
- The scheme doesn’t address panel regions at all – there is some support for them but it doesn’t work well.
- Most important: panel editing is not WYSIWYG. Since you are using the panel background as your texture, you can’t actually see where all the moving parts are! Doh!
That last point is perhaps the most important one, and it is why, for the next version of the plugin, I am introducing panel previews.
Basically a panel preview is a screenshot of your panel with the instruments on it, sitting in your cockpit_3d/-PANELS- folder. AC3D will recognize and use the panel preview when possible. This will solve problems (1) and (3) – there will be only one naming convention for previews, and they will be screenshots of the panel in action, so you can texture with a preview of the instruments.
Plane-Maker 930 will contain a facility to generate panel previews; if you are using X-Plane 920, you can generate the preview manually by taking a screenshot in X-Plane.
For panel regions, we will have one preview file for each region (e.g. Panel_preview0.png, panel_preview1.png). This addresses issue 2 – usage of the region previews will invoke ATTR_cockpit_region.
Finally, I am moving the panel sub-region information from the preferences to the .ac file (hopefully) so that it will be saved with your plane.
Hopefully this will make a work-flow which is much simpler. To make a 3-d cockpit you will simply pick “generate previews” in Plane-Maker, and then start using the previews as textures.
I really like what he’s done with the new page – see it for yourself here!
In a previous post I ranted about the hidden cost of adding new third-party-accessible features to X-Plane – that cost being the cost of supporting the APIs way into the future so that third party content doesn’t break.
Even trickier are accidental contracts – that is, unintentional behaviors that the sim exposes that third parties take advantage of. This can be particularly tricky for us because we might not even realize that the behaviors are happening.
In 921 we allowed the HUD to be visible in the 3-d cockpit for planes that use our “default” 3-d cockpit, like the 777. Ignoring that this was a truly silly feature to do, one of the side effects of the code change was that EFIS glass instruments now render correctly over transparent parts of the panel texture on some hardware. Javier immediately jumpedon this to build a 3-d HUD.
These EFIS glass instruments appearing in that region was totally unintentional, and this is about the last way I wanted to implement 3-d HUDs, but now that it’s in there, I have to ask: do I want to break Javier’s airplane, which a lot of users will like? Probably not!
Another accidental contract is the draw order of the cockpit objects. Basically due to a coincidence of how the code is structured, the external cockpit object is drawn before attached objects, but the internal cockpit object is drawn after. There is no good reason for this, but now that authors build airplanes based on this, we have to preserve it because changing draw order can break translucent geometry.
X-Plane is full of this kind of thing – and all of these hidden conventions make it tricky to restructure subsections of the code.
Sometimes you have to break a few eggs to make an omlette. Or at least, you have to consider whether breaking them is acceptable. Often I hit cases where the cost of supporting a legacy feature is somewhat painful.
One way to decide what to do is to change the feature early in beta, see who squawks, and then change it back if necessary. There are two I am looking at for 930.
Glass Instruments
It turns out that glass instruments fade to black, not to transparency. This is a little bit weird, because that means they will leave black footprints if they are on top of a non-black background. My guess is that most people use them on black screens and thus did not notice.
If people really need fade-to-black glass instruments, I’ll just create a new lighting type (glass-transparent), but if everyone can live with fading to transparent, it’s certainly the more useful case and probably what most people always wanted.
Separate Specular Hilights
For as long as I’ve been involved, X-Plane’s specular hilights are modulated by the object or airplane texture color. In other words, if you paint your airplane red, you get red hilights, and if you paint it black, you get no hilights at all.
This is not a very good way to do things for a few reasons:
- Under this scheme, you can’t make a shiny black object.
- Someday we will add gloss maps – but the glossy part of the gloss map will be defeated by the black texture.
So for 930, I am looking at not modulating specular hilights by texture. (This is called “separate specular hilights” in OpenGL lingo.) My guess is that they will look enough better in almost every case that people would rather have it this way.
Should specular hilights be white for a black object? Yes! A specular hilight is a simulated intense reflection from a very far away, very bright object (the sun). So it should take its color from the sun, not the object itself. To this end, I have also (finally) set the specular hilights to take on the daylight sun color, so that they get fainter and yellowish at dusk. This makes dusk and dawn look a little bit less strange.
(Nerd note: Technically, for the day texture to be an albedo texture, it shouldn’t affect specular hilights.)
When you make an airplane, you customize some of the images and sounds by simply putting a wav or png file in your aircraft package, with the same name as X-Plane’s default. This is “file name” substitution.
What is good about file name substitution is that it is very, very simple.
But file name substitution has some limitations:
- If you need to provide multiple versions of a file, there is no way to do this. You can at most replace one file with one other file.
- There is a risk of “file name collision” – so file name substitution is only appropriate when we can be sure that a folder is only used for one simple purpose.
- The file name cannot easily encode a lot of information about how the file is to be used. We use _LIT to indicate an emissive texture, but imagine trying to encode every aspect of a .ter file in a file name (all of the projection parameters, physics parameters, texture clamping and alpha managemnent, paged texture loading). You’d end up with a file name like my_tex_na_42.23E_72.32W_conc_pd_5000_4000_nw_LIT.png. You can’t tell me that that’s an easy file to work with!
The scenery system hits all three of these limitations, and it deals with them in two ways:
- Texture files are almost always referenced from a text file. The text file provides a place to put all of the important parameters about the texture.
- The library system maps art assets to virtual file paths, avoiding collisions and allowing multiple files to be mapped to one virtual path.
Cristianno wrote this awesome tutorial, which shows how the library system works.
That’s Mr. Bedo to you!
Sometimes I end up learning the name for a computer graphics idea or technique long after I use it. So I was amused the other day to find out that the fancy computer-graphics name for the “daytime” textures in X-Plane (you know, the normal ones for OBJs and airplanes) are called albedo textures in technical terms – or rather they define the albedo component of the lighting equation.
This is handy because I had a nice big fancy word for _LIT textures: emissive. So now, armed with both technical terms, we can accurately describe the two parts of lighting that X-Plane usually specifies by texture: albedo and emissive.
Basically the albedo texture tells what color you see when light shines on it (or rather, how much of that light is reflected back to the viewer diffusely) – it represents color information that does not generate its own light. The emissive texture describes light created by the object, and thus visible under any circumstances.
X-Plane’s lighting equation is thus typically:
albedo * brightness at that point + emissive
Where “brightness at that point” is the sum of all of the lighting effects from any number of lights, the sun, ambient light, etc.
Now there is something interesting about this lighting equation: the emissive component is added to the “modulated” albedo component. So what happens if:
- Albedo is 100% (that is, white)
- The brightness of the sun is 100% bright (really bright day) and
- Emissive is non-zero?
Answer: the total lighting is more than 100%!
100% of what? These lighting values are described in terms of the range of color your monitor can output, from black (0%) to white (100%). So if we have a lighting value of 120%, basically it shows up as white (100%) and the “extra” white is lost. The result is a loss of color accuracy and detail.
For 930 I have a to-do item to scale down all emissive light by a constant factor when the day time overall brightness is high.
The idea is this: in the real world, your eyes have non-linear, adjustable sensitivity to light (and the sun is really, really bright). So when the sun is out, the amount of light added by a neon sign is trivial compared to the light already on the sign from the sun. At night the sign’s light is much more significant because it is relatively dark (thus the sign is in a more sensitive part of your vision and your eyes are adjusted).
In X-Plane, scaling down the emissive texture during the day will simulate its lesser effect during the day.
One more note on emissive vs. albedo texturing: ATTR_emission_rgb basically sends a certain portion of the day time (albedo) texture to the emissive part of X-Plane’s lighting. But the emissive (LIT) texture is still used. So if you use ATTR_emission_rgb, don’t set the emission level to full (1.0 1.0 1.0) and use a very bright _LIT texture; the result will be more than 100% brightness.
I’m pretty gun-shy about posting new features to this blog before they are released. One reason is that a fair number of the things I code never make it into the final X-Plane because they just don’t perform as expected. But the converse of that is: there should be no problem posting about what failed.
One idea that I believe now will not make it into the sim is dual-core pipelined rendering. Let me clarify what I mean by that.
As I have
blogged before, object throughput is one of the hardest things to improve in X-Plane. That code has been tuned over and over, and it’s getting to be like squeezing water from a rock. That’s where dual-core pipelined rendering comes in. The idea is pretty simple. Normally, the way X-Plane draws the frame is this:
for each object
is it on screen?
if it is tell the video driver, hey go draw this OBJ
Now the decision about whether objects are on screen (culling) is actually heavily optimized with a quadtree, so it’s not that expensive. But still when we look at the loop, one core is spending all of its time both (1) deciding what is visible and (2) telling the video driver go draw the object.
So the idea of the pipelined render is to have one core decide what’s on screen and then send that to another core that talks to the video driver. Sort of a bucket-brigade for visible objects. The idea would be that instead of each frame taking the sum of the time to cull and draw, each frame should take whichever one is longer, and that’s it.
The problem is: the idea doesn’t actually work very well. First, the math above is wrong: the time it takes to run is the time of the longer process plus the waiting time. If you are at the end of a bucket brigade putting out the fire, you waste time waiting until that first bucket goes down the line. In practice the real problem though is that on the kinds of machines that are powerful enough to be limited only by object count, the culling phase is really fast. If it takes 1 ms to cull and 19 ms to draw, and we wait for 0.5 ms, the savings of this scheme is only 2.5%.
Now 2.5% is better than nothing, but there’s another problem: this scheme assumes that we have two cores with nothing to do but draw. This is true sometimes, but if you have a dual-core machine and you just flew over a DSF boundary, or there are heavy forests, or a lot of complex airports, or you have paged-texture orthophoto scenery, then that second core really isn’t free some of the time, and at least some frames will pick up an extra delay: the delay waiting for the second core to finish the last thing it was doing (e.g. building one taxiway, or one forest stand) and be ready to help render.
And we lose do to one more problem: the actual cost of rendering goes up due to the overhead of having to make it work on two cores. Nothing quite gloms up tight fast inlined code like making it thread-safe.
So in the end I suspect that this idea won’t ever make it into the sim…the combination of little benefit, interference by normal multi-core processing, and slow-down to the code in all cases means it just doesn’t quite perform the way we hoped.
I am still trying to use multiple cores as much as possible. But I believe that the extra cores are better spent preparing scenery than trying to help with that main render. (For example, having more cores to recompute the forest meshes more frequently lowers the total forest load on the first CPU, indirectly improving fps.)
My family visited DC this weekend and we went out to Udvar-Hazy, the extension of the Smithsonian aerospace museum out near Dulles international airport. My dad took this picture.

That is one of the two radar antennas (and the telescoping arm) used to scan the earth as part of the Shuttle Radar Topography Mission (SRTM). The SRTM is basically the first really good quality most-of-the-earth elevation dataset, and it is the main (but not only) source of elevation data for the X-Plane global scenery.
The telescoping mast shown in the picture (horizontal) extends one of the two radar antennas away from the shuttle when in orbit; had they not been able to retract the antenna they would have had to detach it and leave it in space. Fortunately the mechanism worked properly, so they were able to bring the antenna back for posterity.