We posted the system requirements for X-Plane 11 today. Here’re a few notes on the requirements for X-Plane 11.
This should be a surprise to no one: X-Plane 11 will be 64-bit only. Add-ons have already gone 64-bit only, over 90% of our user base is already running 64-bit operating systems, and we need 64-bit to be able to utilize the RAM that we need and everyone already has.
Windows: No More XP or Vista
For Windows, we are dropping XP and Vista support and requiring Windows 7 or newer. XP has been end-of-lifed by Microsoft for a while and is therefore not safe to use (due to a lack of security updates).
OS X: Yosemite and Newer
For OS X, we are dropping a number of OS X versions and requiring Yosemite (10.10) or higher. Apple has increased the tempo for OS releases in the last few years, and they don’t provide new drivers to old operating systems, so we are pre-emptively cutting down the set of supported operating systems to cut down the number of different 3-d drivers we have to test.
Linux: Proprietary Drivers Required
On Linux, we will continue to support only the proprietary 3-d drivers from AMD and NVidia; these drivers use the same OpenGL stack, so they let us support Linux without the cost of additional 3-d driver testing. We don’t officially support the Mesa/Gallium stack for Intel GPUs, but X-Plane Linux users have done a bunch of work to make this unofficially work, and we do our best to not undo their work.
We’re setting the minimum graphics card at the AMD HD 5000-series line for the red team and the GeForce 400-series for the green team. This ensures that we only support cards with reasonably current drivers, DX11-class capabilities, etc. For Intel, you’ll need at least an HD2000 series or newer; figuring out your Intel motherboard graphics is really tricky because their numbering scheme is crazy, but if you don’t have at least some kind of “HD” graphics, you definitely can’t run.
We recommend a newer graphics card, e.g. at least from the DX12 or newer generations. When it comes to graphics, basically more is more, so whether you need a Titan or Fury or similarly monstrous card depends on things like how big your monitor is.
CPU requirements are the messiest part of the spec and the source of most of our internal discussion. Simply put, there really aren’t good ways for us to simply state what CPU is going to work well or not with X-Plane. X-Plane itself has a huge range of CPU uses based on configuration, and CPUs have a huge range of actual performance that can be hard to predict from some of the simple headline numbers. Clock rate is absolutely not indicative of performance, nor is core count.
A recommended system is pretty simple: we recommend the Intel i5 6600K, which is the current top-speed gamer targeted i5. You can go lower or older and lose significant performance, or you can go faster and really start to pay a lot more money. If you want to invest in 8 Xeon cores, it may help… but we aren’t going to go tell you to spend that kind of money for a little more performance.
Here are my practical recommendations for X-Plane 11:
- If your machine is just barely getting by with X-Plane 10 at the lowest settings, and those hardware requirements seem high because your machine was built several years ago, you may need to upgrade for X-Plane 11. In this case, it could be a good time to upgrade OS and multiple components.
- If your machine runs X-Plane 10 well, it will almost certainly run X-Plane 11 in some form, with the exception of the very oldest graphics cards. (If you have one of those, I would say your definition for ‘run well’ is a lot lower than mine is.)
- If you need to purchase new hardware, I strongly recommend running X-Plane 11 on your existing hardware first and examining performance of the demo (when available) to see where you’ll need to upgrade.
Real hardware performance is hugely varied by what you are doing and your particular system components, so trying the demo will tell you more than we can hope to figure out from specs.
We’ve seen a few reports (and Philipp has experienced first hand) that X-Plane will crash on startup with the very latest Catalyst drivers that came out last week. The crash appears to only happen if you have a machine with a discrete AMD GPU (e.g. a 7970M) and a built-in GPU on your CPU for low-power use.
We are investigating this with AMD now; in the meantime please use the previous Catalyst driver if you are seeing crashes.
If you have a desktop machine, don’t have an AMD card, or are on Mac or Linux this does not affect you.
TL;DR: 10.36 works around the latest NVidia driver – let X-Plane auto-update and everything will just work the way it should.
X-Plane 10.36 is out now – it’s a quick patch of X-Plane 10.35 that works around what I believe to be a driver bug* in the new NVidia GeForce 352.86 drivers.
10.36 has been posted using our regular update process and has been pushed to Steam, so if you’re running 10.35, you’ll be prompted to update. The update is very small – about 10-12 MB of download.
With this patch, you can now run the latest NVidia drivers. I have no idea if those drivers are good (I have anecdotal reports that they’re both better and worse than the last drivers, but these kinds of reports often have a large ‘noise’ factor**).
We patched X-Plane because we can cut an X-Plane patch faster than NVidia can re-issue the driver, and the driver issue was causing X-Plane to not start at all for any users, which was turning into a customer support mess. Past NVidia-specific patches have been to fix bugs in X-Plane, but in this case, we’re simply avoiding a pot-hole. I hope that NVidia will get their driver fixed relatively soon so that people installing from DVDs with older versions of the sim won’t be stuck.
Update: NVidia fixed this bug in the new 353.06 drivers!
[OpenGL, Windows 8.1 -x86/x64]: GLSL shader compile error. 
* The bug is that #defines defined within a function body don’t macro-substitute, but #defines outside a function body too. The work-around is to move some #defines out of function bodies. If anyone can find a reason why #defines can’t be in function bodies, please shout at me, but it’s a pre-processor.
** We’ve had reports of huge fps improvements and losses on beta updates where we’ve made only cosmetic changes to the sim.
I’ll post about X-Plane 10.40 next week – but just a quick note: apparently the Rift will ship Windows-only first:
Our development for OS X and Linux has been paused in order to focus on delivering a high quality consumer-level VR experience at launch across hardware, software, and content on Windows. We want to get back to development for OS X and Linux but we don’t have a timeline.
That’s from a much longer blog post describing the Rift’s hardware requirements and the difficulty of moving that many pixels at that high of a framerate with ultra-low latency. (The tl;dr version is that if you have a Windows laptop you probably won’t be able to run the Rift on it.)
I don’t know what the failure mode will be for not meeting requirements, e.g. will the Rift simply not run, or will it do the best it can with degraded performance.
To state the obvious, if there isn’t an Oculus Rift SDK and driver for OS X or Linux, then X-Plane can’t support those operating systems.
When it turns out that a bug that we thought was in an OpenGL driver is actually in X-Plane, I try to make a point of blogging it publicly; it’s really, really easy for app developers to blame bugs and weird behavior on the driver writers, who in turn aren’t in a position to respond. The driver writers bust their nuts to develop drivers quickly for the latest hardware that are simultaneously really fast and don’t crash. That is not an easy task, and it’s not fair for us app developers to blame them for our own bugs.
So with that in mind, what did I screw up this time? This is a bug in the framerate test that caused me to mis-diagnose performance of hardware instancing on NVidia drivers on OS X. Thanks to Rob-ART Morgan of Barefeats for catching this – Rob-ART uses the X-Plane framerate test as one of his standard tests of new Macs.
Here’s the TL;DR version: hardware instancing is actually a win on modern NVidia cards on OS X (GeForce 4nn and newer); I will update X-Plane to use hardware instancing on this hardware in our next patch. What follows are the gory (and perhaps tediously boring) details.
What Is Hardware Instancing
Hardware instancing is the ability to tell the graphics card to draw a lot of copies of one object with a single instruction. (We are asking the GPU to draw many “instances” of one object.) Hardware instancing lets X-Plane draw more objects with lower CPU use. X-Plane’s rendering engine will use hardware instancing for simple scenery objects* when available; this is what makes possible the huge amounts of buildings, houses, street signs, and other 3-d detail in X-Plane 10. X-Plane has supported hardware instancing since version 10.0.
The bug is pretty subtle: when we run the framerate test, we do not set the world level of detail explicitly; instead it gets set by X-Plane’s code to set up default rendering settings for a new machine. This “default code” looks at the machine’s hardware capabilities to pick settings.
The problem is: when you disable hardware instancing (via the command line, explicit code in X-Plane, or by using really old hardware) X-Plane puts your hardware into a lower “bucket” and picks lower world level of detail settings.
Thus when you disable hardware instancing, the framerate test is running on lower settings, and produces higher framerate numbers! This makes it look like turning off instancing is actually an improvement in fps, when actually it’s just doing better at an easier test. On my RetinaBook Pro (650M GPU) I get just over 20 fps with instancing disabled vs 17.5 fps with instancing enabled. But the 20 fps is due to the lower world LOD setting that X-Plane picks. If I correctly set the world LOD to “very high” and disable instancing, I get 16.75 fps. Instancing was actually a win after all.
Was Instancing Always a Win?
No. The origin of this mess was the GeForce 8800, where instancing was being emulated by Apple’s OpenGL layer. If instancing is going to be software emulated, we might as well not use it; our own work-around when instancing is not available is as fast as Apple’s emulation and has the option to cull off-screen objects, making it even faster. So I wrote some code to detect a GeForce 8800-type GPU and ignore Apple’s OpenGL instancing emulation. Hence the message “Disabling instancing for DX10 NV hw – it is software emulated.”
I believe the limitations of the 8800 are shared with the subsequent 9nnn cards, the 1nn, 2nn and 3nn, ending in the 330M on OS X.
The Fermi cards and later (4nn and later) are fundamentally different and can hardware instance at full power. At the time they first became available (as after-market cards for the Mac Pro) it appeared that there was a significant penalty for instancing with the Fermi cards as well. Part of this was no doubt due to the framerate test bug, but part may also have been a real driver issue. I went back and tried to re-analyze this case (and I revisited my original bug report to Apple), but X-Plane itself has also changed quite a bit since then, so it’s hard to tell how much of what I saw was a real driver problem and how much was the fps test.
Since the 480 first became available on the Mac, NVidia has made significant improvements to their OS X drivers; one thing is clear: as of OS X 10.9.5 instancing is a win and any penalty is an artifact of the fps test.
What About Yosemite?
I don’t know what the effect of instancing is on Yosemite; I wanted to re-examine this bug before I updated my laptop to Yosemite. That will be my next step and will give me a chance to look at a lot of the weird Yosemite behavior users are reporting.
What Do I Need To Do?
You do not need to make any changes on your own machine. If you have an NVidia Mac, you’ll get a small (e.g. < 5%) improvement in fps in the next minor patch when we re-enable instancing.
* In order to draw an object with hardware instancing, it needs to avoid a bunch of object features: no animation, no attributes, etc. Basically the object has to be simple enough to send to the GPU in a single instruction. Our artists specifically worked to make sure that most of the autogen objects were instancing-friendly.
A few weeks ago AMD posted a beta driver for newer Radeon HD cards that fixed some drawing artifacts with X-Plane. They have now released an official, non-beta WHQL that works with X-Plane. So if you have a newer AMD card on Windows, I suggest updating to the new Catalyst 14-4 driver.
An update on the state of drivers:
- AMD’s latest Catalyst Beta (Catalyst 14.2 V1.3) fixes the translucency artifacts in HDR mode. This driver also supports the newest cards and has correct brightness levels in HDR mode, so if you’re an AMD user on Windows, this is the driver to use. This change hasn’t made it to the Linux AMD proprietary driver, but it probably will soon.
- I have received reports of faint red lines on the latest NVidia Windows WHQL drivers (334.89) under cloudy conditions, but neither Chris nor I have been able to reproduce them. If you can reproduce this, please file an X-Plane bug. (I have not reported this to NVidia because I can’t reproduce it.)
- Some users have reported crashes with Intel HD 4000 GPUs on Windows; getting the latest drivers from Intel seems to fix the issue. I don’t have good info on what versions work/don’t work but it appears that plenty of machines have shipped with old drivers for their motherboard graphics. I believe X-Plane does run correctly with the Intel HD4000 series GPUs on Windows as long as the right driver is installed.
- OS X 10.9.2 is out, and I think it may have new drivers (the NVidia .kext files changed versions), but I don’t see any change in framerate for either NV or AMD cards.
Update: NVidia has been able to reproduce the red lines bug – we’re still working out the details of what’s going on, but it’s a known issue now. Thanks to everyone who reported it.
If you find a driver bug on Windows or Linux, please repot this to us (via our bug reporter) and to NVidia, Intel or AMD. I try to bring known bugs directly to the driver teams, but having the bug in their public bug reports is good too – it makes it clear that real users are seeing real problems with a shipping product.
ppjoy users on Windows have been experiencing a crash on startup; this was a bug in X-Plane 10.10/10.11, induced by particular virtual HID devices that only ppjoy could make. I found the problem and it will be fixed in 10.20.
In the meantime, if you need to use ppjoy and want to work around the problem, set your hat switches to discrete directions, not analog. (X-Plane can’t use an analog hatswitch anyway; most people have this because it is a ppjoy default.)
As a side rant to ppjoy users: I was a bit horrified with the process of installing ppjoy. ppjoy is an unsigned driver so I had to turn off driver signing in Windows. ppjoy is also, as far as I can tell, not hosted anywhere official. So I had to install an unsigned driver off of a file locker onto my Windows machine with the safeties off.
To be clear, I do not think that this is the author’s fault. He is making freeware, and the only thing that would remedy these problems is money. I do not and cannot expect him to give up not only his time (to code) but also pay to solve the distribution problems of official hosting and buying a signing certificate.
Still, the process of taking off all of the safeties to put random third party binary software on my Windows box was unnerving and not something I would ever do as an end-user.
As far as I know, the ppjoy crash and the PS3 controller crash are the only two known regression bugs* with joystick hardware, and they’ll both be fixed in 10.20. (Linux users, needing to edit udev rules to use hardware is not something that we consider to be a bug – see this post.)
When will 10.20 go final? Real soon now. Plugin authors, if you aren’t already running on 10.20 betas, you should have been doing that weeks ago.
* Regression bug means: it used to work in 10.05 and stopped working in 10.10 when we rewrote the joystick code.
Please read the requirements below twice – this is a very particular setup I need. To run this test you must have either:
- A functioning hackintosh running the latest Mountain Lion release (OS X 10.8.x) or a functioning Mac Pro running the latest Mountain Lion release (OS X 10.8.x) and
- An NVidia 560,57,580 or 660,670, or 680 desktop GPU in that machine and
- The computer must also be set up to run of Windows Vista or Windows 7 in Bootcamp and
- You have or are willing to run the 10.20 beta (at least as a demo and)
- You can run command-line fps tests if given instructions.
Please do not reply to this if you do not meet all three hardware requirements; I am looking for someone who can do a clean apples-to-apples comparison of NVidia driver performance between Mac and Windows.
Please do not comment on the post if you do not meet these requirements; I am going to clip the comments pretty tightly on this one.
EDIT: thanks, but a user already sent me the test data I needed!
I’ve seen a few bug reports complaining about ‘flicker’ with HDR enabled. It took me a few tries to understand that the users were not actually reporting Z-Thrash (which is what I think of when someone says ‘flicker’, but were actually reporting temporal anti-aliasing of anisotropic meshes like roofs and roads.
Ants are alienating an icy tropical metro what now?!?
Sorry, graphics programmers have lots of big words for things that aren’t actually that complicated. (Seriously, we call a daytime texture an “albedo”. Who is Mr. Bedo anyway??) But basically the issue is this:
- We have something that appears long and thin on the screen, like the roof of a far off building (wide, but not tall, in pixels) or a road (again, wide, but not tall – a road might be 20 pixels wide but only 1 pixel tall on screen). Anisotropic just means different lengths in different dimensions, more or less.
- The road or roof will be rendered in a stair-step manner, as the graphics card tries to approximate a diagonal with pixels.
- As the camera moves, which pixels form the stair-step will change every frame, causing parts of the road or roof to flicker into and out of existence on a per frame basis.
Going for a Swim
In the old days, this effect used to be called ‘swimming’. A diagonal line would appear to ‘swim’ as the stair-step pattern changed as the camera changed. The swimming was annoying, but if you had a lame graphics card, you could live with it.
The problem is that in X-Plane 10, a lot of the meshes we draw are a lot smaller. As we build more 3-d detail and improve the engine to draw more detail on screen, the result is a lot of really small things. In X-Plane 9 we could draw 5-10k objects; now we can draw over 75k objects. That means that individual objects on screen may be 1/10th of their size (since there are more of them).
So instead of having big objects with big triangles that ‘swim’, we have tiny triangles that flicker out of existence entirely.
One reason I haven’t blogged about this before is because there are a ton of different full-screen anti-aliasing technologies out there and the prospect of explaining them was daunting. Fortunately Matt Pettineo did an awesome job with this post. Go read it; I’ll wait here.
The main idea is that full screen anti-aliasing draws everything bigger and then down-sizes it to get softer edges. Diagonals don’t look stair-stepped, and a tiny roof won’t flicker into and out of existence because relative to the larger size that it was drawn, the roof is no longer tiny. In other words, 4x MSAA makes everything 4x less tiny from a perspective of a triangle being ‘too small to not flicker’.
The second reason why I am getting bug reports about flicker (besides a larger number of smaller triangles) in v10 is that HDR mode doesn’t use conventional MSAA. For various technical reasons, MSAA works poorly with a deferred renderer, and HDR is a deferred renderer. So like many games today, X-Plane’s problem is to anti-alias without letting the hardware do it. If you’re used to 16x MSAA from your graphics card, HDR with no FSAA is a rude surprise.
Current Option Number One: FXAA
FXAA is an anti-aliasing shader written by Timothy Lottes at NVidia. FXAA is typical of a whole category of anti-aliasing solutions in that it is a post-processing solution – that is, it takes an aliased, jagged image and attempts to smooth out the image after the fact. (MLAA is also in this category.)
FXAA has a few things going for it that are good:
- It’s very fast. The cost of enabling FXAA is very low compared to other anti-aliasing algorithms.
- It doesn’t consume any extra memory or VRAM.
- It produces smooth diagonal lines, more so than lower-levels of FSAA.
It does, however, have one major down-side: because it doesn’t actually draw the scene at a higher resolution, any mesh that is so small that it is flickering is still small, and thus it will still flicker. On any given frame, the roof will have no jagged edges, but the roof may simply not exist in some frames. If the roof isn’t drawn at all, FXAA can’t fix it in a post-process.
So FXAA is fast and cheap and makes still images look nice, but it can’t deal with temporal artifacts, that is, flicker between frames.
Current Option Number Two: SSAA 4X
4x SSAA simply means we draw the entire world at double the resolution in either dimension, and then down-size it later. Jagged edges become blurred in the down-size and thus aliasing is reduced. (Nerd note: when technical papers talk about OGSSAA, they mean ordered grid super-sampled anti-aliasing, which just means the image is bigger. 🙂
The up-side to SSAA is that it reduces flicker. Because the drawn image is bigger, very small elements won’t flicker (since they are bigger when drawn).
The down-side is the cost: 4x SSAA is the same as doubling your screen res in both dimensions. And if you’ve experimented with monitor resolutions, you know that once you are GPU bound, doubling the resolution in both dimensions uses 4x the VRAM and cuts your framerate to a quarter of what it was.
So the big problem with 4x SSAA is cost. Since we’ve improved HDR performance in 10.10r3 I’ve seen more users reporting the use of 4x SSAA. But it’s not cheap.
Newer, Better Options
I have two new tricks for HDR FSAA that I’m hoping to roll into 10.20. (They’re new to X-plane; I am sure someone else has coded these things in other games before.)
First: FXAA and SSAA can be used at the same time in the engine, for better quality than either can provide on their own. SSAA does better at fixing temporal artifacts like flicker (because it makes things ‘4x bigger’ relative to the size at which they flicker) but FXAA does better at making diagonals ‘less jagged’. Now you can have both. (10.20 features a newer version of FXAA.)
Second: I realized that our aliasing is anisotropic (there’s that word again) meaning it’s not the same in both directions. X-Plane’s worst aliasing comes from long thin horizontal screen elements like roads, and roof tops. Therefore having more anti-aliasing vertically than horizontally is a win.
So rather than just have SSAA 4x (which is twice as big horizontally as vertically) we can now do 2x (only vertical) and 8x (2x horizontal, 4x vertical). This provides a range of options; 2x SSAA will be affordable to some users who can’t run 4x SSAA at decent framerates. 8x SSAA will provide anti-flicker that should be as similar to non-HDR with 16x MSAA for urban scenes, for those who have a big new graphics card.
I posted a set of technical test pictures here.
What about TXAA?
NVidia has announced a new anti-aliasing technology, called TXAA. At this point there isn’t enough technical information published about it for me to comment meaningfully on it. My understanding is that TXAA is not yet available for OpenGL based games.
I can say that in the future we will continue to try to adopt the best anti-aliasing technology we can find, and the problem X-Plane faces (anti-aliasing for a deferred renderer) is not at all unique to X-Plane, so it is likely that there will be good solutions forthcoming. Far smarter minds are working on this problem at ATI and NVidia as we speak.