As X-Plane 10 gets closer to shipping, we’re getting a better idea of what its performance characteristics are going to be.  This will be the first of several posts on hardware and X-Plane 10.  Spoiler alert: the buying advice I can give is at best fairly vague and obvious (don’t buy an old piece of junk DirectX 7.1 graphics card for $29) – once the sim ships and people start posting real performance comparisons, I suspect community data will be a much richer source of information for planning a new system (or a system upgrade). Please don’t ask whether your particular graphics card will work well with X-Plane 10.

Last week I had a chance to bring X-Plane 10 to ATI and run the sim on a few different modern GPUs.  It’s the Mac team that is nearby, but ATI’s mac drivers are of a quality such that platform isn’t really an issue.

We ran X-Plane on a current-generation Mac Pro with a single Radeon HD 5870, driving a 2560 x 1440 display.  The 5870 is a very solid card – to do better you have to spend a lot of money.

Here’s what we found.  Bear in mind that the art asset set is still a partially completed one, and the shaders are still being debugged and optimized, so anything could change in both directions.

Geometry: How Much Stuff Can You See?

First: geometry and 3-d.

  • I could turn up almost everything that creates geometry.  To keep fps smooth I had to turn down something one notch – either world LOD, visibility, or autogen density – but only one notch.  This is a big step forward from X-Plane 9.  (But also remember this is a current generation CPU.)
  • Shadows took a big bite out of the geometry budget – basically the sim has to draw geometry again to build the shadows, amplifying the cost of your objects and trees.  Current generation hardware simply doesn’t have enough kick to max shadows out and max all of the 3-d out.  Not yet.*
  • PCIe 16x matters – we saw a significant reduction in performance on the newer 8x PCIe Thunderbolt Mac laptops.  You definitely want PCIe 16x!

Geometry in X-Plane 10 is a little bit different from X-Plane 9 – it can be bottlenecked not only by your CPU, but also by PCIe bus bandwidth.  Because CPU use is more efficient in version 10, we sometimes get stuck when there isn’t enough bandwidth to transfer geometry from the CPU to GPU.**

(Please note that the fast path for geometry in X-Plane 10 requires OBJs to be stripped of most attributes; if you load a version 9 DSF and put objects on insane, you’ll get high CPU use and low fps because the objects are going to hit the slow code path.  The autogen in X-Plane 10 is being completely redone, so it should all hit the fast path.)

Hardware Advice: Make sure you have your video card in a PCIe x16 slot!

The new Thunderbolt Mac laptops have only 8x PCIe to their discrete GPUs; this may turn out to be the limiting factor for X-Plane.

Pixels: How Much Do Effects Cost?

How about shader effects?  Some of the new effects in X-Plane 10 (atmospheric scattering, fade-out level of detail, FXAA, linear lighting, deferred rendering) use only pixel shading power on the GPU.  We turned on pretty much every effect that isn’t CPU/bus based (that is, the GPU pixel effects) and found that we could run with all of them at 20-30 fps.  Heck, we even turned on a few things that aren’t going to be ready for 10.0 but may come out during the version run.

This is pretty different from X-Plane 9.  With X-Plane 9, you can turn on all of the GPU effects (there is really only one, “per pixel lighting”, crank 16x FSAA, run at 2560 x 1024 and get 27 fps on my older Radeon 4870.  In other words, X-Plane can’t even keep a previous generation GPU busy at very high settings.

(To compare, the HD 5870 has double the shader cores and a 13% faster clock, so it is at least double the GPU of the 4870 for fill and shading.  In other words, in the test system we used that machine would have run X-plane 9 at perhaps 45 fps when completely maxed out on a huge display.)

In other words, X-Plane 9 couldn’t max out a GPU; X-Plane 10 can.

If you’ve been using a “medium” tier car (e.g. a GTX 560 instead of a GTX 580, or a 5650 instead of a 5870), this is going to mean less “stuff” in X-Plane 10: less eye candy, or less resolution, or less fps.  With X-Plane 9, a top-end video card would be bored; with X-Plane 10 it might have something to do.  I think this may surprise some users who have been buying cheaper video cards and “getting away with it” for a while now.

Hardware Advice: if you want the highest quality visuals at a high resolution, get a full powered video card!

I still concur with Austin’s advice to the news list to buy the second most expensive video card.  Looking up the price and stats of the Radeon 6970 vs 6950, you pay a 40% premium to get about 19% more fill-rate.  More importantly, the HD 6990 is actually just two video cards jammed together using internal CrossFire, at over double the price of one card.  But since X-Plane does not yet leverage CrossFire, you’re paying a lot of money and you won’t see a fill-rate improvement.

The one case where you might want to go really crazy is if you want to drive an insane number of pixels, e.g. multiple huge monitors.  Your framerate is going to be inversely proportional to the number of pixels on screen, so if you are getting 30 fps with one 2560 x 1440 monitor, you’re going to get 15 fps if you try to run at 5120 x 1440 on two monitors.

Epilogue: Are We Winning?

Something to think about: is it good that X-Plane uses more GPU and less CPU?  I think the answer is “yes” – and these performance results are the result of a steady process of modernizing our rendering engine.

You can max out X-plane 9 with a cheaper video card; not so with X-Plane 10.  But this isn’t because X-Plane 10 is more of a pig; it is because X-Plane 10 provides new options (options you can turn off) that X-Plane 9 did not have.  With X-Plane 9, it was as if we had decided for you to turn off advanced rendering settings, leaving your GPU idle.

I believe that it must be the goal of the X-Plane 10 rendering engine to scale, meaning X-Plane should be able to max out a very powerful system (and provide increased visual quality in doing so), but also be able to run on a less powerful system by backing off on rendering quality.

The Radeon HD 2600 has between 144 and 192 GFLOPS of computing power; the 5870 has 2720 GFLOPS – that’s at least 14 times more computing power between an older iMac (but still one that can run HDR mode if you’re a little crazy) and the current maxed out Mac Pro.  (If you have the HD 2400 in your iMac, that gap widens to 68x!)  It is in the face of that performance gap between the high and low end system that we are trying to make X-Plane as scalable as possible.

* PCIe 3.0 will be out one of these days – I don’t know when, and I haven’t had a chance to run X-Plane on such beastly hardware, but it is my hope that we’ll get additional shadowing performance via the increased geometry budget.

** Why not keep all of the geometry in VRAM?  That’s a long sad story about the OpenGL API that we’ll have to save for another day.

About Ben Supnik

Ben is a software engineer who works on X-Plane; he spends most of his days drinking coffee and swearing at the computer -- sometimes at the same time.

51 comments on “X-Plane 10 and GPU Power

  1. Fantastic post and news Ben! No questions from me, just keep up the good work.
    PS I’m really looking fwd to decent AI planes. FSX ones are ‘thick as bricks’

  2. The other side could be, that if your graphics card is limited by your CPU in X-Plane 9 it will still be able to run X-Plane 10, as it is used more efficent then before. Could I be right?

    By the way: What is more important? Pixelfillrates or FLOPS or …? What’s propably the bottleneck?

  3. Would SLI and CrossfireX be potential solutions for the PCIe bandwidth problem (geometry drawing), considering there are systems out there that can do 16x/16x? Or it only really matters in terms of driving higher resolution output?

    1. In theory, yes, but mobos that do 16x/16x are rare. Vendors can get away with 8x / 8x SLI because in AFR mode you only need half the bandwidth _if_ you are pixel bound, and being pixel bound is what SLI/Crossfire target.

      I think PCIe 3.0 will be a much more reachable solution for more users when it comes out.

  4. Where can we find that the GPU on the macbooks are PCIe 8x? What about iMacs? Why would apple only use 8x?

    1. Apple System Profiler tells you your PCIe slot speed – for example, my early core 2 MBP says “PCIe lane width: x16” under Graphics/Displays.

      I don’t think anyone in the company has a latest gen (thunderbolt-equipped) MBP – if anyone out there has one, please post a snapshot of your graphics/display page to confirm/correct the bus speed.

      1. Indeed, the GPU in the current high-end 15″ MBP (the thunderbolt one) has only an 8x connection. I’m wondering whether a 16x “slot” would be much of an improvement, though – the Radeon HD 6750M isn’t really all that powerful, after all, especially when compared to desktop chips.

        I was quite surprised, by the way, that I can more or less max out X-Plane 9 on a notebook (except for downtown Manhattan and the like – too many objects still kill it big time). But I still maintain that notebooks are not for flight simming anyway, or gaming in general, for that matter, so I don’t really mind that X-Plane 10 will change that. Quite to the contrary: like you say, X-Plane should really be able to scale to the highest-end GPUs available in a year or so.

        Judith

        1. 16x vs 8x – it depends on what you like. If you like geometry, the slot matters – the i7 will push enough batches to jam the bus at 8x – at least we think.

          If you care about effects – HDR, scattering, etc. you may run out of pixels on on a mobile chip. That of course also depends on what res you run. But plenty of users just want more 3-d geometry.

          1. I also confirmed, the 2011 mbp high end 15″ i7 has 8x slot.

            Now considering the posts from many months ago regarding optimization and v-10 to likely run better / at same settings or higher on your same setup; is this still true? Despite being a laptop vs dekstop( obviously imac or macpro would work better) but will users with a thunderbolt mbp 1024mb vram get a nice looking and running experience with v-10? Hope this article is not basically saying users without 16x are screwed/ or spend tons of $$ to keep up. if i can’t run max settings on everything, owell, not like I do now Yet the experience I am getting now with 2011 mbp dedicated card is huge step from 08 mb integrated ya know. Thats the premiss, worth buying, good experience 🙂 Good to read about demo news.

            Thanks Ben

          2. I don’t know if this detail will help give a simple answer, but currently I am getting about 30fps with storm clouds and heavy rain using the high detailed crj 200 jrollon made. 20km visibility , if I use a less detailed / mb sized plane I can get 40-50, when I turn some settings down, use less buildings etc I can get 60fps.

    2. the thing is that thunderbolt is basically an extension of pcie.
      8 lanes are dedicated to thunderbolt port on newer macbooks…

  5. Curious how it would run on the GT 330M included in the 2010 MacBook pro’s.

    Per ASP it looks like it’s x16, and the i7 models shipped with 512MB VRAM


    Chipset Model: NVIDIA GeForce GT 330M
    Type: GPU
    Bus: PCIe
    PCIe Lane Width: x16
    VRAM (Total): 512 MB
    Vendor: NVIDIA (0x10de)
    Device ID: 0x0a29
    Revision ID: 0x00a2
    ROM Revision: 3560
    gMux Version: 1.9.21

  6. unfortunately it is x8.

    latest thunderbolt 17inch 2.2GHz MBP model.
    But because the 1.8x resolution difference 2560×1600 on my 2.93GHz 5870 4core MacPro vs. 1920×1200 and 6750M get better fps in MBP.

    The interesting thing is I get better fps on my Win7 i7 first gen OC 4.2GHz with the 2GB 5870 Eyefinity6 (6outs) with >11Mpix 5 display (3x2560x1440 equivalent….)

    I’ve observed also the GPU was never >60% loaded. I could go with 16x FSAA and such a 5500x2000pix resolution with few fps down.

    Speaking of very cool technology as Eyefinity 6 and bezel correction (rendering the bezels) it would be nice to have movable windows because many times the windows are spanned cross the monitors and you can’t see the text or buttons without the actually moving the window… That’s why compatibility with Eyefinity / bezel correction would be very nice addition.

    Thanks for the good new, at last we’ll have scalable system and our GPU fans would kick in (now idling at 40%:).

  7. I never had the chance to get XP 9 running on full on a pretty decent hardware (I7 950 – 8 GB RAM – NV GTX 460). I’m on Linux 64.

    The AA level is 2X, 16X Aniso, HD Resolution. And on very intensive poly scenes, like big airports (LFPG) on the Paris scene, the framerate drops down so much that, if I want to fly that kind of heavy areas, I would have to decrease geometric detail big time.

    Though I understand you will tell me you can’t absolutely assure me it will run smoothly on the same conditions, I hope XP 10 will not have that kind of bottleneck on my machine. To test it by myself when i’ts out, is there any XP10 demo planned ?

    I’m waiting for your next post about VBO implemention problem in XP, too bad there is such bugger. I love that blog, thanks for all the work you’re doing, Ben and the others :).

    1. X-Plane 10 has a time demo, like X-Plane 9 – but this time the time demo is a bit revised to be more useful for testing hardware. (The original time demo was aimed at finding problems with the various parts of X-Plane itself.) We’ll have some reasonably good docs on the time demo when the sim is available.

      I don’t know what you’re referring to though re: VBO problems…I think our VBOs work correctly.

      1. Thank you for your reply about the demo.

        And for the VBO thing, I was referring to this :
        ** Why not keep all of the geometry in VRAM? That’s a long sad story about the OpenGL API that we’ll have to save for another day.

        Bye :).

  8. It’ll be also interesting to see how much additional CPU power will be required by ATC/AI.
    In The Other Flight Simulator ATC/AI has maybe the biggest single CPU hit.

  9. Hi Ben,

    Thanks for the writeup. This is getting pretty exiting.

    Question about comparing performance of video cards – How can I compare my current setup (iMac i7, 27″ – I think it’s a 512MB ATI Radeon HD 4850 but am not at my computer to check) to what you or others may run in a tower?

    I’m not sure how to factor in CPU, RAM, density altitude, etc.

    thanks-
    Andrew

    1. Once the demo is out, you’ll be able to run the framerate test and compare to other users. I can post my framerate results too if it’s uesful to people but my hardware is neither particularly rare nor impressive.

  10. Will you guys include an “optimise” section – eg so we could edit a small matrix of what we find most important, eg fps vs scenery detail vs number of aircraft – just high level stuff – and then XP would optimise for our system? We’d even say “use 90% of the machine” to give some wiggle room.

    1. I have thought that maybe we could write a script or wizard that does that – just launches the sim with the timedemo and different options 200 times overnight – you’d come back in the morning with the “hot spots” in the matrix to try for yourself.

      1. That sounds fantastic (and technical)! Might save coding time for Xmas by just doing the simple matrix with pre-set option sets, then allowing user to navigate to a sub-menu with the option to “upgrade me” or “downgrade me”, up and down the ladder of say 20 pre-set options.

        Naturally, and of course full score on that – an “auto-optimise” based on telemetry gained from live running on the computer is way cool. Kind of evolutionary. But sounds a little harder – and might be prone to crashes.

        Although note I’ve only had XP crash on me once in 2 years (unlike me crashing on XP, which has been a little more frequent)

  11. I definitely agree that X-Plane becoming more GPU-bound is a good thing and that it means the engine is being brought up to date. After Intel and AMD gave up on the GHz wars, the GPU is where most of the exciting stuff has been happening.

    Traditionally, flight sims have been mostly ignoring the GPU, resulting in sims that require an extremely fast CPU, usually overclocked far beyond what even “Extreme Edition” CPUs run at by default, but which work fine with even a mid-range video card from several years ago. It looks like XP10 will be more balanced, making all of the components of your computer work together for the best and smoothest visuals.

  12. Hi Ben,

    thanks a lot for the updates! So, X-Plane is going to become PCIe bandwidth bound, therefore I’d better wait for PCIe 3.0 to become available. In the meanwhile, I wonder if my good old PC will be enough to run X-Plane 10. To be more specific, I wonder what is the minimum Shader Model the GPU shall comply to in order to run X-Plane 10. Will my good old Nvidia GF 7800 GTX (Shader Model 3.0) be able to barely run X-Plane 10, even with almost no eye candy?

  13. My 2010 iMac says…..

    “ATI Radeon HD 5750:

    Chipset Model: ATI Radeon HD 5750
    Type: GPU
    Bus: PCIe
    PCIe Lane Width: x16
    VRAM (Total): 1024 MB
    Vendor: ATI (0x1002)
    Device ID: 0x68a1
    Revision ID: 0x0000
    ROM Revision: 113-B9710C-238
    EFI Driver Version: 01.00.417
    Displays:
    iMac:
    Resolution: 2560 x 1440
    Pixel Depth: 32-Bit Color (ARGB8888)

    2.93 GHz Intel i7 16GB 1333 MHz DDR3

    Will I be able to squeeze XP 10 onto my
    OWC Mercury Extreme Pro SSD:

    Capacity: 240.06 GB (240,057,409,536 bytes)
    Model: OWC Mercury Extreme Pro SSD

    Looking forward to X-P 10

  14. Will I need to get a DVD drive for X-Plane?

    I recently built a new computer, and since I haven’t used an optical drive in a while, I opted not to include one. I installed Windows and Linux from USB thumb drives, games come from Steam or direct digital purchase, and movies stream through Netflix. Would it be possible to purchase X-Plane on a 16GB USB stick (lots of region scenery at USB3 speeds!), or maybe a digital purchase/download option like that of Steam?

  15. How many cores / %CPU is X-Plane 10 using on your Mac Pro? I’m really excited about X-Plane being able to make use of the hardware that I have available.

    1. It depends on what I’m doing. During DSF load: all of them. Flying with low rendering settings and no AI – one plus a flicker. I haven’t had time to set up a good real-world examination of multi-core performance.

  16. Researching the different graphic chips i came across the instance of the Intel HD Graphics 3000, which is attached directly on the die to a Sandy Bridge i7 chip, the interest is the fact they say it is integrated as part as the main cpu of the chip I quote
    “Alike its predecessor it does not have a dedicated graphics memory, but, can access the Level-3 cache of the CPU (which is now called ‘Last Level Cache’ – LLC) of up to 8 MB, depending on the CPU. Further memory requirements are covered by accessing the RAM of the computer.”
    So how is this interpreted by XP10, is it good or bad, faster or slower, excellent or keep as far away from as possible, or even useable?
    I bring this up in the fact that most people would want to max out for the best to run XP10, but the fact remains many will have to settle for the middle ground, is this graphic chip a compromise.

    1. The GPU integrated into the Sandy Bridge CPUs, despite the marketing stuff that marketing folks could tell You about technical improvements, is there to do the job You could typically demand to an integrated GPU: perform the most basic graphics tasks (i.e. displaying the operating system UI) and little more.
      The lack of dedicated memory, together with the low number of shader units, automatically excludes any chance for such GPUs to perform acceptably in almost any 3D application.
      I have a PC at home, with an AMD 780G chipset (featuring an integrated Radeon HD 3200 – DX10 compliant). It struggles with the 3D elementary games that come with Windows 7. I can barely run the good old X-Plane 8 at 1024×768 (by “barely” I mean that I squeeze 15-20 fps with texture resolution and scenery details at low settings). I finally threw in an old dedicated GPU (Nvidia GeForce 7800 GTX w/ 256 MB of video memory, that’s why I asked Ben if this hardware can cope with X-Plane 10).

      The only task the integrated Sandy Bridge GPU is good for is video transcoding (the Intel integrated GPU has a dedicated hardware unit for this – way faster than any software or GPGPU based solution). For any other purpose… go with dedicated GPUs.

      Best regards,
      Filippo

  17. Hi Ben,

    does your statement “your framerate is going to be inversely proportional to the number of pixels on screen” refer to everything you said about XP10-GPU-usage?

    Since I believe that a lot of users won´t run xp10@2650×1440 pixels (I myself will run it @1680×1080) that would lower the gpu costs by 50%.
    That would probably allow to max out xp10 with a medium range card like a gtx 560 oder an amd 6870 and have a smooth flight, right? Or isn´t it that easy because geometry doesn´t worry about pixels? Any estimation?

    Best regards
    Flo

    1. Yep. If you don’t need huge res, you don’t need as much fill rate, and you don’t need as much GPU.

      I can’t say how well a mid-range GPU will do with v10 as I haven’t tried it. In some cases the mid-range cards are _heavily_ cut down. For example, that insane res I tried is only 2.8x the pixels of 1280×1024 (that is, a simple 19″ LCD display). By comparison, the 6970 has 3.5x the compute power of the 6670. In other words, the bottom half of the product spectrum falls apart fast. (But you could get something more mid-range like a 6790 and perhaps be okay.)

      And you are correct: geometry doesn’t worry about pixels. Making the screen smaller won’t help you with any geometry-related issues.

  18. Two questions occurred to me from this post:

    1) What is the implication to home cockpit building with multiple monitors? Can we now use the scalability to our advantage so that a 6 core i7+ high end GPU can be used to drive multiple visuals with one computer? Three visuals using 2 cores per screen and one or more GPUs driving the monitors, potentially trading off some some of the eye candy for the large visual with realistic perspective by turning off some of the effects?

    2) What kind of role the GPU architecture plays here? Do you expect ATI’s approach to excel in one area of rendering XP10 and NVidia architecture to be optimal for some other tasks? If so, what would these areas be? Or do you expect one or the other approach to generally outperform the other one when it comes to XP10?

    1. (1) X-Plane 10 does not support multiple renders to multiple screens, or even distinct 3-d views out of one machine. In the long term, I’m not sure that having a huge machine with a huge CPU and a huge GPU will be that great because you can’t get the PCIe bus bandwidth or memory bandwidth to match. I think there is some potential to scale up the number of views off of “single big machines” but only to a limited extent.

      (2) I don’t know – at this point both vendors have (with DX1!) fully embraced instancing and tessellation, high geometry through-put, and their high-end cards are loaded with shader power. So I think “how new” and “how powerful” is perhaps more important than “which vendor” for performance.

      1. What is being described above (1) is what I’m looking for (building a home cockpit from one single computer), and it might actually be achievable with a couple of existing solutions from ATI (Eyefinity + CrossfireX) or NVIDIA (2D Surround + SLI). I’m planning on doing that myself, learning how it goes during the process.

        1. I currently have my XP9 cockpit on a single i7-980x + GTX580 + ATI HD 4800 so that GTX580 drives 3 monitors through matrox th2go for the visual and ATI card takes care of the instrument panel via Simavio. This works great except for the fact that the visual is one 3d view stretched over 3 screens covering about 95 degrees of FOV. The problem of course is that the visual is not very realistic towards the edges of the display when it comes to the sizes of objects and the behavior of the horizon when turning and climbing etc. It would be better to have the 3 screens as 3 different views angled to -32.5,0,32.5 and having 32.5 degrees of FOV on each. Ben’s answer is very informative for me because it means that it is not possible to achieve this with one computer with XP10 either, so now I know that I need to go computer shopping to have 2 more computers to get the kind of visual that I want. The implication of the second answer is not that clear cut, but it seems that perhaps I could do ok by having GTX580s also in the new computers as well.

  19. Just wanted to add that yes, that will require a really beefy configuration, and can be easily bottlenecked by the bandwidth issues that Ben mentioned.

    Hoping that LGA2011 gets some of that addressed next year with quad-channel memory (bringing bandwidth from 25.6 GB/s to 51.2 GB/s) and 40-lane PCIe 3.0 bus (from 5 GT/s to 8 GT/s).

  20. Firstly, thanks for all the hard work and resulting info – really interesting.

    As I am about to buy a new 2560×1440 monitor, I have a question about colour depth :

    I have a choice of 8 or 10 bit colour (16.7 million or 1.07 billion), so will X-plane 10 support 10 bits and will I notice the difference? I believe my HD 4870 GPU will support 10 bits at this resolution.

  21. Ben;

    I am not here to go “when do we get the demo?!!” But, based on all the posts you’ve done explaining how x-plane works, I have to say that when the day arrives, I will be tremendously excited. I have a Mac Pro 8 core, with a 5870, and I am uh, interested in seeing how hard I can push it.

    (Well, I can do that easily with multithreaded rendering, but i mean.. .you know.. with a little more immediate feedback!)

    Thanks for balancing the CPU/GPU behavior of x-plane (this is the part that is on topic.) Its been a little weird to be able to run X-plane 9 at full speed with Blender rendering in the background, while the GPU is barely even aware that something is happening.

  22. Who do we think will buy that DirectX version 78 even there are just very direct x which is better and latest, now looking forward for the release of DirectX 11.

Comments are closed.