Chris and I wince whenever we see a forum post like this*:

Every time I start up the left engine of the KX-1000 turbojet, rain appears on my right windscreen.  I hope 64-bit fixes that!

Seriously though, 64-bit seems to have gained a reputation in the community as the messiah that will cure all evil.  This reputation is unfounded.

64-bit will do one and only one thing: allow X-Plane to use significantly more virtual memory without crashing.  If you crash because you run out of address space, 64-bit will help you.  It is not going to fix anything else.

Here are some things 64-bit will not do:

  • 64-bit will not make X-Plane faster.  Using more memory doesn’t make things faster.  (Nerds: using IA64x86_64 doesn’t make things faster.  We’re not register bound because secretly x86 chips do crazy register renaming.  We are, however, L2 bound, and making all of our pointer-based structures 2x in size isn’t a move in the right direction.)  Past tests indicate that 64-bit is performance neutral.
  • It won’t fix any bugs.  If a system is broken in 32-bits, it will be equally broken in 64 bits.
  • It won’t make anything look any better.  The same settings should produce the same image in 32-bits or 64-bits.
  • It won’t bring world peace.  For that you need 128 bits.

Actually, that second-to-last statement is slightly not true . With 64 bits, some settings will become possible for some users that were strictly memory bound before.  For example, if you couldn’t run extreme texture res and extreme forests only due to memory limits, 64-bits will let you use extreme res and forests, and that probably will look better than having to pick one or the other.

Of course, with the address space limit removed, users will run into a series of new limitations:

  • Some users will run out of physical memory.  Fortunately RAM is very cheap, but it’s not easy to upgrade in a laptop or iMac.  (I stand corrected: apparently iMac RAM upgrades are now slip-in, which is cool.  I think a while ago you had to pop the case, which wasn’t much fun.  The TiBooks used to have a RAM drop-in under a pop-off keyboard, which was cool but put constraints on how strong the keyboard could be.  But I digress.)
  • Some users will not be able to show all of that extra 3-d “stuff” at reasonable framerates.
  • Some users will run out of VRAM at the higher texture resolutions.

There’s always one thing that limits performance…64 bit simply ensures that it isn’t the needless artificial limit of a 32-bit process.

* Whenever we see such a post by accident while looking at online shoe catalogs.  We do not read the forums!  Do not assume that a forum post is a bug report.  Insert the rest of the rant here, etc.

About Ben Supnik

Ben is a software engineer who works on X-Plane; he spends most of his days drinking coffee and swearing at the computer -- sometimes at the same time.

49 comments on “Things 64-Bit Will Not Do

    1. Pretty much all laptops now days take the same SODIMMS for upgrades as well, the exceptions being the new retina display macbook pros and most ultrabooks and netbooks

  1. Actually, for Peace in the World, I guess neither a 1024-bit system would do…
    For the rest, well, I am very pleased that my right turbotwin-lowbypass-ramjet engine will start when in heavy snow while using Train Simulator on the old 24″ iMac in the other room booted in BootCamp…! Just kidding, obviously!
    Actually I have an iMac 27″ and I am constantly getting errors due to RAM address space issues, so I am glad. Thank you.

    1. Just for reference, well before you max out the storage size of 128 bits you would need so much matter that you would cause your storage devices to collapse into a black hole.

  2. When world peace is so “close” … with 128bit … then why bother with 64bit at all? Even if it “only” takes a “few” extra years … I would vote for it 😉 !

  3. Hi! while you’re talking about releases I was wondering if it is planned to release soon the tool Laminar uses for generating roads out of OSM data? XPOSM doesn’t seem to be compatible with XP10 and I would like to release scenery with some road updates.
    Thanks!

    1. Hi,

      Well, the _source_ is available now – in fact, it has been since 10.0 shipped. But the tool we use to make our DSFs isn’t really easy to use. I have not yet figured out how we will package up our road algorithms into an easy-to-use tool, but we will get there someday.

      If a road scenery pack made with XPOSM works with v9 but not v10, file a bug. But I suspect the real problem is XPOSM not understanding v10 DSFs – I think XPOSM tries to pre-drape the roads (the only way to make roads in v9); in v10 roads are supposed to be draped by X-Plane.

  4. Is there any estimated relaese date fur 64 bit.? X-plane is just not any fun anymore, I have come so close serveral times to shift over to the dark side but keep hanging on. Some light at the end of the tunnel would be nice.

    1. We are not announcing a schedule for 64-bit at this time. I don’t think 64 bit is that far out but I will not post an ‘estimated’ date – inevitably the estimate will be wrong and everyone will want to know why the were mislead. So no dates until we have a solid one.

  5. How many files will be required to differentiate between 32-bit and 64-bit mode? Considering I’ll probably run two betas of each, apart from plugins, is it only the X-Plane.app that is different?

    1. Basically you’ll just get two apps in the outer level for your OS – you’ll be able to switch between 32 and 64 bits easily in a single install.

  6. Correction here : “Nerds: using IA64 doesn’t make things faster.”
    Mainstream chips are not IA64, but x86_64, and I presume you never ran XP on a true IA64 machine.

    And about performance, there should be a little plus if you would make a comparison between un-optimized builds of in x86 and x86_64. But for high performance builds I assume you already did optimized builds in x86. By optimized I mean with compilation extra flags enabled (SSE instructions, build tricks…). And in this case yeah, there is no step up in performance.

    Anyway, I think you’re right to explain what 64 is not to almost non geeks, we never know for sure :).

    1. Oh shoot — IA64 is, like, Itanic, right? Yeah we haven’t even started on that port yet. 😉 And we only compared optimized-to-optimized; our approach to performance is to write C++ (and the occasional intrinsic) with the optimizer as our target audience. If the optimizer can unroll a loop, we won’t do it ourselves just to speed up the unoptimized debug build. The optimized build is the one that is basically the same speed since it is L2 bound.

      1. Apologies. Should have been more clear. Sandy Barbour stated “… until X-Plane, the plugin SDK and my plugin are 64 bits on linux, trying to get [the python plugin] to work is always going to be a pain.”

        My question is if the 64 bit release will address the first two? I’m sure Sandy will take care of the third as soon as he can.

  7. Hi !

    Well these news are great, i hope that the X-Plane will be more stable and it will handle more graphics in extreme resolution.

    I am a fan of open source programming and i use daily linux distros.
    Will this patch be for Linux and MacOSX ?

    According to Phoronix.com , some OpenGL games (not every game) are running faster in Linux environment rather that Windows – MacOSX, and they see a big future in this part of Linux Distro (don’t forget that valve is going to release a Linux version of Steam client soon).
    I mention also that you have some problems with Ati driver.I tried to benchmark and compare the X-Plane with the same settings in other OS and the result was awful. The other option for the driver is the opensource one (xf86-video-ati) with Mesa. They have a lot of work to do in order to complete it (notice that the last version gained finally OpenGL 3 support) but they see a good future for this too.
    Will X-Plane gained support for the opensource drivers?

    Maybe its time for a better support for the Linux version of X-Plane, as i can found a lot of bugs (i will try to make a list with as many bugs i can found). Maybe there is a huge advantage using a Linux Distro, but we cannot answer that.
    Its just a thought.
    I am so excited with your work, go on!

    1. Hi,

      I compared 10.10r3 on my HD7970 on Win7/64 and Ubuntu 12.04 LTS 64-bit and that, both with and without pinned memory, their performance is within 1 fps (and the fps test isn’t good enough to measure more precisely). So Linux is not faster or slower. (However, pinned memory is borked on the Linux driver so the real user experience is much worse since this driver extension isn’t available.)

      I have no intention to support the open-source drivers. It’s just a question of resources; a major driver of development time is debugging shader/OpenGL code on multiple GPUs and platforms; adding new combinations (Linux open source + back-end hw) is a loss – it doesn’t make sense for us to defer other work to spend time debugging an alternate driver stack.

      If the drivers just worked, then we’d say “well go for it”, but since they don’t we tell users “just go use the proprietary drivers”. The proprietary drivers share GL code with the desktop which minimizes debugging.

      If the open source drivers were fps competitive we’d pay more attention but I just don’t see them ever getting there; ATI and NV redo their GPUs so rapidly that it takes a huge concentration of firepower to develop a fully optimized back-end. I fear that by the time the open source driver catches up, the card is several generations old.

    2. The general reason is because most Linux ports do not support the same graphics features due to poor Linux video driver support. You simply can’t use all the features of high end graphics cards in Linux.

      They games aren’t faster, they’re just doing less. Thats not going to change before you get feature parity with OSX and Windows drivers, which isn’t going to happen anytime soon.

      1. This has -not- been my experience with OpenGL — the Linux NV and ATI proprietary drivers support the same GL implementation (with perhaps 1-2 months of version skew if the drivers are on different release schedules) as Windows. So for example, as far as we know, the tessellator is waiting for us on Linux and Windows. The case of pinned memory being broken only on Linux ATI drivers is the rare exception – virtually every other driver issue we’ve had has been equal on Linux and Windows. (See also sRGB – equally borked on Lin and Win. I have tried a beta Windows driver that fixes it and will be very surprised if the same driver rev for Linux doesn’t fix it as well.) In a few cases, bugs we’ve seen have been Windows specific with Linux being unaffected.

        1. Ben,
          That’s interesting that you don’t see much, if any, difference in performance between platforms. On my system Linux (highly optimized Gentoo) is almost 50% faster than any version of Windows. This is on a Q9300@3Ghz with a GTX 470. All platforms running the most current NVidia drivers.

  8. You should also notice that there will be no more any new drivers for older graphics card, so you cannot wait for an driver update to fix some bugs.
    I think the final driver for Radeon HD 2xxx-4xxx is catalyst 12.4

  9. Hi Ben,

    I know I’m off topic, but speaking about graphics cards and drivers I was eager to ask you this question 🙂

    I’ve been surfing the net to learn something about deferred rendering (which to my understanding is used by X-Plane in HDR mode). I found some contributions (forums, blogs) hinting that deferred rendering is not really “SLI/Crossfire-friendly” because of its internal workflow; which seems to be consistent with the current situation of X-Plane, that (for now) doesn’t get significant advantages from multiple GPUs.

    Considering that more and more 3D games are gradually switching to deferred rendering engines (one example for all: Unreal Engine 3), if what I read is true, then should I deduce that SLI/Crossfire configurations are becoming gradually useless in favour of high-power single GPU configurations? Or is it something temporary, until future enhancements in drivers will allow deferred rendering engines to fully exploit multi-GPU configs?

    … or maybe I’ve completely misunderstood the situation, which could be possible given that I’m no expert at all in this field…

    1. Hi Filippo,

      HDR is indeed a deferred renderer – it’s a deferred renderer that resolves into an HDR surface with a linear color space. Now…

      1. It is my understanding that X-Plane _will_ scale with SLI now; the reason NV did not default us to using SLI in their profiles is that we are typically CPU bound on the type of system that the test (one big monitor, one bad-ass video card). For example, if you run a GeForce 680 at 1920 x 1200 with a lot of ren settings, you’re going to max out on CPU due to objs and shadows, not GPU due to HDR and clouds.

      2. SLI/Crossfire come in a number of flavors: AFR (alternate frame) should be friendly with deferred rendering engines if they are coded correctly. They are unfriendly to temporal anti-aliasing – the driver has to do some special work to make this case work. Deferred renderers are very unfriendly to _split frame_ SLI and CrossFire, but this case is highly non-optimal anyway because the driver has to push the entire frame to BOTH GPUs (double the “push”) where-as each frame goes out over the bus only once in AFR.

      Anyway, if you found a post where someone who works on Unreal 3 says “We don’t play nice with AFR” I’d be curious to see it, but two-pass rendering techniques like deferred rendering should not be fundamentally problematic.

      What IS problematic with deferred rendering is anti-aliasing…the G-Buffer has to have its store in “multi-sample” mode – which means 16x FSAA + deferred rendering is 16x the g-buffer storage. Who cares? Well, the problem is that G-Buffers tend to be pretty huge – at a minimum typically 4x the cost of a regular framebuffer. So FSAA + deferred is very frame-buffer expensive, which is why the anti-aliasing options in HDR mode are not as nice (from an anti-aliasing perspective) as in non-HDR mode.

      1. Hi Ben,

        I’m very grateful for your precious insight!

        Basically these are good news for me, since in my later experiments it looks like I’m strongly GPU-bound at the moment (CPU is Intel Core i7 3770 – Ivy Bridge, GPU is AMD Radeon HD 7870): X-Plane never reports CPU utilization above 0.6, while the framerate never goes above 50 fps, unless I turn off quite a bit of the eye-candy (clouds below 30%, no cars, town and forest object to “sparse”). I was in doubt whether going for a second 7870 could improve my situation: based on what You wrote me there are chances it could.

        Anyway, I’m aware You are not really satisfied with current Catalyst driver (using 12.8 at the moment here)… I’ll wait for possible improvements of rendering path on AMD hardware before going Crossfire…

        Thanks!
        Filippo

        1. Hi Filippo,

          It might make more sense to go for a single bigger GPU (in your case a 7970?) than two smaller ones. But check wikipedia for specs and local prices of course.

          cheers
          ben

          1. Hi Ben,

            To be honest, I decided to buy this model because at that time Radeon GPUs seemed the ones that didn’t have glitches (at that time You were struggling with screen artifacts on NV hardware). Unfortunately it looks like I was too conservative with the model… now I must live with it, my wife would kill me if she found extra expenses on our bank account…

  10. Ben I’m wondering if the 64-bit version will do anything to decrease the blur we see out the window at high altitudes. Any comments? 🙂 Thanks!

  11. Ben,

    I’m curious about your test comparison between the 32-bit and 64-bit. Was this comparing a 32-bit Win 7 OS against a 64-bit OS, or did you use the 32-bit WoW functionality of the Win 7 64-bit system? My understanding is that when using the WoW, the system actually has to work a little harder and it decreases performance compared to a native 32-bit build. It may be splitting frog hairs, but I have multiple machines and about to start putting together a new one, wanting to use one of these primarily for flight sim. My original thought was to use the 64-bit system for the master when 10.20 is released and a 32-bit for the scenery rendering, but perhaps this is unnecessary?

    -Chris

    1. Hi Comicus,

      What the WoW layer does is translating 32-bit system API calls into 64-bit equivalent ones, and to convert the return values to make them usable by a 32-bit application. This is actually an overhead, but it is so small that it goes almost unnoticed, at least from a “human” point of view (using benchmark You could detect some difference).

      When I first had the opportunity to “put my hands” on a 64-bit OS (Windows XP x64), I tested it with Flight Simulator 2004 and I got exactly the same performances as I used to see in my previous 32-bit Windows XP Home Edition.

      The little performance penalty given by the WoW layer could be compensated by better performances in other areas of the Operating System and/or device drivers. Another variable is how the specific CPU You are using performs in 32-bit mode compared to 64-bit. When Intel produced their first 64-bit capable CPU, the Pentium 4 600 series, it was dramatically slower in 64-bit mode (benchmarks told about 20%), probably because it was fundamentally a 32-bit CPU “quickly patched” to support 64-bit extensions. Current CPUs offer almost the same performance in 32- and 64-bit modes.

      There are some advanced techniques to gain slight performance advantages in number-crunching applications, especially if floating-point calculations are involved, by converting these into fixed-point using 64-bit integers. I don’t know if such approach has (or will be) used in X-Plane code.

      Best regards,
      Filippo

Comments are closed.