I think I have randomly mentioned this to various developers, so I might as well randomly mention it to all developers. A new particle system for X-Plane is in our road-map – that is, a particle system where authors can control the properties and graphics in detail (similar to what you’d get in one of the AAA game engines).
If you have feature requests or ideas for things you need for a particle system, please write something short and coherent and email it to me; I can file it with my design notes for later integration.
When X-Plane checks X-Plane.com for updates, it calls the server with an identifier for itself that contains a little bit of information about the host machine it is running on: operating system type and version, whether it is the 64-bit or 32-bit version, and whether it is running as a demo.* (X-Plane does not send any personally identifying information about you, but the server can see your IP address because all servers can see the IP addresses of all incoming network traffic.)
I sometimes get asked by third party developers: what percentage of users are using 64-bit, or what percentage of users are on Mac or Linux. So I wrote a script to analyze the incoming data and break it down by platform, etc. Here are the results. (I have done this kind of analysis before, but this is the first time I wrote a good script to remove possibly confounding results.)
Platform Breakdown
The data set consists of 24,917 unique IP addresses that ran a non-demo, global X-Plane 10.25r1 in the last week of 2013. This excludes users who have the regional version, a demo, don’t have their DVD in the drive, are running an old version, or who don’t have net connectivity. So that’s a big enough sample to get good data, even though it’s only a fraction of the total X-Plane 10 copies sold. Here’s the platform break-down:
IBM: 65.7%
APL: 32.2%
LIN: 2.0%
This matches the number we’ve seen ever since FSX development was halted: growing market share for Windows. (We used to be 60-40 Windows/Mac back in the day). Since X-Plane 10 is selling better than X-Plane 9, I believe that what we are seeing is growth, and the growth is disproportionately among Windows users. The Linux share appears to have shrunk, but it’s hard to tell since past data wasn’t as carefully analyzed. (The highest percent I have ever seen for Linux was 5%, but from this data I estimate the error bars on old data might be +/- a few percent, so who knows.)
64-Bit Adoption
Here’s the rate of 64-bit adoption for each OS.
All OSes: 82.0%
APL: 85.1%
IBM: 80.7%
LIN: 72.7%
It doesn’t surprise me that OS X has the highest 64-bit adoption – every Mac is running a 64-bit operating system and OS X has the least available address space. What does surprise me is that Linux has the lowest 64-bit adoption rate, since Linux users have had the strongest desire for 64-bit. (This desire is, I think, driven by the difficulty of setting up cross-64/32-bit operation on modern distributions.)
Operating System Breakdown
We don’t have a break-down of Linux distros or Kernels – the Linux version of the sim doesn’t report that, but we do have operating system versions for Windows and Mac. On Windows the _32 and _64 bit suffixes tell whether the user is running the 32 or 64 bit “edition” of the OS.
The numbers include the 64-bit adoption rate for that particular OS; naturally the 64-bit adoption rate on the 32-bit editions of Windows is 0% because those OSes can’t run 64-bit X-Plane. Fortunately 64-bit editions of Windows are becoming the norm – of the users running Windows 8, over 98% have the 64-bit edition. On OS X, every version of the OS is 64-bit capable.
Windows:
5.1_32: 1.8% (64-bit: 0.0%)
5.1_64: 0.2% (64-bit: 0.0%)
5.2_64: 0.1% (64-bit: 100.0%)
6.0_32: 0.9% (64-bit: 0.0%)
6.0_64: 0.8% (64-bit: 77.8%)
6.1_32: 4.1% (64-bit: 0.0%)
6.1_64: 66.0% (64-bit: 87.2%)
6.2_32: 0.4% (64-bit: 0.0%)
6.2_64: 25.7% (64-bit: 87.3%)
OS X:
10.6.5: 0.1% (64-bit: 42.9%)
10.6.6: 0.0% (64-bit: 100.0%)
10.6.8: 8.8% (64-bit: 78.9%)
10.7.0: 0.1% (64-bit: 28.6%)
10.7.2: 0.1% (64-bit: 100.0%)
10.7.3: 0.0% (64-bit: 100.0%)
10.7.4: 0.1% (64-bit: 81.8%)
10.7.5: 11.3% (64-bit: 78.7%)
10.8.0: 0.0% (64-bit: 100.0%)
10.8.1: 0.0% (64-bit: 0.0%)
10.8.2: 0.4% (64-bit: 88.2%)
10.8.3: 0.3% (64-bit: 88.0%)
10.8.4: 0.7% (64-bit: 84.7%)
10.8.5: 15.0% (64-bit: 82.0%)
10.9.0: 8.0% (64-bit: 84.3%)
10.9.1: 54.7% (64-bit: 88.5%)
10.9.2: 0.2% (64-bit: 100.0%)
Hopefully this is useful for third parties in deciding what operating systems and platforms to support.
* This is a standard practice – the update check runs over HTTP, just like your web browser.
In a past life, before I worked for Laminar Research, I was a plugin developer. So I’ll admit up-front that I might be the most guilty individual when it comes to this particular problem: log spam.
What do I mean by log spam? By log spam I mean printing information to Log.txt that isn’t necessary.
Log.txt exists so that we (and third party developers) can look at a single machine-generated file and get a strong idea of what a user did with X-Plane for diagnostic purposes. Therefore log.txt needs to contain three kinds of information:
- Basic configuration information. We print your graphics card, your OS, etc. (In 10.30 we’ll even get your CPU type right on OS X. 🙂 It’s more reliable to log this than to send a user into ten different control panels to gather information.
- Very basic event logging, e.g. what did the user do? Where did he start his airplane, what scenery was loaded.
- All abnormal conditions, including error alerts that came up.
In particular, what we do not need to do is log the detailed blow-by-blow progress of loading in excruciating detail.
If I may use my own sins as an example, older versions of XSquawkBox shipped with debug code on that dumped the contents of its entire ICAO dictionary – thousands of lines of text. This information isn’t configuration (it’s not unique to particular installs), it isn’t an event – it is part of the event of loading XSquawkBox, and its not an error condition. What a mess! (Fortunately Wade has fixed this and turned such logging off in the newest XSB betas.)
Programmers write this kind of logging code to help trace program execution, but it’s important to turn it off – if we don’t, the log files get so verbose that everyone spends a ton of time just fishing through ten tons of junk.
Some more guidelines:
- Log the presence of files or sub-modules that are user-configurable, but do so briefly. It’s reasonable for SASL or Gizmo to list scripts that do not come with the base package, or for XSquawkBox or Pilots Edge to list CSLs found. (This information lets a developer see what a user has done to modify the plugin.)
- Don’t log the presence of required files unless an error condition occurs. An empty log can simply mean “everything went okay.” (The sim will already confirm successful plugin loading on a per-plugin basis.)
- Log rare events, e.g. a flight starting, a network connection being opened. Don’t log events that occur at high frequency, like receipt of a single network packet.
X-Plane is guilty of log-spam too; I’m slowly trying to cut down on this. For example, spurious autogen-tree warnings are now gone from the log in 10.25r1.
Support Debugging
Sometimes you need all of that logging. My suggestion is: if your add-on is complicated, make a specific interface to get verbose diagnostics that users can turn on as needed. For example, X-Plane has an option in “operations and warnings” to dump verbose play-by-play network data to Log.txt. When you enable this, you can get a ton of information about network operations, but the log will also be totally over-run. The option is meant only for developers who are integrating with X-Plane’s UDP networking, and thus it has to be turned on by hand.
The OBJ engine actually also has such a trick – if you put the single word DEBUG (all caps) at the end of your .obj file, the entire OBJ contents will be output in detail to Log.txt. Again, this is a ton of information, available when needed, but off by default.
You can support verbose logging in your add-on with a script variable, a dataref, a change to a preferences file. Pick an easy mechanism, so that you can have users turn on verbose logging only when they need it.
Not Everyone Has Grep
Finally, a quick aside to my fellow professional programmers. You might be rolling your eyes and going: “Ben you idiot, it’s easy to filter the log. Everyone should just prefix their log entry and use grep.”
If we were all programmers, I would agree 100%. But please understand that aircraft and scenery authors have to look through the log to get diagnostic information about art-asset problems, and this is an audience that is sometimes not comfortable with command-line tools, huge 25MB text files, massive amount of text, etc. So we are trying to keep the log.txt more human-readable than, for example, dmesg on Linux.
X-Aviation just posted an update to SkyMaxx Pro – the new 1.1 patch brings big performance improvements and fixes rendering problems with HDR. From the user reports I’ve read, performance in HDR mode is good. (I hate to see users have to pick between HDR and third party add-ons; we want HDR to be the basis for superior next-generation aircraft and scenery.)
Getting a plugin that draws in 3-d to work with HDR mode requires some caution; the plugin APIs for drawing were designed in X-Plane 6, when ‘deferred rendering’ didn’t even exist as a concept.
I have updated the X-Plane SDK tech-note on 3-d drawing to contain modern guidance on how to cope with 3-d drawing callbacks. In particular, you get called twice when your plugin requests to draw in 3-d:
- The first callback is for drawing “solid” stuff that will be affected by spill lights; the only safe thing you can do from this callback is to call XPLMDrawObject.
- The second callback is after lights have been mixed; at this point normal OpenGL drawing works reliably (albeit without spill lights being mixed in). This is the time to draw translucent prop discs, coach marks and labels, clouds, smoke and particle systems, etc.
If your plugin does any 3-d drawing (e.g. custom particle system drawing or any kind of effects code), please review the tech note, and email me if you have questions. The next-gen CSL code sample that is linked from the article is tested and works correctly too.
The HDR check-box turns on HDR mode in X-Plane 10. But what is HDR mode?
High Dynamic Range( HDR) mode in X-Plane actually enables many features at the same time. We put in a single check-box because the features all fit well together, and we wanted to keep the interface simple.
- High-dynamic range rendering. In HDR mode, the world is rendered across a wider spectrum of brightness (just like real HDR photography)*, to better capture lighting levels. The result is then compressed to low dynamic range (which is what your monitor does).
- Tone mapping with real-time exposure. The HDR image is converted to LDR via a tone mapping algorithm that attempts to preserve the character of the image while keeping the dynamic range within limits of your screen. The level of exposure changes in real-time: if you look at a bright scene and then duck the camera behind a building, you’ll see your eyes slowly adjust up, then everything gets dark when you stare at the sun. (The level-adjustment algorithm runs entirely on the GPU for performance.)
- X-Plane uses deferred shading for all rendering except for translucent glass in aircraft. Deferred shading is a 2-pass approach to rendering that is commonly used in today’s shooter games, because it allows us to efficiently draw huge numbers of real 3-d spill lights. In the case of X-Plane, you get a lot more night lighting.
- The individual puffs in our 3-d clouds have soften intersection with the terrain to avoid ugly cloud-ground collisions in low-cloud conditions. (See page 1 and 4 of the article for before/after pics.)
- We emit heat blur from the engine particles. The ability to add blur effects is a capability of the deferred rendering engine; something I’d like to do more with in the future.
- Optional atmospheric scattering. This effect enables a more advanced lighting and fog shader that runs on the GPU and models the diffusion of light through the atmosphere as it hits oxygen and water particles. Atmospheric scattering is what makes the far view ‘bluer’ than the near view.
So the moral of the story is: the HDR check-box actually enables a lot of effects!
Why Did You Put Everything in One Check-Box?
We had a few reasons to put a lot of effects into one bucket:
- Some of those effects require other effects. For example, you can’t have the heat blur or soft cloud puffs without the deferred renderer.
- We wanted to keep the UI simple. Our view is that the rendering settings screen is way too complicated and hard to tune, something we’re still looking to improve. So any time we add features, we have to ask: can we make this simpler?**
- There are fewer combinations for us to test if there are fewer settings. Sometimes we can have bugs based on combinations of features, so having the features all enable at the same time simplifies development of testing, which is extra important when we have to test GPU features against different hardware and drivers already.
* GPU nerds: The final render surface is in 16-bit floating point format.
** If you really want to get tweaky, most of the sub-effects within HDR mode can be accessed by settings.txt. But if you edit settings.txt and your computer transforms into an alien cyborg and destroys humanity, don’t hold me responsible.
In a previous post I discussed a new facility in X-Plane 10.25r1 (coming real soon) to disable aircraft-attached objects for performance optimization. There is a second use for this feature: performance analysis.
This post is targeted at aircraft authors, particularly authors who create complex aircraft with Lua scripts or plugins.
The basic idea here is to remove work from X-Plane and measure the improvement in performance. I strongly recommend you look at X-Plane’s framerate in milliseconds. An improvement of 1 frame per second is a big improvement at 15 fps and almost nothing at 60 fps. But saving 5 ms is always a good thing, no matter what framerate.
The GPU Is Confusing
Your graphics card is basically a second computer, with its own memory, its own bus, and its own CPU (in this case called a GPU). Ideally the CPU on your computer and GPU on your graphics card are both working at the same time, in parallel, to present each frame of X-Plane.
OpenGL drivers accomplish this by building a “todo” list for the GPU, which the GPU then does on its own time. Ideally the CPU is off computing flight models and running plugins while the GPU is still filling in mountain pixel shaders on screen.
For any given frame, however, one of the CPU or GPU will have more work than the other, and the other will be idle some of the time.
- If you have a lot of GPU work (e.g. 4x SSAA HDR with clouds on a huge screen) and not much CPU work (e.g. no autogen) then the CPU will have to wait. It will give all of its instructions to the GPU for a frame, then the next one, then the next one and eventually the GPU will go “stop! I’m not done with the frames you gave me” and the CPU will wait.
- If you have a lot of CPU work (e.g. lots of objects and shadows and plugins) but not much GPU work (e.g. a small screen without HDR on a powerful graphics card) the GPU will wait; it will finish its entire todo list and then go “uh, now what?”. Until the CPU can get more instructions ready, the GPU is idle.
Here’s the key point: your framerate is determined by the processor that is not idle. If your GPU is idle and your CPU is not, you are “CPU bound”; if your CPU is idle and your GPU is not, you are GPU bound. Optimizing the use of the processor that is idle will not improve framerate at all.
Viewing Specific Processor Load in X-Plane
X-Plane’s “frame rate” data output gives you two useful numbers for figuring out where X-Plane is spending its time:
- “frame time” gives you the total time to draw one frame, in milliseconds.
- CPU load give you the fraction of that frame time that the CPU was busy.
For example, my copy of X-Plane is showing 9 ms frame time and .78 CPU load. That means that the GPU needs 9 ms to draw the frame, but the CPU needs 7 ms to draw the frame – I am GPU bound. (I am also in the middle of the ocean, hence the low numbers.)
Unfortunately if you are CPU bound (CPU load > 0.95 or so) there is no current display for the GPU’s utilization; we are working on that for X-Plane 10.30.
Analyze Performance By Subtraction
Putting it all together:
- You can calculate the actual CPU time spent in your add-on from the CPU load and frame time data outputs.
- You can disable your add-on to see how much CPU time is now used; the difference is the CPU time your add-on is consuming.
- You can change settings to be GPU bound (and confirm that by seeing CPU load < 0.95). Once you are GPU bound, improvements in framerate when you turn off your add-on show the amount of time you used the GPU for.
Armed with this knowledge, you can find the real cost in milliseconds of GPU and CPU time of the various parts of your add-on. You can find out what costs a lot (the panel? The 3-d? The systems modeling) and then focus your optimizations in places where they will matter.
Support Performance Analysis in Your Add-Ons
In order to tell what part of you add-on is consuming CPU and GPU time, you need to be able to separate your add-on into its sub-components and disable each one.
If your add-on makes heavy use of plugin code or scripts, I recommend putting in datarefs that turn off entire sub-sections of processing. Good choices might be:
- Bypassing custom drawing of glass displays in an aircraft.
- Bypassing per-frame systems calculations.
- Bypassing per-frame audio code, or shutting off the entire audio subsystem.
- Turning off any 2-d overlay UI.
You use DataRefEditor to turn off each part of your system and see how much framerate comes back. If you get 5 ms of CPU time back from turning off your 2-d UI you can then say “wow, the UI should not be so CPU expensive” and you can then investigate.
X-Plane Supports Performance Analysis
The technic of shutting a system off and measuring the increase in performance is exactly how we perform performance analysis on X-Plane itself, and as a result X-Plane ships with a pile of “art controls” that disable various pieces of the simulator.
This article provides a big list of art controls you can use to disable parts of your aircraft and measure the performance improvement.
Here’s where object-kill datarefs come in: the “cost” of drawing an object is mostly inside X-Plane (in driver and OpenGL code) but also in the time spent calling dataref handlers that drive animation. With an object-kill dataref, you can turn off your objects one at at time and see which ones consume a lot of CPU and GPU time. If an object is found to be expensive, it might be worth an optimization pass in a 3-d editor, or some time spent to improve the scripts that control its datarefs.
I have a lot to cover here – a little for everyone I think.
10.25 Release Candidate 1 Is Up
If you are a third party developer using 10.22, and you haven’t participated in 10.25 betas, please go do so now. You can get the beta by running the installer and clicking “get betas”. (If you run the beta, you are auto-notified to update.)
This build sneaks in object-killing in Plane-Maker; thanks to the aircraft developers who took time to privately test this feature last week!
A Fix to the Plugin SDK
This section is just for the programmers. I investigated a three-way conflict between X-Plane 10.25, Gizmo 64-bit and the new 64-bit XSquawkBox, and what I found was a bug in the C++ wrappers that ship with the X-Plane SDK headers. XSquawkBox was using them, but they were not correctly updated for 64-bit.
They are now. So if you use the “XPC” C++ wrappers in your plugin, please go get the new headers!
I’ve written about this before on the X-Plane dev email list, but the short of it is that ‘long’ as a datatype is not safe for plugin use. A long is 64-bits on Mac/Linux but 32-bits on Windows. If you use long, your data structures change size, which is never what you want.
The SDK widget APIs sometimes store widget IDs (which are pointers) in integer slots. in order for this to work, the slots must be 64 bits wide. The old SDK (and XPC wrappers) use ‘long’ for this, but the correct type is intptr_t. The SDK made this change a while ago, the XPC wrappers made this change now, and you should be sure that your plugin isn’t using “long” with the SDK.
The failure mode of mixing ‘long’ and ptrs on Windows is exciting: the upper 32 bits of the address of the widget get cut off; as long as you allocate your widgets early, your widget address is probably less than 2^32, and you are okay. But if your plugin loads later, your widget IDs (which are secretly ptrs to structs) will be > 2^32 and converting them to long changes the address, crashing the sim.
This is exactly why Gizmo appeared to be “crashing” XSquawkBox: XSquawkBox was using ‘long’; if Gizmo ran first and allocated memory (which Gizmo is well within its rights to do!) then XSquawkBox’s widget IDs would be greater than 2^32 and the ‘long’ bug would kick in.
(I don’t know when Wade will release an updated XSquawkBox, and I do not plan to discuss XSquawkBox any more on this blog. You can follow XSB here.)
Whose Bug Is It Anyway?
The XSquawkBox + Gizmo crash illustrates an important point: if two add-ons work by themselves but crash when used together, we can’t know which add-on is at fault without more investigation.
In this case, the bug was in XSquawkBox. But before I investigated on my computer, Ben Russell reported to me that removing some initialization code from Gizmo also “fixed” the problem (in that it made the symptoms disappear). Yet we know from investigation in the code that XSquawkBox had the bug (using long for pointers on Windows).
The moral of the story is: if two add-ons crash together, we can’t know which add-on is fault by which add-on changes to “fix” the problem. It is very common in the OpenGL world for the driver team to change the driver to work around buggy apps, and for apps to work around problems in buggy drivers. A change to code might be a fix for a bug, but it might be a work-around, avoiding a bug that still exists.
Here’s my take-away point: identifying a conflict between two programs is a way to narrow down a bug, but it is not a fix. We (Laminar Research) almost always ask you to remove add-ons when you see a crash. This is not a fix! We want you to remove add-ons to identify the conflict between X-Plane and a particular add-on (or between two add-ons). The next step is for us to figure out why the add-on might crash X-Plane or vice versa. Typically we prefer to contact the add-on maker directly to get technical information. What we are looking for is an identified conflict with the minimum number of variables.
There are a number of changes to how X-Plane 10.22 beta 1 handles memory for LuaJIT plugins.
Windows and Linux 64-bit: X-Plane Manages Memory
Previously, 64-bit Windows and Linux LuaJIT plugins had to allocate their own memory, and often they did not succeed. 64-bit LuaJIT can only use certain special areas* of memory; if, by the time a LuaJIT-based plugin loads, some other code has used all of that memory, then the LuaJIT plugin cannot operate.
With 10.22 beta 1, X-Plane pre-allocates as much low memory as it can and then provides it to LuaJIT plugins on demand.
This change should fix problems where LuaJIT-based plugins run out of memory, fail to load, etc. on Windows with lots of scenery packs and rendering settings cranked up.
If you ship a plugin that uses LuaJIT, make sure your plugin can use the memory provided by X-Plane. The process for doing this was defined during the X-Plane 10.20 beta and has not changed, so plugins that are coded correctly will just work.
OS X 64-bit: Crash Fix
Previously for OS X, when a LuaJIT used up all available low memory that X-Plane had reserved, X-Plane would crash. This was a bug in our code; X-Plane now correctly simply tells the plugin “sorry, we’re out of memory for you.”
I have said this before in forum posts and I will say it again: your plugin should not exhaust Lua memory! There is typically over 1 GB of LuaJIT memory available; if your plugin exhausts it all, your plugin is doing something wrong.
So it’s good that this won’t crash, but if there were plugins that were causing this crash, those plugins probably need to be carefully examined – their memory use was way too high!
New Stats to Monitor Memory Use
There are two new “stats” in DataRefEditor (pick the “show stats” sub-menu option) for Lua memory use: lua/total_bytes_alloc and lua/total_bytes_alloc_maximum. The first one tells you how many bytes of memory all Lua plugins are using, the second shows the highest memory allocation ever recorded. A few notes:
- This only measures memory use provided by X-Plane. So 32-bit plugins will show “0” for both, because in 32-bit plugins, X-Plane does not provide memory to Lua.
- Lua is garbage-collected, meaning it allocates memory for a while, then periodically throws out unused stuff. So it is normal to see this value slowly rise over time, then periodically drop down by quite a bit. It is not normal to see these values increase indefinitely without ever dropping down.
- If your 64-bit Windows plugin uses LuaJIT but registers “0” for lua/total_bytes_alloc, your plugin is not getting memory from X-Plane and is not working correctly; fix your plugin ASAP!
- This memory includes allocations by Lua scripts. It does not include memory for textures, sounds, and other “native” resources provided by SASL or Gizmo. So you should not see a 4 MB allocation just because you made a 1024 x 1024 texture, for example.
* The lowest 2 GB of virtual memory, to be specific. Most other code running in X-Plane (the sim itself, the OpenGL driver, OpenAL, the operating system, other plugins) can use any memory, but they tend to consume some of this “LuaJIT-friendly” memory during normal operations. Thus X-Plane’s strategy is to pre-allocate the LuaJIT-friendly memory and reserve it only for LuaJIT use.
X-Plane 10.21 rc2 is out; this recut of the release candidate backs out most of my changes to the lights; in hindsight my change was too ambitious/crazy at way too late of a point in the release process. The runway lights will still look better in 4x SSAA, but (like X-Plane 10.20/10.11) they will look dimmer if your monitor is bigger.
We’ll do something more involved with the lights for 10.30 when we have time for a proper beta test, and when Alex is around to look at my changes and tell me I’m an idiot.
I seem to be having the same conversation with lots of third party developers via email, so I’m going to write up some of my recent thinking on Lua – if nothing else, it will save me typing the same thing over and over.
One thing was very clear in the X-Plane 10.20 beta: while authors can’t agree on Gizmo vs. SASL*, the entire authoring community is surprisingly united around Lua as a scripting language – it’s everywhere in the payware add-on market.
But Lua is being used for a lot of different things – and these levels of usage are paralleled in Gizmo and SASL; my opinion on the use of Lua must be qualified by which usage we are talking about.
- The simplest level is “datarefs and math”. At this point, to fully utilize the aircraft SDK, an author needs to be able to create and do very simple math with datarefs – something that is not possible inside an OBJ or panel.** Right now we don’t have an easy way to do this basic dataref math. Writing a C plugin and compiling for 3 platforms is a huge hurdle to jump just to wire up an animation to a custom rotary switch.
- Gizmo and SASL both provide ‘add-on’ SDKs – custom sound APIs, particle systems, physics, gauges – basically replacement SDKs for parts of the sim’s own extension system where authors wanted more than we provide. This stuff isn’t Lua at all – the underlying systems are coded in C++ and thus can only be maintained by the C++ developer who writes the Lua-based plugin. The development cost (in man-hours) to do a particle system in Gizmo or SASL is abotu the same as it would be to build it into the sim.
- Some authors have written fairly huge scripts in Lua – for example, doing complete systems simulations or navigation code in Lua. (At least, that’s what I am told, e.g. the JAR A320 – I haven’t read the Lua scripts myself.) This is “Lua as language for a huge program.”
This is my current thinking on these three tasks:
- Datarefs and math are a great use of Lua – it lowers the bar for authors, it’s exactly what scripting languages in games are meant for, and the underlying code in C++ is finite, limited, and therefore not a maintenance problem. I don’t know what LR’s relation to Lua, Gizmo, SASL, or scripting is yet, but I have been saying for a while (internally in company discussions) that we need something easier than C for this.
- I think that if an authoring SDK is limited in X-Plane, we (LR) need to make it better. In the most useful cases where people are doing things with Gizmo and SASL, we sometimes have on our road map features to add to X-Plane that are similar. (But note that these features aren’t necessarily coming soon – authors get a time to market advantage by using these outside SDKs.) I consider this plugin code to be a possible maintenance problem. For example, you can write graphics code in a plugin, but it may not integrate well with next-generation rendering engine features.
- I don’t see huge plugins or huge scripts as something LR should get involved in. If you want to make a truly huge or complicated add-on, that’s great, but it’s a big undertaking and it just takes a development team. I don’t know if Lua is good for huge development; the people who say no (people like me) have no serious Lua experience, and the people who say yes have no serious C++ experience. So far I’ve only heard from people who have lived on one side of the grass.
Anyway, one thing is clear: having LuaJIT work in a plugin is a necessity for X-Plane now; with 10.20 we’ve sunk the engineering cost to make this happen. I do not yet know how else (or if) we will leverage Lua.
* Don’t even bother to comment on Gizmo vs. SASL – I will delete any attempts to start a Gizmo vs. SASL discussion so fast that your head will spin around 360 degrees!
** No math in OBJs or panels. An OBJ is a 3-d model – it is data that you view, not a simulation to be executed! We do not want to mix visualization with simulation or data with code!