Category: Development

A New Broken Record

For years now I’ve been harping about ways to keep the number of batches down in your scenery. A batch is a single submission of triangles to the graphics card for drawing. Batches get rendered fast even if they contain a ton of triangles, but changing modes between batches is not very fast, so a few large batches is hugely better for performance than a large number of tiny batches.

To play that broken record one more time, there are two ways that you (a scenery designer) can cut down the number of batches):

  • Use a small number of larger textures instead of a large number of small textures, preferably sharing textures between similar scenery elements that are placed nearby. X-Plane will do its best to merge the content that uses those textures into single batches. We call this the “crayon rule.”
  • Use less attributes in your objects. Attributes usually require a new batch (after the graphics card mode has been changed due to the attribute). So if you’ve got 1000 attributes in your object, you’ve got a problem.

Well, with X-Plane 9 I have a new broken record: avoid overdraw!

Overdraw is the process of drawing pixels on top of other pixels on screen. It happens any time we use blending to do translucency, and any time we use polygon offset to build the image in layers.

Overdraw is bad because with X-Plane 9’s pixel shaders, most users are slowed down by the graphics card’s ability to fill in pixels (pixel fill rate), with those complex shaders being run for every pixel. If you are at a screen-res of 1200×1024 looking at the ground with no objects, that might be 1.2 million pixels to fill. But if there is an overlay polygon covering the ground, we have 2.4 million pixels to fill! That’s a huge framerate hit.

Right now there’s not much you can do about overdraw. Once MeshTool comes out I will post some guidelines on how you can limit overdraw.

We took a step in the v9 global scenery to limit overdraw: in X-Plane 8 the global scenery tried to hide repetition of flat textures by drawing them over each other with offsets. In X-Plane 9 this is done in a pixel shader (e.g. the texture is analyzed and swizzled in the shader and then drann once), cutting down the number of times we must draw.

If you turn pixel shaders on and off in a flat area like Kansas you might see this if you compare the screenshots – the farm textures are more repetitive without shaders. This gives faster fps to everyone (with or without shaders) by eliminating overdraw.

Posted in Development, Scenery by | Comments Off on A New Broken Record

Scenery and the New Plugin SDK

Sandy and I are working on a major revision to the plugin SDK (all the old stuff will work, we’re just adding new things) that should be available in X-Plane 9 soon. The new “2.0” SDK APIs include some new functionality for working on scenery.

  • Plugins can find the height of the ground at a given location, which is necessary to draw in the 3-d world in a realistic way (e.g. vehicles that drive on the ground in a sloped airport environment).
  • Plugins can load and draw OBJ files using X-Plane’s built-in facilities. I’ve posted OBJ drawing code in the past, but this makes things even easier.
  • Plugins can lookup virtual paths in the library to find artwork from scenery packages.

This makes a number of scenery-system concepts available to plugins.

I’ve been resisting OBJ-drawing support in the SDK for a while, but a few things changed my mind:

  • We’ve moved as much drawing in X-Plane to OBJs as possible and it’s been a big win. A lot of the dynamic elements are OBJs, they’re used in scenery and cockpits and airplanes. Using OBJ files means our artists (who are not programmers) can customize just about every aspect of the sim. So by providing a file format with a rich tool chain to plugins, hopefully we are helping third parties streamline content development.
  • With pixel shaders, X-Plane’s 3-d drawing environment has become complex and hard for third parties to safely augment. By encoding drawing at a higher level via pre-built OBJs (which can be animated via plugin-driven datarefs) we can insulate plugins from drawing-environment changes.
Posted in Development, File Formats, Scenery, Tools by | 1 Comment

It’s Time to Try Nine

If you create plugins, airplanes, or scenery for X-Plane and haven’t tried your add-ons with X-Plane, please do so soon!! It’s much easier for us to fix backward-compatibility problems while we’re still in beta. Beta 14 introduced some bugs (that should be fixed in beta 15 real soon) but I think we’re reaching the point where you can do compatibility testing.

We’re working on a public Linux beta – see here.

Posted in Development, File Formats, Panels by | Comments Off on It’s Time to Try Nine

Beware the Forceware

Brett and a few other users pointed out to me that the nVidida ForceWare 169.21 release appears to hose video for X-Plane. If you have an nVidia card, don’t update, or you may want to back up a version.

This kind of thing has happened in the past – hopefully a revision will come out fairly soon. (But I do not have contact with the nvidia driver team for Windows on this…)

Also some users are seeing corrupt startup screens with ATI hardware – apparently some configs don’t react well to us turning vsync off. Not quite sure what we’ll do about that yet, but hopefully a fix will be in the next few betas.

Posted in Development by | 3 Comments

Is Bigger Always Better?

We’ve been preaching “one big texture, not lots of little textures” for a while now, and generally speaking, packing a lot of art into one big texture makes life eaiser for X-Plane, because it can draw more triangles at once before it has to tell the card to change what it’s doing. Inside the company we call this the “crayon rule“.

Now the total set of geometry and textures that X-Plane needs to use for one frame is the “working set” – you can think of it as the crayons that you keep out of the box because you need them all the time. And as I said before, if the working set becomes too big, your framerate dies.

Now with large panels we’re seeing a new phenomenon, one of the first cases where the crayon rule might not be true. The reason is due to working set.

When you make an airplane with a large panel in version 9, you can either use ATTR_cockpit, which lets you use the entire panel as a texture, or you can use ATTR_cockpit_region, which will let you use several parts of the panel. Each ATTR_cockpit_region is a texture change, so that’s more crayons. And yet ATTR_cockpit_region is usually faster.

The reason is two-fold:

  1. You can often use cockpit regions that don’t cover the entire cockpit texture. Large panels are rounded up to 2048 if the are larger than 1024 in any dimension, so the “wasted space” in a 1600×1600 panel is actually quite huge. If you can get away with some smaller regions, your total panel texture area is smaller because there isn’t wasted space due to this rounding, and you can also skip things like Windows. Prepping the panel texure takes time, and it’s done once for lit and once for non-it elements, so it adds up!
  2. It turns out there are two categories of textures that contribute to the working set: static texures and dynamic ones, and their impact on VRAM is very different. Dynamic textures are much more expensive. The panel texture is dynamic and it’s uncompressed, so it really costs a fortune. (32 MB of VRAM for 1600×1600. That’s not a lot for a static texture but for a dynamic one that’ll kill you.)

Here’s the details on dynamic vs static textures: the OpenGL driver keeps a backup copy of a texture in main memory, so that if it has to purge VRAM (to make room for more stuff) it still has the texture. As it “swaps” textures, the process is to simply send textures as needed from main memory to VRAM. No big deal.

But with a dynamic texture, the texture has been modified in VRAM! So the copy in system memory is old and stale. The graphics card thus must send the texure back to main memory, consuming twice as much bus bandwidth as normal. (To free 16 MB of VRAM and refill it takes 32 MB of transfer, 16 MB to copy the old texture back to system RAM and another 16 to send the new textures to VRAM.) On non-PCIe cards, this back-transfer might be at 1/8th the speed of the transfer to the card, so this is even worse on AGP cards.

Thus the driver does its best to not throw out dynamic textures. And this is why the panel texture is so expensive. That P180 will cause X-Plane to make two 16-MB dynamic texures, and those textures will cause 32 MB of VRAM to basically be off the table. That’s less space for the other textures to swap in and out of. This kind of “permanent allocation” makes the VRAM budget tighter for all other drawing operations.

Given the right combination of large panels, large res, pixel shader effects (which make more dynamic textures), clouds, and FSAA, you can easily get even a 256 MB card to a state where the free space into which static textures are shuffled becomes horribly small, and the framerate just dies.

So the moral of the story is: yes, it can be worth 4 crayons (using panel regions) to avoid the huge cost of dynamic textures from large panels.

As to static textures (regular DDS files) that are 2048×2048 – the jury is still out but my guess is they don’t represent a huge performance problem. As one user pointed out to me, they’re only 2 MB when compressed (maybe more with alpha) so they’re not insanely huge, and they can be swapped out.

Posted in Development, Scenery by | Comments Off on Is Bigger Always Better?

Script Files and Options

Sometimes we find that users machines cannot run without hiding OpenGL driver features from X-Plane. That is, the computer says it can support VBOs, but when the sim asks for a VBO, something really bad happens. X-Plane (since mid-version 8) accepts a series of command-line options that cause the sim to ignore the given feature.

These kind of bugs come and go as drivers are updated, the sim changes which technology it uses, and hardware cycles through the user base. The biggest one we’re seeing right now is that the new iMacs show runway lights as white squares unless the sim is run with the –no_sprites option.

We’re trying a new way to address these problems. In the past we would give users the command-line option; now we are building double-clickable script files that launch the sim with the appropriate options. Theoretically this…

  • Is less error prone for users.
  • Is quicker for users who may have to use the command-line option every time they launch the sim until a driver update becomes available.
  • Is quicker for us (in that we spend less time mailing out instructions and helping users who are unfamiliar with a command-line environment).

We did consider some other options, but this seemed like the least evil. The runners-up:

  • Just turning off the hardware feature permanently. We ruled this out because the performance hit would be significant and affect all users.
  • Attempt to identify and auto-turn-off known bad options. We do this in some cases, but it requires changes to the sim code, so this does no good if a new video card comes out after a given x-plane release and introduces new problems. so it’s not a total solution. Also since a bug might be resolved in-field, if X-Plane auto-avoids certain configurations we have to patch the sim once the configuration starts to work again.
  • Provide a user interface to turn these options off inside the sim. This was ruled out for two reasons: first, some of these options cause the sim to crash before a user can ever get to the rendering settings. Second, turning off these kinds of options can really kill performance, so leaving the options in sight produces a whole new tech support option. (A user tells us their framerate is awful at the lowest settings…perhaps they turned off the hardware acceleration options in an attempt to get the lowest settings…etc.)

We’ll see how well this approach works. So far it seems to be working better than handing out command-line options.

(Of course if we can address the problem by working around it with a change to our code, we almost always choose this option.)

Posted in Development by | Comments Off on Script Files and Options

NVidia: 2 Ben: 0

I found the root cause of another NVidia specific bug, and once again it’s my own stupid code. If you Google for driver bugs, you’ll find plenty of grumpy developers ranting about how card X does this wrong thing and card Y does that wrong thing…I figure it’s only fair to follow up and say “yep, that one was mine.”

Like the previous nVidia-only crash, this was a case where X-Plane was always doing something wrong, but only some drivers had problems with the behavior. So the crash was NVidia-specific, but X-Plane caused.

I believe that this bug was manfiesting itself either as a message that “scenery shift took more than 30 seconds” or some kind of crash. One of the problems was that the diagnostics for this particular bit of code were really bad. So we’ve improved things a bunch…

  • There is more careful error checking during scenery shift, and those error messages are reported.
  • If the sim does crash, some new code will output a crash log on Windows that helps us isolate what actually happened.

Beta 12 will be out soon with the fix that caused problems on NV hardware as well as the improved diagnostics. So you may find that the sim just works better, but if it does still crash or report errors, please tell us – now we’ll have log files that will let us diagnose the problem a lot faster!

Posted in Development by | Comments Off on NVidia: 2 Ben: 0

“Driver Bugs”

I have spent perhaps the last two weeks tracking down driver-related problems. But the term “driver bug” is heavily overused (blog around and you’ll see that many OpenGL developers get frustrated…) A few examples of the gray areas:

  • Sometimes there will be a bug in application code that shows up only on certain hardware. Drivers are concerned with making video cards go fast, not spanking programmers who don’t know what they’re doing. This is exactly what happened to me in beta 2 with the crash-on-nVidia-with-C172 bug. This was just plain broken code in X-Plane, but for some reason the ATI drivers didn’t have a problem with it (probably because they were performing an optimization which let them ignore the bogus call I made). NVidia specific, but not NVidia’s fault!
  • On OS X with the Radeon 9600XT, runway lights don’t show up. Adding an extra line to the pixel shaders “fixes” the problem. I believe the problem is in the driver (or the shader compiler more specifically) but by changing the code to the shader we work around the problem. A change in X-Plane addresses the problem, but not X-Plane’s fault!
  • The “36061” errors that some users have been seeing turn out to be because (through a very convoluted chain of events) X-Plane was asking the video card to operate in a mode that it could not operate in. Turns out this can be fixed by changing X-Plane’s code (fix will be in beta 12) or by getting new drivers. This one wasn’t really a driver bug – more that the drivers were limited in a way X-Plane did not expect. (Our fault for being picky!)

The situation is fundamentally tricky – games are integrators of other people’s technology – as such, we get blamed by the end user for a fault anywhere in the system. At the same time, it’s way too easy to turn around and blame the part supplier, and unfair when the source of the bug hasn’t been identified.

I am looking now at problems on Windows with dual core machines and nVidia cards. The problem goes away both by changing a registry setting that affects the driver and by changing X-Plane’s code. So I think it’s too soon to tell on this one.

Posted in Development by | Comments Off on “Driver Bugs”

I will reply (soon)

At this point my in-box has approximately 180 emails from the month of December regarding X-Plane 9. So while I appreciate all of the feedback we’ve gotten (bug reports, performance, etc.) it’s going to take a while to reply. If you haven’t heard from me, don’t panic! I hope to answer a whole pile of emails next week.

In the meantime, I’ve been working on improved crash logging on Windows. Right now when we have a crash on Windows, all we know is that (1) X-Plane crashed and (2) what DLL we crashed in (which is always us or the video driver – not useful).

Coming soon, X-Plane will catch the fatal crash, examine memory to see what was going on, examine its own EXE to figure out the names of the functions going on, and output it all to a crash log that users can send us to get a much clearer picture of what’s going on. This information is called a “backtrace” – we’ve had it for the Mac or a while (OS X provides back-traces automatically with a crash) and it’s really useful.*

So my top priority is all of the users who are seeing problems during scenery load, and a new build with a back-trace should help to reveal what’s really going on.

I’m also working on putting additional timing and performance information into the sim so we can learn more about why some users have poor performance. So far here’s what I’m seeing:

  • 8800 users seem to have great performance. If you have this card and don’t have good fps, adjust your x-plane settings and NV control panel settings – this card can bring it.
  • 8600 users sometimes have performance problems – not sure why.
  • Older nvidia GPUs (7600, 6800) sometimes have performance problems with the new eye-candy features – I am investigating.
  • The pixel shaders seem to slow down the new HD2x00 Radeons a lot more than expected…I still need to investigate this. This is the most surprising datapoint – the X1600 does very well, so I would expect newer GPUs to at least have that level of performance. I think this is something we might be able to address.

However not all of the reports are consistent, so I think it’s too soon to make some calls on recommended hardware. The only thing that’s clear is that most 8800 useres who we do careful perf experiments with end up with huge framerates.

* Microsoft provides some back-trace facilities, but since we don’t use their compiler tools, we had to roll our own.

Posted in Development by | Comments Off on I will reply (soon)

V-Sync – Problematic in Practice

To those who have sent performance info: thank you, but you probably won’t hear for me for a week. I’m up to my eyeballs in reports and it’s going to take a while to get through them.

I finally found the code that allows X-Plane to turn off V-Sync. This should help nVidia users who are having framerate problems.

The basic idea is this:

  • X-Plane tells the graphics card to draw a lot of stuff.
  • The video card accumulates this “todo” list and works on it while X-Plane runs.
  • X-Plane indicates that the entire frame to be seen is done and tells the card to show the results.
  • Eventually some time later the card finishes the todo list and then shows it to the user.

V-Sync relates to the question of when this last step happens. When V-Sync (vertical sync) is off, the card shows the results as soon as it is done drawing.

But when V-Sync is on, when the card finishes drawing the world, it then waits until the monitor is done drawing its frame, and then shows the results. Without V-Sync we can have a situation where the top half of the monitor is showing a new frame and the bottom half is showing an old frame. (This is called “tearing”.)

So normally V-Sync is good because it prevents tearing. But the problem with V-Sync is that a frame can only be shown when the monitor refreshes. The video card has to wait until this happens, and this slows our framerate down.

In particular, most users have their monitors set to 60 hz. If X-Plane can only produce frames at 50 hz, the video card will have to further slow the framerate down to30 hz (one x-plane frame for every two monitor refreshes). If X-Plane falls below 30 hz, we end up with 20 hz (one X-Plane frame for three monitor refreshes), and if X-Plane goes below 20 hz, we would clamp at 15.

So when monitor refresh is on, there can be large framerate hits for small losses of performance in the sim, and a real risk of getting locked around 20 fps.

(The minimum framerate in X-Plane is intentionally set to 19 so that we won’t fog up if the video card clamps us at 20 fps.)

So when beta 11 comes out, you may get some framerate back if you haven’t already hacked your graphics card’s control panel settings. If you still want v-sync, you can always set it this way in the control panel. But most users I’ve talked to are happy to have it off.

In an only vaguely related note, one of the reasons to have high frame rate is to have a smooth flight model. But Austin has now put a new setting in the operations-and-warnings dialog box: you can pick how many times per graphics frame the physics run. The normal ratio is 1:1, but for fighter and acrobatic pilots, you might find that you can get a nice feel at lower fps (20-30) by setting a higher ratio.

Posted in Development by | 3 Comments