Order independent transparency (OIT): first, the shiny robot.
So first, what’s so special about this? Well, if you’ve ever worked with a lot of translucency in X-Plane, you know that it doesn’t work very well – you invariably get some surfaces that disappear at some camera angles.
The problem is that current GL rendering requires translucent surfaces to be drawn from farthest to nearest, and who is far and who is near changes as the camera changes. There are lots of tricks for trying to get the draw order mostly right, but in the end it’s somewhere between a huge pain in the ass and impossible.
What’s cool about the robot data is that the graphics card is drawing the transparency even if it is not drawn from back to front, which means the app can just shovel piles of translucent triangles into the card and let the hardware sort it out (literally).
X-Plane is currently riddled with transparency-order bugs, and the only thing we can do is burn a pile of CPU and add a ton of complexity to solve some of them partly. That proposition doesn’t make me happy.
So I am keeping an eye on hardware-accelerated OIT – it’s a case where a GPU feature would make it easier for modelers to create great looking content.
This feature is not in the WED developer preview (because I just coded it) but: WED 1.1 will feature “click-to-split” for edges. With WED 1.1 you can option-click the edge of a polygon or line feature (but not a must-be-straight entity like a runway) to insert and drag a split point.
This will hopefully be a lot easier than the current, convoluted, WED 1.0 technique of selecting the two surrounding vertices and picking “split”, then repositioning the vertex.
Hackers: this features is not yet checked into the tree, so … building from source won’t help you. It’ll be available some time in the next week.
I realize as I write this that I am going to get some comments mocking the fact that X-Plane 940 is on RC, um…13. I don’t decide the milestones for X-Plane, nor do I decide the version numbers. If you want to discuss why X-Plane is 940 (and not 930 or 950) and why it goes beta and RC when it does, email Austin. What follows is all about the scenery tools, not X-Plane.
With that out of the way: I have released a WorldEditor 1.1 developer preview. So I wanted to explain in a little bit more detail what the difference is between a developer preview and a beta. Here is an approximation of the standard definitions of “milestones” – they are what I use for WED.
- Development: not all features are coded, no guarantees about bugs.
- Alpha: all features are coded, no bug is so severe that you can’t at least try a feature. (For example, if WED crashed on startup, it would not be alpha, because you could not test saving files.)
- Beta: all requirements of alpha, also no bugs that cause program crash or data loss.
- Release: no open bugs.
This all applies to known bugs. Beta software may crash and cause data loss – it’s just that we wouldn’t have put it out as beta knowing that this happened.
WED 1.1 is still in phase (1) – development, and the build I posted is a developer preview – a cut of whatever code I had laying around. So: I can’t promise it isn’t going to trash your data or crash! Be even more cautious with the developer preview than you would with a normal beta. You don’t want to run a five hour session without saving your work, and you want to be backing up your work often – the “save” command might trash your entire project.
Why do a developer preview if it’s still so buggy? Some users who know how to compile WED from its source code are already using WED 1.1 and they seem to be enjoying it. So far it appears not to be lethally broken. Given that and the fact that most of the uncoded WED 1.1 features are usability and edge-cases, it seemed like the developer preview could be useful for getting earlier feedback.
One last note: the manual is not updated at all, nor is there any documentation on the new features. Let me be clear: no tech support or help is provided what-so-ever. Do not email me, or X-Plane tech support with “how do I use WED 1.1” questions. If you cannot figure out how to use WED 1.1 on your own, don’t use the developer preview.
We’ve made some progress in transitioning XSquawkBox to be under Wade’s primary management – read more here.
First, I want to thank the X-Plane users who have gone through piles and piles of regression tests to isolate some of the peskier bugs. Often a bug only occurs on hardware that we don’t have in-house, so this detailed reporting is a huge help in shipping a program that has to run on a huge number of configurations.
Sometimes a user will offer to work around a bug by changing an add-on, or just dropping the add-on. But…I have learned the hard way: never ignore a bug you don’t understand.
First, the bug might run much deeper than the reported use. Perhaps the bug is actually affecting dozens of other add-ons.
If we don’t understand the bug, how can we say “this is so unimportant that it can be ignored”?
Now some bugs, once diagnosed, may prove to be not worth fixing. But…until the bug is fully understood, we have to take the time to dig in. We can’t just give up because the bug seems unimportant.
This is off topic, and I try to keep the off-topic political crap on this blog to a minimum. But I do want to put a plug in for Freecycle.
Freecycle is basically a series of locally based mailing lists for the free exchange of things you would have thrown out. My wife and I use it every time we move – we get packing materials from other people who have just moved, and then give them back to our new local group when we are done. We also freecycle all of “that old junk” that we realize we shouldn’t move with us to the next location.
If you are ever in a situation where you need to throw out items that are potentially useful but just not worth the cost of carrying anymore, please consider Freecycle – it strikes me as a reasonably reliable way to keep items out of landfills. (Especially hard-to-recycle mixed-material items like electronics.)
One note of caution: freecycle is just a group of people exchanging “stuff” in their spare time – typically it could be 2-3 days before someone can pick up your donated item. So…if you have a lot of stuff to give away, start early. I made this mistake in DC and wasn’t able to give away all of the items because people couldn’t pick them up soon enough.
Laminar Research has closed its Washington, DC satellite office and opened a new office outside Boston. Okay – that is completely cheeky – LR employees all work at home, and I have moved from the DC area to the Boston area.
This has put me a little bit behind schedule with, well, everything. I have four or so interesting example plugins almost ready to be published, WED is ready for some kind of developer preview, and I have some X-Plane 940 bug reports to go through. I hope to get the back-log cleared out this week. But first priority: unpacking the computers (and figuring out how to get an internet drop into the new home office).
If you have ever textured an airplane, then you know that you can’t use the same texture for both sides of the plane if there is writing on the fuselage. The writing will be horizontally mirrored on one side.
The same thing goes for normal maps – you can’t mirror a normal map without getting bogus results. Think of your normal map as a real 3-d (slightly extruded) piece of metal. If you are seeing the text go the wrong way, the metal must be facing with the exterior side pointing to the interior of the airplane.
Having your normal map flipped does more than just “inset” what should be “extruded” – it completely hoses the lighting calculations. In some cases this will be obvious (no shiny where there should be shiny) but in others you will see shine when the sun is in a slightly wrong position.
The moral of the story is: you can’t recycle your textures by flipping if you want to use normal maps.
Now before you come at me with pitch-forks, I don’t hate photo-realistic scenery, and X-Plane’s performance with orthophoto-based scenery is very, very good. But…the term “Photo-Realistic”…it makes me crazy. Here’s why:
First, photo-realistic scenery links the use of photographs in scenery to realism in its very name, and I don’t buy it.
Yes, some photo-based scenery packages are realistic looking, by today’s standard of flight simulation. Some are not. Just look at any old photo-realistic package to see what I mean…realistic is a relative term, defined by how much fidelity we expect, and that expectation has steadily gone up. Even with a modern package, a photo-based scenery pack might not be realistic if the photos are not used well.
(For example, is a package that uses orthophotos on the mesh but provides no 3-d in a city still considered realistic now? What kind of review would such a package get?)
Nor do photos have a monopoly on realism. They can look nice when well used, but I would put Sergio’s custom panel work up against any photo-based panel. (Sergio does not manipulate photos for his panel, he constructs them from scratching. He has thousands of photos for reference, but the pixels you see are not originally from any photo.)
Second, the term photo-realistic (in the scenery world) is most commonly applied to scenery that applies orthophotos to the terrain mesh in a non-repeating way. But orthophoto base meshes don’t have a monopoly on the use of photographs, which can be used to form land-class textures or to texture objects.
Okay, so “hate” is a strong word. But I feel some frustration whenever I see scenery discussed in terms of “photorealism”.
I’ve had a few inquiries about environment maps, normal/specular maps in more places than just OBJs, etc. The short answer is: a lot of these future rendering engine enhancements are good ideas that we like, but we have other features that are already partially implemented that we need to productize first.
In particular, one of those half-finished features not only could use to be shipped, but also affects the way just about every other rendering effect works. So better to get these features finished first, and build new effects within the context of these “new rules”.
Here are some of the ideas that I’ve heard kicked around:
- Environment maps on the plane – I like it, it’s not that hard to do, and the framerate hit could be ramped up and down with detail. See above about finishing other features first.
- Next-gen texturing on runways (wet runways, environment maps, bump mapping) – I like all of it. The runways really need to be addressed comprehensively, not piece-wise, in order to find a rendering configuration that meets our scalability and efficiency needs. (In other words, we can’t just burn tons of VRAM on the runways, and we need a way to render them that works on low and high end computers with one set of art assets.)
- Normal mapping on the ground. I like it, but I wonder if this isn’t part of a bigger idea: procedural texturing on the ground. E.g. if we want to add detail on the ground, can we add it with multiple layers at different resolutions with a shader adding yet more detail.
Generally speaking, it’s not that hard to push a feature out from one part of the rendering engine to the other. For example, normal maps on facades (if anyone cares 🙂 would make sense and be a fairly trivial change. I hesitate on polygons only because the ground might require something a little bit more sophisticated.
Going Crazy With Choices
Here’s a straw man of why polygons might be different: draped polygons can’t have specular shininess right now, and I don’t think anyone is complaining. So it’s a bit of a waste to use the alpha channel of a normal map for shininess. Perhaps it could be used for something else like an environment mapping parameter.
Hrm…but we would like to have environment mapping on airplane objects too someday. Well, we could go two ways with that. We could just use the alpha channel for shininess and environment mapping…not totally unreasonable but it wouldn’t let us have a glossy non-reflective material, e.g. aluminum vs. shiny white paint.
Math nerds will realize that the blue channel in normal maps can be “reconstituted” from the red and green channels by the GPU (at the cost of a tiny bit of GPU power). That would give us two channels to have fun with – blue and alpha. Perhaps one could be shininess and one could be an environment mapping material.
Well, shininess is still no good on the ground. But…perhaps that would be a good place to store dynamic snow accumulation? Hrm…
My point of this stream-of-consciousness rant is that the design of any one rendering engine feature is heavily influenced by its neighbors. We’ll get all of these effects someday. If there are features that are really easy, we can get them into the sim quickly, but the only obvious one I see now is using bump maps on other OBJ-like entities (which at this point would mean facades).