In my post on 64-bit computing and X-Plane, there’s a point that’s implicit: there is a cost (in development time) to adopting any new technology, and it takes away from other things. I’ve been slow to work on 64-bit X-Plane because it would take away from things like texture paging and generic instruments. Similarly, there is a cost every time we do a build to supporting more configurations, so we pay for 64-bit continuously, by supporting six platforms instead of 3 (3 operating systems x 2 “bit-widths” of 32 and 64 bits).
We have a similar problem with graphics hardware, but it’s even more evil. Moore’s Law more or less says that in a given period of time, computer technology gets twice as fast. In the case of graphics cards, each generation of cards (coming out about every 12-18 months) is twice as fast as the last.
This has some scary implications for X-Plane. Consider these stats for video cards (taken from Wikipedia):
Card Date fill rate Bus Memory bw
GF3 01Q4 1920 MT/S 4x 8 GB/S
GF4 Ti 03Q1 2400 MT/S 8x 10 GB/S
GF5950 03Q4 3800 MT/S 8x 30.4 GB/S
GF6800 04Q2 7200 MT/S PCIe16 35.2 GB/S
GF7900 06Q1 15600 MT/S PCIe16 51.2 GB/S
GF8800 06Q4 36800 MT/S PCIe16 86.4 GB/S
GF9800 08Q2 47232 MT/S PCIe16/2 70.4 GB/S
Let’s assume we support any video card in the last 5 years (in truth we support more than that). The difference between the best card and the oldest in w006 was 13,680 MT of fill rate.
Now in 2008 the difference is 43,432 megatexels per second!
In other words, the gap between the best and worst cards we might support is over 3x larger in only 3 years!
This is no surprise – since cards get twice as fast with every revision, the gap for a given number of generations also gets twice as wide.
What this means for us, programming X-Plane, is that coming up with a single simulator that runs on the very best and worst hardware is becoming increasingly more difficult, as the performance gains at the high end run away.
As 920 beta finishes up, I am working on some WED features. Future WED updates will be small and more frequent; I think it’s more useful to get at least some features into WED quickly, rather than holding them until I have a huge update.
So this list is short – that doesn’t mean that other features won’t happen – they just won’t happen as soon. In particular, I am very aware that WED needs better taxiway sign editing and better polygon editing. Those will have to wait though.
WED 1.1 will provide overlay editing. WED will not be a replacement for overlay editor – Marginal has done a really great job with his tools. However, having overlay editing in WED will allow people making airports to see complete apt.dat 850 data and overlay data at the same time. This is particularly useful for authors making custom scenery where the apt.dat format is augmented with custom DSF overlay pavement and lines.*
WED 1.1 will have 3 features:
Note that while roads and beaches are technically allowed in overlays, they are not on the list. The reason is that right now both of them require elevation data in the road and beach itself. Since WED doesn’t have a base mesh, it can’t sanely provide this information.
Eventually we will have some way to “drape” roads – at that point it will make sense to provide WED road editing too. But let’s not delay at least some overlay editing that long!
I don’t know exactly how long this work will take – my rough guesstimate is about 2 weeks. But…that depends on my working on WED for two weeks straight without interruption!
* Every type of element in apt.dat can also be created with custom-provided artwork and an overlay DSF — draped polygons for taxiways, draped lines for taxiway lines, object strings for taxiway lights, etc. The idea is to provide a way for authors who want to extend realism beyond the scope of apt.dat 850 to insert custom artwork. Apt.dat 850 is not a modeling format, so you cannot provide PNG files with it.
If you look at the LOWI demo area, you’ll see that some of the pavement in that layout is in the apt.dat file, but some pavement is in the DSF overlay. Creating this demo area required a bunch of DSF2Text hacking by Sergio and myself. With WED 1.1 it will be possible to do this completely using WED.
I don’t usually blog about plugin issues, but this one falls into limbo, somewhere between plugins, aircraft design and OBJ animation. This is written for plugin developers; if you don’t write plugins, you can tune out.
Plugin Drawing
Plugin drawing is documented in a few tech notes – I strongly recommend reading them all.
The basic idea is this: to draw in a plugin, you register a callback. X-Plane tells you “it’s time to draw” and you draw.
The first rule of drawing callbacks is: you only draw. You don’t do anything else.
The second rule of drawing callbacks is: you only draw. You don’t do anything else.
There are a number of good reasons for this.
Surprise
Now here’s the surprising thing: your dataref callback is actually a drawing callback, sort of!
Object datarefs are called back during object drawing. Therefore they have all of the above problems that drawing callbacks have. Here is my recommendation:
If you create a custom dataref that is used by an OBJ for animation, return a cached value from a variable in the dataref, and update that variable from the flight loop.
This avoids the risk of your “systems modeling” code being called more than once per frame, etc.
When you update X-Plane, it scans about 10,000 files before it runs the update. This scan checks every single byte of every file. As a result, it’s a little bit slow – it takes a few minutes.
With the new installer, when you add and remove scenery, we do the same scan on the installed scenery. This scan is very slow – 78 GB is a lot of data!
When the next installer comes out (2.05, which will be cut after 920 goes final), the add-remove scenery function will do a quick scan to see what files exist. This is a lot faster – even with full scenery the scan takes only 10-20 seconds on a slower computer.
I can’t pull the same trick for updates – we need to detect any possible defect in a file during update to ensure that the install is correct!
We get a number of questions about whether X-Plane takes advantage of 64-bit CPUs. The short answer is: “no, not yet”, but here’s the details.
So the short answer is: you’re not going to get a fps boost from 64 bit. Sorry.
I’m back from vacation and trying to catch up on email and close out 920. I’ve received a number of emails regarding the 3-d cockpit, the big questions being:
The answer is unfortunately “not much for now” and “no”. Let me explain what’s going on with the 3-d cockpit and what we’re thinking for a long term strategy.
The 3-d cockpit lighting environment must work the same for the panel texture and object textures. This is necessary to keep the brightness of the finished cockpit consistent between the two textures sources. With the current 3-d system (e.g. what you can do in 864) often the brightness of the panel texture and the rest of the object don’t match.
If we don’t provide the spot lighting features for all panel textures then we have the problem of inconsistent lighting, which makes the feature fairly useless.
It’s that time of the year again…I’ll be out of the office next week, and more importantly, away from all cell and email contact.
Austin and Randy will be at EAA – stop by and say hi if you’re there. Once Austin gets back, the last few betas should resume – we’re close to an RC on 920 but not quite there.
I just found two bugs in the scenery system in beta 7 (both will be fixed in beta 8):
One of our users did the right thing: even though it wasn’t his scenery, he reported the problem and sent me the packages, rather than fixing the problem and going on.
My goal with scenery is to avoid regression bugs – that is, if it works in 900, it should work in 920. So I was glad to get these reports. I cringe when I read posts where people report “fixing” problems with content that are new to 920. Those are my bugs to fix!
My previous post on scenery tools stirred up some discussion; y-man brought up a point fundamental enough that I think it warrants explanation.
The questions is whether to fix broken DSFs by editing the source data or the DSFs themselves.
Let me be clear: both are viable options, both have limitations, and neither are possible today. So in choosing one over the other, I am picking a strategy that I think is superior, and discounting the other not because I think it is useless, but because it is less useful and development time is very limited (doing both would take away from other good features).
Basically the two ways to fix a DSF are:
I strongly believe we must pursue the second strategy for this simple reason: if we correct a DSF but don’t fix the source data, the same mistakes will be made in the future.
We have to keep recutting the global scenery to keep up with:
To lose any of these would be a big set-back in scenery quality, so not recutting DSFs isn’t a great option. (Furthermore, improvements in scenery often come from new global data, so picking user changes over new data would be a tough choice.)
By letting users change the source data, we can have the best of both worlds: problems are fixed while new technology is adopted.
To answer y-man’s direct questions:
But that data is not available to us users is it? If it is, where is it, and where is the spec to work with it?
It is not available yet! I am proposing to focus on making the data available and creating the infrastructure to share data and receive improvements (choice 2) rather than providing a DSF mesh editor (choice 1).
Alternatively, a user who goes thru the trouble of correcting base DSF could send in the modified DSF, or may be even a diff between before and after states of DSF text files, and attach an explanation of purpose.
Here’s the problem: we can’t preserve the diff to the DSF and apply it to future renderings.
Imagine, or example, that you relocate 500 mesh vertices from the existing DSFs to correct a coastline. (I am being generous here and assuming a clear, single, unifying edit. But some authors would more likely move the vertices to make a number of improvements.)
In the meantime, another author creates an airport nearby, and someone else improves the SRTMs (there is at least one person attributed in our about box who has been collecting void-filled SRTM tiles). The effect of the nearby airport’s pavement changing size and the raw elevation changing height is to change greatly how the mesh is generated, such that 400 of the 500 vertices that were moved simply no longer exist, and 450 new vertices are in the nearby area now that were not in the original DSF.
This case is known as a “merge conflict” in computer programming terms (and happens when two changes to a program are “merged”). The problem is that we can’t sanely know what to do with our 500 edited vertices in this case. Do we take the 100 vertices that still exist and move them without the others? What if that produces very strange results? (Triangles might become inside-out because some vertices are moved a lot but their neighbors are not because they did not match the diff.)
We could try to apply some kind of change to the new vertices similar to what happened to the old, but how do we know if this is making things better or worse? What if that change simply deforms the outline of the airport that was added?
I can go on and on with these kinds of examples, but the point is this: you can’t unbake a cake. A DSF is similar; you can’t necessarily recover the sources that were processed together to form the final product.
This is why it’s important that we create infrastructure to correct source data, rather than focus on editing the final DSFs. I can assure you that my negative attitude toward editing DSFs is not a negative attitude toward user participation!
There are still a lot of details to be worked out about how we can work together. Who will own the data, and on what terms? How will it be distributed? How will it be shared?
Unfortunately I can’t answer those questions yet. I still need to do more research into what is technologically possible – then we can figure out how to proceed.