No More Instrument Limit
X-Plane 902 has an instrument limit of 400 instruments for the 2-d panel and 391 instruments for the 3-d panel. 920 will remove these limits, allowing you to build very complex panels via generic instruments.
X-Plane 902 has an instrument limit of 400 instruments for the 2-d panel and 391 instruments for the 3-d panel. 920 will remove these limits, allowing you to build very complex panels via generic instruments.
I spent part of the morning tracking down installer issues. Here are some things I have learned:
If you have Windows Vista and User Access Control, there’s a good chance that the installer may get caught on permissions problems. I don’t fully understand what’s going on, but I don’t think it’s a coincidence that it is US GraphSim customers who see this.
Basically those DVDs, which went to press a little bit early, default to installing X-Plane on the C drive, which is not a generally accessible area. UAC then makes some guesses and allows some but not all operations (or something equally weird) and the user gets stuck.
I am still investigating, but I think the fix is to install to your home folder.
Some users have bad DVDs. It happens, and I think it happens more now that we’re dual layer. There’s also a variance in the sensitivity of drives – some users send us back damaged disks and they look bad but we can still read them.
Tech support will tell you this, but if you can’t install, one test is to simply copy the entire DVD to your hard drive; if the copy fails mid-way with “I/O error” or some other message about damaged media, call up tech support, they’ll swap you a new disk.
We try to find duplication facilities that do high quality work, but bad disks do happen. Tech support should be able to make this right for you.
If you are developing an add-on for X-Plane, you should always check the log file (Log.txt) after running. (Remember, the sim must quit before the log file is completely written to disk.)
Posting errors to the log gives us a way to provide verbose feedback to authors when the sim hits a problem without making the user experience too horrible; minor errors are logged and major errors are mentioned to the user only once per package and logged.
In a previous post I said that our tech support guys will trouble-shoot the most likely problems first (based on what we see in our entire user base) – they’re playing the odds.
I seem to be in a philosophical mood these days with my blog posts…thought for the day: the human mind easily goes from the specific to the general. Our brains are generalizing machines, pattern matchers finding the rule in the noise.
I am also deferring work on dataref-driven textures; we’ll get there eventually, and the infrastructure from the pager will make it easier. But dataref-driven textures really need to be available in a lot more places – it’s a bigger, more complex feature* and I can’t keep adding scope to 920.
I’ve been reading Fooled by Randomness (highly recommended – it will either change your life or you won’t finish it because Taleb’s writing style annoys you) – and it has me thinking about the nature of certainty in software development. Consider two approaches to uncertainty and how they are completely at odds with each other.
So playing the odds, the right thing to do when presented by a third party with weird behavior is to wait and see who else reports it; the frequency of reports gives us information about the likely resolution.
Since X-Plane 9 went final I’ve been going in about 5 different directions with the code, and as a result my posts have diverged pretty far from my charter within the company, namely scenery. Here’s some of what I’m looking at on the scenery front:
Texture Paging
Texture paging is a new feature where X-Plane can change the resolution of orthophotos while you fly. The big limitation of orthophoto-scenery right now is that X-Plane loads every orthophoto at the maximum user-defined resolution when the underlying DSF/ENV is loaded. This can cause X-Plane to run out of virtual address space and crash. With texture paging, only the nearby textures are loaded at high resolution.
Far away textures may be low res, but you’ll never notice because they are far away (and thus you’re seeing them over only a few pixels on screen anyway.
The cost of this technique is that textures are being loaded over and over and over. But this cost is made less painful by two features:
This feature will require modification of scenery packs in that you’ll have to add some “paging” information to your .ter files; I will try to get a preliminary spec posted soon. Because you only have to modify text files, it should be possible to create “paging patches” to existing DSF orthophoto sceneries that are fairly small.
I do not know if paging will be available for .pol files. My guess is yes, but I reiterate that using .pol files for large areas of orthophotos is a bad idea! Make a new mesh!
Improved Multi-Core Performance
This is going to be an on-going process through the entire v9 run and beyond, but basically with every change to our rendering engine (and we make some serious modifications with almost every patch to keep it modern and competitive) we try to move toward using the new hardware (meaning machines with 2-4 cores) more efficiently. Some of this will be in 920, although my guess is we’ll need at least one more patch after 920 to really see the improvement.
Tools
It’s going to be a little bit before I can put more time into the various scenery tools. My top priority is to keep up with user feedback and work on MeshTool. Hopefully we’ll also declare WED final soon. But for now, since I am working on cockpit and airplane modeling features, my next set of work will probably be for the airplane authors.
Shaders
I do want to put additional shader work into v9. I realize I am at the edge of provoking a bunch of rants in the comments section, so let me save you some time:
“Hey Ben, stop working on eye candy and create some more tools. I don’t want a shader feature that makes my 1.2 ghz G4 with a GeForce 2 MX run any slower. You should finish the tools so they do exactly what I want!”
Okay, I admit, that was totally unfair…there is a lot of truth in the complaints about shaders vs. tools.
So in planning what goes out when, I look for clumps of features and tools that can go together to make some sense: WED to use apt.dat 850, texture paging to go with MeshTool. It wouldn’t make sense to defer texture paging to make the next tool while MeshTool is waiting for engine enhancements.
* A DDS already has all of the smaller-version pre-compressed textures in the file. So loading a DDS at low res involves loading a small amount of data from disk to memory and sending it to the graphics card.
By comparison, a PNG file only contains the maximum size, so to load a PNG at low res, we load the largest size, shrink it on the fly, then apply DDS compression.
For those who posted comments, sorry it took so long to moderate them – for some reason my spam filter decided that notifications of comments are, well, spam, so I just found them now. I should have known people would have jumped into a Vista-bashing thread. 🙂
There is an X-Plane 9.02 beta 1 posted – like 901 we’ve been pretty quiet about this, but you can get it by enabling “get betas” and running the X-Plane updater. Please give it a try. Like 901 it is a small change for the purpose of localization, but it actually has an interesting feature pair:
This is part of some rework we did to provide better language support. So…you should be able to run X-Plane no matter what weird characters* are in your folder names, name your airplanes funny things, and see diacritical marks. 902 uses a font that provides all of the Latin and Greek/Cyrillic code pages.
Also I have heard reports of improvements based on drivers:
As always, bugs in the X-Plane beta should go to our bug report form, on the X-Plane contacts page.
* You might accuse me of being American-centric in decrying diacritical and greek letters weird – but the truth is I am computer-centric…anything that is not in the original ASCII set is weird. 🙂
I’ll warn you in advance: this is going to start off topic and go way off topic. “Catching up” with the changes to Mac OS, Windows and Linux has me thinking about the nature of technology. I feel a little bit guilty about this post because it’s going to turn into a rant about Vista, and ranting about Vista is like shooting fish in a barrel. On the other hand, having used Vista, well, I have a lot of rant to give.
One of the most important things to understand about technology (and computers are no exception) is that changes in the scale of the technology change the very nature of the technology. That is, as you make computers faster and cheaper, at some point the sum of all of those small improvements changes the fundamental nature of the beast. We’ve seen this as the computer transformed from main frame to desktop (which is really just a change in cost and size), finding an entirely new audience, and now again as the computer changes from what we know of now as a computer to cell phones, MP3 players, and other small, mobile devices.
“Commodification” is what happens when, as things get better, cheaper, faster, etc., consumers stop caring about the marginal improvement. Back in the days of Windows 95 and 386’s, there were ways you could improve the operating system and hardware in substantial ways; a doubling in processor speed and a rewrite of the operating system got you protected memory, which meant less data loss.
A few years ago we reached the point where desktop hardware became commodified. For the average user, 1.8 vs 2.2 ghz makes no difference at all. It’s a question of how quickly your computer can wait for keystrokes and data from the internet. (Answer: even a lowly Celleron is light-years faster than the I/O devices it typically has to talk to. Even if you’re the last kid in your class at Harvard, you’re going to be bored discussing politics with a bunch of four-year-olds.) At that point things became very difficult for major vendors like IBM (sold out), HP and Compaq (merged), Gateway (bought out of it’s misery), etc. The price of a desktop plummeted from over $1000 to less than $400.
I believe we’ve reached the point where operating systems have become a commodity as well;
And this is why life is not so good for Microsoft. In a non-commodified market, you can charge a premium for incremental improvements over the competition. That’s a game Microsoft can play – they have a lot of capital to invest in changing their operating system, as long as they are rewarded with a lot of cash for doing it. (And normally they are – about six billion dollars for a major OS revision, I’m told.)
The problem is that operating systems are now a commodity. Simply put, users don’t need a new operating system. There are no big ticket features missing from OS X 10.4, Windows XP, or Linux 2.6. This makes Microsoft’s business model fundamentally vulnerable to Linux for the first time. If the name of the game is:
That’s a game Linux, with their army of distributed bug fixers and free source code, is going to win.
When I looked at Windows XP and Ubuntu 6.06 I was afraid that Linux wouldn’t make traction into the desktop market. I blamed the adoption of X11, the KDE/GNOME schism, and the Linux communities’ being made up of Shell nerds for the tolerable desktop experience.
But look where we are now: Vista is a vehicle for bloat. Combine “we make money by shipping major features” and “there are no more major features to ship” and you get Vista…an attempt to change a lot of things when you should have left things alone.*
By comparison, Ubuntu pretty much just works – you put the live CD on your machine, it asks you some questions and installs…it knows about more hardware, has less bugs, more drivers, and a better user experience. In a commodified operating-system space, the only thing to do is try to avoid a bad user experience – if you can’t offer a really juicy carrot to users, try to avoid hitting them with a stick.
And it is in this environment that the Mac is actually gaining market share. Apple’s business model has always been at odds with the industry. Complete vertical integration meant higher costs, lack of market share, and out-of-date technology – back when having more for less meant something, that was a real weakness, and explains why the Mac never dominated in market share.
But what a difference a decade makes! Hardware is now commodified (and Apple is integrated at the system-building level, leveraging cheap third party parts like they always should have). Operating systems are commodified. But on the one frontier left, quality of user experience, Apple’s vertical integration gives it an immense advantage.
The question is: why does an operating system “just work”?
I don’t kow what Microsoft’s future is, but it can’t be very good. At some point they are going to have to transition from a “major revision for cash” to an “incremental tuning” approach to operating systems. As long as they have market share, they still get the “Windows tax” – that is, their OEM pricing from major vendors on every new computer that is built. It’s going to be harder and harder to convince the entire world to make a major jump (see how well XP to Vista went). In this situation, they’d be better off with a more solid operating system. It’s unfortunate that they’re going to have to try to sustain market share with Vista.
Their best-case scenario is that they eventually get Vista back to an XP-quality experience, in which case all they’ve done is spend a huge amount of R&D; money and pissed off a lot of customers to maintain the status quo.
* I have mixed opinions on Vista’s video-driver-model change. But that’s a different post.
Austin posted another State-of-the-Union yesterday, and he was good about only mentioning things that are fairly close to completion (e.g. the panel texture optimizations) as well as things that are general (e.g. threading optimizations). After a long beta, when 9.0 went final I sort of went crazy and started a whole series of new projects at the same time; here are some things that I have in progress:
That’s a bit much for the next patch, so it’s likely that only some of these things will actually make it into the next patch, and I’m not sure what you’ll see. A lot of sim work goes in as a series of small independent pieces; only the last parts of a feature are user-visible.
For example, the first part of texture work was simply rearranging how the code was structured to make new things possible. Change to the user experience: none. The second part changed threaded handling of textures, which at least shows up as performance improvements. But both the stage for new features later.
So even if a lot of the above doesn’t make it into the next major patch, a lot of ground work is going in, setting us up for features later.