The short answer is: this is not a very good idea.
Now with OS X, this configuration is supported, and OS X will cleverly copy graphic output from one video card to another to make the system work well. You will get a fps hit when this happens.
With Vista, this configuration isn’t supported. (Snarky comment: it is lame that Microsoft completely rewrote their video driver infrastructure and went backward in terms of configuration support.)
With Linux, I have no idea if this configuration can run. I do know that trying to change my configuration hosed Ubuntu thoroughly and I decided not to break my Linux boxes any more, having spent plenty of time doing that already in the last few days.
For X-Plane, we can’t handle this case very well (at best you get the framerate hit) because we need to share textures between the IOS screen and main screen. So if you are trying to set up an IOS screen, you really do need a dual-headed graphics card. For what it’s worth, every card I’ve gotten in the last few years has had two video outputs.
Ben, do you mean, ‘two video cards from two vendors’ or ‘two video cards or two vendors’? What I’m asking is that it’s unclear if the problem is when mixing cards from ATi and Nvidia, or just having two separate video cards regardless.
Cheers,
Two cards from two vendors hoses Linux and Windows.
Two cards period penalizes X-Plane…various driver implementations will work, but you’ll be paying some kind of performance penalty, epending on the driver quality.
Two cards, same vendor, with SLI or Crossfire should theoretically not penalize you.
With Xorg (Ubuntu) it should just work for 2D, but not so sure with 3D, mixing both the AT & NVIDIA proprietry drivers is probably a bad idea, but an NVIDIA card plus and ATI that uses the open driver would probably work.