Hi, I'll start a project I always wanted to do, which is to create in realtime a virtual cockpit, which will move as the camera moves. Or should I say which will look attached to the plane even if the camera is moving. It should be a very immersive experience.

It's not too difficult, as we have the angles of the camera (from the pan-tilt servos). The cockpit will be created on a PC, which will mix this with the video from the FPV. The position of the cockpit over the video will be changed as the camera moves, so to look as if the cockpit was fixed relative to the plane.

Virtual instruments such as airspeed meter, altimeter and variometer can then be included in this virtual cockpit with a good level of realism. As the cockpit is created by the PC, we can record the video with and without the added cockpit. It's also possible during the flight to make the cockpit a little transparent or to hide/show it.

The lisibility can be maximized as it is not an osd in the plane sent through a video link, but an image created and positionned by the pc on the video. So we have a very good resolution and use all the colors for photorealism.

Some effects can be added to add to the realism, such as glass reflections in the instruments or on the virtual canopy.

As we know the motor speed (from PropAway www.propaway.com ), it's also possible to reproduce another motor sound, such as a real plane's motor. We'll know the speed, and so we can simulate the wind's sounds. (of course we can simply use a microphone in the plane, but the sound an RC motor is so different from a real plane)

A realistic HUD can be simulated, or different instruments, or a glass cockpit, actually, by creating the virtual cockpit on the PC, there a few limitations to what can be done.

Any comments or ideas ? I'm surprized it doesn't already exists as (in my imagination at least) it looks very fun

To create the HUD on the PC I guess you will need the plane's telemetry data, right? As long as ImmersionRC are willing to release the protocol to decode the EzOSD audio stream however, I guess this should be easily possible. Neat!

I use X-Plane and my Eagletree data equipment for just exactly what you are proposing. The only problem is processing lag will get you. An EXTREEEEEEMY fast machine with a really good professional level display card is a must. I use nVidia 3d workstation cards. Additionally it needs an external video mixer that is "green screen" capable.

X-Plane is perfect for this since it was designed from the outset to be hacked input and output wise.

Good luck, there is a LOT of homework and hard work involved. Hope you have some time and deep pockets. I have been at it for over 7 years now and still no satisfactory solution below cost levels that cause nose-bleeds just thinking about. It's pretty easy to do in a lab setting but making it portable gets insanely costly.

I use X-Plane and my Eagletree data equipment for just exactly what you are proposing. The only problem is processing lag will get you. An EXTREEEEEEMY fast machine with a really good professional level display card is a must. I use nVidia 3d workstation cards. Additionally it needs an external video mixer that is "green screen" capable.

(...)

Have fun and be safe.

Rob

Thanks Rob for these infos, I understand the complexity of the subject, and wanted to do something intermediate, not too complex :

-we have the exact position and orientation of the camera, so I think it's less resource intensive to use this information rather than use augmented reality. And in theory it should work better in case of sun flares etc. And I would like to show a complete 360° (or 180°) canopy, not just instruments in front of the pilot.

-phase 1 : I'm not going to create real 3D, but more something apparented to 'sprites' : according to the camera orientation, the dashboard will be displayed in 2D at the correct position. I don't plan to use a real flight simulator, but rather a video library (emgu CV), on which I already have some experience. (This library includes more than needed, and can be used for augmented reality if needed later. In can also correct the wide lens 'barrel' effect). I'm confident that this phase will not be CPU intensive, even if some real needles or numbers are displayed on the 'photo' dashboard.

- phase 2 : depending on the result of the phase 1, some improvements will be added, mainly transforming the 2D dashboard into a flat 3D dashboard. I mean it's still a flat dashboard, but a perspective transform is applied according to the camera position to better include it in the video. If we need to accelerate, let's use only the pan, and precalculate the dashboard for every possible angle. For example for every half degree a 100 kb dashboard image is precalculated, on 180° it represents only 180 * 2 * 100 k = 36 MB which can hold easily in memory.

- phase 3 : according to the previous result, we know the time it takes to compute / display this dashboard (should be less than 1/25 or 1/30 sec), other 'panels' are added following the same principle. Depending on the cpu we'll see if it's possible to multiply the number of polygons to create curved surfaces, reflections, etc.

- phase 4 : other effects and instruments are added, and see if 3D video card can be used. It may sound strange, but I'd not start by using a 3D card. I think it's better to start with something small which is working, and then incrementaly add features or improve.

Well this was only 50% of the difficulty, I plan to make my own vario, altimeter, airspeed meter and telemetry system - I know it looks complicated but I already worked a lot on this and have the vario and airspeed already tested.

I'm working to comply with FAA regulations as they will be in 12-24 months. I intend to keep flying and get paid for it.

X-Plane is already FAA certified so most of that worry is over.

With X-Plane all it needs is X,Y,Z orientation and current position to do all the work and paint your environment, update your instruments, cockpit decor and even correct shadowing of the interior for the direction you are traveling relative to your geographic position and accurate to time of day/and seasonal changes.

Yes, this is a BIG project. looking forward to seeing this project evolve

Today I've been working a lot on the virtual cockpit project.
I have done the incrustation of cockpit on the video, for now from a cockpit photo in 2D, which is moving almost as needed. I mean today it is just moving horizontally, without rotation nor perspective, following the camera's orientation (if the camera is looking left, the cockpit appears logically more and more on the right, staying on the plane).

The PropAway has been modified to send the position of the camera (and now for testing purposes, it just makes it go slowly right and left in a continuous cycle of about 6 seconds).

Doing this takes 10-11 ms for each frame, which is perfect, and no specific optimizations have been made. The time is dependent on a lot of factors, here I display the original 640*480 from the camera in a 1000*750 format which is more comfortable.

Still a lot of work to do, but I was happy to see that I did not take too long to do the calculation for a frame. Of course we'll see when adding needles on the panel, perpective transform etc.

One detail to mention : the camera pan servo was a bit slower than the display of the cockpit, which was giving a bad look. I added a slight filtering on the position where I display the cockpit, to match the servo reaction speed. For example, when giving a signal to go from full left to full right, the cockpit was going there immediately, but the servo needed a fraction of second to do so. Now with the (digital) filter they go at the same speed. A visible difference is that the video will be slightly blurred when the camera moves while the cockpit always looks crisp. Maybe I'll add a slight motion blur when moving.

Another thing to manage is the lens distorsion, as I'm using a wide angle. It's not a big issue as the image library I'm using (open CV) has functions to measure and correct this.

I use X-Plane and my Eagletree data equipment for just exactly what you are proposing. The only problem is processing lag will get you. An EXTREEEEEEMY fast machine with a really good professional level display card is a must. I use nVidia 3d workstation cards. Additionally it needs an external video mixer that is "green screen" capable.

X-Plane is perfect for this since it was designed from the outset to be hacked input and output wise.

Good luck, there is a LOT of homework and hard work involved. Hope you have some time and deep pockets. I have been at it for over 7 years now and still no satisfactory solution below cost levels that cause nose-bleeds just thinking about. It's pretty easy to do in a lab setting but making it portable gets insanely costly.

Have fun and be safe.

Rob

I agree that X-plane is an excellent product (I also have the 9 DVDs ) but I prefer to do something a bit lighter and not FAA compliant, but more targeted for FPV use. For example specific indicators for the battery's usage (remaining mAh, etc) which will not exist on a real plane.

I'm sure you already know this, but the FAA-approved version of X-Plane requires special hardware (a certified computer). In other words the FAA has to approve the software and its installation and the computer (the entire package, software and hardware).

The consumer version of X-Plane is definitely not FAA-certified for anything.

I'm sure you already know this, but the FAA-approved version of X-Plane requires special hardware (a certified computer). In other words the FAA has to approve the software and its installation and the computer (the entire package, software and hardware).

The consumer version of X-Plane is definitely not FAA-certified for anything.

I am aware of the certification requirements for official training, yes. It is way easier to make a silk purse from real silk than some poor animals ear any day.

Starting with an already certified platform helps a lot when it comes time for acceptance.

Besides, the folks that wrote this know way more than I about most of this anyway.

Here a several screenshots, I prefer posting a video in a few days when I've added the vertical movement and rotation of the cockpit. (Remember I'm working on the virtual cockpit since yesterday only )

To position correctly the cockpit is not simple, because it's not just moving regularly horizontally when the pan servo is used. Because of the lens distorsion, the movement is relatively faster near the center and slower when looking at the wings. Secondly, there is a small vertical movement when panning horizontally.

So I've put some visual markers where the cockpit sould be in the plane, and reprogrammed PropAway to make pan very slowly the camera over the whole horizontal range. At the same time I click on the screen to indicate where the markers are. The software then displays the position (x and y) and the current pan servo value. I export these values in excel to make curve fitting (see photo) and get the formula (x and y cockpit position in function of pan value).

In the future the markers can be detected automatically and no need for excel calculation. But it's a one time calibration and then the system can be used.

As mentionned, I don't plan to detect the markers during the flight, and there is no need for this as we know exactly the camera's orientation.

The cockpit photo has been modified slightly to extend it on the right and left sides, it's a first draft and should be improved in many ways.