Step 3 - apply a corrective distortion (barrel distortion) to avoid the distortion from the large FOV

Steps 2 and 3 are here to adjust the rendering to the optics of the Rift.

In the Oculus SDK there is a C library (source and compiled) that can be used in existing projects. However, as we do not have access to the source code of SpaceEngine, this cannot be implemented except by SpaceEngineer. It does not *seem* very complicated and there are many working examples in nature (), but i am a noob reagarding shaders, i gladly admit it!

Still, i was able to accomplish the first step in Celestia with a simple CelX script, thanks to split views and LUA support in Celestia (you can hook on the main loop to add your own code, with some limitations of course). If I am not mistaken there is no Lua hook in SpaceEngine ? That would be a nice feature, even outside of the Oculus scope.

As fullmetaljackass suggested, there is the possibility of a GLSL shader for the first step (like proj_fisheye in the shaders folder of Space Engine) but unfortunately I have to admit that I have insufficient skills to work this out... (as I don't know exactly what data is passed to the shader i am working in the dark, and again i am by no means a shader developer)

The same goes for step 3 with the possibility of shader injection (post-processing by an external shader after SpaceEngine has finished rendering). There are some nifty third party tools and drivers (like Vireio), but most of them are made for DirectX not OpenGL, although I was able to find examples of GLSL with Google. I also found a SE forum thread discussing external shaders.

Once a proper duplicated+distorted+corrected image is rendered, the matter of transfering it to the Rift is simply a right-click in Windows to extend/duplicate the screen on another device.

BUT, to be fully honest, adapting SpaceEngine to the Oculus Rift would be even trickier than those three steps. For one, ergonomy would be quite heavily impacted by the wearing of the mask: not seeing the keyboard, UI text difficult to read, implementing head movement as a substitute for the mouse left-click-and-move, etc

There is also a fair risk of nausea as right now one cannot change the camera direction during "goto" travel, and this would require the user to not move the head at all when hitting the Goto key (G). This seems impossible to me, given the eye candy of lightspeed travel in SpaceEngine...! There might actually be a technical reason for not being able to look around during "goto" travel. If not, well it could be an opportunity for even more eye candy Actually, it seems that any automatic movement similar to the goto key (like the horizontal-level key End) would foster simulator sickness... (it's called kinetosis, when your brain cannot match perceived visual movement with physical movement)

To make a very long story short: a first implementation of the Rift drivers in the core or -even better for testing by the community- through an external properly-coded shader (like proj_fisheye.glsl) would be a huge step, and from what I've seen in the comments there would be quite a few betatesters around

Please SpaceEngineer, open the Pandora box and make it happen.........!

Hey! As you can see, I've add a Paypal donation button on the website main page. And thanks to some amazing guys, just within a few hours I'm on a halfway to purchase the Oculus Rift Dev Kit. Yes, this will be my first spending of your donations.

Hey! As you can see, I've add a Paypal donation button on the website main page. And thanks to some amazing guys, just within a few hours I'm on a halfway to purchase the Oculus Rift Dev Kit. Yes, this will be my first spending of your donations.

Hey Space Engineer please contact them before you buy Occulus! If you show them your work, I think they will be more than happy to send you a free sample! Or at least one at a reduced price so that you can try it before you invest on it! Make a deal with them - both of you can benefit from this

No. I am not sure. But you can try. If I was them I would probably (definetly but Iam biased) want a software like SE to be compatible with my product - it would be a great pro. And vice versa - conclusion: They can benefit from you and you can benefit from them

i am heading to the Paypal button as soon as I finish writing this, because it's a great way to say thank you for this beautiful piece of software, and because you will certainly fall for the potential of Oculus in SpaceEngine once you receive the devkit!

make sure you try Titans of Space, as suggested by others in this thread, and also Outerra which now has an oculus mode. no google earth mode, but that could change sooner or later.

let me know if i can be of any help: i did 50+ demos of the Oculus devkit to individuals, companies and research labs since receiving my own kit, and it gave some insights on what works and what doesn't (from the users point of view).

and like many other forum members i would beta-test an oculus mode for SE anytime

And thanks to some amazing guys, just within a few hours I'm on a halfway to purchase the Oculus Rift Dev Kit. Yes, this will be my first spending of your donations

Speaking for myself, it doesn't matter to me whether you spend the money on that, on your electricity bill or on a date as long as you don't intend to give up SE.

How do you envision stereo vision for SE? I mean, in a realistic mode, stereo will only be useful at the very surface of planets, but I imagine that users will be asking to experience 3D in space as well. In an adaptive mode perhaps the distance between the eyes can change seamlessly from a few cm to parsecs depending on the distance to the nearest object with some filtering.

in a realistic mode, stereo will only be useful at the very surface of planets

Stereo could be useful with spacecraft, but the real strength of Rift with SE would be the head tracking. Imagine hovering in space over a planet, aurora shimmering beneath you, stars spread across the sky, and just looking around with your head as if you were actually there. It would be a dramatic improvement in immersion.

Stereo could be useful with spacecraft, but the real strength of Rift with SE would be the head tracking. Imagine hovering in space over a planet, aurora shimmering beneath you, stars spread across the sky, and just looking around with your head as if you were actually there. It would be a dramatic improvement in immersion.

Absolutely. My three monitor setup feels fairly immersive, but Rift will be much better. I was just anticipating some of the expectations, since 3D movies and planetariums usually show the universe as seen through the eyes of a giant with asteroid sized eyeballs.

VR was a big thing in the early 90's and was supposed to change everything, but the hype quickly died. I don't think the computing power at the time was good enough. Besides, the technology was just too expensive. That has changed, so it will be interesting what Rift will lead to.

To get the immersion to work well I think it will be important for SE to run at a steady non-tearing 60 Hz framerate.

I think, Space Engineer, that you could have an adjustable binocular gap (the distance between the eyes), that works a bit like the zoom mode works in SE now - have a default (human) width, but if you hold a key and scroll the mouse wheel, you can grow yourself to galactic scales, and observe the 3D relationships between large-scale objects. Then click the mouse wheel to bring it all back down to human scales - imagine how mind-bending that would be... maybe keep a barf bag nearby

I know I posted this XKCD on the previous page, but I think it's so damn relevant.

I'm loving this thread too: it is rather speculative, but it might help if one day SpaceEngineer makes an Oculus mode for SE.

Speaking of verticality and the vastness of space, i am not sure if it was intentional but in Titans of Space, the avatar blocks the view when you try to look below your feet, therefore you don't get too much the feeling of being in an infinitely vast emptiness in all directions, and you keep a sense of verticality and a ground below your feet (your avatar is visibly seating in some kind of spaceship cockpit). I know from experience that you can easily fall when the view in the Oculus suggests that your orientation is changing when it is not (like with the roller coaster demo). SpaceEngine might actually be better in the Oculus when lying on the ground or a matress...! Or with a comfortable chair, in which you can lay back.

Regarding sizes and distances, within the Oculus, Titans of Space produced on me the impression of a sun the size of a house, Earth the size of a car, and so on. These sizes are similar to what the author says he used. He also explains that in the demo their size comparison is not accurate. Further in the demo, stars bigger than the sun appeared to me to be as big as a 6-stories building (that's already quite big, thinking that it is a suggestion to the brain!) It works really well for me in Titans of Space because the user keeps a comparison of the size of things, and distances between them. To give a rough estimation, I would say that this demo gives the impression that the solar system is the size of a village (a few hundred meters large). For me things looked visually big when I moved the head and the view of the celestial body in front of me changed very little (the object didn't move much in the view, it never disappeared, and it was difficult/impossible to see its limits). This is what one should experience when looking at Earth from low orbit (~400km): it is so big that it can feel the natural ~180° human field of view.

But Titans of Space remains in our solar system. SpaceEngine brings us in many other places, outside of our solar system, outside from our galaxy. This should really be something to experience in VR, like the cartoon from XKCD