We just opened our newÂ GitHub repositories a few weeks ago, and little by little we have been filling it with new and helpful things, and we expect to continue doing thatÂ

First one to join wasÂ Flare3D Labs, this repo includes all kind of experimental codeÂ (deferred lightning, Adobe AIR Gamepad, SAO, etc…) that you can use as an inspirationalÂ starting point to do your own creations.

We thenÂ added a new Flare3D EngineÂ repo where you can find parts of the core engine,Â either to extend its functionalityÂ or as part of the documentation to better understand how things work internally, for example, allÂ material filters and its FLSL shaders can be found there.

We are waiting for you on GitHub

If you want to be updated with the latest examples, please follow us in GitHub, if you like the content, please feeds our ego giving us a star in your preferred repo and if you think that you are ready to add some amazing stuff we will be glad to receive your contribution

Visit us on GitHub regularly, cool things will be published soon!

If you want to stay tuned to receive latest news and announcements, follow us on Facebook and Twitter.

From the first time i saw the holodeck in Star Trek, the next generation, i’ve always wanted to pursue virtual reality.

In the 90’s, I was 12 or 13 years old during the first VR hype, the local computer business showcased the huge Forte’s VFX-1 HMD for PC. It really was not that good and i certainly did not have the funds to buy something like that. But it always stayed in the back of my head.

When Nvidia started experimenting with the first stereo drivers that worked with shutter glasses (before slow lcd’s became the standard) I immediately went and bought them, Battlefield 1942 looked gorgeous in stereo 3D.

After graduating i started working as a flash developer in an awesome multimedia company called BUT. (www.but.be). When the first 3D engines came out in flash, before Stage3D came to the scene, I created some demo’s with stereo views in anaglyph 3D.

I read the MTBS forums (http://www.mtbs3d.com/) at the time Palmer Luckey was contemplating his first rift prototype and went to kickstarter with it. I asked my boss for a developer kit and could not wait for it to arrive. After playing around with it a few evenings i wanted to create things myself. I looked around if somebody created a Native Extension for Air already. Jonathan Hart just started his ANE and a few other people contributed code. I set out to create the distortion shader (in AGAL) and spent many many nights until the basics finally came together.

Palmer Luckey and an earlier Oculus Rift prototype

By then i found out about Flare3D and it’s FLSL language, I asked for help on the forums and just minutes later Ariel came up with a working distortion shader with chromatic aberration. I felt like all my AGAL studying was for nothing, but was sold on Flare3D. The support is just great!

Are you working on an Oculus Rift project?

While the implementation is already really usable there is still work to be done, latency tracking is next on the todo list. But the most important bits are done including Timewarp.

I’ll soon be making a demo project so we can pitch ideas to clients. Personally I would really like to make a space shooter. Sitting in a cockpit looking into the void of space is just something that is made for VR.

Jonathan Hart just completed work on integrating the Oculus into the award winning VR-based music video for Beckâ€™s â€œSound and Visionâ€, produced by Stopp Family and directed by Chris Milk which won 2 Webby Awards last year. This piece also uses Flare3D, and in a very novel way. http://www.stopp.se/chris-milk-beck-hello-again/

Awesome Interactive 360 music video for Beckâ€™s â€œSound and Visionâ€ also uses Flare3D

Beyond that there is also already work underway to support the Samsung Gear VR and integrating Oculusâ€™ Android SDK as a platform target for the ANE. Exciting times are ahead!

What offers Oculus Rift DK2 comparing it with their previous version?

DK1 had a 1280×800 60hz. LCD display. DK2 has a 75hz. full HD OLED display that is capable of showing a much crisper image.When Adobe AIR is delivering 75fps and you turn on the low latency mode, the difference with the first developer kit is huge. Low latency shows the rendered frame only for a fraction of the 13ms it would normally be visible to the eye. The result is that the image does not smear like in DK1 when looking around.

Besides the better visual qualities it also has an external camera that tracks the users position. makes a huge difference in not getting sick.

DK2’s only downside is that is has a lower fov. then the first developer kit has. But it’s important to know that all of this is still for developers. The final consumer version specs are in no way locked down and i’m sure they’ll be improving almost every aspect of the current developer kit for the first consumer version. for example the new prototype Oculus showed last week has completely different optics, a higher refresh rate (90hz. is hinted) and it looks like it has a higher resolution.

Is the example ready or does it need a few tweaks to consider it finished?

It needs tweaking, the perspective projection is calculated in Flare3D while Oculus provides a projection matrix from the SDK. I just can’t make sense of the one they are providing, it’s something that needs looking at so the setup will automatically work for future versions.

Flare3D and Oculus Rift DK2

Adobe Air does vsync but in my test setup it locks to the refreshrate of the slowest monitor. Which in many cases is 60hz. That might be an area where AIR could be improved. It should check which display it is running on and then adjust. This way, when you go fullscreen on the oculus, it starts running in 75fps. The current solution is to buy a decent high framerate main monitor or set the rift as the only display which is highly impractical. According to Adobe, they have no framerate cap on AIR desktop, which means the 90Hz consumer version of the Oculus is already feasible: the world just needs hardware that refreshes at that frequency.

Which was your experience integrating Flare3D with Oculus Rift?

Flare’s api and the flsl language are perfect for the job but what is even better is the active community and a lively forum where you can go for help. The scriptable IDE is also great and allowed me to write an importer for the oculus demo scene. I’m really looking forward to the next version. People jumping in and creating fxaa shaders that can work together with the distortion shaders really gives you the energy to continue working on it.

Full example and Sources

The example, ANE, and source code are available on his Github repository here:

One of the latest features released with AIR was Adobe AIR Gamepad. Basically, if you have adobe AIR installed in your Android phone or tablet you can use it as a joystick in your game. The integration is really easy (check out Adobeâ€™s tutorial here) and provides you with a lot of possibilities such as adding the joystick functionality for your game or else you can use it as a second screen for interactive applications.
To play the demo you must follow the following steps:
Important: Flash Player 14 (or higher) is required

Type into the demo the code that appears in your device and press â€œconnectâ€

Type the code that appears in your device.

Once you get connected your device will vibrate and you will see on the device screen the touch areas for walkingÂ and looking around into the demo.

The demo

This demo uses our “Deferred lightning”Â example. You willÂ find the demo’s source code at Flare3D Labs inÂ aÂ new folder called “gamepad” which has beenÂ added into src.
The VirtualPad class receives a gamepad’s reference and handle the input in the same way.Â This class was made to detect traditional touch events, but with a few changes was adapted to work with AIR Gamepad. As a complementary information you can checkout AIR Gamepad docs here.

Interpreting the input

AIR Gamepad recognizesÂ Multi Touch and gestures input and send them to your application. So you can catch the input asÂ ifÂ your app would beÂ running in the mobile device. Once the connection between the app and the device has been establishedÂ you are able to skin the device’s screen sending an image. In our demo, we are sending an image that shows a reference indicating the touch areas on to the screen.

The contest reached the end and we can say that it was a great experience for us and we’ll certainly repeat it in the future!
Although theÂ contest’s goal seemed easy to achieve, the reality was different. Only three participants were able to finish their BREAKOUT’s versions. ItÂ justÂ wasn’t right to chooseÂ two winners among three participants. Thus… we decided (only this time) to declare the three finalists as winners!

Now it’s time to meet the winners:

Tim Budds

Tim didÂ a great Job! Basically he takes the example and turns it into a full grownÂ game!

Respecting the traditional gameplay, his “Breakout 3D” includes 8 levels, power-ups (some of them really funny), support for keyboard and mouse and a mobile version for Android!

Gabriel Walter

Gabriel presents “Wonder Runner 3D” and we are still trying to figure outÂ Â what kind of game is it! 100% surreal, in the game you are a man running across an oneiric world searching for a golden ball… Yes, it’s a crazy idea but Â has potential!

Wandah W.

Our dear WandahÂ is theÂ usual suspect into the Flare3D’s world 😉 with a pastÂ proved experience using Flare3D. Wandah spent 10 hours converting our breakout example into a mix between a football game and a tower defense! The game recreates the final match of the latest World Cup game between Argentina and Germany, includes cute graphics, soccer’s sounds, splash screens, menus and runs perfectly!

It is nowÂ time to think about the future! Keep in touch, we are cooking something special for the next mini Challenge!