Lantern : a devlog

With this article, I'm going to start a new adventure! I want to begin a new personal project revolving around SceneKit (3D scenes on iOS), Blender and visual effects. The initial idea is to create a small realtime-3D scene with a carefully crafted look and atmosphere, running on iOS.
I will try to post a log each week with details on the advancement and my "discoveries", but some other projects might disturb this schedule — I'm in particular planning an update for my fishtank tracking application, Aquarii.

General idea

I wanted to create a peaceful scene and to convey a mood through lighting, colours, etc. I'm quite inspired here by meditation/zen apps [1], where you can interact with a relaxing scenery. I chose to display a lantern at night, outside in a windy place, maybe near the seashore. I don't want to be too precise yet, everything can evolve in the future.

First try, first scene

I started by playing a bit with SceneKit and Swift. To test the general disposition I placed a cube on a plane, lit by a spotlight.

SceneKit represents a scene as a graph, with nodes linked to geometry, materials, lights, cameras. A few primitives are available, and you can load Collada files (.dae). You can also display stats about the rendering, get the OpenGL calls stack and see the content of each buffer.
An important thing to notice is that SceneKit is running extremely slowly in the iOS simulator, as everything is emulated by the CPU (including the rendering): even for a simple scene on a powerful computer the framerate will suffer dramatically. Whereas on a real device the same scene runs smoothly at 60 fps [2], and this really is the best way to analyze and debug scenes.

Camera moves

The camera interactions are simple ; I wanted to allow the user to rotate around the object, with the camera always focusing on the lantern. In order to achieve this, one could use a look-at prebuilt constraint. But to make the relation between the panning value and the camera rotation/position easier to compute, I've chosen to attach the camera to a spherical node centered on the lantern, and to move this node in sync with the user's move.

Also, the user's moves are limited when zooming and rotating, to avoid going through the lantern or the ground. When reaching a limit, the user is brought back to a correct position with a spring effect.

The lantern, first model

The 3D model is quite basic for now, it was built using Blender. I had not used Blender since high school so it was a complete re-discovery; I especially like the Edit mode capabilities when modeling. Yet, there are a few caveats when exporting a Collada file for SceneKit: the shading and normals have to be correctly set up before exporting, as SceneKit won't do any kind of normals reconstruction/interpolation.

I might add the possibility to choose from a set of lanterns, or to generate random lamps using a simple grammar with building blocks. This will be done later, as I first want to focus on having a clean and coherent mood for the scene.

Rocks and grass

To add a bit of context to the scene, I added rocks and a few blades of grass.
The rocks are basic polyhedra edited in Blender, whereas the grass is made of triangular-based shapes.

In order to animate the grass, one could use a physical simulation with wind, as SceneKit includes a physics engine, or import a rigged animation. But the simplest way is to modify the position of the vertices in the vertex shader. SceneKit implements its own private rendering pipeline, but it fortunately allows us to inject code at four given entry points in the shaders attached to any geometry:

geometry modifier: corresponds to the vertex shader

surface modifier: modifies the material applied to the object

lighting modifier: replaces the default lighting model used by SceneKit

fragment modifier: called in the fragment shader

For each of these modifiers, SceneKit provides a few useful values (position, normal, texture coordinates, transformation matrices,... [3]) and one can also set uniform variables using the standard key-value observer pattern in Swift/Objective-C.

Here, we want to move the vertices along time, depending on their height above the ground. We can compute an horizontal offset in GLSL code and inject the code:

Conclusion

And that's all for the first log! Since then I've had the time to add lighting animations and particles effects, so this is one of the topics I'll cover in the next article, (hopefully) next week. Below is a small animated preview, enjoy!
And please let me know what you think of my project! [4]