Software developer

This is a simple ray tracing demo done entirely in a fragment shader over a fullscreen quad.

The idea is that each fragment shader shoots one ray from the eye, through the screen and into the scene; checks which objects it intersects, if any; and based on this it calculates the color for the fragment. This is a good resource if you want to read more about ray tracing.

The materials are all based on procedural textures. I will assume you have some knowledge about perlin noise and procedural textures, but if not you can check this famous presentation by Ken Perlin himself from 1999, where he explains the concept and presents some of its applications. I’ll just quote the presentation and say that noise is a controlled random primitive; a pseudo-random function where all the apparently random variations are the same size and roughly isotropic. By itself it doesn’t do much but creating a simple pattern; its power comes when combining noise at different frequencies and feeding it to other constructs to provide interesting variations. It can be used to generate textures, models, animations, etc.
The implementation of perlin noise I used was developed by Stefan Gustavson.

Let’s dig into the code…

First let’s see the functions that combine noise at different frequencies, which serve as the basis for the different procedural textures used. These are explained in the presentation I mentioned above.
The function ‘cnoise’ (not listed here) is the basic perlin noise primitive.

The function intersectWithScene just checks whether the ray hits the sphere, the plane, or nothing, and returns an object id based on this.
What’s more interesting are the functions that return the color for each surface. This is the function for the sky:

It has four layers: the clouds, the sky color, the sun and its rays. Let’s see each one in more detail.

The clouds are created with a fractal sum of 3d noise values. The input to the sumNoise function, which was listed above, is the point where the ray hit the sky (imagine a sky dome). This point is scaled down to ‘zoom in’ in the noise space, and then we animate the values in all 3 dimensions, otherwise we would get static clouds.
Then we multiply the result with a smoothstep that depends on the altitude of the point in the sky, to prevent the clouds from going below the horizon.

The sky layer is just a gradient between two colors, the horizon color and the sky color.

The sun and rays layers are computed in the following way. We take the dot product of the normalized directions from the ray’s origin to the sun position and to the sky position we are in. As you should know from your algebra lessons, this dot product is the cosine of the angle between these two vectors. We feed this value to the pow function and get a circular shape which is at full intensity in the sun position and falls off to zero while getting away from it. This falloff is controlled by the pow exponent: a small value for the rays results in a bigger shape, and a bigger value for the sun renders a small and more concentrated circle.

Finally we combine the four layers by adding the clouds to the other three layers scaled by an alpha value. This alpha is taken to be 1 – clouds.x, and its purpose is to remove some black spots from our sky, which would be really weird.

Moving on, let’s see the function that outputs the plane color. Explained in the comments inside the code: