Following one man's task of building a virtual world from the comfort of his pajamas. Discusses Procedural Terrain, Vegetation and Architecture generation. Also OpenCL, Voxels and Computer Graphics in general.

Thursday, February 2, 2017

The very-far-away

The previous post described a new system that allows rendering rich surfaces we call "meta-materials" using low-resolution geometry. Meta-materials cover ranges from 100 meters to 10 meters. What about anything more distant than that, that is, the range from tens of kilometers down to 100 meters?

It turns out the same system applies. You can think of this as "meta-meta-materials", we just do not call them that because one "meta" in a name is already too much. We have multiple objects that do fit that description. A terrain biome is one example.

In this post, you can see the results of applying this method to biome objects. All images are in faux solid color, which we use to make sure feature placement is correct.

Here is a single biome and the amount of geometry it takes to represent it:

In order to capture the detail, this biome also uses 1024x1024 texture maps for diffuse color, normals and other maps required for physically based rendering. Terrain voxels, which are generated on the fly, emit UV coordinate pairs which link the voxel's position in the world with the right section of these texture maps.

Here you can see multiple biomes in the same image, again in faux color, covering an area of approximately 3000 square kilometers:

Since most detail is contained by textures, it is possible to use a much coarser geometry. The following images show that we can crank up the mesh simplification and still obtain fairly good looking features:

As a creator of worlds, this feature is entirely transparent to you. These detail textures are automatically generated. Actually, all the content in these images was generated by our procedural algorithms, but if you had custom made maps, you would not need to be concerned about creating and maintaining the detail textures.

Like I said in the previous post, this is a technique frequently used in modern polygon-based terrain. The key here is this is now working on voxel terrains. These environments can be modified in real time by players. They can harvest materials, make trenches, even blow out entire craters in real time.

What about even larger scales? How well can you handle something like a dyson sphere zooming from a human sized character all the way out? Or a realistic Saturn-style ring system? Applying things like the Worm Day example to planet-size objects rather than mountain-sized ones?

For entirely procedural objects, the system can scale pretty much all you want. I will cover this in more detail in future posts, but the overall idea is when things are far enough they are replaced by a new material that contains a "projected" version of the higher frequency detail. You can do this swap many times over.

When you have user made structures, let's say now you have harvested one quarter of Saturn's rings, it is trickier because we do not have a stage in the process to bake this information. But this is feasible anyway. We already compute higher LODs on the fly, it would be a matter of computing these other forms of LOD.

The Saturn's rings thing reminds me of one case of LOD you haven't mentioned; how do you handle/plan to handle things that look like solid object up close and a partially transparent texture form far away? Saturns rings, forest canopies, rows of pillar relatively thin enough that the distance between them would be sub-pixel, etc.

((I wish I could play around with this stuff more. Can't really justify buying a devkit even as reasonably prised as this just for hobbyist messing around and not having any actual game in mind or working in the games industry at all. ))