Following one man's task of building a virtual world from the comfort of his pajamas. Discusses Procedural Terrain, Vegetation and Architecture generation. Also OpenCL, Voxels and Computer Graphics in general.

Monday, January 30, 2012

This introduction was also long overdue. For more than a year I had this program that allows me to work on a definition for the procedural world, then schedule it as a render job in the Voxel Farm and finally show the results on screen. Many of the screenshots you have seen in the past were taken from the viewer in this program, but the UI for it remained hidden. Until today.

I give you Voxel Studio:

This program has a series of editors for the different things that make the procedural world. In the previous screenshot you can see the Terrain Layers editor, which allows to define the multiple layers of the terrain. For instance, there is one layer for each type of rock. The placement of these layers is determined by masks, which are the output of an earlier process. It also covers Vegetation and Architecture, but I plan to show that later this year.

Something I like about this program is that it uses the Voxel Farm to create previews. Right now I have only three good machines in the farm, a single scene like the one in the screenshot still takes close to a minute to render. If I had six machines I would wait only half that time.

The Renders view will show you a list of the jobs currently in execution inside the farm:

One only render job at this moment, but there could be several at the same time. The green cube shows a world chunk that has just been received from the farm.

The material editor allows to define the multiple layers that make each material:

These layers follow the same model I had introduced in an earlier post. They are rendered in real-time, although there are no shadows nor global illumination unless you perform a deeper render.

If you are only working on the materials there is no need to recompute the geometry of the scene, this ends up being a pretty neat and fast workflow.

If you want to make sure you are working on the right layer of terrain or material, you can always ask the program to highlight it. The following screenshot shows the same scene as before, but the "Volcanic Talus" layer appears highlighted in red:

In the future I will post more about this program. If I get enough time I will capture a video so you can see a quick session.

In the poll at the top of this site there is an option that reads "Content creation application (like WorldMachine)". This option has received one third of the votes compared to the most popular option. This is a lot of votes once you consider it is a tool and not a game like other entries in the poll. Voxel Studio is that application.

Like anything else in this project, I'm not sure if it will ever be released, but I'm closer to having this finished than a game or anything else. It could be a nice procedural creation suit, including both the studio and the farm. If you see yourself using an application like this, I would like to hear about what features you'd like on it. Please leave your suggestions as comments and we will sort out what is feasible and what is not.

Thursday, January 19, 2012

This post is about something I have not discussed before, but somehow it has been present on every screenshot or video I have posted for a while now.

For more than a year I have been building a little farm of machines. They run a series of programs I wrote for the procedural generation. They work in parallel, sometimes doing the same task over different locations of the virtual world, sometimes running very different tasks one from another. Their efforts are highly coordinated. Thanks to this I can get large portions of terrain, forests and buildings generated in very little time. I can see the results of the changes I do to the world definition without having to wait too long.

What it is best, this setup allows me to throw more nodes in the network at any time. I only have three decent machines in the farm right now. They are old gaming rigs I found on Kijiji around Montreal. The minimum spec is 8 Gigs RAM, 3 or 4 cores and an ATI video card better o equal to a 4770. I need them to have GPUs because some algorithms use OpenCL. I cannot afford to get too many of them right now, but having software that scales over multiple systems is already saving me time.

What I like the most is that it really feels like a single very powerful machine. The existence of the farm is completely transparent to the application I use to design virtual worlds. I will cover this application in a future post, you have already seen many screenshots taken out of it without knowing.

I would like to introduce you to some different animals I keep in this farm and explain a little about what they do.

Dispatcher

The dispatcher controls everything that happens in the farm. There is a single instance of this process for the entire farm. At any time the dispatcher keeps tracks of the different jobs currently active. Each job may be at a different stage. The dispatcher knows to which farm worker should direct the next request. All the coordination happens over TCP/IP. The dispatcher listens on two different ports. One is for the farm workers to report their progress and get new work assignments. The other one is so clients of the farm can request new jobs and also query the status of ongoing jobs.

ContourSeveral layers make the virtual world. Some are terrain layers, some vegetation, some are buildings and roads. All these layers have something in common, they represent a volume with an inside and an outside. Contouring is the process that allows to find the surface that divides the inside from the outside. The world is broken into many Octree cells. Each contour worker can process a cell individually. It knows which layers intersect the cell so it runs an algorithm known as Dual Contouring on the contents of the cell. The result is a very detail polygonal mesh.

DecimateThe meshes produced by the contour phase are very dense. If they were fed to the next processing stages it would slow them down. For this reason they go through a phase of decimation. This is a fast Multi-Choice mesh optimization that preserves topology, and only removes those triangles that bring very little difference to the mesh. The resulting mesh is very close to the original, but the number of triangles is drastically reduced..

ReduceI use a LOD system to replace several distant small cells by a larger cell. Since they are Octree cells, this means combining eight children cells into one large parent cell. Even if it covers eight times the space, the parent cell must be similar in byte-size than the child cells. This means the eight children must be brought together and compressed. The compression at this phase does change the mesh topology, otherwise it would be impossible to achieve the target sizes. Then the resulting parent cells are again combined into a larger parent cell and so on, until the highest LOD cells are obtained.

ProjectThis process takes a high resolution mesh from the decimate or reduce phases and creates a very simplified mesh out of it. Then it projects the excess geometry on a normal map. The results are compressed as I described before and stored in a cell definition file. These are the files that are sent to the client for rendering. At this point the processing for a single cell is pretty much done.

I have not covered here the generation of cities, architecture, forests and other elements. They blend into this sequence and also live in the farm, but I think they deserve a dedicated post.

Probably the most interesting aspect of writing a collective of programs like this was how to make it reliable. Since I was targeting unreliable hardware to begin with, I realized failure had to be an integral part of the design. I devised a system where none of these processes expects you to do proper shutdown on them. They could just vaporize at any point. Actually I did not implement a way for them to exit gracefully. When one needs to close, the process is simply killed. The collective has to be resilient enough so no data corruption arises from such a failure.