Recommended Resources

Blog Stats

Update to FG. I decided to use the noisy AO results of computing the FG points after all, its not that bad when I also include the view/ray intersection points. The image below shows this update. The scene is from a Digital Tutors tutorial DVD.

Update to FG. The sample points are now based not only on view ray intersecitons, but also points layed out on the geometry (back face culled). This produces nice results as you get more points in complex areas, and the distribution increases as the rays form more grazing angles to the geometry. There are still some artifacts though, and this is based on the points being uniformly placed, and patterns can be seen in the interpolation. Hopefully some randomness will sort this out. I also tried using ambient occulsion to create a buffer which can be used to distribute points. It worked great, but is just very slow to compute. If I reduce the samples, then a get a more noisy buffer, if I reduce the resolution of the buffer then I get other artifacts showing up. Im still going to play around with it some more and try to reach a good compromise.

Some early renders using my Final Gather implementation. The first shot is of just the indirect illumination with no interpolating. The second shot is with interpolation of 3 points, and the third shot with 10 points. The final shot is with the direct illumination too. You can see the FG points Im sampling in the the top viewport (the green points), which currently are uniformaly sampled at across the viewing plane (using a ration of 1/5 to pixels in the shots below). Unfortunatly, even with many FG points and a lot of interpolation the results still contain noticeable low frequency noise. Hopefully this will be reduced by importance sampling the FG points as opposed to unformly distributing them.

Quick caustic test. This doesnt actually use any extra features than what I had when I first started, I just uped the indirect samples. You can also actually see the reflection of the light off the back ball being refracted through the closer ball and projected onto the right wall.

Added environment maps to be using as enviroment lights. Again, without enough samples, the results are quite noisy. Once way around this could be to use a blurred version of the map, such as using a lower level mipmap, another way is to importance sample the image, but this only seems to work well for low frequency maps. However, it does produce more interesting and realistic lighting, especially with reflective materials.

I added the functionality to bring in meshes using my 3ds Max exporter to render more interesting objects. This also meant introducing an acceleration structure due to the number of triangles. I implemented both an Octree and a kd-tree, both work well but the kd is more efficient and renders slightly faster, as its nodes are implemented using only 4 bytes, in a fixed array, as opposed to a linked list, so they’re a lot more cache friendly. I also implemeted the recursion using a while loop instead of stack recursion which provided a little improvement too.