HSpark - a particle system compiler

Summary

HSpark (Haskell
Spark) is a particle system
compiler. It is intended as a tool to encourage reusability of code
and formulas between different particle systems and at the same time
provide aggressive space/time optimization. It is also designed to be
simple and efficient to integrate with a larger system, such as a
scene graph or a game engine.

The definitions of particles and emitters are
written in a special-purpose language, hosted in Haskell (which is a
general purpose, purely functional programming language well suited
for embedded langagues and compilation tasks). The particle
definitions are given in a
declarative way, i.e. one says what the behaviour is, not how it
should be calculated.

From the user-provided behaviours for emitters and
particles, HSpark will generate code in C++ and/or GPU-asm. This code
is then ready for inclusion in your application. The provided class
will fall back to CPU-processing should the hardware not have
sufficient capabilities and select the optimum way of transfering
data to the GPU. The generated code will be as optimal as possible
with respect to memory footprint and calculation overhead.

HSpark is chiefly meant to aid in the creation of
particle systems while maintaining sufficient performance. It is of
course possible to edit the output (C++/GPU-asm) files and thus
hand-optimize that extra bit before final release.

HSpark's origin is as a project in the Advanced
Functional Programming course at
Chalmers held by Koen
Claessen, who also is the supervisor for this thesis
project. The idea behind creating something like HSpark was born when
it became clear that the particle systems in Reaper had a huge potential for
reusability, but each system differed in some ways and the differences
was such that sharing was hard to express in a straight-forward
manner in C++. A compile-time (template based) linear algebra package could
not do enough, and writing a complete compile-time system or even a
separate compiler in C++ did not strike
me as a very pleasant job. Overall, the task seemed interesting and fun
and also very well suited to Haskell's strengths.

HSpark is heavily inspired by several predecessors in
animation and compilation, such
Pan,
Fran and
Vertigo. All of these are
the works of Conal Elliot, more or less.

Features (thesis project)

Static typing.

Local algebraic optimization with constant folding and if-floating.

Global optimization with common subexpression elimination.

Numerical ODE-solver, allowing (restricted) recursive expressions.

Renders using OpenGL and extensions.

Small library of emitters and behaviours.

Wishlist (future work)

Outputting of a complete glut-app.

Calculation on CPU or GPU, depending on hardware.

Optional bounding volume, allowing external occlusion culling.

Extensive emitter/behaviour library.

Bounding volume approximation.

Symbolic ODE solver, making more expressions state-less.

Rendering using DirectX and/or user-provided mechanisms.

Support for HLSLs such as Cg or OpenGL 2.0 and DirectX 9.

More back-ends (C, Python and maybe C# and Java ...).

Back-ends supporting SIMD technology such as SSE or 3dnow.

Test lab that allows on-the-fly modification and compilation.

Current status

The thesis work is completed and the result is quite pleasing, at least
with regard to the language. The compiler needs some more work to handle
all systems correctly. The language design
was considerably harder than expected so there was not time
to implement all intended (and sexy) features, which means
that the compiler and its output is not useful in a
real-world situation. Hopefully, there will be time to
turn HSpark into a Real Useful Tool (tm) some day.