Bacteria farming and Software design.

Bacteria farming and Software design.

This is an article about my creative process behind Paelodictyon, a site specific installation that I developed in collaboration with Yannick Jacquet and Thomas Vaquié. Since this was our first big project integrating Cinder in production from the early stages, and because I used it to create most of the visual content, this post is going to have a big emphasis on creative coding and software development. I’ll try not to get too technical, but still, you’ve been warned, this is a geeky post!

Inspiration.

After a few brainstorming sessions with the rest of the team, it appeared pretty quickly that we were going to work with themes inspired by the breathtaking differences in scale found in nature. When looking at the curves of the architecture, the idea of flow came immediately to mind and we started looking into the idea of the sea and the crazy organisms that populate it. I did some research on deep sea organisms and found a couple of articles about the paleodictyon nodosum, its incredible habitat and supposedly faculties for bacteria farming. Without getting too much into details here, the similarities that we found between these organisms and Shigeru Ban’s architecture seemed to be too much of a coincidence not to be looked at.

Our storyboard gave me some pretty clear leads on how to build the the software upon. I had to come up with visual and technical solutions to make our ideas possible. Ideas like this concept of multiple individuals creating, on a higher scale, a big complex organism or this idea of “skin” being a structure constantly rearranging itself in reaction to different stimuli, and so forth… From that point it was kind of easy to see that I was going to play around with agents, particles and group behaviour. But since we’d all been experimenting a lot with these themes in the last decade I really wanted to try to push this further. I had to find a way to make it more interesting for me and not just to create an nth Craig Reynolds’s Steering behaviors implementation.

From storyboard to software.

Instead of hard-coding a particle system as I usually would, I decided that it was time to have a more modular approach to designing particle animation, and invested quite some time trying to find the right solution in terms of usability and creative possibilities. I might have been wrong but it was pretty clear to me at the time that in order to achieve this i would need a good user interface.

I quickly implemented a typical scene explorer, similar to the ones you find in most graphic softwares, and came up with some easy-to-use code to create new objects that could be added to a scene. Having the hierarchies that a scene graph can offer allowed me to design a number of objects and re-arrange at will how those objects could influence each other. I quickly decided to limit myself to a small number of object types per scene. Particle groups, particle behaviour and effectors felt like a good starting points to build what I wanted. It kind of summarized quite well those ideas of internal/external world and stimuli that we had in our storyboard.

Complexification.

When it comes to programming physical processes, I’ve always been fascinated by how combining different layers of complexity can be so powerful. Each layer bringing its new set of rules and surprises. Combining can sometimes result in something greater than the sum of their parts, and that is where interesting and unexpected things can happen.

Usually the first layer that I play with gives each particle different properties, sizes, masses or shapes. “Press play” and see what happens, what kind of interesting patterns or animations emerge when exploring with those different parameters. Sometimes it does really feel like putting your finger into a Petri dish just to see what might happen.

Another layer that I wanted to add was the ability to create separated group of particles and apply behaviours or constraints to a group rather than to every single individual. This layer may seem quite simple or obvious at first, but it allowed for some really nice things to happen, and helped a lot to create complex interactions between particles. Make a small group act like a flock of fish and another one more as a fluid and you already have some nice interactions.

That is where the design that I chose (keeping particle properties, behaviours and constraints as separated concepts) came really handy. Playing with particles and giving them different properties, sizes and shapes is always interesting, but the fun really starts when you can mix different groups of behaviours together.

Timeline animation.

Why bother with coded animation when you can do it with a timeline? This might seems trivial but the level of complexity increased quite drastically when I added a time dimension to those two layers.

I knew that injecting any animated data into a physical simulation can often lead to surprising results but still I was really amazed to see how changing those behaviours over time would create such unexpected reactions. Anyone who has played with Craig Reynolds’s Steering behaviors knows how a small set of rules can create such compelling animations, even if none of the rules parameters are animated. Well if you are that kind of person, then you can probably imagine how animating those parameters can create such surprisingly organic and complex animations. This system helped me to create the different reactions that we wanted for our living organism, like skin contraction and dilatation, structure’s construction, re-organisation and deconstruction, and other organic animations. This was already a big part of our idea of an organism reacting to external stimuli. Here are a couple of examples of that “ever-changing state” structure:

Node Graph and sound design.

There was quite a lot of ping-pong between our composer, Thomas Vaquié and myself. More than ever, the music that he wrote influenced our approach to visual production. More than just a highly collaborative way of working together , we wanted to give the music a real literal role in the piece, making it one of the actual inputs in our (eco)system, like the very stimuli I was talking about previously. The music became quickly the main antagonist in our story, attracting/repelling those organisms, controlling their every move. It also helped a lot to structure our narrative around the birth, life and death of this weird organism, and even led up to an interesting new aspect, that of the balance between two other worlds, light and darkness.

This is one of the reasons why we needed that strong symbiosis between music and visuals. The last few years I’ve been experimenting a lot with audio and particles systems as part of my ongoing collaboration with Murcof and I really wanted to try something new in terms of creation and experimentation possibilities. This is where a node graph came really handy, allowing me to visually route any part of the audio to any part of the visual/physical system. When you are used to re-design the code every time you want the music to influence a part of the animation, well, a user interface like this one is definitely a huge time saver and gives much more space for experimentation.

Stimuli and working with motion designers.

From the start, it was pretty clear that Yannick was going to focus on the more graphical and geometric parts of the piece, and that I would be taking care of the procedural and organic parts. Instead of giving a 4 week old software full of bugs to Yannick, I decided to put my efforts on building bridges between my software and the ones that Yannick would be using. In order to develop fully our story it was really important for us to make those two worlds meet, fight and live together, not only in terms of collaboration and compositing techniques but also conceptually.

I decided to look into computer vision to find simple ways to work with Yannick’s footage rather than the other way around. I built an OpenCV module that was taking care of analysing Yannick’s videos and extracting interesting data that I could use for my animations. This idea gave birth to a nice list of new effects, some of them would extract the polygons out of videos to create collisions, some other would use grayscale gradients to influence the strength of another effect, etc…

It ended up being a really powerful tool, allowing me to use those graphical animation to physically collide with particles, scaring them off or attracting them. And after a few tweaks to the computer vision engine the result was quite convincing. I could just click one button, import new videos, assign them to different effects and, voila, please meet interactive physical compositing!

This module was the last piece I added to the software. Because time is often the main constraint for that kind of project, especially when you are the lead and only developer, there’s always a moment when you have to stop building new toys and start playing with the one you already have!

The software that I built for this project was made possible thanks to the huge efforts and energy of Andrew Bell, the Barbarians and the amazing Cinder community. A big thanks to the whole team for creating such a powerful framework! Cinder rules!

I used Gwen GUI to build the user interface, Gwen is a small library written by Garry Newman, and it is definitely worth having a look at it! I started playing with this library several months before this project and had to hack it quite a lot to make the timeline and nodegraph widgets possible, but it is without any doubt a really nice piece of code!