Spotlight #20Rebirth by Patryk Kizny

This time we are sharing an experimental project by DitoGear’s Patryk Kizny. “Rebirth” is a philosophical story of human desire for transcendence and a Faustian dream of achieving immortality through creation at a symbolic level.

And what’s your latest project? Share your interesting stories in the next spotlight.

Natalia Brząkała: Could you tell us something about the background of this project? How long have you been worked on Rebirth?

Patryk Kizny: It started back in 2010 when we shot a short film “The Chapel” which from today’s perspective I can consider as a teaser and a starting point for “Rebirth”. We completed the film around Sept 2010 and the popularity peaked around Feb 2011. I knew I wanted to get back to that place and tell a deeper and richer story. What was missing for me in “The Chapel” was mostly human presence.

“The Chapel” brought an interest of the international heritage documentation industry and I presented across a range of events. That’s how I discovered laser scanning. Once I saw the technology I knew I’d want to make use of it for the film industry.

Late November, 2011 we were back again to the church doing the laser scanning thanks to the involvement of Jan Kanngießer and Mathias Ganspöck of EKG Baukultur. At that point I did not have a precise vision of the film yet and it crystallized in the next half year.

NB: What is the story about?

PK: The main character, the Architect, dreams of achieving immortality through creation of an extraordinary temple filled with light and glory. We follow the birth and materialisation of the idea through the use of laser scanning data and pointcloud visuals. But immortality comes at its price and the success becomes a curse. Witnessing his firstborn’s glory quickly turning to decay and agony leads the Architect to posing deep philosophical questions about the responsibility of mankind for the future in front of History. But is it a real death? The cycle of life closes its passage with the Architect’s Ideas and soul getting reborn as a beautiful mysterious dancer.

NB: How did you feel about the Protestant temple in Zeliszow when you saw it for the first time? Had your perception about this place changed while working on this project?

PK: Well, we were amazed with Robert as soon as we entered the place. Did my perception change? No, I don’t think so. I felt a very strong connection with that place since the very first moments. It created excitement for shooting “The Chapel” and brought determination to finalize “Rebirth”. As of today, when the church is being renovated – I think I’ll miss the decaying roof a bit.

NB: What proves the uniqueness of the project?

PK: Well, first of all – the temple itself. It opens the doors and makes people astounded. Second – the experimental formal approach – there are not many shortfilms to make use of such a wide range of different technologies and aesthetics. Third – an innovative use of laser scanning data and visualization – such thing has never been done before with comparable results (there were 2 uses of LiDAR data in total, both in TVCs produced by TheMill, but they embraced the rough look of pointcloud data is it comes out of the scanner and processing software).

And at last, but not least – both films have proven that through artistic vision and the execution, art can impact the reality around. The films have built the social awareness of the forgotten temple and, long story short, have significantly contributed to preserving and restoring the temple. Before “The Chapel” nobody know about the temple. Today it’s being renovated by an independent organization that took it over from a local authorities. And that’s something that does not compare to any film festival prizes or generated audience size. No greater reward for a filmmaker.

NB: Please say something about people involved in Rebirth. How did you manage to draw their attention?

PK: The project would not have been possible without the involvement of the entire team. There was no budget, so the team consisted mostly of volunteers. I think it was the strength of the idea, the magic of the place and my genuine intentions that made it possible.

NB: Please determine which parts of the production were the most challenging for you?

PK: The live-action shoot was the toughest shoot I’ve ever had in my career – that’s for sure. The reality of guerilla shooting at its worst and a lot of bad luck. We had to deal with extreme weather conditions, electricity issues and a crusade of local authorities kicking us out of the place – enough to ruin a 8h-long shooting day (November in Poland is not generous with daylight). Not getting too much into details, I am glad we brought any footage.

The other thing was post production – particularly challenging was developing workflows for cinematic lidar data visualization.

NB: Was the project more technically or artistically demanding for you?

PK: That’s a good question. I think both. There was a tremendous amount of work on the LiDAR data visualization and VFX, but also bringing so many different techniques together in a way that is justified and makes sense for the story was quite a challenge.

NB: Please tell us about the different techniques used in this project.

PK: Of course there are HDR motion-controlled timelapses. Some of them are reused and reprocessed shots for “The Chapel”, but there’s a lot of new shots. That’s complemented with live-action footage.

There’s the LiDAR data visualizations, finally motion capture, photogrammetry and particle simulations. Quite a lot of stuff for a 10-minutes long movie.

NB: Could you describe the workflow of the most sophisticated technique used in Rebirth?

PK: Let’s have a look into LiDAR data visualizations and all the VFX part.

LiDAR (laser scanning) is an engineering technology and resulting pointclouds are not visually pleasing. One of my goals was to transform it into art and seamlessly blend with the cinematic language. That poses lots of challenges.

The softwares for processing LiDAR data do not have visualization tools tailored towards VFX industry. The resulting images are very rough. I needed to learn how to clean up and process the entire dataset – roughly 500M 3D points and find the tools and workflows to visualize it. After a long research and many trials often with alpha versions of unreleased software from the labs, I finally settled using Krakatoa, a Thinkbox Software particle renderer. It’s damn fast and allows for node-based and scripted manipulation of the datasets.

The process involved preparing the pointclouds in the engineering software. Then custom scripting and processing the ASCII files to remove incompatible data lines. Then it was brought into Autodesk 3DMAX and Krakatoa for visualization.

One of the challenges was to do a script-based color grading of a pointcloud data – as crazy as it sounds. The process of scanning of the temple took about 16h – from early morning to late night and included about 20 different scan positions. The pointcloud color is based on the superimposed image coming from visual camera of the scanner. Now the early scans did have the visual captures along, but for afternoon to night scans we only had points intensity because it was way too dark for visual data. I took the color data from a few of the first scans and then projected and interpolated it onto all other scans. Finally, there was relighting and color fine tuning to make it look good – all required node-based and script-based approach and a lots of maths – forget about tools like Davinci Resolve.

Now even 500M points fall short when it comes to nicely filling an HD frame. You still see single-pixel points which do not look good and won’t stand the video compression. The salvation was in Krakatoa’s depth of field which allows to take a single point and if it’s out of focus replace it with a semi-transparent disk. The result is way smoother, but it is expensive. Do the maths yourself – take a single point from an original 500M points dataset and draw each of them 20M times. Then multiply it by 1000s of frames. That results in an insane amount of points to draw – even for as good renderer as Krakatoa and a network of a few high-end workstations at our studio.

If visualizing pointclouds was a challenge, I wanted to take it further, so let’s have a look at the collapse scene. It’s a 30s-long single-shot all-CGI scene where the temple collapses and out of the smoke and ashes appear the angels and the dancer. The workflow includes applying physics simulations to every single point making up the dataset and processing it throught. Then add smoke simulations, photogrammetrically-reconstructed angels and motion-captured dancer, everything brought together within the established pointcloud/particle data aesthetics. So it took about a month all together only for the computation and rendering. And the particle caches total up to 4TB of data.

NB: Were there any failures?

PK: Mostly the live-action shoot. It was hard and I am not very pleased with the outcome. Did my best to make up for it in color grading, but anyways, there are limits of doing cinematography in post. What’s shot badly is shot badly. Period.

NB: What role played the DitoGear equipment in your production? Please share your experience.

PK: Well, “The Chapel” was the first serious battle test for the OmniSliders. And a huge thing to make them popular. So that holds all the way to “Rebirth” too – the film would not have been possible without them. But for “Rebirth” we also used the never-released DitoGear™ HexaCrane prototype – the crane with a 6m arm suited for timelapse and live action. We used it for shooting live action that time and I believe it was a great addition to the shoot.

Interested in Evolution Motion Control Kit?

Check DitoGear™ Evolution Motion Control Kits

NB: Did you manage to express your vision through the final result of Rebirth? Are there any things you would do differently today?

PK: For sure, I’d love to have better conditions, equipment and more time to do the live action part at the quality that corresponds more to my skills and expectations. We used RED One and the 5DS, but the EPIC was already on the market. The footage could have been technically better if shot with today’s cameras. And it could be 10 times better if we had better shooting conditions and were more prepared.

But yes, absolutely – the final delivers all I wanted to convey.

NB: Are you currently working on anything interesting?

PK: There’s always lots of ideas in my head. We’re currently working on the new shot including generative imagery (being vague on purpose) as well as a very interesting project suited for VR that has to do with exploration of the music through a spatial dimension.

NB: Patryk, Thank you for sharing the production details with us. Congratulations on being awarded at 12 Months Film Festival!