Okay, great: you can project on facades and surfaces in ways that makes the image tailor-mapped to the surface. What else can you do?

Rafal Bielski and a small team from Poland provide a glimpse of a more awesome, more futuristic future. Here, projection mapping isn’t content with a still, static surface, like a building. The surface and the projection can both move, aided by robotic servos. As that image tracks the object, the combined project comes further to the dream of transforming the physical reality around us with digital visuals.

How cool is this? Well, for starters, that video above is not something done in post. It’s not a special effect. It’s live, real-time projection you’re watching – really. (No one hiding behind the curtain, either.)

Rafal explains that the result is “a combination of interactive real-time 3D projection, robotics and augmented reality.”:

MPS is the realization of an idea of pushing projection mapping one step forward, making it even more spectacular and involving. In short, we have synchronized rotation of stepped motor engine with “geometric” projection with respect to the position of the viewer. This is the moment where we get the spectator to “believe” that this is a metallic, weird box rotating in a front of the window of the skyscraper. It was shocking to me when I made a very first panel to show our friends that Vimeo flick. Most of them — like 80% — thought it was all post-production, a 3d animation (the industry three of us have come from) and Arek Rekita is just keyed-in there. So imagine the
sensation when you are actually standing in front of it. Its physical form there is real, and I think you can sense those qualities even through the computer’s screen.

The ability to visually influence human beings is weakening. We are getting more and more resistant to that form of communication. (When was the last time you heard heard that a movie had “good CG FC, really, go see it”?) In the film industry, I hope we are getting back to the era of a good script over spectacular effects, but in advertising, there is a whole new world coming in. I could rant about it for hours, so lets get back to MPS. The upper projection surface part is a 50 kg vinyl polymer structure, reaching 3.5 m in height. The base is a custom-made device from certified components, mainly stepper motor connected and controlled via Arduino.

The PC has twin GeForce cards driving three projectors – Benq W1100 and wide-lens Mitsubishi 230u-st. That worked well, but the system also worked with Christie Roadster monsters not in this video. Of course, we went for Derivative TouchDesigner as our interactive visual programming environment. It was a spectacular
learning process, adapting third-party libraries, synchronizing devices, and writing tools to quicken calibration processes. Yeah, we developed our own calibration software: it’s a semi-automatic little tool with live quality. We are going to introduce it on our Facebook profile soon. Stay tuned.

More is coming, too. Rafal tells CDM, “in the near future, we will presents our in-house software for semiautomatic projector/3d calibration and others.” Yes. We’ll look forward to that.

seems to be the start of a next “something very cool” but i wish the camera had been stationary so we could get a better idea of the sculpture’s motion. my impression from the video is that it actually wasn’t moving very much. as a practical matter, a smart object made of a video-active fabric (so it is self-contained) would obviate the need for all the extra hassle of projecting/tracking. you’d want to have wireless data control then, but i guess only the military has that stuff at the moment?