Kinect hacking, worldwide, continues unabated, taking the tool from proof-of-concept depth tests to something that could actually prove the basis of real media. The killer app, it seems, will be interactive art, as Kinect’s ability to interact with someone in space is ideal for the kinds of interactions designers need – perhaps, ironically, more so than in games.

Most promising of all, Robert Praxmarer writes to share his team’s work on skeletal tracking, top. This is really the magic of Kinect: beyond just tracking depth, once you have a notion of a human skeleton, you can really begin to interact with movement, opening up seriously powerful possibilities for spatial interaction and dance. Robert comes from a “newly-formed research new media lab in Austria called CADET – Center for Advances in Digital Entertainment Technologies.”

Game design with something like Kinect is no small challenge, which is why the launch lineup on the Xbox itself – with the possible exception of Harmonix’s dance game, Dance Central – has been deemed disappointing by many early adopters and critics. To really explore the medium’s possibilities, it may be absolutely essential to get independent designers working on the problem, freed from the pressures of the commercial game development pipeline. Accordingly, the Austrian team isn’t just doing interesting analyses. They’re putting that work to the test with an open-source Kinect game that uses generative realtime graphics and a rich musical soundtrack.

And thanks to this work being open source, you can expect to build on the results yourself, standing on each other’s shoulders instead of each other’s toes.

Lyserg21 is an attempt to make an open source kinect game with stunning generative realtime graphics. It can be compared with Child of Eden, but I want it to find its own style and game mechanics. But basically it will be an audio/visual tunnel trip alike REZ and other classics…

If someone wants to join developing or producing assets get in touch …

Aside from a few of these examples and a handful of others I've seen, it seems to me that most of these things can be done with OpenCV or other camera tracking that exists, albeit a bit more work to make up for the IR sensor. Maybe I'm missing something else here, but I'm really excited to see more work involving the Z-Axis.

CDM is an online magazine for creative technology, from music and DJing to motion and more.