Kinect Skeleton Test from CADET on Vimeo.

Kinect hacking, worldwide, continues unabated, taking the tool from proof-of-concept depth tests to something that could actually prove the basis of real media. The killer app, it seems, will be interactive art, as Kinect’s ability to interact with someone in space is ideal for the kinds of interactions designers need – perhaps, ironically, more so than in games.

Most promising of all, Robert Praxmarer writes to share his team’s work on skeletal tracking, top. This is really the magic of Kinect: beyond just tracking depth, once you have a notion of a human skeleton, you can really begin to interact with movement, opening up seriously powerful possibilities for spatial interaction and dance. Robert comes from a “newly-formed research new media lab in Austria called CADET – Center for Advances in Digital Entertainment Technologies.”

Game design with something like Kinect is no small challenge, which is why the launch lineup on the Xbox itself – with the possible exception of Harmonix’s dance game, Dance Central – has been deemed disappointing by many early adopters and critics. To really explore the medium’s possibilities, it may be absolutely essential to get independent designers working on the problem, freed from the pressures of the commercial game development pipeline. Accordingly, the Austrian team isn’t just doing interesting analyses. They’re putting that work to the test with an open-source Kinect game that uses generative realtime graphics and a rich musical soundtrack.

First visual tests for kinect game protoype from CADET on Vimeo.

And thanks to this work being open source, you can expect to build on the results yourself, standing on each other’s shoulders instead of each other’s toes.

Lyserg21 is an attempt to make an open source kinect game with stunning generative realtime graphics. It can be compared with Child of Eden, but I want it to find its own style and game mechanics. But basically it will be an audio/visual tunnel trip alike REZ and other classics…

If someone wants to join developing or producing assets get in touch …

Stay tuned for our skeleton library coming soon @ cadet.at !!!!!!!

Credits:
Coding: Robert Praxmarer
Video: Lisa Eidenhammer
Photos: Matthias Paul Hempt
Music: Squarepusher Brainbug, Monolake Plumbicon,
remixed by Robert Praxmarer

http://cadet.at

Other developments look promising, as well. Here’s a quick round-up:

vvvv + Kinect, on their forums

Kinect + lighting, with DMX (via OpenDMX):

Kinect object recognition, seeming a bit like the early training tests of the fictional HAL computer, via OMG! Ubuntu:

Ubuntu users, check out the PPA for easy multi-touch Kinect support.

Discussing Processing + Java + Kinect on the Processing Forum

We could even see Kinect support in a browser, using JavaScript, all via an open source library you can find right now:

DepthJS from Fluid Interfaces on Vimeo.

Flight404 is doing some beautiful and creative work, experimenting with the aesthetic possibilities, from making puffy, cloud-like versions of himself to setting himself on fire:

Fat Cat from flight404 on Vimeo.

Conflagration, alternate from flight404 on Vimeo.

In other news:

Microsoft claims they left Kinect open by design (I’m guessing not everyone in Microsoft is of totally the same mind there, but okay), complete with a video round-up [Ars Technica]

Our friend Filip at Creative Applications has been really on top of this. See:

Kinect – One Week Later [Processing, oF, Cinder, MaxMSP]

And best of all, he’s doing a master list of projects:

http://www.creativeapplications.net/kinect/

Keep those tips coming; in about a week as things fall into place, I’m planning a “where to start” guide for the many, many platforms that are adding Kinect libraries.

  • http://www.enjoytheblank.com Mike Cohen

    Aside from a few of these examples and a handful of others I've seen, it seems to me that most of these things can be done with OpenCV or other camera tracking that exists, albeit a bit more work to make up for the IR sensor.  Maybe I'm missing something else here, but I'm really excited to see more work involving the Z-Axis.